&EPA
    United States
    Environmental Protection
    Agency
Framework for Comparing Alternatives
For Water Quality Surveillance and Response Systems
Office of Water (MC 140)
EPA 817-B-15-003
June 2015

-------
                                      Disclaimer

The Water Security Division of the Office of Ground Water and Drinking Water has reviewed and
approved this document for publication. This document does not impose legally binding requirements on
any party. The information in this document is intended solely to recommend or suggest and does not
imply any requirements. Neither the U.S. Government nor any  of its employees, contractors or their
employees make any warranty, expressed or implied, or assumes any legal liability or responsibility for
any third party's use of any information, product or process discussed in this document, or represents that
its use by such party would not infringe on privately owned rights.  Mention of trade names or
commercial products does not constitute endorsement or recommendation for use.

Questions concerning this document should be addressed to WQ_SRS(g),epa.gov or the following contact:

Steve Allgeier
USEPA Water Security Division
26 West Martin Luther King  Drive
Mail Code 140
Cincinnati, OH 45268
(513)569-7131
Allgeier.Steve@epa.gov

-------
                            Acknowledgements

The document was developed by the USEPA Water Security Division, with additional support provided
under USEPA contract EP-C-10-060. Peer review of this document was provided by:
   •  Kevin Gertig, Fort Collins Utilities
   •  Julie Hunt, Trinity River Authority of Texas

-------
                                 Table of Contents
LIST OF FIGURES	iv
LIST OF TABLES	iv
ABBREVIATIONS	v
SECTION 1: INTRODUCTION	1
  1.1     SCOPE OF GUIDANCE	1
  1.2     APPLICATION OF GUIDANCE	1
  1.3     GUIDANCE OVERVIEW	2
SECTION 2: DEVELOP LIFECYCLE COST ESTIMATES	4
  2.1     IDENTIFY UNIQUE COST ELEMENTS FOR EACH ALTERNATIVE	5
  2.2     DEFINE THE ANALYSIS PERIOD AND USEFUL LIFE OF SYSTEM ASSETS	6
  2.3     CALCULATE LIFECYCLE COSTS	7
    2.3.1    Implementation Costs	8
    2.3.2    Operations and Maintenance Costs	9
    2.3.3    Renewal Costs	11
    2.3.4    Value of Remaining Useful Life	13
    2.3.5    Total Lifecycle Cost	14
  2.4     EXAMPLE CALCULATION OF THE LCCE FOR OWQM DESIGN ALTERNATIVES	14
SECTION 3: SCORE ALTERNATIVES WITH RESPECT TO EVALUATION CRITERIA	20
  3.1     ESTABLISH EVALUATION FRAMEWORK	20
    3.1.1    Establish Evaluation Criteria	20
    3.1.2    Weight Each Criterion	21
    3.1.3    Develop a Scoring Scale	21
  3.2     SCORE THE ALTERNATIVES	21
  3.3     EXAMPLE QUALITATIVE EVALUATION OF OWQM DESIGN ALTERNATIVES	22
SECTION 4: SELECT THE PREFERRED ALTERNATIVE	28
  4.1     FINAL ANALYSIS OF THE RESULTS	28
    4.1.1    Scatterplots	28
    4.1.2    Cost vs. Capability Ratio	29
    4.1.3    Interpretation of Analysis Results	29
  4.2     EXAMPLE OF THE FINAL SELECTION PROCESS	30
RESOURCES	32
GLOSSARY...                                                                              ,...33

-------
                                   List of Figures
Figure 1-1.  Overview of the Process for Comparing Alternatives	3
Figure 2-1.  Steps for Developing a Lifecycle Cost Estimate	4
Figure 2-2.  Example Timeline of Asset Renewal over an Analysis Period	7
Figure 2-3.  Asset Renewal Timeline for the Example LCCE	16
Figure 3-1.  Steps for Scoring Alternatives Using Evaluation Criteria	20
Figure 4-1.  Scatterplot of LCCE vs. Capability Score	29
Figure 4-2.  Scatterplot for Example Alternatives	30



                                    List of Tables

Table 1 -1.  Scales of SRS Design Alternatives that can be Considered in a Comparative Analysis	2
Table 2-1.  Example LCCE Cost Elements	6
Table 2-2.  Examples of Implementation Costs	9
Table 2-3.  Examples of Operations and Maintenance Costs	11
Table 2-4.  Examples of Renewal Cost	13
Table 2-5.  Assets Associated with the Example Alternative Designs	15
Table 2-6.  Summary of Costs for Each Asset Used in the Example	15
Table 2-7.  Summary of Costs forthe Example Alternatives	19
Table 3-1:  Weights Assigned to the Example Evaluation Criteria	24
Table 3-2.  Scoring Logic for Example Evaluation Criteria	25
Table 3-3.  Qualitative Scoring of Each Alternative	26
Table 3-4.  Developing a Final Capability Score for Each Alternative	27
Table 4-1.  LCCE and Capability Scores forthe Example OWQM Design Alternatives	30
Table 4-2.  Capability vs. Cost Ratios for the Example OWQM Design Alternatives	31
                                                                                          IV

-------
                                   Abbreviations

C/C           Cost vs. capability ratio
CCS          Customer Complaint Surveillance
CM           Consequence Management
Cont          Local controller used for data transmission
Conv          Sensor suite to measure conventional parameters
DG           Design goal
DOE          Department of Energy
DPD          N,N-diethyl-p-phenylenediamine;
ESM          Enhanced Security Monitoring
IT            Information technology
LCab          Large cabinet
LCCE         Lifecycle cost estimate
NIST          National Institute of Standards and Technology
O&M         Operations and Maintenance
OWQM       Online Water Quality Monitoring
PHS          Public Health Surveillance
PO           Performance objective
RUL          Remaining useful life
S&A          Sampling and Analysis
SCab          Standard cabinet
SPV          Single present value
SRS          Water Quality Surveillance and Response System
TEVA-SPOT  Threat Ensemble Vulnerability Assessment - Sensor Placement Optimization
              Tool
TUL          Total useful life
UPV          Uniform present value
USEPA       United States Environmental Protection Agency
UV           Ultraviolet
UVis          UV-Visible spectral absorption instrument
WHEAT      Water Health and Economic Analysis Tool

-------
                            Framework for Comparing SRS Alternatives


                             Section 1:   Introduction
1.1    Scope of Guidance
This document provides guidance for selecting the most appropriate Water Quality Surveillance and
Response System (SRS) design for a utility from a set of viable alternatives.  It provides a framework that
guides the user through an objective, stepwise analysis for ranking multiple alternatives and describes, in
general terms, the types of information necessary to compare the alternatives.

Before the comparative framework described in this document can be applied, the SRS design alternatives
to be compared must be developed.  These design alternatives should be informed by design goals,
performance objectives and
constraints established for the SRS     f              LIMITATIONS OF THIS GUIDANCE
OJSEPA. 2015).  Design goals define
the specific benefits that a utility
    , , , ,  .     , •   .1     i              framework for comparison of viable and well-defined SRS
would like to realize through             desjgn a|ternativesH |t does not describe how to deve|op a
deployment of an SRS. Benefits
obtained through operation of an SRS
can be considered in two broad
categories: (1) those that support
The scope of this document is limited to defining the
set of viable SRS design alternatives.

For guidance on developing viable SRS designs, please visit the
USEPA Water Security website:
             .      ,           ^  ,,     http://water.epa.gov/infrastructure/watersecuritv/index.cfm
routine operation and management or
the distribution system and (2) those
related to detection of and response to water quality incidents in the distribution system. Performance
objectives define metrics to gauge how well the SRS achieves the established design goals. Constraints,
often driven by practical and financial considerations, dictate the requirements or limitations within which
the SRS must be designed and operated.  The same information used to develop the alternatives (design
goals, performance objectives and constraints) may be useful as evaluation criteria in the analysis
of alternatives.

1.2   Application  of Guidance
The framework presented in this guidance can be applied at a variety of scales including alternatives for
the design of the overall SRS, design of an individual SRS component, and design or selection of a
specific asset.  An asset is a specific piece of equipment or other item used in the  implementation of an
SRS. Table 1-1 describes these three scales, providing example alternatives that  could be  analyzed at
each scale and the level of definition required to complete the analysis. For effective application of the
framework, all alternatives must be adequately and consistently defined at the scale being analyzed.
Furthermore, the same cost elements should be included for all alternatives being compared.

The number of alternatives selected for comparison is limited by the scale of the system being considered.
Comparison of large-scale designs, such as that for an SRS, works better with a relatively small number
of alternatives due to the inherent complexity of the entire system. At a smaller scale, such as a
component or asset, it becomes feasible to compare a larger number of alternatives.

-------
                              Framework for Comparing SRS Alternatives
Table 1-1. Scales of SRS Design Alternatives that can be Considered in a Comparative Analysis
      Scale of
    Comparison
       Example Design Alternatives
Example Required Level of Definition
    for the Scale of Comparison
 System
Alternative SRS designs considering
components to be included in the system, with
a trade-off between cost and capability:
1.  Base SRS (CM, CCS, PHS, S&A)
2.  Base SRS + OWQM
3.  Base SRS + ESM
The components to be included in the
SRS.  Attributes of conceptual-level
design for each component such as
equipment, information management
systems, additional personnel, and
partner involvement necessary for each
alternative. Order of magnitude cost
estimates for each alternative.
 Component
Alternative OWQM designs, with a trade-off
between number of monitoring stations and
number of parameters monitored:
1.   Monitoring for conventional parameters
    (chlorine residual, pH, and conductivity) at
    20 locations in the distribution system
2.   Monitoring for conventional parameters
    plus UV-Visible spectral absorption at
    10 locations in the distribution system
3.   Monitoring for conventional parameters at
    12 locations and  for conventional
    parameters plus  UV-Visible spectral
    absorption at 3 additional locations in the
    distribution system
4.   Monitoring for conventional parameters at
    6 locations and for conventional
    parameters plus  UV-Visible spectral
    absorption at 6 additional locations in the
    distribution system
Parameters to be monitored, instrument
types, monitoring station design, and
types of potential installation locations
for each alternative. Approximate unit
cost of each station type.
 Asset
Alternative technologies for measuring chlorine
residual at OWQM stations:
1.   Instrument based on the DPD method,
    provided by Vendor 1
2.   Instrument based on the DPD method,
    provided by Vendor 2
3.   Instrument based on the amperometric
    method, provided by Vendor 2
4.   Instrument based on solid-state
    technology, provided by Vendor 3
5.   Instrument based on solid-state
    technology, provided by Vendor 4
Specific model of the online chlorine
residual sensor used in each alternative.
Purchase price and estimated annual
operations and  maintenance cost for
each model.
 CM = Consequence Management; CCS = Customer Complaint Surveillance; PHS = Public Health Surveillance
 S&A = Sampling and Analysis; OWQM = Online Water Quality Monitoring; ESM = Enhanced Security Monitoring;
 DPD = N,N-diethyl-p-phenylenediamine; UV = Ultraviolet

 1.3     Guidance Overview
 An overview of the framework for comparison of alternatives that will be discussed in this document is
 illustrated in Figure 1-1. This framework considers the tradeoff between benefits realized and costs
 incurred among the alternatives. While certain aspects of this analysis are quantitative, there are also
 qualitative factors to be considered, and thus some degree of value judgment is necessary to select a
 preferred alternative.  A number of software tools are available to  support application of this framework.
 In particular, these tools automate several of the calculations and document the analysis.

-------
                           Framework for Comparing SRS Alternatives
 PREREQUISITES
 COMPARISON
Establish Establish Performance Identify
Design Goals Objectives Constraints
ON
	 1
De\
Lifecyc
Estin
ted in Figur(
	 i 	



'-
relop
;le Cost
nates*

;2-1
Develop
Alternatives

V
1 Score Alternatives
With Respect to
Evaluation Criteria**
Select
Alternative
 ** Further illustrated in Figure 3-1

Figure 1 -1. Overview of the Process for Comparing Alternatives

The process presented in this document assumes that the total possible number of alternatives under
consideration has been reduced to a select set of alternatives that are viable, and that any alternatives that
were obviously non-compliant with critical requirements or outside budget constraints were eliminated
from further consideration.

This document is organized as follows:
•   Section 2 describes the process of developing life-cycle cost estimates.
•   Section 3 describes the process of scoring alternatives with respect to evaluation criteria.
•   Section 4 describes the process of selecting an alternative based on the lifecycle cost estimates and
    evaluation scores.
•   Resources presents a comprehensive list of documents, tools and other resources cited in this
    document, including a summary and a link to each resource.
•   Glossary presents definitions of terms used this document, which are indicated by bold italic font at
    first use in the body of the document.

-------
                           Framework for Comparing SRS Alternatives
             Section 2:  Develop Lifecycle  Cost Estimates

The relative lifecycle costs of alternative SRS designs are important to consider when evaluating which
alternative to select. The general terms that comprise the Lifecycle Cost Estimate (LCCE) are shown in
Equation 2-1 and defined below.
      LIFECYCLE COST ESTIMATE = Implementation Costs + Operations and Maintenance Costs

                      + Renewal Costs - Value of Remaining Useful Life
 Equation 2-1.  Lifecycle Cost Estimate Equation

•  Implementation costs include all design, procurement, installation and training costs associated with
   implementing the system.
•  Operations and maintenance costs are ongoing costs for items such as reagents, replacement parts,
   support contracts and the level of effort, including personnel costs, required to maintain the system.
•  Renewal costs account for the cost of replacing assets that have a shorter useful life than the period
   chosen for analysis.  In addition to procurement costs for replacement equipment, renewal costs may
   include the costs of redesign if the new assets differ from the original (such as a new model of water
   quality sensor), installation and initial training, and decommissioning and disposal of the equipment
   being replaced.
•  Value of remaining useful life accounts for the residual value of those assets that have useful life
   remaining at the end of the period chosen for the analysis. Useful life is explained in more detail in
   Section 2.2.

Figure 2-1 expands on Figure  1-1, showing the three basic steps involved in developing the LCCE for
SRS design alternatives. These steps are described in further detail in subsequent sections. An example
that illustrates these steps is provided in Section 2.4.
                                                                   Identify Unique Cost
                                                                     Factors for Each
                                                                       Alternative
                                       Develop
                                   Lifecycle Cost   i
                                     Estimates*
 Define the Analysis
Period and Useful Life
  of System Assets
                                                                        Calculate
                                                                     Lifecycle Costs
Figure 2-1. Steps for Developing a Lifecycle Cost Estimate

-------
                           Framework for Comparing SRS Alternatives
2.1    Identify Unique Cost Elements for Each Alternative
An LCCE is developed to a level of accuracy appropriate for the intended use of the result. In the context
of the framework for comparing alternative SRS designs, the LCCE only needs to include costs that are
different among the alternatives. For example, if all alternatives require procurement of the same
information technology (IT) and communication equipment, the costs associated with this equipment may
be excluded due to their commonality across alternatives.

When selecting cost elements to include in the LCCE, it is  generally preferable to minimize the number of
cost elements included to avoid unnecessary information collection and calculations. However, it is
important to ensure that results are sufficiently detailed to observe meaningful differences among the
alternatives.  The scale of the SRS alternatives under comparison, as described in Section 1.2, will
influence the cost elements that should be considered. For example, an LCCE at the system or
component level may require the inclusion of design and project management costs that are not
necessarily required when developing an LCCE at the asset level.

Also, only those costs large enough to make a difference within the margin of error of the estimate need
to be included. The scale of the  SRS alternatives under comparison (system, component or asset) will
influence the necessary level of detail in the underlying data used to calculate the LCCE.  In general, the
level of detail required in the LCCE increases as the scale of the alternatives under comparison decreases.

The specific cost  elements needed to calculate an LCCE will vary by the SRS component(s) considered in
the designs. Table 2-1 provides examples of general, high-level cost elements that might be included for
each of the SRS components, as  well as cost elements relevant to the entire system.  Specific examples of
cost elements for each SRS  component are provided in Section 2.3.

-------
                            Framework for Comparing SRS Alternatives
Table 2-1. Example LCCE Cost Elements
Example Cost Element
Implementation
Operations & Maintenance
Renewal
Remaining
Useful Life
Develop design documentation for the
system, component or asset
Develop an information management
system
Procure equipment
Develop and implement an initial
training and exercise program
Coordinate with partner agencies
Review and analyze data and
investigate alerts
Maintain equipment
Plan and implement training and
exercises
Procure software licenses
Procure consumables
Maintain documentation
Procure replacement information
technology hardware and software
Procure replacement equipment
Value depreciated assets at the end of
the analysis period (negative cost)
System
X
X
-
-
X
-
-
-
X
-
X
X
-
X
OWQM
X
X
X
X
-
X
X
X
X
X
X
X
X
X
ESM
X
X
X
X
X
X
X
X
X
-
X
X
X
X
CCS
X
X
-
X

X
-
X
X
-
X
X
-
X
PHS
X
-
-
X
X
X
-
X
-
-
X
-
-
-
CM
X
-
-
X
X
-
-
X
-
-
X
-
-
-
S&A
X
-
X
X
X
-
X
X
-
X
X
-
X
X
 * Note that the four cost element categories shown in this table correspond to the terms listed in Equation 2-1.

 2.2   Define the Analysis Period and Useful Life of System Assets
 A common analysis period is used to develop the LCCE
 for each alternative considered in the comparison.  The
 analysis period must be long enough to demonstrate the
 differences in the LCCE among the alternatives.
 However, costs become more uncertain as they are
 estimated further into the future, particularly given that
            HELPFUL HINT
The analysis period is defined only for the
purpose of the LCCE and is not necessarily
related to the actual life of the SRS.
 the SRS is heavily dependent on rapidly evolving technologies such as water quality sensors and IT
 equipment. Thus, the analysis period should be kept as short as possible.

 Selection of the analysis period should be informed by the useful life of all assets among the alternatives.
 Total useful life is defined in this document as the period
 of time that an asset is able to be economically
 maintained. The total useful life of an asset is determined
 by a number of factors including availability of
 replacement parts, estimated cost of repairs, performance
 degradation overtime, and availability of improved
 technologies.
            HELPFUL HINT
Set the analysis period equal to the longest
total useful life among all assets used in the
alternatives under comparison.

-------
                            Framework for Comparing SRS Alternatives


Information available to determine the total useful life of an asset includes:
•   Manufacturer's documentation
•   Utility experience with similar assets
•   Expert or consultant knowledge about similar assets

Figure 2-2 provides an example timeline showing how the total useful life of each asset is applied over
the analysis period. In this example, the alternatives being considered have four unique assets with
different total useful lives. As suggested above, the analysis period is chosen as the longest total useful
life across assets (which is the total useful life of Asset 3). For all assets, the lifecycle starts with the
beginning of project implementation. If the end of the total useful life of an individual asset is reached
within the analysis period, the asset must be replaced to maintain a fully functioning system. This
replacement is termed renewal, and the  figure illustrates the point at which each renewal cost would be
incurred. At the end of the analysis period, three of the assets have remaining useful life, or additional
time they can be viably operated before  renewal.  Both the renewal costs and the value of the remaining
useful life are used in calculating the LCCE, as  described in the next section.
     Start of Project
     Implementation
                                                                             Remaining
                               -Analysis Period-                            - useful Life
                Asset 1 Total Useful Life      «§       Asset 1 Total Useful Lifr
           Asset 2 Total Useful Life   (§)  Asset 2 Total Useful Life  P   Asset 2 Total Useful Life
                          Asset 3 Total Useful Life
                  Asset 4 Total Useful Life           
-------
                            Framework for Comparing SRS Alternatives
(NIST. 2014). The 2014 edition is the latest version of this resource at the publication date of this
guidance document. As this resource is updated annually, the most recent edition should be referenced to
obtain discounting factors. Within this guidance document this resource will be referred to as the "latest
version of the Annual Supplement to Handbook 135 (NIST)" without a publication year identified.

The following sections describe the approach for developing an LCCE for an SRS design alternative.
Each section describes the method for calculating one of the LCCE terms presented in Equation 2-1.

Throughout this document the terms used in equations (for example CjfAssetJ) are expressed using the
following form of notation:
•   C is used to represent a cost.  Vrepresents a value.
•   Specific LCCE terms, as listed in Equation 2-1, are represented as subscripts such as / for
    implementation.
•   A bracketed term is used to indicate whether the cost (or value) applies to an asset [Asset] or an
    alternative [Alternative]'. The term [Alternative] denotes the summation of the cost or value for all
    assets used in the alternative.
Thus, Cf[Asset] represents the implementation costs for an asset.

The Analysis Period is denoted as AP years, and the number of times an asset is renewed during the
analysis period is denoted as N.

The Total Useful Life  of an  asset is denoted as TUL, and the Remaining Useful Life of an asset is denoted
as RUL.

2.3.1  Implementation Costs
Potential implementation costs for an alternative include the procurement of equipment, procurement of
IT hardware and software, design and documentation of the  system, initial training, and all other costs
associated with the initial startup. Table 2-2 provides example implementation costs  for the components
of an SRS.

Utility experience with an asset is the most reliable way to estimate these costs.  Costs can also be
established by requesting a quotation from potential suppliers or from other utilities or organizations who
have undertaken  a similar project.

Calculating the implementation costs for each alternative (C^Alternative]) involves the  simple
summation of the relevant costs identified for the alternative. Implementation costs usually occur during
the base year and therefore don't require discounting. However, if lagging implementation costs occur in
later years, they need to be discounted using the approach described in Section 2.3.3.

-------
                               Framework for Comparing SRS Alternatives
Table 2-2. Examples of Implementation Costs
   Component
                           Implementation Cost Example
     System
• Provide management and oversight during SRS implementation
• Develop and document the overall SRS design
• Design and implement an information management system, possibly including a dashboard
  that manages and displays information for multiple components
• Procure IT hardware and software
     OWQM
  Procure sensors and necessary supplemental equipment such as tubing and reagents
  Design, construct and install OWQM stations
  Procure and implement IT hardware, software licenses and communication systems
  Procure or develop a data analysis and alert notification system
  Train staff on use and maintenance of installed sensors
  Develop component alert investigation procedures and train staff on their roles and
  responsibilities
      ESM
                   Identify critical facilities and design a custom security monitoring system for each facility
                   Procure video equipment, intrusion detection systems and communication systems
                   Procure and implement IT hardware and software to display real-time data from intrusion
                   detection systems and  video equipment
                   Properly commission all security equipment
                   Train staff on use and maintenance of installed security hardware
                   Develop component alert investigation procedures and train staff on their roles and
                   responsibilities
      CCS
                  • Design and implement a system to collect and manage all customer feedback related to
                   water quality concerns
                  • Procure or develop a data analysis and alert notification system
                  • Develop component alert investigation procedures and train staff on their roles and
                   responsibilities
      PHS
                  • Establish partnerships with all relevant public health jurisdictions in the drinking water system
                   service area
                  • Develop notification protocols in the case of a public health alert potentially related to drinking
                   water
                  • Implement automated alert generation and notification systems
                  • Develop component alert investigation procedures and train staff on their roles and
                   responsibilities
       CM
                   Develop consequence management and crisis communication plans
                   Conduct initial training on these plans
                   Plan and implement initial exercises to test and evaluate these plans
      S&A
                  • Establish agreements with laboratories that could perform emergency analysis of water
                   samples for contaminants of concern
                  • Procure lab and field testing equipment, as necessary
                  • Train lab and field personnel on equipment, methods and procedures
                  • Establish baseline occurrence data for contaminants of concern
 2.3.2  Operations and Maintenance Costs
 Equation 2-2, is used to calculate the total operations and maintenance costs for the alternative over the
 analysis period (C0M[Alternative]).  Operations and maintenance costs include the labor needed to operate
 and maintain the system, procurement of consumables and spare parts, the recurring cost of software
 licenses and service agreements, the time required to investigate alerts, and the time required for refresher
 training and document updates.

-------
                            Framework for Comparing SRS Alternatives
                    COM [Alternative] =  CAnnOM [Alternative] x  UPV(AP)

  Where:
       Coivi[Alternative] = Total operations and maintenance costs for the alternative over the analysis period
     CAnnOM[Alternative] = Annual operations and maintenance costs for the alternative
                 AP = Number of years in the analysis period
             UPV(AP) = Uniform Present Value factor for AP years
            ^^^^^^^^^^^^^^^^^^^^^^

   Equation 2-2.  Total Operations and Maintenance Costs for an Alternative

CAimoM[AlternativeJ is the total annualized cost for all assets used in the alternative. It incorporates a
common simplifying assumption that operating costs are constant for each year in the analysis period.
Any operations and maintenance costs that are not already expressed on an annual basis should be
annualized when calculating this value.  For example, the cost of a multi-year contract can be annualized
by dividing the total cost by the number of years over which the contract applies. The value for Uniform
Present Value (UPV(AP)} can be found in Table A-2, "DOE Discount  Rate," of the latest version of the
Annual Supplement to Handbook 135 (NIST).

Table 2-3 provides  examples of operations and maintenance costs for SRS components.  Cost quotations
and literature from suppliers can be used to estimate these costs.  However, the real-world operational
experience from utilities or other organizations that use the same, or similar, equipment can provide a
more reliable estimate.
                                                                                              10

-------
                              Framework for Comparing SRS Alternatives
Table 2-3. Examples of Operations and Maintenance Costs
  Component
                     Operations and Maintenance Cost Examples
 System
• Maintain the SRS information management system
• Update system documentation and procedures
 OWQM
  Investigate and document alerts in real time
  Perform routine maintenance and calibration of sensors
  Procure reagents and replacement parts as needed
  Update the data analysis system and alert algorithms as necessary
  Renew software licenses as necessary
  Renew service contracts as necessary
  Conduct annual drills and exercises on alert investigation procedures
  Update alert investigation procedures as necessary
 ESM
                  Investigate and document alerts in real time
                  Perform routine maintenance and calibration on intrusion detection systems and video
                  equipment
                  Renew software licenses as necessary
                  Renew service contracts as necessary
                  Conduct annual drills and exercises on alert investigation procedures
                  Update alert investigation procedures as necessary
 CCS
                • Investigate and document alerts in real time
                • Update the data analysis system and alert algorithms as necessary
                • Renew software licenses as necessary
                • Conduct annual drills and exercises on alert investigation procedures
                • Update alert investigation procedures as necessary
 PHS
                • Investigate and document alerts in real time
                • Hold routine meetings with public health partners
                • Conduct annual drills and  exercises on alert investigation procedures
                • Update alert investigation  procedures as necessary
 CM
                • Conduct annual consequence management drills and exercises in coordination with external
                  response partners
                • Update the consequence management and crisis communication plans based on the outcome
                  of drills, exercises or real-world incidents
 S&A
                • Perform routine maintenance and calibration of lab and field instrumentation
                • Procure reagents and replacement parts as needed
                • Perform routine sampling and analysis as needed to maintain proficiency
                • Renew service contracts as necessary
                • Conduct annual drills and exercises on sampling and analysis procedures
 2.3.3   Renewal Costs
 Renewal costs include the costs for updating or replacing assets that are no longer economically viable to
 maintain in order to continue SRS operations.  Renewal may require updates to the design of the asset or
 the equipment and systems with which the replacement asset needs to interface. Renewal costs may also
 include the cost of removal and disposal of the asset being replaced. As it is difficult to predict costs that
 will be incurred in the future for asset renewal, a simplifying assumption often made is that the renewal
 cost for an asset is equal to that asset's initial implementation cost.  This assumption is incorporated into
 Equation 2-3.

 Renewal costs must be calculated on an asset-by-asset basis. As shown in Figure 2-2, each asset may be
 renewed multiple times during the analysis period.  The renewal costs must be discounted separately for
 each renewal and then summed across the analysis period.
                                                                                                  11

-------
                            Framework for Comparing SRS Alternatives
The total number of renewals within the analysis period is determined by dividing the analysis period by
the total useful life of the asset, and rounding down to the nearest integer.  If dividing the analysis period
by the total useful life results in an integer, the number of renewals is the nearest lower integer (note that
the initial procurement is part of implementation costs and is not accounted in the number of renewals).

Equation 2-3 shows the total cost of renewal for an alternative (CR[Alternative]) is the  sum of the renewal
costs of all assets used in the alternative that are renewed at least once during the analysis period. The
total cost of renewal for a specific asset (CR[Asset]) is the sum of the discounted renewal cost for the asset
each time it is renewed during the analysis period. As noted above, each asset's initial implementation
cost (Cj[AssetJ) is used as the cost each time the asset is renewed. The Single Present Value (SPY) factor
is obtained from Table A-l of the latest version of the Annual Supplement to Handbook 135 (NIST) using
the "DOE Discount rate" column.
                                                 All Assets
                              CR [Alternative] =   V   CR [Asset]

                                               1->N
                          where: CR[Asset] = V(Cj [Asset] x SPF(y))
    Where:
      CR[Alternative] = Total renewal costs for an alternative during the analysis period
          CR[Asset] = Total renewal costs for an asset over the analysis period
                 N = Number of times an asset is renewed over the analysis period (integer value)
           Ci[Asset] = Total implementation cost for an asset
                 y = An integer value indicating the number of years from the first year of the analysis period
                     that the renewal is planned
            SPV(y) = Single Present Value factor for year y


     Equation 2-3.  Total Renewal Costs for an Alternative over the Analysis Period

Table 2-4 provides examples of renewal costs for the different components of an SRS. Renewal costs
can be estimated using field experience with the equipment, manufacturer-provided data, and the
recommendations of subject matter experts or consultants.  Calculation of renewal costs requires the total
useful life of an asset to be estimated, which is discussed in the next section.
                                                                                              12

-------
                            Framework for Comparing SRS Alternatives
Table 2-4.  Examples of Renewal Cost
Component
System
OWQM
ESM
CCS
PHS
CM
S&A
Renewal Cost Examples
• Replace obsolete components or software used in the SRS information management system
• Select, procure and install new sensors
• Train staff to operate and maintain new sensors
• Redesign or reconfigure OWQM stations where required to accommodate new sensors
• Replace obsolete IT hardware or software
• Dispose obsolete equipment
• Select, procure, install and commission new equipment
• Train staff to operate and maintain new equipment
• Replace obsolete IT hardware or software
• Dispose obsolete equipment
• Replace obsolete IT hardware or software
• Generally not applicable
• Generally not applicable
• Select and procure new lab and field instrumentation
• Train staff to operate and maintain new instrumentation
• Dispose obsolete equipment
2.3.4  Value of Remaining Useful Life
As the analysis period may not be an integer multiple of an asset's total useful life, some assets may have
remaining useful life at the end of the analysis period. The value of remaining useful life accounts for the
potential continued use of the assets past the end of the analysis period. Calculation of the value of
remaining useful life of an asset at the end of the analysis period is performed using a straight line
depreciation of the original implementation cost.  As this value will occur in the future, it is subject
to discounting.

The value of remaining useful life must be calculated on an asset-by-asset basis. The remaining useful
life of an asset (in years) is needed to determine this value and is calculated using Equation 2-4.
                           RUL[Asset] = TUL[Asset] x (N + 1) - AP

   Where:
     RUL[Asset] = Remaining useful life of the asset at the end of the analysis period, in years
     TUL[Asset] = Total useful life of the asset in years
             N = The number of renewals in the analysis period
           AP = The number of years in the analysis period
          ^^^^^^^^^^^^^^^^^^^^^^^^^_

   Equation 2-4. The Remaining Useful Life for an Asset

The present value of the remaining useful life for an alternative (VRUL[Alternative]) is the sum of the
value of the remaining useful life of all assets (VRUL[Asset]) calculated using Equation 2-5, where
Cj[AssetJ is the initial implementation cost for an asset and the SPVfactor is selected from Table A-l of
the latest version of the Annual Supplement to Handbook 135 (NIST) using the "DOE Discount rate" for
the final year of the analysis period.
                                                                                             13

-------
                            Framework for Comparing SRS Alternatives
                                                 All Assets

                            VRUL [Alternative]  =   ^  VRUL [Asset]


                                                     /RUL[Asset]\
                  where:VRUL[Asset] = C,[Asset] x  ( TUL[Asset]) x SPV(AP)


    Where:
      VRUL[Alternative] = Total value of the remaining useful life for the alternative
          VRui_[Asset] = Value of the remaining useful life of the asset
            Ci[Asset] = Initial implementation cost for the asset
          RUL[Asset] = Remaining useful life of the asset at the end of the analysis period, in years
          TUL[Asset] = Total useful life of the asset in years
                 AP = Number of years in the analysis period
            SPV(AP) = Single Present Value factor for year AP
 X	

     Equation 2-5.  Total Value of the Remaining Useful Life for an Alternative


2.3.5  Total Lifecycle Cost

The lifecycle cost for an alternative is calculated using Equation 2-6, which is the summation of all the
LCCE terms obtained using Equations 2-2,  2-3 and 2-5. Each term represents the total cost for the
alternative and thus includes the costs incurred for all associated assets over the analysis period.
      LCCE = Cj[Alternative] + COM[Alternative] + CR[Alternative] - VRUL[Alternative]

   Where:
              LCCE =  Lifecycle cost estimate for the alternative over the analysis period
       Ci[Alternative] =  Total implementation cost for the alternative
      CoM[Alternative] =  Total operations and maintenance cost for the alternative over the analysis period
       CR[Alternative] =  Total renewal costs incurred for the alternative over the analysis period
      VRui_[Alternative] =  Total value of the remaining useful life for the alternative at the end of the analysis
                      period


   Equation 2-6. LCCE for an Alternative
2.4    Example Calculation of the LCCE for OWQM Design Alternatives

This example shows a hypothetical utility's application of this methodology to compare different design
alternatives for OWQM. The OWQM component alternatives shown in Table 1-1 was used in this
example to illustrate the LCCE methodology:
Alternative 1: Monitoring for conventional parameters (chlorine residual, pH and conductivity) at 20
             locations
Alternative 2: Monitoring for conventional parameters plus UV-Visible spectral absorption at 10 locations
Alternative 3: Monitoring for conventional parameters at 12 locations and for conventional parameters
             plus UV-Visible spectral absorption at 3 additional locations
Alternative 4: Monitoring for conventional parameters at 6 locations and for conventional parameters plus
             UV-Visible spectral absorption at 6 additional locations
                                                                                              14

-------
                            Framework for Comparing SRS Alternatives
In this example, the trade-off between number of monitoring locations and number of parameters
monitored is considered. More monitoring locations in the distribution system increases spatial coverage
and provides information about a larger portion of the system. On the other hand, the addition of
instruments (in this case, a UV-Visible spectral absorption instrument) to the suite of conventional
parameter sensors provides more information about water quality at each location and enhances the ability
of the OWQM component to detect a broad range of water quality incidents.

Table 2-5 provides a summary of the assets that are used in the four alternatives. The abbreviations listed
in the second column are used in the example calculations that follow.

Table 2-5. Assets Associated with the Example Alternative Designs

Asset

Sensor equipment to monitor
conventional parameters
UV-Visible spectral
absorption instrument
Local controller (one needed
for each station)
Standard cabinet (one
needed for each station
monitoring for only
conventional parameters)
Large cabinet (one needed
for each station monitoring for
conventional parameters
and UVis)

Abbreviation

Conv
UVis
Cont
SCab

LCab
Number of
Assets
Needed for
Alternative 1
20
0
20
20

0
Number of
Assets
Needed for
Alternative 2
10
10
10
0

10
Number of
Assets
Needed for
Alternative 3
15
3
15
12

3
Number of
Assets
Needed for
Alternative 4
12
6
12
6

6
Identical communication and information management systems are used for all alternatives and thus
were not considered in this analysis. The costs associated with the assets that differ among the
alternatives are shown in Table 2-6. Note that detailed cost data was rolled up to generate the summary
values shown in this table.

Table 2-6. Summary of Costs for Each Asset Used in the Example
LCCE Terms
Initial implementation cost per
asset
Support contracts per year
per asset
Annual labor and
consumables costs per asset
Total useful life
Conv
$7,000
$5,000
$3,000
6 Years
UVis
$11,000
$2,000
$1,000
8 Years
Cont
$8,000
$1,000
$0
10 Years
SCab
$9,000
$0
$0
15 Years
LCab
$10,000
$0
$0
15 Years
Fifteen years was selected as the analysis period for this example because it is the longest total useful life
among the assets across all alternatives, as shown in Table 2-6. The timeline for renewal of the assets
used in the four alternatives is presented graphically in Figure 2-3.  Within the analysis period, each suite
of conventional parameter sensors would have two renewal cycles (at Year 6 and Year 12), each UV-
                                                                                             15

-------
                            Framework for Comparing SRS Alternatives
Visible instrument would have one renewal (at Year 8), and the local controllers will have one renewal (at
Year 10).  The cabinets would have no renewals because their total useful life would be equal to the
analysis period.
    Start of Project
    Implementation
                               Analysis Period
                              '   (15 Years)
Remaining
Useful Life
           Conv (TUL = 6 Yrs)    I
               UVis(TUL = 8Yrs)
                                                                     -  - Asset Renewal
Figure 2-3. Asset Renewal Timeline for the Example LCCE
Details of the LCCE calculations for Alternative 3 are presented in the following section to illustrate the
methodology. Alternative 3 was chosen for the example because it uses all five assets and thus better
illustrates the nuances of the calculations. The same calculations would be carried out for the other
three alternatives.

Implementation Costs for Alternative 3
The implementation costs for Alternative 3 are the sum of the implementation costs for the relevant
assets, as shown below. For each asset, the number of units of the asset needed for Alternative 3 (Table
2-5) was used along with the per-unit implementation cost (Table 2-6).

                                      Cj[Alternative 3]

   =  (15 x C,[Conv]) + (3 x C,[UVis]) + (15 x C,[Cont]) +(12 x C,[SCab]) + (3 x C,[LCab])
     = (15 x $7,000) + (3 x $11,000) + (15 x $8,000) + (12 x $9,000) + (3 x $10,000)

                                         = $396,000
Operations and Maintenance Costs for Alternative 3
Equation 2-2 was used to calculate operations and maintenance costs. The annual operations and
maintenance cost for each asset (CAnnOM[Asset]) is the sum of its yearly support contracts and annual labor
and consumables costs, both identified in Table 2-6. The value for the UPVterm (11.94) was obtained
                                                                                            16

-------
                          Framework for Comparing SRS Alternatives
from Table A-2 of the Annual Supplement to Handbook 135 (NIST, 2014) using the analysis period of
15 years.
                                    COM [Alternative 3]
             x CAnnOM[Conv] + 3 x CAnnOM[UVis] + 15 x CAnnOM[Cont
     = ((15 x $8,000) + (3 x $3,000) + (15 x $1,000) + (12 x $0) + (3 x $0)) x 11.94

                                     = $1,719,360


Renewal Costs for Alternative 3
Renewal costs were calculated separately for each asset and renewal for the alternative using Equation 2-
3.  The total number of renewals within the analysis period was determined by dividing the analysis
period by the useful life of the asset, and rounding down to the nearest integer. As noted above, the
sensors to measure conventional parameters would be renewed twice (at 6 and 12 years), the UV-Vis
instruments renewed once (at 8 years), and the local controllers renewed once (at 10 years). The cabinets
wouldn't require renewal during the analysis  period and thus were not included in the calculations below.

For each renewal, the initial implementation cost for the asset (CrfAsset]) identified in Table 2-6 was used
as the renewal cost. The SPV factor was obtained from Table A-l of the Annual Supplement to Handbook
135 (NIST, 2014) using the number of years between the start of the project and  the time of renewal.
Specifically, the values for the SPVterm are 0.837 for 6 years, 0.789 for 8 years  0.744 for 10 years and
0.701 for 12 years, all determined relative to the base year.
              CR[Conv] = (C,[Conv] x SPV(6Yrs)) + (C,[Conv] x SPV(12 Yrs))

                          = ($7, 000 X 0. 837) + ($7, 000 X 0. 701)

                                       = $10,766


                             CR[UVis] = C,[UVis] x SPV(8 Yrs)

                                   = $11,000x0.789

                                        = $8,679


                            CR[Cont] = C^Cont] x SPV(10 Yrs)

                                    = $8,000 x 0. 744

                                        = $5,952


          CR[Alternative 3] = (15 x CR[Conv]) + (3 x CR[UVis]) + (15 x CR[Cont])

                                       = $276,807
                                                                                        17

-------
                           Framework for Comparing SRS Alternatives


Value of Remaining Useful Life for Alternative 3
The value of the remaining useful life was calculated for each asset using Equation 2-5. Using
Equation 2-4, the RUL at the end of the analysis period was calculated as three years for the sensor pack
(6x3-15), one year for the UV-Vis instrument (8x2-15), and five years for the local controller (10x2-15).
Both cabinet types have no useful life remaining.

The remaining useful life, total useful life and initial implementation cost for each asset were used to
calculate the value of the remaining useful life using Equation 2-5. The SPY factor used to compute the
present value of these costs were found in Table A-l of the Annual Supplement to Handbook 135 (NIST,
2014) using the number of years in the analysis period from the base year (in this case 15 years).
                               n      r     n   RUL[Conv]\
                     VRUL[Conv] =  Q[Conv] x             x SPV(lSYrs)
                                   /          3Yrs\
                                 =  $7, 000 X - X 0. 642
                                   V         6 Yrs /
                                          = $2,247

                                  /           RUL[UVis] \
                     VRUL[UVis] = (^[UVis] x TUL[Uvis] j x SPV(15 Yrs)

                                   /            lYr\
                                =  ($11, 000 x -  x 0.642
                                   V           8 Yrs/
                                           = $883

                                   /          RUL[Cont]\
                     VRUL[Contr] = ( C,[Cont] x TUL[Contjj x SPV(15 Yrs)

                                   /*          5 Yrs \
                                =  $8, 000 X -  X 0. 642
                                   V          10 Yrs/
                                          = $2, 568
        VRUL [Alternative 3] = (15 x VRUL[Conv]) + (3 x VRUL[UVis]) + (15 x VRUL[Cont])
                                         = $74,873

Lifecycle Cost Estimate for Alternative 3
The total lifecycle cost estimate for Alternative 3 was calculated using Equation 2-6 and the results from
the previous steps.

             LCCE = Cj [Alternative 3] + COM [Alternative 3] + CR [Alternative 3]
                - VRUL [Alternative 3]
                     = $396, 000 + $1, 719, 360 + $276, 807 - $74, 873
                                      = $2,317,294

The same computation was performed for each alternative shown in Table 2-5, and the results for all four
alternatives are presented in Table 2-7.
                                                                                        18

-------
                         Framework for Comparing SRS Alternatives
Table 2-7.  Summary of Costs for the Example Alternatives
LCCE Terms
Implementation Cost
Operations and
Maintenance Cost
Renewal Cost
Value of Remaining
Useful Life
Lifecycle Cost Estimate
Alternative 1
$480,000
$2,149,200
$334,360
$96,300
$2,867,260
Alternative 2
$360,000
$1,432,800
$253,970
$56,978
$1,989,793
Alternative 3
$396,000
$1,719,360
$276,807
$74,873
$2,317,294
Alternative 4
$360,000
$1,504,440
$252,690
$63,077
$2,054,054
                                                                                     19

-------
                           Framework for Comparing SRS Alternatives


                        Section 3:  Score Alternatives
                    with Respect to Evaluation Criteria

When comparing SRS designs, the benefit of each alternative must be considered in addition to the
LCCE. This section describes a rigorous approach for including these factors, which are generally
qualitative, in the comparative analysis. The steps of this process are depicted in Figure 3-1 and further
described in the subsequent sections.  An example that illustrates these steps is provided in Section 3.3.
                                                                  Establish Evaluation
                                                                      Framework
                              Score Alternatives
                                With Respect to
                             Evaluation Criteria
                                                                      Score the
                                                                     Alternatives
Figure 3-1. Steps for Scoring Alternatives Using Evaluation Criteria

3.1    Establish Evaluation Framework
The first step is to establish the evaluation framework, which includes establishing the evaluation criteria,
weighting each criterion and developing a scoring scale. The evaluation framework should be developed
without consideration of the alternatives being compared to ensure an objective evaluation based solely
on the design goals and performance objectives established for the SRS.

Evaluation criteria are generally qualitative, though quantitative metrics can be derived for some criteria.
For example, tools such as hydraulic models, the Threat Ensemble Vulnerability Assessment - Sensor
Placement Optimization Tool (TEVA-SPOD (USEPA. 2010). and Water Health and Economic Analysis
Tool (WHEAT) (USEPA. 2014) can be used to develop qualitative estimates of the consequences or
contamination.

3.1.1  Establish Evaluation Criteria
The evaluation criteria should reflect the desired outcomes of SRS implementation and operation and
provide an unbiased approach for the comparison. SRS design goals and performance objectives used to
develop the set of viable alternatives can also be used to develop evaluation criteria for comparing
alternative SRS designs.

For this step in the comparative analysis, it is important to clearly define the evaluation criteria.
Characteristics of effective evaluation criteria include:
•  Traceable: each criterion should directly relate to a design goal or performance objective.
•  Unique: each criterion should  be unique  so that there is no "double counting" for a particular goal
   or objective.
•  Measurable: each criterion should be able to be assessed, even if only qualitatively, so that a score
   can be assigned to each criterion for each alternative.
                                                                                         20

-------
                            Framework for Comparing SRS Alternatives


•   Precise: each criterion should be specific and clearly worded such that a score can be easily assigned
    for each alternative without requiring interpretation by the evaluator.
•   Attainable: the criteria should be achievable within the constraints established for the project.
•   Complete: when considered together, the criteria should cover all SRS goals and objectives.

The number of evaluation criteria utilized should be minimized while meeting all of the characteristics
identified above, in particular those of uniqueness and completeness.

3.1.2  Weight Each Criterion
The next step in the process is to develop a weighting scale and assign specific weights to the evaluation
criteria established in the previous step that reflect the relative importance of each criterion.  Weighting of
the evaluation criteria is based on the relative importance that the utility places on each criterion,
independent of the  SRS design alternatives.

The weighting scale chosen should be intuitive and simple to apply. Also, it should provide enough
differentiation among the criteria while not being confusing  or cumbersome to apply. An example of a
four-level weighting scale is shown below:
4 = Critical: the criterion is essential to meeting the design goals and performance objectives of the SRS.
3 = High importance: the criterion is important, but not essential, to meeting the design goals and
performance objectives of the SRS.
2 = Moderate importance: the criterion helps differentiate among alternatives, but is not essential or
highly important.
1 = Low importance: the criterion would be nice, but is not important.

3.1.3  Develop a Scoring Scale
A scoring scale needs to be developed so that each alternative can be assigned a score for each of the
evaluation criteria.  The  scoring scale is a means of assigning numbers to criteria that are intrinsically
qualitative and unlikely to lend themselves to a quantitative  assessment. A scoring scale should provide
enough levels to  differentiate scores among the alternatives while also being straightforward to apply in a
consistent manner.  The  score assigned to an alternative should reflect the degree to which the alternative
meets the criterion. A simple four-point scoring scale
follows:
3 = Completely satisfies the criterion
2 = Partially meets the criterion
1 = Minimally meets the criterion
0 = Completely deficient with respect to the criterion
            HELPFUL HINT
Well defined evaluation criteria will ensure
that scoring is intuitive.  If this is not the
case, it may indicate that the evaluation
criteria need to be refined.
3.2    Score the Alternatives
After the evaluation framework has been developed, each alternative is independently scored against each
of the evaluation criteria. Scoring should be objective and based solely on the degree to which the
alternative meets the evaluation criteria. Scores should be assigned in a consistent manner across
alternatives, preferably by the same individual(s) to encourage consistency.

The final score for each alternative is referred to as the capability score, which characterizes how well
each alternative meets the design goals and performance objectives for the SRS. It is calculated by
multiplying the assigned score by the weighting factor for the individual criterion, and then summing all
                                                                                              21

-------
                            Framework for Comparing SRS Alternatives
the weighted scores across all criteria. This can generally be completed efficiently using a simple
spreadsheet. An illustrative example, without use of a spreadsheet, is provided in Section 3.3.

3.3    Example Qualitative Evaluation  of OWQM Design Alternatives
To illustrate the methodology for performing a qualitative evaluation of SRS design alternatives, each
step in the analysis will be shown for the OWQM design alternatives shown in Table 2-5 as a
continuation of the example presented in Section 2.4.

Establish Evaluation Criteria
The following design goals and performance objectives were established by a hypothetical utility to
develop the OWQM design  alternatives  considered in this example.

Design Goals
1.  Detect water quality incidents: Detect unusual water quality conditions in the distribution system
   including regular system occurrences such as nitrification, pressure transients, rusty or turbid water,
   treatment process failures, pipe breaks and excessive water age. Detect foreign substances in the
   distribution system resulting from leaky pipes, inadvertent cross-connections, backflow incidents,
   chemical overfeeds during treatment and intentional contamination.
2.  Optimize application of treatment chemicals: Provide disinfectant residual data at control points in
   the distribution system to better manage application of treatment chemicals and limit disinfection
   byproduct formation.
3.  Support compliance with water quality goals and regulations:  Identify deteriorating water quality in
   sufficient time to allow for corrective action that avoids potential compliance issues.
4.  Optimize investment: Minimize the  resources required to implement and operate the SRS by
   leveraging existing capabilities, infrastructure and personnel when practical.

Performance Objectives
1.  Spatial coverage:  Maximize the portion of the distribution system that is monitored. Spatial
   coverage is dependent on the number and locations of monitoring stations, as incidents of abnormal
   water quality can only be detected if affected water flows through a monitoring station.
2.  Incident coverage:  Maximize the types of water quality incidents that can be detected.  This is
   dependent on the parameters monitored, as an incident can be detected only if it causes a change in a
   monitored parameter.
3.  Alert occurrence:  Minimize the rate of invalid alerts while maintaining the ability to detect water
   quality incidents. Alerting is primarily impacted by the number of monitoring stations, the accuracy
   of OWQM data produced by the  sensors and the data analysis method(s) used.
4.  Timeliness of detection:  Minimize the time required to detect a water quality incident. Timeliness of
   detection is dependent on the number and locations of monitoring stations as well as the frequency
   with which water quality data is collected and analyzed.
5.  Operational reliability:  Maximize the percentage of time that the component is fully operational.
   This requires proper maintenance of all equipment and information management systems.
6.  Sustainability: Realize benefits related to day-to-day system operation as well as detection of water
   quality incidents that justify the cost and level of effort required to implement and operate the
   OWQM component.

Based on these design goals and performance objectives, the following evaluation criteria were
developed. The list in the parentheses after each criterion indicates the design goal (DG) or performance
objective (PO) from which the criterion was derived.
                                                                                             22

-------
                            Framework for Comparing SRS Alternatives


1.  The ability of each station to provide data for conventional water quality parameters. (DG1, DG2,
    DG3, PO6)
2.  The ability of each station to detect nitrification (through direct measurement of nitrate/nitrite) and
    turbid or discolored water to support early detection and response to frequent water quality issues.
    (DG1, DG2, DG3, PO2, PO6)
3.  The ability of each station to detect a broad range of abnormal substances. These include
    contaminant classes identified by USEPA QJSEPA. 2013). (DG1, DG3, P02, PO3)
4.  The ability of the OWQM component to provide information about water quality throughout the
    distribution system. (DG1, DG3, PO1, PO4)
5.  The degree to which the alternative maximizes the use of existing infrastructure. (DG4, PO6)
6.  The degree to which the alternative maximizes the use of existing knowledge and training. (DG4,
    P05, PO6)

This set of criteria has the characteristics described in Section 3.1.1. They are traceable, unique,
measurable, precise, attainable, and as a set are complete.

Weight Each Criterion
The four-point weighting scale shown in Section 3.1.2 was used for this example:
4 = Critical: the criterion is essential to meeting the design goals and performance objectives of the SRS.
3 = High importance: the criterion is important, but not essential, to meeting the design goals and
performance objectives of the SRS.
2 = Moderate importance: the criterion helps differentiate among alternatives, but is not essential or
highly important.
1 = Low importance: the criterion would be nice, but is not important.

Weights were assigned based on the relative importance the utility placed on the defined design goals and
performance objectives and are listed in Table 3-1, along with the rationale  for the weighting.
                                                                                             23

-------
                               Framework for Comparing SRS Alternatives
Table 3-1:  Weights Assigned to the Example Evaluation Criteria
           Evaluation Criterion
Weight
Rationale for Weighting
1.   The ability of each station to provide data
    for conventional water quality parameters.
          These parameters are critical for optimizing water
          quality in the distribution system and supporting
          regulatory compliance.
    The ability of each station to detect
    nitrification (through direct measurement
    of nitrate/nitrite) and turbid or discolored
    water to support early detection and
    response to frequent water quality issues.
          These are common water quality problems that occur
          in the utility's distribution system and can result in
          customer complaints and potential compliance
          issues. Though not critical, time and money could be
          saved by detecting these early, and  it could increase
          customer confidence in water quality.
3.  The ability of each station to detect a
    broad range of abnormal substances.
    These include contaminant classes
    identified by USEPA.
          While distribution system contamination incidents are
          rare, their occurrence would have significant
          consequences for the utility and its customers.
4.  The ability of the OWQM component to
    provide information about water quality
    throughout the distribution system.
          Awareness of water quality throughout the system is
          the main driver of this utility's OWQM implementation.
          They want to better understand how water quality
          varies throughout the distribution system and want to
          maximize the ability to detect water quality incidents
          anywhere in the system.
5.  The degree to which the alternative
    maximizes the use of existing
    infrastructure.
          Use of existing utility facilities to house OWQM
          stations is preferred as it would save money and
          provide the utility with direct control over security and
          access to the monitoring stations. However, the
          utility is willing to install OWQM stations at non-utility
          owned facilities if it would better support the
          design goals.
6.  The degree to which the alternative
    maximizes the use of existing knowledge
    and training.
          Use of technologies that are familiar to utility staff
          would reduce the amount of training required to
          operate and maintain the component. Though of
          secondary importance to the day-to-day benefit the
          system would provide, the utility wants to minimize
          the burden to maximize staff buy-in of the project.
Develop a Scoring Scale and Assign Scores to Alternatives
The four-point scoring system shown in Section 3.1.3 was used for this analysis.  The scoring for each of
the evaluation criteria was based on the approach shown in Table 3-2:
3 = Completely satisfies the criterion
2 = Partially meets the criterion
1 = Minimally meets the criterion
0 = Completely deficient with respect to the criterion
                                                                                                     24

-------
                               Framework for Comparing SRS Alternatives
Table 3-2.  Scoring Logic for Example Evaluation Criteria
         Evaluation Criterion
                          Scoring Logic
1.   The ability of each station to provide
    data for conventional water quality
    parameters.
The score assigned to this criterion depends on whether conventional
water quality parameters are monitored at each OWQM station:
3 = Conventional water quality parameters are monitored
0 = Conventional water quality parameters are not monitored
2.   The ability of each station to directly
    detect nitrification (through direct
    measurement of nitrate/nitrite) and
    turbid or discolored water to support
    early detection and response to
    frequent water quality issues.
Direct monitoring to detect nitrate/nitrite, turbidity or color is necessary to
satisfy this criterion. The standard sensor pack does not do this, but UV-
Vis instruments do measure these parameters.  Thus, the presence of a
UV-Vis instrument is necessary for a station to completely satisfy this
criterion. The following logic was used to score the alternatives as a
whole:
3 = All stations have UV-Vis instruments (all stations completely satisfy
the criterion)
2 = 50% to 99% of stations have UV-Vis instruments
1 = 1% to 49% of stations have UV-Vis instruments
0 = No stations  have UV-Vis instruments
    The ability of each station to detect a
    broad range of abnormal
    substances. These include
    contaminant classes identified
    by USEPA.
The standard sensor pack has the ability to detect some contaminants of
concern and thus partially meets this criterion. The addition of UV-Vis
instruments significantly increases the number of contaminants that can
be detected.  Thus, the presence of a UV-Vis instrument is necessary for
a station to completely satisfy this criterion. The following logic was used
to score the alternatives as a whole:
3= All stations have UV-Vis instruments (all stations completely satisfy
the criterion)
2= Some,  but not all, stations  have UV-Vis instruments
1= No stations have UV-Vis instruments and thus detection is limited to
substances detectable by conventional parameters
0= No detection ability (N/A for these alternatives since all stations
monitor conventional parameters)
4.   The ability of the OWQM component
    to provide information about water
    quality throughout the distribution
    system.
The score assigned to this criterion depends on the number of stations
installed in the distribution system.  Based on analysis using their
hydraulic model, the utility determined that 20 stations are required to
provide sufficient coverage and completely satisfy this criterion.  The
analysis also showed how coverage would be diminished with fewer
stations.  The results of this analysis were used to develop the following
scoring logic:
3 = Twenty or more stations
2 = Eleven to nineteen stations
1 = One to ten stations
0 = No stations
5.   The degree to which the alternative
    maximizes the use of existing
    infrastructure.
The utility owns ten facilities that could house OWQM stations, and thus
alternatives with ten or fewer stations would completely satisfy the
criterion of using existing infrastructure. Any stations beyond the initial
ten will require that the utility either build new facilities to house the
additional stations or install some OWQM stations in non-utility facilities.
The utility used the following  logic to score the alternatives:
3 = Ten or fewer stations
2 = Eleven to nineteen stations
1 = Twenty or more stations
0 = Not applicable as all alternatives will be able to use some utility
facilities
                                                                                                        25

-------
                             Framework for Comparing SRS Alternatives
         Evaluation Criterion
                        Scoring Logic
6.   Maximum use of existing knowledge
    and training.
In this example, the sensors used to monitor conventional parameters
are familiar technology, while the UV-Vis instruments would require
additional training and acquisition of new knowledge. Thus, the use of
UV-Vis instruments was used as the differentiator for scoring the
alternatives as follows:
3 = Use of only conventional parameters (no UV-Vis instruments)
0 = At least one UV-Vis instrument
Scores were assigned to each criterion for each alternative as shown in Table 3-3.

Table 3-3. Qualitative Scoring of Each Alternative
Evaluation Criterion
1 . The ability of each station to provide data for
conventional water quality parameters.
2. The ability of each station to directly detect
nitrification (through direct measurement of
nitrate/nitrite) and turbid or discolored water to
support early detection and response to
frequent water quality issues.
3. The ability of each station to detect a broad
range of abnormal substances. These include
contaminant classes identified by USEPA.
4. The ability of the OWQM component to
provide information about water quality
throughout the distribution system.
5. The degree to which the alternative
maximizes the use of existing infrastructure.
6. Maximum use of existing knowledge
and training.
Alternative
1
3
0
1
3
1
3
Alternative
2
3
3
3
1
3
0
Alternative
3
3
1
2
2
2
0
Alternative
4
3
2
3
2
2
0
Note: in this example all alternatives monitor conventional water quality parameters, so the first
evaluation criterion was not a differentiator.

Calculate Capability Scores
For each alternative, the score for each evaluation criterion (from Table 3-3) was multiplied by the
criterion's weighting factor (from Table 3-1) to produce a final weighted score. The final weighted scores
for each alternative and criterion are presented in Table 3-4. The capability score for each alternative
was calculated by summing the weighted scores for each criterion and is shown in the bottom row of
Table 3-4.
                                                                                                 26

-------
                          Framework for Comparing SRS Alternatives
Table 3-4. Developing a Final Capability Score for Each Alternative
Evaluation Criterion
1 . The ability of each station to provide
data for conventional water quality
parameters.
2. The ability of each station to directly
detect nitrification (through direct
measurement of nitrate/nitrite) and
turbid or discolored water to support
early detection and response to
frequent water quality issues.
3. The ability of each station to detect a
broad range of abnormal substances.
These include contaminant classes
identified by USEPA.
4. The ability of the OWQM component to
provide information about water quality
throughout the distribution system.
5. The degree to which the alternative
maximizes the use of existing
infrastructure.
6. Maximum use of existing knowledge
and training.
Capability Score
Weight
4
3
3
4
2
2

Alternative
1
12
0
3
12
2
6
35
Alternative
2
12
9
9
4
6
0
40
Alternative
3
12
3
6
8
4
0
33
Alternative
4
12
6
9
8
4
0
39
                                                                                        27

-------
                            Framework for Comparing SRS Alternatives
               Section 4:  Select the Preferred Alternative

The final step involves selection of an alternative for implementation based on the results of the steps
described in Sections 2 and 3.  The LCCE calculated in Section 2 provides a comparative assessment of
the costs required for each alternative, while the capability scores calculated in Section 3 provide a
comparative assessment of the ability of the alternatives to meet SRS design goals and performance
objectives.  Both of these are important to consider when selecting the SRS design to implement. This
section describes methods for considering the tradeoff between cost and capability when selecting
an alternative.

One outcome of this analysis may be to investigate hybrids of the alternatives identified at the beginning
of the process as the results of the analysis may indicate that adjustments to the preferred alternative will
provide additional capability at minimal additional cost.

4.1     Final Analysis of the Results
This section presents two analysis techniques that can facilitate comparison of alternatives based on the
LCCE and capability scores calculated in the previous sections.  These techniques are:
•  Producing a scatterplot of the LCCE (from Section  2) against the capability score (from Section  3) for
   each alternative.
•  Calculating and comparing the ratios of the capability score to the LCCE.

4.1.1   Scatterplots
A scatterplot can be useful for visualizing the tradeoff between costs and capability. When creating  a
scatterplot for this analysis, one point is created for each alternative, with the capability score plotted on
the x-axis and the LCCE plotted on the y-axis.

Figure 4-1 illustrates how a scatterplot could be  divided into four quadrants to provide a simple, visual
differentiation among alternatives. Alternatives in the bottom right quadrant provide the best solution
with the greatest capability for the lowest cost. Conversely, alternatives that fall into the top left quadrant
are less desirable due to their low capabilities and high cost.
                                                                                            28

-------
                            Framework for Comparing SRS Alternatives
LCCE($)
                                     IGHER COST /
                               LOWER CAPABILITY
HIGHER COST/
GREATER CAPABILITY
                                    LOWER COST /
                               LOWER CAPABILITY
LOWER COST /
GREATER CAPABILITY
                                          Capability Score
Figure 4-1. Scatterplot of LCCE vs. Capability Score

4.1.2  Cost vs. Capability Ratio
Another tool for comparison of alternatives is the cost vs. capability (C/C) ratio, which provides a
numerical indication of the increase in cost for an incremental improvement in capability.  It is calculated
simply by dividing the LCCE by the capability score. The ratio that results is in dollars per unit of
capability score. Low C/C ratios are preferable, as they indicate a lower additional cost required for a unit
increase in capability score.

This type of analysis is particularly useful when evaluating alternatives that are significantly different
with respect to both cost and capability.  For example, if one SRS design has a significantly greater
capability but at much higher cost than another alternative, comparison of the C/C ratios between these
two options may provide insight into the relative cost of increased capability.

4.1.3  Interpretation of Analysis Results
The analysis tools described in this section may  demonstrate that one of the alternatives is clearly superior
to the others.  However, it's often the case that more than one alternative provides a viable solution;
multiple alternatives appear in a similar area on  a scatterplot and have similar cost vs. capability ratios. In
such situations, the methodology described in this document could be applied again on the subset of
viable alternatives, using updated cost estimates and capability scoring where:
    •  Revision of the cost estimates may include replacement of order of magnitude cost estimates with
       more detailed estimates for cost factors with a high value, ensuring that all costs have been
       accounted for, and doing further research to obtain more precise values for those cost elements
       that differ across the alternatives being considered.
    •  Revision of the capability scoring may include changing the weighting and scoring for existing
       evaluation criteria, or addition of new criteria based on further deliberation with system designers
       and end users (as long as any new evaluation criteria still relate back to the DGs and POs).
                                                                                             29

-------
                            Framework for Comparing SRS Alternatives


4.2   Example of the Final Selection Process
To illustrate the final analysis and selection of the preferred OWQM design alternative, the methodology
discussed above is illustrated for the example alternatives presented in Table 2-5, as a continuation of the
examples presented in Sections 2.4 and 3.3.  The results of the LCCE and final evaluation scores
calculated in those sections are summarized in Table 4-1.

Table 4-1.  LCCE and Capability Scores for the Example OWQM  Design Alternatives
Alternative
1
2
3
4
LCCE
$2,867,260
$1,989,793
$2,317,294
$2,054,054
Capability Score
35
40
33
39
A scatterplot of the results presented in Table 4-1 is shown in Figure 4-2.  This plot clearly shows that
Alternative 1 and Alternative 3 can be removed from further consideration as they both provide less
capability at higher cost compared with the other two alternatives.
 o

3.0


2.8


2,6
           LESS
           DESIRABLE
                                 HIGHER COST/  HIGHER COST/
                                 VER CAPABILITY  GREATER CAPABILITY
                              Alternative 1
LU

8  2.2
2.0


1.8
           f
            | Alternative 3
                                                    Alternative 4
                                                          ^M
                                                          Alternative 2
                                                        o
         32
                  34
                                      LOWER COST /   LOWER COST /
                                   LOWER CAPABILITY   GREATER CAPABILITY
40
                                                                     42
44
                                   36           38
                                        Capability Score
Figure 4-2. Scatterplot for Example Alternatives

Of the remaining alternatives, Alternative 2 best meets the design goals and performance objectives and
has a marginally lower LCCE than Alternative 4. Alternative 2 therefore appears as the best option based
on the scatterplot above, followed closely by Alternative 4.

Table 4-2 provides the C/C ratio for each of the alternatives, ordered from lowest (best) to highest (worst)
C/C ratio. Again, Alternatives 1 and 3 can be eliminated from further consideration, as their C/C ratios
are significantly higher than the other alternatives. Alternatives 2 and 4 have similar C/C ratios, though
Alternative 4 is shown here as marginally more expensive per unit of capability score.  Thus,
                                                                                            30

-------
                             Framework for Comparing SRS Alternatives
consideration of the C/C ratio reinforces Alternative 2 as the best option for implementation, although
only slightly better than Alternative 4.

Table 4-2. Capability vs. Cost Ratios for the Example OWQM Design Alternatives
Alternative
2
4
3
1
LCCE Cost
$1,989,793
$2,054,054
$2,317,294
$2,867,260
Capability Score
40
39
33
35
Cost/Capability
$49,745
$52,668
$70,221
$81,922
As noted above, there is only a marginal difference between Alternatives 2 and 4 in this example. Thus,
the utility may choose to implement a more detailed cost or capability analysis between only these two
alternatives. For example, Alternative 4 requires installation of equipment at facilities not belonging to
the utility but Alternative 2 does not, so the utility could perform a more detailed evaluation of the
additional cost and effort required for installation and operation of monitoring stations at non-utility
facilities.  Similarly, the weighting and scoring scale for the evaluation criteria related to the use of
existing infrastructure and existing knowledge could be refined to better differentiate between
Alternatives 2 and 4.
                                                                                                31

-------
                          Framework for Comparing SRS Alternatives


                                     Resources


Introduction
System Engineering Principles of Water Quality Surveillance and Response System Design
       http://water.epa.gov/infrastructure/watersecurity/lawsregs/initiative.cfm
       This document provides information about how system engineering principles can be applied to
       the design and implementation of an SRS to ensure that the SRS functions as an integrated whole,
       and is designed effectively to perform its intended function.

Develop Life Cycle Cost Estimates
Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis: Annual Supplement to
       Handbook 135
       http: //www 1. eere. energy. gov/femp/pdfs/ashb 14 .pdf
       This supplement to Handbook 135 provides the tables with discounting factors for use in the
       calculation of lifecycle cost estimates.

Score Alternatives with Respect to Evaluation Criteria
Water Security Initiative: Guidance for Building Laboratory Capabilities to Respond to Drinking
       Water Contamination
       http://nepis.epa.gov/Exe/ZvPDF.cgi/P100KLAN.PDF?Dockev=P100KLAN.PDF
       This document includes a section on the capabilities which should be developed to address the
       contaminants of concern, and identifies and classifies those contaminants.

Sensor Network Design for Drinking Water Contamination Warning Systems: A Compendium of
       Research Results and Case Studies using the TEVA-SPOT
       http://nepis.epa.gov/Exe/ZvPDF.cgi/P10077WZ.PDF?Dockev=P10077WZ.PDF
       This document provides information on the use of TEVA-SPOT for estimating consequences of
       distribution system contamination and assessing the capabilities of different SRS designs for
       minimizing those consequences. It also includes a number of case  studies demonstrating the
       application of TEVA-SPOT for these purposes.

Water Health and Economic Analysis Tool (WHEAT)
       http://water.epa.gov/infrastructure/watersecurity/techtools/wheat.cfm (accessed April 8, 2015)
       This web-site provides details and the downloadable software for the WHEAT tool. The tool
       provides the ability to develop scenarios and estimate the financial  consequences of a distribution
       system contamination incident.
                                                                                       32

-------
                            Framework for Comparing SRS Alternatives


                                        Glossary

alert. An indication from an SRS surveillance component that an anomaly has been detected in a
datastream monitored by that component. Alerts may be visual or audible, and may initiate automatic
notifications such as pager, text or e-mail messages.

analysis period. The period of time used for the LCCE analysis.  It must be common across all assets
and alternatives which are part of the analysis.

asset. A piece of equipment, IT system,  instrument or other physical resource used in the implementation
of an SRS component or system.

benefit. An outcome associated with the implementation and operation of an SRS that promotes the
welfare of a utility and the community it  serves.  Benefits can be derived from a reduction in the
consequences of a contamination incident and from improvements to routine operations.

capability score. A score which provides an indication of the degree to which an SRS, component or
asset design meets evaluation criteria derived from the design goals and performance objectives
established  for the SRS.

component. One of the primary functional areas of an SRS.  There are four surveillance components:
Online Water Quality Monitoring; Enhanced Security Monitoring; Customer Complaint Surveillance; and
Public Health Surveillance.  There are two response components:  Consequence Management and
Sampling and Analysis.

Consequence Management (CM).  One of the response components of an SRS.  This component
encompasses actions taken to plan for and respond to possible drinking water contamination incidents to
minimize response and recovery timelines, and ultimately to minimize consequences to a utility and
the public.

constraints. Requirements or limitations that may impact the viability of an alternative. The primary
constraints for an SRS project are typically schedule, budget and policy issues (for example, zoning
restrictions, IT restriction and union prohibitions).

Customer Complaint Surveillance (CCS). One of the surveillance components of an  SRS. CCS
monitors water quality complaint data in  call or work management processes and identifies abnormally
high volumes or spatial clustering of complaints that may be indicative of a contamination incident.

dashboard. A visually-oriented user interface that integrates data from multiple SRS components to
provide a holistic view of distribution system water quality.  The integrated display of information in a
dashboard allows for more efficient and effective management of  distribution system water quality, and
the timely investigation of water quality incidents.

data analysis. The process of analyzing  data to support routine system operation, rapid identification of
water quality anomalies and generation of alert notifications.

datastream. A collection of time-series  data for a specific parameter or set of parameters.

design goals (DG).  The specific benefits to be realized through deployment of an SRS  and each of its
components. A fundamental design goal of an SRS is detecting and responding to distribution system
                                                                                            33

-------
                            Framework for Comparing SRS Alternatives
contamination incidents. Additional design goals for an SRS are established by a utility and may relate to
the operational benefits derived from SRS operations.

discounting. An accounting term that describes the technique for adjusting costs realized in the future to
express them in today's value.

Enhanced Security Monitoring (ESM). One of the surveillance components of an SRS. ESM includes
the equipment and procedures to detect and respond to security breaches at distribution system facilities
that are vulnerable to contamination.

evaluation criteria.  A set of criteria used to evaluate the capability of SRS design alternatives. The
evaluation criteria are based on the design goals and performance objectives established by a utility for
their SRS.

implementation costs. Costs to procure and install equipment, IT components and subsystems necessary
to deploy an operational system.

information management system. The combination of hardware, software, tools and processes that
collectively supports an SRS and provides users with data needed to monitor real-time system conditions.
The system allows users to efficiently identify, investigate and respond to water quality anomalies.

invalid alert.  An alert from an SRS surveillance component that is not due to water quality incident or
public health incident.

lifecycle cost. The total cost of a system, component or asset over its useful life.  Lifecycle cost includes
the cost of implementation, operation & maintenance and renewal.

lifecycle cost estimate (LCCE).  An estimate of the total cost of an alternative including all costs
associated with implementation, operations and maintenance, and renewal.  The value of the remaining
useful life of any assets that have useful life at the end of the analysis period is subtracted from the LCCE.

monitoring  station.  A configuration of one or more water quality sensors and associated support
systems, such as plumbing, electric and communications that is deployed to monitor water quality in real
time at a specific location in a drinking water distribution system.

Online Water Quality Monitoring (OWQM). One of the surveillance components of an SRS.  OWQM
utilizes data collected from monitoring stations that are deployed at strategic locations in a distribution
system. Monitored parameters can include common water quality parameters  (such as, disinfectant
residual, pH, specific conductance and turbidity) and advanced parameters (such as, total organic carbon
and UV-Vis spectral data). Data from distribution system monitoring locations is transferred to a central
location and analyzed for water quality anomalies.

operations and maintenance (O&M) costs.  Expenses incurred to sustain operation of a system at an
acceptable level of performance.  O&M costs are typically reported on an annual basis, and include labor
and other expenditures (supplies and purchased services).

performance objectives (PO). Measurable indicators of how well an SRS or its components meet
established design goals.
                                                                                             34

-------
                            Framework for Comparing SRS Alternatives
Public Health Surveillance (PHS).  One of the surveillance components of an SRS. PHS involves the
analysis of public health data to identify public health incidents and investigation of such incidents to
determine whether they may be due to drinking water contamination.

real-time. A mode of operation in which data describing the current state of a system is available in
sufficient time for analysis and subsequent use to support assessment, control and decision functions
related to the monitored system.

remaining useful life (RUL). The amount of useful life of an asset remaining at the end of the analysis
period.

renewal. The replacement of an asset at the end of its useful life to maintain a fully functioning system.

renewal cost.  The cost of replacing an asset at the end of its useful life to ensure that the functionality of
the asset is provided until the end of the analysis period.

Sampling and Analysis (S&A).  One of the response components of an SRS.  S&A is activated during
Consequence Management to help confirm or rule out possible water contamination through field and
laboratory analyses of water samples. In addition to laboratory analyses, S&A includes all the activities
associated with Site Characterization; site investigation, site safety screening, rapid field testing and
sample collection. S&A continues to be active throughout remediation and recovery if contamination
is confirmed.

single present value (SPV).  A factor that can be used to determine how much a single, future
expenditure would cost in today's dollars.

Surveillance and Response System  (SRS). See Water Quality Surveillance and Response System.

total useful life (TUL). The total period of time that an asset can be economically maintained.

uniform present value (UPV). A factor that can be used to determine how much annually recurring,
future expenditure would cost in today's dollars.

useful life. The period of time that an asset is able to be economically maintained. Total useful life refers
to the total period of time that an asset is able to be economically maintained, and remaining useful life is
the useful life of an asset remaining at the end of the  analysis period.

value of remaining useful life. The  value of the useful life remaining at the end of the analysis period
discounted to the base year.

Water Quality Surveillance and Response System (SRS). A system that employs one or more
surveillance  components to monitor and manage distribution system water quality in real time. An SRS
utilizes a variety of data analysis techniques to detect water quality anomalies and generate alerts.
Procedures guide the investigation of alerts and the response to validated water quality incidents that
might impact any aspect of operations, public health  or utility infrastructure.
                                                                                             35

-------