-------
£•
S)
1
c
1
g.
S.
©
U
'E
£
5 <
0) C
o i
• c
i s
i
^ c
o" g
i c;
CO C
eo" «-
,1 0
C£ °?
T^ "
T- 6
§°-
1
T~
fc.
® in
2 -3
ership Type/
Water Source
O
Sg e^§ §g
co YJ: co 10 in co'
* +) » +, * +,
eg ci o Y-: CM CM'
•W CJ> Y"> O ,,
-H
*- O> CM CM '-CO
§£ SSJ 8«i
+1 -H
^" CO '—CO O5 CO
co f^ Is- sr o Y-"
f- -H f"- .H & t~
•H
** *** * •
%• CO CM V Si ?-
oS o' CM' ci •«- M«
co vi i^ v- m •?
•H -H +1
Is; CO CM O O ^,
f*» CO* O CM ^" CM
M- T: m "£ in V:
-H -H
00 ^S ^^
CMoi coed jnjo
•H -H
*x *x s?^
LO CO O> C\4 O> CSi
&* ££ 8'g
•H +H
O> CO CO o C3> CO
co co m' ci co K
m -H "*"^ "•" -H
+1
ystems
Primarily Ground
Confidence Interval
Primarily Surface
Confidence Interval
Primarily Purchased
Confidence Interval
co
**
ic Systems
a
3
Q.
O IO *fr S.
*- ^ CM' co
10 -H <° V
s^ss •£•$?
O o CO CM
80' ci CM"
t, O) t*«
i— •*•"
-H
in co co 10
(^ N: vn co"
CO Y— CO '^
+1 +1
in co op co
CM CO op CO
""• +l ""• -H
*>« ^^
t--; P, O) CO
cp co •* oj
"^ --H
O CO Cf) T-
s?s ?s
•H
CO CM Is* CO
CD" ci co ci
f V- ^ CM
•H -H
s?ss s?s?
o> o N: ^~
co c\i h^ cd
1" \~ CO Y—
•H -H
Is"; CO CO CM
cb K: 06 co
CO CM CM CM
•H , "H
Primarily Ground
Confidence Interval
Primarily Surface
Confidence Interval
gg
'* -H
|g
S ^
CO 85
in Si
•H
^ =
CM o>
•H
«^
S£
•H
.
CO %.
?CM
•H
s?s?
CM c»
££
-H
<£> t~
r^-' co
-H
Primarily Purchased
Confidence Interval
gg
"* -H
O> o
"*" +1
U) CO
^~ +1
gg
N^ CO
•^ -H
^ 0
O %•
CD^
^g
O cj
in T~
•H
C3> C3>
tec
-H
-§ o
cp ub
^SJ
-H
col?
5s"
te Systems
Primarily Ground
Confidence Interval
CO
>
•c
a.
o> oo oo co
500 coco
IS |S
in 85 in o
S 10 •* o>
in v- co CM
-H -H
C) CN Cp c\|
Sx-^ ^— ci
T- CO CM
•H -H
^ 0 ^ 0
%-, co a> oo
CO CM CM CO
m co m CM
O CM CO v*
CM o> r- CM'
in t— co CM
•H -H
cn^ ro|5
S' f-' CO 10
Y* CO Y*
•H -H
gS^ s?^
o o o r«*.
c> c» o ^*
CM CM •» Y-.
•H -H
CD N. •* CO
m co o> CM*
m Y-: CM CM
+1 -H
Primarily Surface
Confidence Interval
Primarily Purchased
Confidence Interval
5^ <3?
O) oo
0X1 -H
§
S
*~
s§
ss?
•H
^ 0
CD S>
•H
O CO
cS1^
lary Systems
Primarily Ground
Confidence Interval
o
c
O> YJ.
^ +1
0
8 J
^ 0
O> ^
SS
•H
f"- O)
SY=
•H
Primarily Surface
Confidence Interval
gg
^ +i
^
fr> YJ.
ci oi
T— Y^
•H
p 10
T^ <0
•H
Primarily Purchased
Confidence Interval
&
c
8
c
cu
•g
i
"5
3
8
2
CO
c
o
t:
CO
it
=3
O
CO
I
I
I
•5
O)
O)
3
03
a.
g>
JZ
CD
(O
C
$
S3
O
10 to
co E
1 I
±J
I
a
-
-------
t-
CO
o>
.
c
CO
^
i
c/5
••>,
0
O)
o
0)
o.
x
111
3
a.
ra
O
o
I
•5 fl>
i3|:
O> c ^™
1 -n '^~
« » ^
I > =
ID ^
1 ^
3
a.
0
0
a
5*
c
a>
a.
«
!C
0
U.
Q.
M-»
i
3
'5.
n
O
m S
Of O
"I*—
* o
5§
0 o
o o
IO *"
5§
o^ .
T~ "^
£•
o
01 i e
n o c4**
O co o
i ""
"IB
S
3
Q.
g i «
CO
I O
V O
^
is
t"™
o w
o 3)
^,
c
o
9
ership Type/
Major Capital Expense Cat
c
O
0? S?
O CO
CD C)
"*~ -H
v^ so
to ?^-
«5 lf>
T~ -H
25s?
ib 5)
N-' IT)
T~ +1
i": o>
J CO
•H
go go
o 85
O tf
-S vp
?f CM
CM £*
-H
Sg
CM ^
""" -H
23 S?
CO 8)
CO* ^
ystems
Water Quality Improvement
Confidence Interval
CO
5
3s S?
CM N.
CO
-H
-? vp
*- co
Jo's
-H
25s?
CO tb
8S
•H
g$
gS
m -H
25s?
to co
CD T*
"-H
>5 VP
co \h
5$
•H
-g
S*
^SS
T— Q
££
•H
.i
Replacement or Major Repa
Confidence Interval
25s? g
G>0>'°
3 CO
-H
= r-
«o 0")
££
•H
25 S? <°
CM Tt- ?!
CD CO
^ +1
sp x^p (D
? CO *°
T-* CO
•H
S^Ss "*"
"•*" co ^
"* -H
- o
*?*• co °
rt -H
-g ^
CM £i
•H
2? S? °
CO CO ^^
0 0
+1
System Expansion
Confidence Interval
Observations
25 "•? 25 s?
CO o IO T-
CDW C0«i
•H -H
^ v= SO
P- CO P- TO
•* 10 f-" W
••" -H e0'-
+1
25s? 25~3"J
O) to "^ CO
CD 10 *t> C)
T- CO T-
-H
s§ ni
CD CO T" OS
**"* -H ^ -H
5^ ^ 5^ ^
CO N. W o>
011 •» ^ -H
^ ^o
S> CO
CO 01 1^
h- CM
+1
25^ fc
CD S °°
** -H
25 S? ^
CM CO
10 +1
25 S? N
CM ^J-
* -H
^ CO
OJ o> W
CM ^
CO t-
+1
CM CO "*
SSJ
•H
25 SS "»
CM CO *~
CO* CO
41
System Expansion
Confidence Interval
Observations
vP >,Q ^ff XP
o* 5s o* es
CD IO •*". CO
cS """ 3> """'
+1 +1
^ s§ xP
CM o ?- TO
cd •<-: d o
255? 25^?
CO O Is- ^t-
T- (S CM* l<
CM CM •* r-
-H -H
gg gS
O> CO t— '^
•H *~
•H
25 Ss 25"S?
•v-f CM CO t-
CM CO CO IO*
"~ -H n ""
•H
*& sp >§ sp
CO S. p %•
co 10" co •*-:
CM CM CO ^~
+1 -H
^§ Si
« oi gj Y-;
+i
25S? 25S?
O> IO O^ T—
1C* tC> •* CM
T^- t- h- CO
+1 +1
2
ite Systems
Water Quality Improvement
Confidence Interval
Replacement or Major Repa
Confidence Interval
1
£
25 S? °5
CO -r-
"5
to 10
ss
-H
CO N. ^
cd -^
"*•*
•H
S-5 S? ^
CD ^
oS to
•H
System Expansion
Confidence Interval
Observations
^ go go ,_
o * o * o *
. * . * . *
o o o
0
_ ^a - w
O O O ^- O M-
°° ?§ 5§
-H -H
gg 25SS gS? £
T-* T-: CD ^ T- CM
-H -H -H
25S? 25S? 25S? ^
IO TO CM CO CO 1O "*~
CO* O OS IO* CM* CO
T- -r- CO t- T- t-.
•H -H +1
£
llary Systems
Water Quality Improvement
Confidence Interval
Replacement or Major Repa
Confidence Interval
System Expansion
Confidence Interval
Observations
'o
c
1
0)
C
1
a>
•o
§
1
Q
"c5
0
c
_o
** : Insufficient observal
CO
°2
_
|
to
0
'to
1
1
1
s
*Q-
en
o
o
B
?
c
Q
C
c
„
o
Q.
£
CO
CD
"w
>»
to
Data: Q. 36
Notes: Table includes only
-------
-------
PART 3
METHODOLOGY REPORT
-------
-------
1.
INTRODUCTION
1.1
Study Background and Survey Overview
Study Background
In compliance with Executive Order 12866, the Regulatory Flexibility Act, and the Safe
Drinking Water Act (SDWA), EPA's Office of Ground Water and Drinking Water conducts periodic
surveys of the financial and operating characteristics of community water systems. Previous surveys were
conducted in 1976, 1982, and 1986. This report documents the methodology employed for the latest point
in this series, the 1994 Community Water System Survey. The information in this survey will be used to
help EPA and State program offices develop and implement the administration's proposals for
reauthorizing the SDWA. It will be used, for example, to help make water system capacity-building
evaluations, particularly for small systems, and to determine the need for and design of special Best
Available Technology (BAT) programs, again primarily benefiting smaller systems. The information is
also essential to support economic analyses of the costs and benefits of new regulations and changes to
existing regulations on consumers, the water supply industry, and the nation. The information will also be
used to measure the financial burden of EPA's regulations on consumers and the water supply industry.
Furthermore, data from the survey will help EPA identify, evaluate, and develop guidance for Best
Management Practices (BMPs) used in water treatment and distribution systems.
1.2
Survey Overview
This section is intended to provide the reader with an overview of the design and conduct of
the Community Water System Survey. The topics presented in this section will then be discussed at greater
length in the following chapters.
The Community Water System Survey (CWSS) was a mail survey designed to collect
operating and financial information from a representative sample of community water systems. In order to
identify the eligible systems and appropriate respondents for the mail questionnaire, the survey utilized a
computer assisted telephone interview (CATI) questionnaire to conduct a preliminary screening of the
sampled systems. An initial Phase I sample of 5,856 water systems was drawn from over 57,000 systems
contained on the Federal Reporting Data System (FRDS) file, which was the source of the sample frame
1-1
-------
for the CWSS. To meet EPA's analytical objectives, the sample was stratified into 38 groups (strata)
defined by the system's type of ownership, water sources, and number of residents served.
Telephone interviewers contacted and screened the systems to verify FRDS data, collect
additional data for stratifying the mail sample frame, and collect the name and address of the person who
would complete the mail survey questionnaire. The CATI survey identified 4,729 e ;gible water systems.
These systems were re-stratified based on the information they provided during the screening survey, to
form the frame for the Phase II of the survey. From these 4,729 systems, 3,681 were subsampled to
receive the main survey questionnaire.
Three versions of the survey questionnaire were developed to be administered by mail to
publicly owned, privately owned, and ancillary community water systems. An ancillary system is one that
operates a drinking water system as a secondary component of its main business, such as a trailer park.
After the data collection instrument was pretested and revised, a complete pilot test of survey instruments,
procedures, and operations was conducted with approximately 80 water systems. The instruments,
systems, and procedures were then revised for the full study, as dictated by the findings of the pilot test.
Immediately prior to the mailout of the questionnaires, the systems received a notification call
alerting them that they would receive a mail questionnaire. During mail data collection, a toll-free support
line was manned to answer any technical or administrative questions that respondents might have. At
appropriate points in time, reminder calls were also made to non-responding systems. Requests for
questionnaire remailings were received through the support line and in conjunction with reminder calls; the
requests were batched and remails were done on a weekly basis.
As the completed questionnaires were returned, they were logged into a receipt control system.
Next, they went through an extensive data quality review and possible data retrieval to clarify or correct
anomalous items or collect missing responses. The questionnaires were key-entered using independent
double-key entry. Finally, they were run through automated cleaning and editing programs.
A series of sample weights, non-response adjustments, and other statistical techniques were
created and applied to the final set of responding sampled water systems, so that the responses of the
sampled systems could be properly expanded to represent the national universe of community water
systems (CWSs).
_
1-2
-------
Work on the planning and design of the Community Water System Survey began in July,
1993, and continued through July, 1994. The pilot test was conducted from August to September, 1994,
and the final design was developed in October and November, 1994. Data collection occurred from
November to December, 1994 (telephone screener) and from June, 1995, through March, 1996 (mail
survey). Data processing, analysis, and reporting covered the period December, 1995, through September,
1996.
EPA secured the services of several contractors who performed a variety of tasks in support
of the survey design, conduct, data processing, and analysis. Prime contractors included The Cadmus
Group, and Westat, Inc. Other contractors included The Washington Consulting Group and The
Government Finance Group. The Cadmus Group has been supporting EPA and other clients in the
assessment and analysis of the water industry for over 15 years. Cadmus' primary responsibilities were for
overall project management, expert QA review of survey data, and report preparation. Westat was
responsible for assisting EPA with questionnaire design; for sample design and selection; design, conduct
and management of the data collection process; editing and preparation of the data into final form for
delivery; calculation of appropriate sample weights; data tabulations; and delivery of the final survey
database with documentation. The Washington Consulting Group conducted advance notification calls to
the sampled systems immediately prior to questionnaire mailout. EPA provided management and guidance
across all of these tasks. The Government Finance Group (GFG), expert in the field of public finance,
reviewed and helped design the financial section of the CWSS questionnaire. They also provided QA
review of survey results. Grant Thornton, Inc., accountants and management consultants, conducted a peer
review of the financial section of the survey questionnaire and provided insights on the classification of
revenues and expenses based on the principles of Enterprise Fund accounting. Peer review of the CWSS
questionnaires was also conducted by John Trax of the National Rural Water Association and by Vern
Achtormann, Waterstats Manager for American Water Works Association until the Summer of 1996.
EPA also was supported by several well known consultants to the water industry. Mr. Jim
McFarland, a consultant who has been conducting RIAs for EPA's Office of Ground Water and Drinking
Water (OGWDW) for over a decade, has been involved in all facets of the CWSS project. He supported
the design of the CWSS questionnaire, helped develop the QA review procedures that were applied, and
participated in report preparation and review. Mr. Dan Fraser, an engineer and expert in the operational
characteristics of water systems, participated in the review of the operational section of the survey
questionnaire and QA review of survey results. Dr. Janice Beecher of Indiana University has conducted
peer review of the CWSS Report. Dr. Beecher, an expert in the water industry, serves as Senior Research
Scientist and Director of Regulatory Studies at Indiana University's Center for Urban Policy and the
1-3
-------
Environment She also serves as a Senior Institute Research Specialist at the National Regulatory
Research Institute (NRRI).
1-4
-------
2. SAMPLE DESIGN AND WEIGHTING
2.1
Sample Design and Selection
This section describes the sample design for the Community Water System Survey (CWSS).
It includes a description of the sampling frame, target sample size, stratification variables and sampling
method.
The survey utilized a two-phase, single stage, stratified sample design. Phase I was a
telephone screening survey which provided a sampling frame for the main data collection effort in Phase II.
The Phase II mail survey was tailored to the type of ownership of the water system and provided data for
the substantive analysis. At both stages, sampling strata were defined by various combinations of the
water systems' size (residential population served), ownership (public, private, or ancillary), and primary
water source (ground or surface).
A screener questionnaire was administered by telephone to the sampled water systems to
determine their eligibility and to obtain precise stratification and accurate contact information. In Phase II,
the eligible Phase I respondents were stratified utilizing the screening data and a sample of 3,681 systems
were selected for the mail survey.
2.1.1
FRDS Sampling Frame and Coverage
CWSS Sample Frame. The CWSS sample frame was developed from the Federal Reporting
and Data System (FRDS), EPA's permanent database of all U.S. water systems. This database contains
records of water systems and water system facilities; it consists of information that is reported to EPA on a
quarterly basis by each individual state. FRDS's coverage of the target population is relatively complete.
Alternate list frames all suffer from substantial undercoverage. An area frame approach would be more
costly than the list frame approach in that the possible improvement in the sample would not justify the
increased cost.
Because of the longitudinal nature of the FRDS file, the formal definition of a community
water system (as specified in the Code of Federal Regulations), and the specific definition of systems
eligible for this survey, not every record on the file represented a community water system that was eligible
for the survey. Thus, to construct a valid operational sample frame, it was necessary to review the
2-1
-------
standard FRDS data file documentation and to conduct summary and individual analyses of the FRDS
records to assess what kinds of information were available. The information was targeted to fulfill three
principal purposes:
• Eligibility: to determine whether a record represented a currently operating, sample-
eligible community water system;
• Sample Stratification: to determine whether the necessary data elements (variables)
existed on FRDS, containing suitable data values, to construct a frame for a sample
design that would meet EPA's ultimate data and analytical needs from the survey; and
• Data Collection: to determine the quality and extent of the water system location and
contact information available on FRDS to support an efficient and effective data
collection effort.
In addition to the presence or absence of specific items, the frame development process also
took steps to analyze the reliability and consistency of the data within and across records and to measure
the degree to which data was present or missing for each data element. Finally, FRDS contains numerous
data elements that were not relevant to the CWSS; and there are potentially multiple sources for certain
pieces of information. Hence, the FRDS review developed a decision-list of all the specific FRDS data
elements that would be used for sampling and data collection, as well as clear documentation of their
contents in terms of the CWSS design definitions.
This review and analysis was conducted at two stages: prior to the first draft sample design,
and again after the results of the survey pilot test identified additional issues concerning the structure,
content, and quality of the FRDS data that could have a bearing on sample design and data collection.
Frame Preparation Protocol. To prepare the CWSS sample frame, the needed data elements
were extracted from the FRDS database, then checked, processed and cleaned according to a documented
protocol developed during the FRDS review and analysis. The principal processing goals during the
extraction and frame building process included identifying and eliminating duplicate records, improving
missing or ambiguous ownership data, checking for valid telephone numbers, and stratifying the resulting
frame according to the sample design.
The sample design called for a stratified sample. Two of the stratification dimensions were
the systems' size, as defined by residential population served, and type of ownership. Analysis determined
2-2
-------
that population served was missing for about 0.5% of the FRDS records, and the frame preparation
protocol imputed these missing values using a hot-deck imputation procedure. The survey design assumed
that this imputation would be used only for the first phase of the sampling process, and that actual
population data would be collected during the telephone screening of the first phase sample.
Ownership type was assigned to the CWSS ownership categories (publicly owned or privately
owned) when values in the FRDS record mapped explicitly to the CWSS ownership definitions. However,
these values were sometimes ambiguous or missing. In these cases, the frame development protocol
incorporated a qualitative evaluation of the water system names to make a judgment as to the ownership
type, based on certain types of names or specific words appearing in the name. A decision tree was
implemented within both automated and manual review processes. If the automated processing identified
certain key words, phrases, or abbreviations in the name, then the appropriate ownership type was
assigned. For example, the presence of the word "Municipal" in its various forms, cases, and abbreviations
was regarded as evidence of public ownership, while the similar variants of the word "Incorporated" was
regarded as evidence of private ownership. A lengthy list of similar indicators was employed. If the
automated decision logic could not make an explicit decision, the record was printed for review by the
CWSS project staff, who used both the system name and other sources of information to assign the
ownership category.
Several specific data content issues were dealt with during the processing of the FRDS
records. Telephone numbers, which were needed for conducting the Phase I telephone screening, serve as a
good example. The preliminary review of the rate of completely missing data in the telephone number
fields identified the need for the CWSS data collection plan to incorporate procedures to acquire phone
numbers from sources other than FRDS. However, during the actual frame processing, a more structured
review of the telephone numbers indicated that the mere presence of numbers in the appropriate field did
not necessarily provide useful telephone numbers. For example, the structured review of the telephone
numbers looked for ones in which all characters consisted of a "9" or "0", or whose area codes began with
"0" or "1." Such numbers were not valid numbers and would need to be identified prior to the data
collection so that they could be put through the intensive tracing process described below in Section 3.3.
In general, most active water systems are considered part of the current FRDS inventory (i.e.,
FRDS data element C109 classifies their record as "current"), while systems that are no longer active water
systems are classified as "historical." In some cases, however, a record for an active water system may
become classified as historical, due to problems experienced during data processing for updating FRDS.
The results of the pilot test confirmed that most of the FRDS records that were classified as historical were
2-3
-------
not active systems. The final frame development protocol included file processing specifications for
eliminating the historical records when the frame was prepared for drawing the sample, expect for rare
instances when there was external evidence that records for specific groups of active water systems had
been systematically classified as historical due to FRDS processing requirements.
2.1.2
Phase I Sample Design and Selection
Sample Eligibility. To be eligible for the CWSS, a water system had to meet several criteria.
First, it had to meet the CFR definition of a community water system; principally, a water system providing
drinking water to 25 or more permanent residents or to 15 permanent connections. (See 40 CFR 141.2 for
complete definition.) In addition, the CWSS excluded federally and state-owned or operated systems since
these are not affected by regulatory and economic forces in the same way as other systems. To the extent
possible, all ineligible systems were identified on FRDS and removed from the frame. However, many
such systems either could not be identified on the frame or could not be so identified with confidence. Any
such systems remaining on the frame were identified by the screening process and subsequently excluded
from the Phase II sample.
The CWSS analytical plan specified minimum precision levels to be achieved for
subpopulations, which meant that specific sample sizes be achieved for each subpopulation. The
stratification variables that were available on the sampling frame were not accurate enough to ensure the
initial selection of the given overall sample size needed for the required precision levels, in a cost-effective
manner. Thus, it was necessary to draw a larger sample in Phase I, to ensure that enough sample would be
available in Phase I to obtain accurate stratification information to be used in the Phase II sample size
allocations within each stratum.
The domains of the population of interest for EPA are based on three major characteristics of
the systems: size of the population served by the system, type of system ownership, and type of primary
source of non-purchased water. The domains of the population are shown in Table 2-1. The regulatory
impact models require as inputs estimates of parameters of reasonable precision in each of these domains.
The sample size in each domain should be large enough to provide a sufficient number of completed
questionnaires to obtain estimates with reasonable precision. Thus, the sample was designed to provide
estimates of percentages with an error not exceeding 10 percent (except for a 1 in 20 chance) within each
domain. For example, suppose 50 percent of the systems in a domain responded that they boost chlorine
residuals in their distribution system. The sample was designed so that EPA could be 95 percent confident
2-4
-------
that between 40 percent and 60 percent of the systems in this domain boost chlorine residuals. The
minimum sample size required under this design would obtain an estimate for a 50 percent statistic with an
error not exceeding ±10 percent (except for a 1 in 20 chance) in each domain. A 50 percent statistic was
used because the standard error is largest when the population percentage is 50 percent. The error will be
smaller for other population percentages.
For some domains, the FRDS frame could provide population counts only for broader
domains than needed by EPA. For example, the data from FRDS were not sufficiently descriptive to divide
privately owned systems into ancillary vs. nonancillary categories. Thus, the estimated proportions from
the 1986 CWSS were used to estimate the population counts for the detailed subdivisions. The response
and eligibility rates from the 1986 survey were used to obtain the estimates for the number of initially
sampled systems required to yield the target number of completes.
The Phase I sample was obtained by drawing a systematic sample of systems from the
cleaned, de-duplicated FRDS frame, within each sampling stratum. Thirty-two sampling strata were
formed as the intersection of the eight population size classes, two ownership types, and two types of water
source. Table 2-1 shows the frame and the sample counts for the Phase I sample. In total, 5,856 systems
were selected for Phase I.
In order to reduce the response burden on the water systems, attempts were made to minimize
the overlap between the 1995 Drinking Water Infrastructure Needs Survey (DWINS) and the CWSS
samples. The systems in the mid-size strata, i.e., population served between 3,301 and 50,000, were
selected conditional upon their selection to the DWINS sample. The conditional selection was used to
reduce the overlap between the two samples. The overlap among small systems was anticipated to be quite
small, and almost all large systems had to be selected according to both survey sample designs, so this
procedure was only applied to mid-size systems. The systems in the mid-size strata were further stratified
by the DWINS sample strata, and then were selected with probabilities conditional upon their inclusion for
the DWINS sample. This modified Keyfitz (Brick, Morganstein, and Wolters, 1987) method of selection
ensured the desired CWSS selection probabilities while minimizing the overlap between the DWINS and
the CWSS samples.
The sampling method ensured that the unconditional chance of selection to the CWSS sample
is equal to the desired CWSS sampling fraction while minimizing the overlap between the two samples.
This selection procedure is optimal in that no other selection procedure which provides the desired CWSS
2-5
-------
Table 2-1 Phase I Frame and Sample Sizes by the Phase I Strata
Phase I stratum
Size of
population
served
100 or less
100 or less
100 or less
100 or less
101-500
101-500
101-500
101-500
501-1,000
501-1,000
501-1,000
501-1,000
1,001-3,300
1,001-3,300
1,001-3,300
1,001-3,300
3,301-10,000
3,301-10,000
3,301-10,000
3,301-10,000
10,001-50,000
10,001-50,000
10,001-50,000
10,001-50,000
50,001-100,000
50,001-100,000
50,001-100,000
50,001-100,000
> 100,000
100,001-500,000
> 100,000
> 100,000
> 500,000
All
Ownership
type
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
All
Source
of
water
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface •
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Surface
All
Frame Sample Sampling
size size rate
1,300 161 0.124
272 119 0.438
14,268 783 0.055
593 401 0.676
4,439 197 0.044
1,191 180 0.151
10,759 667 0.062
778 418 0.537
2,744 179 0.065
778 159 0.204
1,927 209 0.108
324 149 0.460
3,857 175 0.045
1,757 167 0.095
1,632 180 0.110
536 154 0.287
1,845 164 0.089
1,497 165 0.110
455 165 0.363
193 128 0.663
987 143 0.145
1,276 157 0.123
203 110 0.542
127 84 0.661
117 78 0.667
244 106 0.434
29 29 1.000
31 31 1.000
66 66 1.000
177 44 0.249
18 18 1.000
35 35 1.000
35 35 1.000
54,490 5,856
2-6
-------
unconditional selection probabilities would retain fewer DWINS sample units in the CWSS sample. In
total, in mid-size strata 210 DWINS systems were retained. If the CWSS sample had been drawn
independently of the DWINS sample, it was expected that approximately 506 DWINS sample systems
would have been selected for the CWSS. Thus, the procedure provided a nearly 60 percent reduction in the
size of the overlap.
In the small and large size strata, the systems, before selection, were placed in a sort order by
the EPA region, and within the EPA region by the size of the population served. The direction of the size
sort alternated across the regions. This implicit stratification ensured the geographical dispersion among
the sample systems, and increased the probability that a range of population sizes within a stratum were
sampled. With the systems in this sort order an equal probability systematic sample of systems was drawn
within each stratum. The sampling was independent across the strata.
Table 2-1 shows the frame and sample sizes and the sampling rate for each stratum. The
publicly owned systems with surface water serving more than 100,000 persons were selected with
certainty.
2.1.3
Phase II Sample Design and Selection
Out of 5,856 water systems selected for the Phase I sample, 4,729 successfully completed the
interviews and were eligible for sampling for Phase II. The screener collected data on the stratification
variables, i.e. size of the population served by the system, ownership of the system, and the source of water
to the system. The responses to the screener showed inaccuracies in the FRDS-based stratification of the
water systems. This is consistent with experiences from previous surveys that have used the FRDS as a
sampling frame (e.g., the National Pesticide Survey and the 1986 Survey of Community Water Systems).
The migration of the systems to different strata as a result of the screener responses
introduced inefficiency in the sample design through a loss of sample size and/or by introducing unequal
sampling rates. An iterative optimization algorithm was applied to achieve optimum sample size
allocations within each screener-based stratum through equalizing the sampling rates across the FRDS
based strata. The intersections of the FRDS-based strata and the screener-based-strata resulted in a total of
337 strata.
2-7
-------
After the optimum sampling rates were determined, the systems were placed in a sort order by
EPA region, and within the EPA region by the size of the population served within each stratum. The
direction of the size sort alternated across the regions. In total, 3,681 systems were selected by drawing an
equal probability systematic sample of systems in each of the 337 strata.
For each of the screener-based strata Table 2-2 shows the number of water systems that were
eligible for Phase n sampling and the number of systems that were selected.
2.1.4
Stratum Migration
The errors in the FRDS frame classification of the water systems into size, ownership, and
source of water categories introduced inefficiency in the sample design through a loss of sample size and/or
by introducing unequal sampling rates. Among the Phase I respondents, 81 percent reported the same size
class as the FRDS data indicated. A larger proportion, about 86 percent, reported the same ownership
type, and about 91 percent reported the same water source as the FRDS data.
Size of the System
Table 2-3 shows the cross-tabulation of the eligible screener respondents by FRDS-based and
screener-based system size classes. In all size classes, more than 94 percent of the systems confirmed their
original size class or reported a service population in a class immediately adjacent. A greater percentage of
systems migrated to smaller size classes than to larger size classes. In general, size confirmation rates for
the FRDS data can be considered high. However, a few systems that were reported in small size classes in
FRDS migrated to a much larger size class in the screener. Nine systems that were reported as smaller
than 3,301 in FRDS migrated to size classes larger than 50,000. The systems migrating to larger size
classes can have adverse effects on the precision of the estimates. This stratum migration and its effects on
the precision of the estimates are discussed in detail below.
Ownership
Table 2-4 shows the cross-tabulation of the screener respondents by FRDS and screener-
based system ownership classifications. About 91 percent of systems that were classified as publicly
owned in FRDS were confirmed as publicly owned, while a smaller percentage, 82 percent, of the systems
2-8
-------
Table 2-2. Phase II Frame and Sample Sizes by the Phase II Strata
Phase II stratum
Size of
population
served
100 or less
100 or less
100 or less
100 or less
100 or less
100 or less
101-500
101-500
101-500
101-500
101-500
101-500
501-1,000
501-1,000
501-1,000
501-1,000
1,001-3,300
1,001-3,300
1,001-3,300
1,001-3,300
3,301-10,000
3,301-10,000
3,301-10,000
3,301-10,000
10,001-50,000
10,001-50,000
10,001-50,000
10,001-50,000
50,001-100,000
50,001-100,000
50,001-100,000
50,001-100,000
> 100,000
> 100,000
> 100,000
> 100,000
All
Ownership
type
Public
Public
Private
Private
Ancillary
Ancillary
Public
Public
Private
Private
Ancillary
Ancillary
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
All
Source
of
water
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
All
Frame Sample
size size
98 98
68 68
263 158
101 101
321 148
76 76
224 137
194 150
312 154
170 170
212 163
57 57
177 132
145 145
133 133
83 83
212 128
198 127
151 139
83 83
190 132
172 127
127 127
66 66
182 117
166 114
84 84
48 48
89 89
90 90
20 20
30 30
67 67
82 82
16 16
22 22
4,729 3,681
2-9
-------
«•
§
.3
I
i
g
3
J
CQ
1
o
CO
1
S
j?
!
«
u
CO
dJ
i
I
.8
CO
"8
1
en
PM
UH
1
A
o
o
«\
g
1
g
o
1
o
o"
1
0
co"
0
3
ro
2"
fv
•t
I
D
3
1
O
O
"S
op j§
>- u
II
o
CO
Ol «-i O O
o o o o
— i O — < O
P>J "^*» ^J" ^"^
Tj" <-^ *O ^
"*d~ ""-1 ^h CO
M- CN
OO Tf -^ »H
n -* 1-1 oo
vo vo •— < o\
n oo oo
1 1 l|
O cu O (2
en
JS °
So
i— H
OO OO OO CS-^ \o oo'*
ro «H oo — i
fSO fNCM 00«S '*'!*• — < O
t** ^^ t*1 00 CN
Ol O 1^ ^H >O ^ ^rf" ^H I-H O
vo —i e oo CN
V)
vo oo ro r** ^ •— * o o o o
*4 vo ^r
VJ ^. fS-
^-H f— H
*^^ ^J -|-* -f^ -|-r*
o o o S o (3 o fe o o
WOn O CU CJ CU OAn r_^ p ,
i i I 1 i
°- ro" 2 °, 0
T ^, V j, T
0 § 0 §^ 0
"" ro" 2" o"
i-l C
1— I
O 10
-------
Table 2-4. Eligible Screener Respondents by the FRDS-Based and Screener-Based Ownership
Types (Consistent Responses are Shown in Bold)
Screener-Based
Ownership Type
Public
Private
Count
Percent
Count
Percent
FRDS-Based Ownership Type
Small
Public
984
85
168
15
Private
317
15
1,775
85
Mid-size
Public
585
96
25
4
Private
127
28
324
72
Large
Public
312
99
2
1
Private
29
26
81
74
All
Public
1,881
91
195
9
Private
473
18
2,180
82
Note: Percent is column percent, that is, the percentage of systems in the original FRDS-based ownership
type that fell into each ownership type based on the screener data.
indicated as privately owned on FRDS were confirmed as privately owned. However, these percentages
varied by system size. For small (size of the population served is 3,300 persons or less) systems, the
percentages were equal, about 85 percent. For the systems identified on FRDS as publicly owned systems,
the percentage confirmed as publicly owned increased by size of the system, to 99 percent for the largest
systems. For the systems identified on FRDS as privately owned systems, the percentage confirmed as
•*
privately owned was lower for the larger systems, about 74 percent for the largest systems.
Water Source
Table 2-5 shows the cross tabulation of the eligible screener respondents by FRDS-based and
screener-based water source classifications. About 97 percent of systems identified as ground-source water
systems in FRDS confirmed this information; in the screener, however, only 86 percent of the systems
identified as surface-source systems in FRDS were confirmed as surface in the screener. (The lower rate of
confirmation for surface systems can be attributed to differences between the survey's definition of surface
source water system and the FRDS definition of surface source water system.) This also varied by size of
the system. For small and mid-size systems, about 97 percent of the ground water systems in FRDS
confirmed while only 85 percent of the surface water systems confirmed FRDS data. For large systems
(serving more than 50,000) the respective percentages were almost equal, about 88 percent.
In Table 2-6, it can be observed that in none of the domains did the effect of the stratum
migration cause the projected sampling error to exceed 15 percent. Seven out of a total of thirty-six strata
2-11
-------
Table 2-5. Eligible Screener Respondents by the FRDS-Based and Screener-Based Source of
Water Types (Consistent Responses are Shown in Bold)
Screener-Based
Water Source
Ground
Surface
Count
Percent
Count
Percent
FRDS-Based Water Source
Small
Ground
1,878
98
40
2
Surface
196
15
1,130
85
Mid-size
Ground
543
97
14
3
Surface
72
14
432
86
Large
Ground
162
88
22
12
Surface
27
11
213
89
All
Ground
2,583
97
76
3
Surface
295
14
1,775
86
Note: Percent is column percent, that is, the percentage of systems in the original FRDS-based water-
source type that fell into each water-source type based on screener data.
have an expected error between ±14 and ±15 percent, fifteen have an expected error between ±11 and ±13
percent, and the remaining fourteen about ±10 percent. This table is discussed in detail in the following
section.
Impact of Stratum Migration on the Accuracy of Domain Estimates
For each domain, Table 2-6 shows the minimum Phase II mail-out sample size required to
obtain an estimate for a 50 percent statistic with an error not exceeding ±10 percent (except for a 1 out of
20 chance). The planned sample size assumed the FRDS frame correctly identified all subdomain
members, while the required size is the number required to achieve this level of accuracy given the observed
frame discrepancies, and the actual size is the mail-out sample size being implemented. The actual is
smaller than the required sample size when the first-stage sample could not produce enough cases that were
actually in the subdomain given the frame discrepancies. The Table also shows the half-width of the
confidence interval resulting from actual mail sample sizes (and the anticipated response rate based on the
1986 CWSS) and the increase that represents over the planned 0.100 width.
The seven domains that had an anticipated sampling error between ±14 and ±15 percent were
all in strata using surface water with a population served size of 50,000 or less. Six of the seven domains
have private ownership. Lower precision in these domains mainly resulted from classification errors in the
FRDS frame. As discussed above, a substantial number of systems that were classified as privately owned
and surface water in FRDS reported in the screener as publicly owned and ground water. In fact, almost
2-12
-------
Table 2-6. Phase II Sample Sizes and Impact on Accuracy, for Every Design Stratum
Stratum
<100,Pub,Grd
<100,Pub,Srf
<100,Pvt,Grd
<100,Pvt,Srf
<100,Anc,Grd
<100,Anc,Srf
101-500,Pub,Grd
101-500,Pub,Srf
101-500,Pvt,Grd
101-500,Pvt,Srf
101-500,Anc,Grd
101-500,Anc,Srf
501-l,000,Pub,Grd
501-l,000,Pub,Srf
501-l,000,Pvt,Grd
501-l,000,Pvt,Srf
l,001-3,300,Pub,Grd
l,001-3,300,Pub,Srf
l,001-3,300,Pvt,Grd
l,001-3,300,Pvt,Srf
3,301-10,000,Pub,Grd
3,301-10,000,Pub,Srf
3,301-10,000,Pvt,Grd
3,301-10,000,Pvt,Srf
10,001-50,000,Pub,Grd
10,001-50,000,Pub,Srf
10,001-50,000,Pvt,Grd
10,001-50,000,Pvt,Srf
50,00 l-100,000,Pub,Grd
50,00 l-100,000,Pub,Srf
50,00 l-100,000,Pvt,Grd
50,001-100,000,Pvt,Srf
>100,000,Pub,Grd
>100,000,Pvt,Srf
>100,000,Pvt,Grd
>100,000,Pvt,Srf
Total
Mail-out size
Planned
111
82
158
114
148
108
135
124
154
128
163
135
130
115
140
100
126
121
121
103
126
124
108
80
109
112
82
66
57
78
23
24
42
75
15
26
3,663
Required
154
152
155
222
146
106
137
147
152
212
160
110
132
168
167
170
127
125
139
155
131
125
193
149
116
112
128
94
126
106
24
39
115
105
19
26
4,644
Actual
98
68
158
101
148
76
137
150
154
170
163
57
132 ,
145
133
83
128
127
139
83
132
127
127
66
117
114
84
48
89
90
20
30
67
82
16
22
3,681
Half-width of
95% confidence
interval
0.125
0.150
0.099
0.148
0.099
0.118
0.100
0.099
0.099
0.112
0.099
0.139
0.100
0.108
0.112
0.143
0.100
0.099
0.100
0.137
0.100
0.099
0.123
0.150
0.100
0.099
0.123
0.140
0.119
0.109
0.110
0.114
0.131
0.113
0.109
0.109
Increase in
half-width
over 0.100
0.025
0.050
-
0.048
-
0.018
_
-
-
0.012
-
0.039
_
0.008
0.012
0.043
_
-
-
0.037
.
-
0.023
0.050
_
-
0.023
0.040
0.019
0.009
0.010
0.014
0.031
0.013
0.009
0.009
2-13
-------
50 percent of FRDS mid-size systems that were indicated as privately owned/surface water migrated to
another ownership/water source type domain. This resulted in substantial sample size loss in the privately
owned/surface water domains. At the same time, the systems that migrated to privately owned/surface
water domains, although few in number, had in general smaller sampling rates than the systems originally
sampled in those strata (and hence larger sampling weights) which further increased the variance.
The remaining one domain with an anticipated ±15 percent sampling error is the smallest
publicly owned system size category with surface water. The ground water domain in this size/ownership
category also was projected to have a somewhat lower precision, about ±13 percent. The lower precision
expected in these domains resulted mainly from realization of lower than expected eligibility rates in these
domains.
In two mid-size (3,301-50,000) privately owned/ground water domains the anticipated error
percentages were about ±12 percent. The slightly lower precision levels were caused mainly by the
differential sampling rates introduced by the migration of systems into these domains.
All privately owned systems that are larger than 50,000, as indicated by the FRDS data, were
selected with certainty. In these domains, the projected error rate was slightly above ±10 percent (about
±11 percent). This is mainly due to the expected nonresponse rate for the mail survey (about 20 percent) in
these domains. For surface water domains the variation in the sampling rates resulting from migration also
contributed to slightly higher error percentages.
All systems that were classified using FRDS data as larger than 100,000/publicly
owned/ground water were included in the Phase I sample. In this domain, the error is expected to be about
±13 percent. This is because of migration of a few systems to this domain from much smaller Phase I size
classes that were selected with a much lower sampling rate for the Phase I (consequently having a large
sampling weight). Although all the systems in this domain, as indicated by FRDS, were included in the
sample, it was not possible to obtain a higher precision because of classification errors in FRDS. The
projected precision level of ±11 percent for the 100,000/publicly owned/surface domain is somewhat
overstated, because the systems that are larger than 500,000 in this domain were included with certainty.
2-14
-------
2.2
Weighting and Estimation
A sampling weight is attached to each responding water system record (1) to account for
differential probabilities of selection, (2) to reduce the potential bias resulting from nonresponse. The
sampling weights are necessary for unbiased estimation of the population characteristics of interest.
2.2.1
Derivation of the Phase I Base Weight and Nonresponse Adjustment
The base weight for the Phase I sample was computed as the ratio of the number of water
systems in the sampling frame to the number of systems sampled in each Phase I sampling stratum. That
is, the Phase I weight for the /z-th sampling stratum, W$, is:
nh
where Nh and «/> are the frame and sample sizes respectively for the /2-th sampling stratum. Table 2-7
shows the Phase I sampling base weight for each Phase I sampling stratum.
Eligibility could be determined for 5,192 water systems out of a total of 5,856 systems
sampled for Phase I. The 664 systems where eligibility could not be determined were included in the
screener result categories as non-locatable, refusal, no answer, maximum contact, and other as discussed in
detail in Section 3.4 (this includes 9 systems that were classified in Table 3-1 as ineligible through external
sources). Out of a total 4,826 systems determined to be eligible, 4,729 were identified as eligible through
the screener interview. The remaining 97 systems were initially identified as ineligible in the screener and
consequently excluded from the Phase II sampling. However, these systems were later determined to be
eligible through a review process of the screener results which is discussed in detail in Section 3.5. Table
2-7 presents the number of systems for which eligibility could be determined, the number of eligible
systems, and the number of eligible systems that were identified as eligible through the screener
questionnaire and consequently included for the Phase II sampling (the second-to-last column on Table
2-7).
The Phase I nonresponse adjustment factor was computed as a product of two factors. The
first adjustment factor compensated for those 644 systems for which eligibility could not be determined. It
was computed as the ratio of the number of sampled systems to the number of systems where eligibility
2-15
-------
Table 2-7.
The Phase I Base Weight and the Nonresponse Adjustment Factors by the Phase I Sampling
Strata
Size of
population
served
100 or less
100 or less
100 or less
100 or less
101-500
101-500
101-500
101-500
501-1,000
501-1,000
501-1,000
501-1,000
1,001-3,300
1,001-3,300
1,001-3,300
1,001-3,300
3,301-10,000
3,301-10,000
3,301-10,000
3,301-10,000
10,001-50,000
10,001-50,000
10,001-50,000
10,001-50,000
50,001-100,000
50,001-100,000
50,001-100,000
50,001-100,000
> 100,000
100,001-500,000
> 100,000
> 100,000
> 500,000
All
Ownership
type
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
All
Source
of
water
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Surface
All
Number of systems Nonresoonse
Frame Sample Base Eligibility Eligible adjustment
size size weight determined Eligible respondent factor
1,300 161 8.07 135 111 108 1226
272 119 2.29 104 90 85 1212
14,268 783 18.22 562 496 466 1483
593 401 1.48 316 233 216 1.369
4,439 197 22.53 180 175 172 1 114
1,191 180 6.62 167 149 148 1085
10,759 667 16.13 546 516 509 1238
778 418 1.86 357 320 305 1.228
2,744 179 15.33 168 165 165 1065
778 159 4.89 151 145 145 1053
1,927 209 9.22 185 178 177 1 136
324 149 2.17 137 128 126 1.105
3,857 175 22.04 170 167 167 1029
1,757 167 10.52 163 162 162 1025
1,632 180 9.07 168 155 154 1078
536 154 3.48 149 142 139 1.056
1,845 164 11.25 162 161 161 1012
1,497 165 9.07 164 157 157 1.006
455 165 2.76 157 150 149 1058
193 128 1.51 127 118 116 1.025
987 143 6.90 141 141 141 1014
1,276 157 8.13 156 151 151 1006
203 110 1.85 108 106 106 1019
127 84 1.51 82 80 80 1.024
H7 78 1.50 78 78 76 1026
244 106 2.30 105 104 103 1019
29 29 1.00 28 28 28 1.036
31 31 1.00 31 31 31 1.000
66 66 1.00 65 64 64 1015
177 44 4.02 44 43 42 1024
18 18 1.00 16 16 16 1.125
35 35 1-00 35 35 35 1.000
35 35 1.00 35 31 29 1.069
54,490 5,856 5 192 4 826 4 729
2-16
-------
could be determined within each stratum. The adjustment was done under the assumption that within each
sampling stratum, the eligibility rate for the systems with unknown eligibility was the same as for the
systems with known eligibility. The second adjustment factor compensated for those 97 eligible systems
that were excluded from Phase II sampling because they were not identified as eligible until the review
process. It was derived as the ratio of the number of eligible systems to the number of eligible systems that
were identified as eligible through the screener. That is, the screener nonresponse adjustment factor for the
/z-th stratum, Xh, was computed as:
where /j£D) is the number of systems for which eligibility could be determined in the /z-th sampling stratum,
rffi is the number of eligible systems in the /z-th sampling stratum, «$,c) is the number of eligible systems
that were identified as eligible through the screener questionnaire in the /z-th sampling stratum. The
component numbers that went into the screener nonresponse adjustment factors are shown in the second,
fourth, fifth, and sixth numeric columns of Table 2-7.
2.2.2
Derivation of the Phase II Base Weight and Nonresponse Adjustment
Base Weight
As discussed in section 2.1.3, the intersections of the FRDS-based and screener-based strata
resulted in 337 Phase II sampling strata. The Phase II base weight was computed for each Phase II
sampling stratum as the ratio of the number of systems that were determined to be eligible based on the
screener questionnaire to the number of systems that were selected for Phase II within the stratum.
After the Phase II base weight was obtained, the mail survey sampling weight was computed
as the product of the Phase I base weight, Phase I nonresponse adjustment factor and the Phase II base
weight. That is, for a water system in the /z-th FRDS based stratum and the k-th screener-based stratum,
the mail survey sampling weight, Whk > was computed as:
(2-1)
2-17
-------
where
is the Phase II sample base weight.
Aggregated Cases
For a few cases, the water system management could respond to the mail questionnaire only by providing
aggregated data for two or more systems from the sampling frame (including not only the sampled listings
but also nonsampled listings). Those aggregated systems were recorded in the survey database as a single
record. The mail survey sampling weights for each aggregate were therefore adjusted to reflect the joint
chances of selection of all systems represented in the respective aggregate. This adjustment was made
under the assumptions that the water systems included in the aggregate were sampled independently and the
systems responded to the screener independently of each other. The water systems were selected
independently across the strata, but the selection was not independent within the stratum. It was also
assumed that the nonsampled systems represented in the aggregates would have responded to the screener
with propensity equal to the average response rate of the sampled water systems in their FRDS-based
stratum.
Thus, the sampling weight for the g-th aggregate, Ws , was computed as:
(2-2)
where Ss is the set of water systems that were included in aggregate g and Wgi is the mail survey
sampling weight for the 7-th listing in the g-th aggregate computed in Equation (2-1).
Nonresponse Adjustment
Out of 3,681 water systems selected for the mail survey, 3,664 were eligible (1 was found
ineligible, 3 were out-of-business, and the remaining 13 responded but were retired cases because their data
were aggregated with other sampled systems). Out of the 2,004 returned questionnaires, a total of 1,980
(54.0% of the sample) were included in the final weighted sample, after further adjustments for aggregated
cases and unusable questionnaires. Nonresponse adjustment factors were computed to compensate for the
water systems that did not respond to the mail survey. Since less than 1 percent of the respondents were
2-18
-------
ineligible and all Phase II sampled systems had responded to the screener survey it was assumed that all
nonrespondents were eligible.
Although nonresponse adjustment could potentially reduce bias, it can also increase the
variance of the estimates. Small adjustment classes and/or low response rates (yielding large nonresponse
adjustment factors) may increase the variance substantially and give rise to unstable estimates. In order to
prevent excessive increases in variance and thereby adverse effects on the mean square error of the
estimates, lower limits were placed on the size of the adjustment classes to avoid large adjustment factors.
In general, the nonresponse adjustment classes were equivalent to the screener-based strata.
The only two exceptions were that the privately owned ground and privately owned surface water strata
were collapsed across two size strata exceeding 50,000. The strata were collapsed to provide at least 20
respondents within the adjustment class to provide stable estimates. The nonresponse adjustment classes
are shown in Table 2-8.
A nonresponse adjustment factor was computed for each nonresponse adjustment class. The
nonresponse adjustment factor for the f-th adjustment class, §t, was computed as:
(2-3)
& = ~^r;
ieR,
where
Rt is the set of water systems responding to the mail survey in the M:h nonresponse
adjustment class,
Et is the set of water systems that were found to be eligible in the mail survey in the /-th
nonresponse adjustment class.
Wa is the mail survey sampling weight (computed in Equation 2-1 or in Equation 2-2 for the
aggregates) for the z'-th water system in the f-th nonresponse adjustment class.
Table 2-8 shows the response rates and the nonresponse adjustment factors by nonresponse
adjustment classes for the mail survey. The smallest adjustment class contained 24 completes and the
adjustment factors varied from 1.315 to a maximum of 2.606.
2-19
-------
Table 2-8. The Phase II Response Rates and the Nonresponse Adjustment Factors by the Phase
II Nonresponse Adjustment Classes
Nonresponse adjustment class
Size of
population
served
100 or less
100 or less
100 or less
100 or less
100 or less
100 or less
101-500
101-500
101-500
101-500
101-500
101-500
501-1,000
501-1,000
501-1,000
501-1,000
1,001-3,300
1,001-3,300
1,001-3,300
1,001-3,300
3,301-10,000
3,301-10,000
3,301-10,000
3,301-10,000
10,001-50,000
10,001-50,000
10,001-50,000
10,001-50,000
50,001-100,000
50,001-100,000
> 50,000
> 50,000
> 100,000
> 100,000
All
Ownership
type
Public
Public
Private
Private
Ancillary
Ancillary
Public
Public
Private
Private
Ancillary
Ancillary
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
Private
Private
Public
Public
All
Source
of
water
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
All
Number Number Response Nonresponse
of of rate adjustment
eligibles completes (percent) factor
98 49 50.0 1.879
63 32 50.8 1.725
158 64 40.5 2.453
101 41 40.6 2.292
148 65 43.9 2.287
76 30 39.5 2.249
136 79 58.1 1.716
149 74 49.7 1.956
153 68 44.4 2.259
169 86 50.9 1.829
163 70 42.9 2.330
57 27 47.4 2.304
132 74 56.1 1.775
143 79 55.2 1.882
133 68 51.1 1.975
83 50 60.2 1.627
127 82 64.6 1.548
128 70 54.7 1.812
138 71 51.4 1.878
83 47 56.6 2.224
132 78 59.1 1.707
127 69 54.3 1.826
127 87 68.5 1.448
66 39 59.1 1.808
116 66 56.9 1.754
114 74 64.9 1.542
84 43 51.2 2.166
47 30 63.8 1.725
87 55 63.2 1.462
90 66 73.3 1.315
36 24 66.7 1.561
52 27 51.9 2.059
67 38 56.7 2.606
81 58 71.6 1.392
3,664 1,980 54.0
2-20
-------
After the nonresponse adjustment factors were computed, the nonresponse adjusted weight for
the respondents was obtained as a product of the mail survey sampling weight and the nonresponse
adjustment factor. The nonresponse adjusted weight was set equal to zero for the nonrespondents and
ineligibles.
2.2.3
The Final Weight
As discussed in section 2.1.4, the migration of the water systems across the size, ownership
and water source strata, largely due to inaccuracies in the FRDS frame, resulted in considerable weight
variation within the final reported strata, based on mail questionnaire data.
Such a large variation in weights would inflate the design effect and reduce the effective
sample size. This would have a deleterious impact on the precision of the estimates. Trimming the largest
weights can reduce this effect and increase the precision of the sampling estimates. However, at the same
time trimming may introduce bias into the estimates. The goal of trimming is to balance any increase in
bias with reduction in sampling error so as to minimize the mean square error.
The distribution of the nonresponse adjusted weights was examined within estimation strata.
The estimation strata were formed as the intersections of eight population-served size classes and three
water sources (ground, surface, and purchased), both obtained from the mail questionnaire, and the
ownership types as originally reported in the screener survey. Within each stratum, the records were
ranked from the highest weight value to the lowest. The weights were truncated sequentially at each level
of the weight and the design effect for the unequal weights computed.
A sampling weight was trimmed when its trimming provided more than a 20 percent reduction
in variance. In total, the weights of 13 water systems were trimmed. The trimmed portion of the weights
were distributed across the water systems within the trimming (estimation) class to produce the original
weight total. Table 2-9 shows the number of cases trimmed, the weight trimming level, and the factor
applied to the weights to distribute the trimmed portion of the weight within each trimming (estimation)
class. In the largest size class (larger than 100,000) the trimmed portion of the weights was not distributed
(the weight redistribution factor is 1), because to do so would have resulted in a sample estimate much
higher than the frame total for the number of systems in this size class.
2-21
-------
Table 2-9 Weight Trimming Classes, Trimmed Weights and Weight Redistribution Factors
Weight trimming class
Size of
population
served
100 or less
100 or less
101-500
101-500
101-500
501-1,000
10,001-50,000
> 100,000
> 100,000
Ownership
type
Public
Private
Private
Private
Ancillary
Private
Private
Public
Public
Source
of
water
Surface
Surface
Surface
Purchase
Surface
Purchase
Surface
Ground
3urchase
Number of
cases
trimmed
1
1
1
1
1
2
1
1
1
1
1
1
Weight
before
trimming
17.1
61.9
45.8
73.4
32.5
82.3
76.9
56.3
42.6
14.1
45.6
16.7
Weight
cut-off
value
4.8
6.3
16.5
13.1
6.7
20.7
3.4
2.9
5.7
Total
number of
cases in
the class
11
24
29
48
17
42
25
31
23
Weight
redistribution
factor
1.267
1.807
1.447
1.721
1.548
1.072
1.157
1.000
1.000
The weight trimming introduced some potential bias to the estimates to the extent that the
trimmed cases differed in terms of the characteristics of interest from the other water systems in the
estimation stratum. However, for all 13 trimmed cases, the reduction in variance was so large that it is
anticipated to outweigh any potential bias introduced.
2.2.4
Variance Estimation
In this survey, a replication method, jackknife, was used to estimate the sampling variance.
This method provides unbiased estimates of variance for the two-phase sample design used in this survey.
2-22
-------
The variance estimation was carried out in three steps: (1) forming the replicates, (2)
constructing the replicate weights, and (3) computing the estimates of the variances of the survey estimates.
After the replicates were formed and the replicate weights were constructed, the variance
estimate (and the confidence interval) was computed easily for any statistic of interest using WesVarPC.
The variance estimate for a statistic was computed as the sum of the squared differences between the
estimates obtained using the replicate weights and estimate obtained using the full sample weight further
>s
multiplied by the finite population correction factor. That is, the estimate of the variance of a statistic, Y,
denoted as v(Y), was:
. M
where,
is the weighted estimate obtained using the Mh replicate weight,
7 is the weighted estimate obtained using the full sample weight,
8r is the finite population correction factor assigned for the r-th replicate, and
Mis the number of replicate weights.
(See K.M. Wolter, 1985, for more information about variance estimation.) We now discuss
the development of the replicates to estimate the variability within each Phase I variance stratum.
Replicates
The CWSS sample included 27 noncertainty Phase I sampling strata. The variability within
each noncertainty sampling stratum was estimated by constructing two replicates for each stratum. Within
each stratum the replicates were constructed as follows:
The Phase I sample water systems were placed in their sample selection order and
paired successively while in this sort order. Within each pair, the systems were
alternately assigned to variance unit 1 and 2 with a random start. The pairs were
further collapsed to form two clusters, and each cluster was then assigned to one of the
two variance strata.
2-23
-------
Thus, in total, 54 variance strata (two for each noncertainty variance stratum) were assigned to the systems
that were selected from the noncertainty strata.
The water systems that were selected with certainty were not assigned to any variance stratum
because they do not contribute to sampling variability.
Replicate Weights
The r-th replicate base weight for the fc-th water system in the y-th variance unit and /-th
variance stratum, W, was derived as:
• = iandj = '.
ifr*/
where
r=l,2, ,54, and
is the fiill sample base weight for the &-th water system in the/-th variance unit and /-th
variance stratum.
The 54 replicate base weights were all set equal to the full sample base weight for those cases that were
selected with certainty and consequently not assigned to a variance stratum.
All the remaining full sample weighting steps leading to the final full sample weight were
performed on each replicate. These included the Phase I nonresponse adjustment, the weighting for the
Phase II sampling, the weight adjustment for the aggregates, and the Phase II nonresponse adjustment.
By repeating the various weight adjustment procedures on each set of replicate weights, the
effect of these procedures on the sampling variance of the estimator Y is appropriately reflected in the
variance estimator, v(7) , defined above.
2-24
-------
Finite Population Correction Factor
A finite population correction factor was derived for each first-phase variance stratum. The
factor was computed as 1 minus the ratio of the number of systems that completed the mail-out
questionnaire to the estimated number of eligible systems in the frame for the stratum. The factors were
then assigned to the replicates corresponding to the strata.
The confidence intervals that were computed using the replicate weights and WesVarPC are
presented in Part 2 of this report. Each confidence interval presented in Part 2 is based on the assumption
that the average values for the systems represented in a given table cell are normally distributed. In
general, this assumption is true. However, calculations based on small numbers of systems may violate
this assumption. In such cases the reported confidence intervals will not be correct. Most of these can be
identified by noticing when the plus/minus confidence interval width is larger than, or almost as large as,
the calculated average itself. To compute correct confidence intervals for such situations requires
examination of the empirical distributions for each variable in the tabulation and is beyond the scope of this
report.
References:
Brick, J.M.; Morganstein, D.R.; and Wolters, C.L. : "Additional Uses for Keyfitz Selection." Proceedings
of the Section on Survey Research Methods, American Statistical Association, 1987.
Wolter, K.M., "Introduction to Variance Estimation." Springer-Verlag, New York, 1985.
2-25
-------
3.
TELEPHONE SCREENER SURVEY
3.1
CATI Screener Design and Programming
Westat developed a telephone screening questionnaire in cooperation with EPA (see Appendix
A). The main purposes of this instrument were to screen the first phase sample for eligible water systems,
collect the data needed to place the system in the correct sample stratum, and collect the name and address
of the appropriate respondent for the mail survey.
The screener questionnaire was a computer-assisted telephone interview (CATI) instrument.
This computerized questionnaire was programmed according to the document entitled Community Water
System 1994 Survey, CATI Screening Questionnaire, CATI Specs #6 (Final), December 22, 1994. This
document appears as Appendix A.
Because FRDS records often contained ambiguous or incomplete information about the water
systems, Westat also developed a paper Contact Questionnaire that was used to assist the interviewers in
finding the sampled water system and the appropriate person to complete the screener questionnaire. The
Contact Questionnaire is presented in Appendix B.
3.2
Telephone Interviewer Training
Training for the telephone portion of the CWSS took place on November 17, 1994. The
training was conducted by a Westat telephone operations manager and CWSS project staff using a
prepared agenda and a series of prepared training scripts. The training agenda appears as Exhibit 3-1.
Telephone supervisors who would be responsible for supervising the actual work of the interviewers also
attended the training sessions.
The training of the telephone interviewers and supervisors maximized trainee involvement and
participation in the learning experience while providing the trainers ample opportunity to observe and
evaluate individual trainee performance.
3-1
-------
Exhibit 3-1 CWSS Telephone Interviewer Training Agenda
Subject Time
Introduction and Overview 30 Minutes
Interactive Exercise I 30 Minutes
Contact Procedures 60 Minutes
Break 20 Minutes
Contact Interactive Exercises and Roleplays 60 Minutes
Lunch 60 Minutes
Interactive Exercise II 60 Minutes
Refusal Avoidance 45 Minutes
Break 20 Minutes
Role Plays 90 Minutes
Questions and Answers 30 Minutes
Interviewers received eight hours of training in areas specific to the CWSS, including
overview of the study, questionnaire content, specific question issues, sample, making contact, recording
the call results, sensitivity to respondents, handling problems, and avoiding refusals. Previously, they had
received eight hours of training in general telephone interviewing techniques and the use of the CATI
system. The CWSS interviewing staff was comprised of experienced interviewers, who had previously
conducted telephone surveys of business establishments.
Each CWSS interviewer received an Interviewer Manual containing general instructions,
specific procedures for conducting the telephone interviews, and detailed explanations of each telephone
interview question.
The principal training tool consisted of several interactive group exercises. Interviewers took
turns administering the paper Contact Questionnaire and the CATI screener to the trainer (who played the
role of the respondent), by reading the questions aloud and recording the responses in the instruments. The
trainer's responses followed scripted answers prepared in advance. The answers were composed to
simulate specific situations and problems that could be encountered in an actual interview. Because of the
occasional ambiguity of information recorded in FRDS, emphasis was placed on making contact with and
confirming the identity of the actual sampled water system.
3-2
-------
At the conclusion of training, interviewers paired off to practice what they had learned by
following scripted role-play interviews, using the actual instruments, telephones and CATI terminals in the
interviewer carrels. For each role play, one interviewer conducted the interview, while the other played
respondent. Training staff monitored the role plays, and interviewers were assigned to live calling only
after having successfully conducted the role play interviews.
3.3
Telephone Data Collection
Pursuant to the sampling design discussed in Chapter 2, the CWSS Phase I (telephone)
sample consisted of 5,856 water systems The remainder of this section documents the data collection
procedures and results of the CWSS telephone effort.
Telephone Data Collection Procedures
Tracing. The starting point for contacting the water system was the name, address, and
telephone number available in FRDS. However, 40 percent of the sampled systems had no valid phone
number on their records. In addition, the presence of a number on the FRDS record was no guarantee that
the number was correct. (It is important to keep in mind that a large number of water systems are very
small and are managed out of private residences.) Therefore, prior to the start of telephone data collection,
the sample was processed through Bizmatch, an automated telephone number look-up service for
businesses. A new number found through this process was added to the contact information (if any)
obtained from the FRDS records. Systems with no telephone number were, at this point, sent to trained
tracers, who attempted to find them through Directory Assistance prior to the start of calling. As new
telephone numbers were found, they were entered on the Respondent Information Sheet (RISs are described
below), batched, and then entered into an automated telephone number updating system. Final RISs, with
all available telephone numbers, were printed for the start of telephoning.
Systems whose telephone numbers were still non-locatable at the start of data collection, as
well as those determined to have 'bad" numbers during calling, went through additional tracing. Tracing
steps consisted of a progressive series of contacts or lookups with the following potential sources of
information about the identity and phone number of individual water systems.
3-3
-------
• Small Systems ( < 1,000 residents served)
- Directory Assistance
- National Rural Water Assoc. (NRWA) State Association
- City/County Administrative Offices (e.g., City Hall, County Health Department)
• Medium and Large Systems (1,000 or more residents served)
- Directory Assistance
- Intensive searches of CD ROM telephone listing databases
NRWA State Association
- City/County Administrative Offices (e.g., City Hall, County Health Department)
For smaller systems, tracers also looked for phone numbers under the names of any
individuals, listed as contacts in the FRDS records, since smaller systems are often operated as personal
businesses out of an individual's residence, or are managed from the residence of an officer of a residential
cooperative association. Tracers filled out a specially designed CWSS Tracing Procedures Report Form
for each tracing case. It was used to record a detailed description of what was done for each step. This
form stayed with the case throughout data collection; if further tracing was needed, the next logical step in
the sequence was carried out. As a new telephone number was obtained, it was recorded on the RIS.
Case Management. Each water system was treated as an individual case to be carefully
managed and controlled throughout the various steps of the telephone data collection process. Telephone
interviewers for the CWSS used a Respondent Information Sheet, paper Contact Questionnaire, Call
Record, and CATI questionnaire when working a case. Each of these is described below:
• Respondent Information Sheet (RIS) - Contained contact information about the CWS,
including name, address, and possible telephone numbers. The form was used to identify
the specific CWS the interviewer was trying to reach, how to reach it, and to record any
new contact or location information.
• Contact Questionnaire - The Contact Questionnaire was a short series of questions
administered on paper. It was used to verify that the interviewer had reached the exact
CWS listed on the RIS, and to identify an appropriate respondent for the CATI interview.
3-4
-------
Call Record - Basic document that was used to record the results of all calls and to
manage the flow of work. Each case was accompanied by a Call Record.
CATI Questionnaire - The screening questionnaire that was administered to a person
deemed knowledgeable about the CWS. It collected information about the CWS's size,
ownership type, water sources, and related topics, and concluded by securing the name
and address of the most appropriate person to respond to the mailed CWSS questionnaire.
Using these forms and systems, the interviewers were able to fulfill the objectives of the Phase
I telephone screening process:
• Confirm or correct existing information about each system and collect some additional
information about it, such as its size, type of ownership, and water sources, to determine
if the system met the criteria to participate in the full study.
• Identify a person at the water system who was knowledgeable enough to respond for the
CWS about the topics addressed in the telephone interview questions.
• Identify the name and address of the most appropriate representative of the water system
to respond to the mail questionnaire.
3.4
Telephone Data Collection Results
The final outcomes of the telephone data collection process for the 5,856 Phase I water
systems can be categorized into eight groups and are summarized in Table 3-1.
• Complete - The interview was successfully completed with the CWS and the responses to
the screener determined that the water system was eligible to be sampled for the main
survey.
• Ineligible - A system was considered ineligible for the survey under three different
scenarios:
- Based on responses to CATI Questionnaire items SI - S3, it was determined
that the system did not meet the definition of a Community Water System
(Water systems that provide piped drinking water to at least 25 permanent
residents or 15 household water connections)
- Based on responses to CATI Questionnaire item SI2, it was determined that
the CWS is owned or operated by a state or the Federal government
3-5
-------
- Without being able to administer the CATI Questionnaire, it was determined
from other reported information that the CWS was ineligible (e.g., an
institutional water system had ceased to provide water to a nearby
community).
Non-Iocatable - The CWS could not be located through any available telephone numbers
or tracing efforts. Only five percent of the CWS sample was assigned this disposition.
Refusal - The CWS refused the interview or refused to answer the critical eligibility
questions SI - S3. (To minimize any possible impact of the survey effort in relations
between EPA and the regulated Community, the CWSS design omitted the survey
technique of making a second effort to "convert" those who initially refuse cooperation.)
No Answer -There was no answer at the best available phone for the CWS after at least
eight calls; call attempts were spread out over several weeks at different times of day and
days of the week, including weekends; directory assistance checks revealed no other
possible phone number for any entity with a name similar to the CWS.
Maximum Contact -After establishing contact with the CWS and making at least eight
calls, it was not possible to complete an interview with the CWS.
Out of Business - Telephone contact or other source provided definitive information that
the CWS was no longer in business (ceased operations, integrated operationally into
another system, etc.).
Other - This code was used when no other final code applied.
Table 3-1 quantifies these results of telephone data collection. Due to the extensive tracing
efforts, Westat succeeded in positively locating 93 percent of the sample; 5.2 percent were non-locatable
and 1.8 percent never answered the phone. The final response rate was 89 percent. This was calculated by
the following formula:
Total Complete + Ineligibles
Total Sample - Out of Business
Note that this response rate and Table 3-1 are presented as operational results, that is, they
reflect the status of cases as of the end of telephone data collection. As discussed in Sections 2.1.3, 2.2.2,
and 3.5, some further adjustments to these statuses and/or assumptions about the eligibility of non-
interviewed cases were made at later stages of quality assurance and sample weighting.
3-6
-------
Table 3-1 CWSS Telephone Screener Results
Screener
Outcome
Complete
Ineligible
Non-locatable
Refusal
No Answer
Maximum Contact
Out of Business
Other
Total
Number of
Cases
4,729
393
302
149
106
97
79
1
5,856
Percent of
Sample
80.8
6.7
5.2
2.5
1.8
1.7
1.3
0.0
100.0
The Phase II (mail) sample was drawn from the 4,729 CWSs that completed the CATI
interview.
3.5
Detailed Examination of Screener Outcomes
Section 2.2.3 discussed the derivations of the final CWSS sample weights. As part of the
quality control process for the weighting, the estimate of the total number of active Community Water
Systems derived from the initial round of weights was compared to independent estimates derived from
ongoing EPA inventory efforts associated with the maintenance of the FRDS database. This comparison
revealed a small difference between the two estimates that could not be explained, after allowing for normal
sampling error or for definitional differences between the CWSs covered by the survey and the universe of
Public Water Systems contained in FRDS. Further analysis revealed that the likeliest explanation for this
small remaining difference lay in the eligibility rates encountered during the Phase I telephone survey.
Review of screening results suggested that a number of systems may have provided mistaken
responses to screening questions that led to their being erroneously classified as ineligible (i.e., did not meet
the CFR definitions of a CWS). To investigate how often this occurred, the Cadmus Group, Inc., worked
with Westat, Inc., to review the attrition of systems from the FRDS file to the Phase I Sample Frame and
the Phase II Sample Frame. Attrition was due principally to determination of either of two circumstances
3-7
-------
during the two sampling phases: ineligibility (a system was not really a Community Water System under
the CFR definition), or it was out of business (no longer actively operating as a water system).
Specifically, the analysis investigated whether there might be evidence that systems classified as ineligible
or out of business might actually have been eligible and actively operating, specifically within a timeframe
reasonably close to the survey sampling reference period, late 1994. To alter the findings of the Phase I
CATI screener, the types of evidence finally accepted by all analysts working on this issue included the
following:
1. The system is a large system serving 10,000 or more people. Six systems met this
criteria, and three of the six had populations over 100,000.
2. The system was classified as "out of business" during the CATI screener survey, but
FRDS records indicate various activities that strongly suggest that the system was in
business in late 1994. Examples of activity in FRDS include: violations, enforcement
actions, or change is in inventory data such as population served or number of
connections.
3. The system was classified as "not meeting the definition of a Community Water
System" during the CATI screener survey, but FRDS records show that the system was
an active CWS. A stricter standard was applied to these cases before reversing the
screener findings, because survey interviewers had received information directly from
the system personnel that led to classifying the system as "not meeting the definition of
a CWS." If the FRDS inventory showed that the system met the definition of a CWS,
and if FRDS also showed that the system had enforcement actions taken against it, the
analysts concluded that the system was an active CWS. Enforcement actions require
substantial investments of time and effort by State drinking water agencies, and it
seemed unlikely that States would make such investments if the system were not an
active CWS.
4. The system was classified as "not meeting the definition of a CWS" during the CATI
screener survey, but FRDS records show that the system was a CWS. Again, a strict
standard was required, for the same reason as in paragraph 3. The evidence required to
prove that a system was a CWS was that the FRDS inventory showed that the system
met the definition of a CWS, and a FRDS record of complex violation patterns (e.g.,
violation of a maximum contaminant level, a repeat or minor routine violation of the
Total Coliform Rule, a violation of the Surface Water Treatment Rule treatment
technique standard). These violations, like the enforcement activity above, require a
substantial investment of time and energy by the State drinking water agency. It
seemed unlikely that such investments would occur if the system were not an active
CWS.
3-8
-------
For consistency with the CATI Screener, evidence from FRDS was collected for the fourth
calendar quarter of 1994 and also extended through all four calendar quarters of 1995, to reflect any
lagging information about the system's status.
The result of this examination of screener outcomes was to increase the final estimated
number of eligible CWSs on the FRDS frame by one to two percent over the preliminary estimates
developed in the initial stages of the weighting process. Ninety-seven systems classified as ineligible or out
of business during the telephone screening were reclassified as eligible and active by the review process.
The revised estimate of eligible CWSs and the concomitant adjustments to the final sample weights resulted
in survey estimates of a total number of CWSs that are reasonably similar to the estimates from
independent EPA inventory activity.
.
3-9
-------
4.
MAIL SURVEY
The principal components of the CWS mail survey were the data collection instruments and
the operational process of distributing the questionnaires, assuring a sufficient response rate, and handling
the returned questionnaires. Chapter 4 describes the important aspects of these components.
4.1
Mail Questionnaire Design
EPA took the lead in designing the content of the mail questionnaires. Separate questionnaire
forms were tailored to the specific requirements of the three types of ownership: Public (publicly owned),
Private (privately owned), and Ancillary Community Water Systems. An ancillary system is one that
operates a drinking water system as a secondary component of its main business, such as a trailer park.
These instruments appear as Appendices C through E. The bulk of the contents of these three
questionnaires consisted of an identical "core" series of questions pertaining to the water system's
operating and financial characteristics; however, a few questions unique to ownership type were also
developed for each of the three. These differences primarily occurred in the Financial Information section
of the questionnaires. For example, questions about revenue sources and billing structures differed across
the questionnaires.
Westat worked with EPA on methodological aspects, such as:
• Wording and organization of the questions;
• Feasibility of asking various kinds of questions;
• Maximizing response rates and data accuracy for certain detailed or complex items;
and
• Layout and design of questionnaire forms.
Westat was also responsible for documenting and incorporating all revisions over the various
design and test versions of the questionnaires.
During the initial design process, and during the redesigns following the pretest and pilot test
(see Section 5.1, below), the EPA project officer consulted with a range of EPA regulatory and analytical
4-1
-------
staff, representing expert advisors and future users of the data, to identify and correctly present the broad
survey topics and specific survey questions to be included in the questionnaires. These covered such areas
as water production and storage, distribution, treatment plant operations, source water protection, detailed
financial information about water sales revenue and customer data, operating expenses, balance sheet
items, and capital investment. Experts inside and outside EPA were consulted in such areas as engineering,
pollution control, finance, and regulatory development.
The goal of the mail questionnaire design process was to strike a balance between collecting
complete, accurate, detailed data and reducing respondent burden. Towards this end, data from the pilot
test responses were analyzed to determine areas that needed clarification or should simply be eliminated
from the full study. In general, pilot respondents used the questionnaire form correctly and followed
instructions. Most respondents seemed able and willing to provide the responses to most questions. For a
more detailed discussion of the pilot test, refer to Section 5.1.
Prior to finalizing the forms for the full study, improvements were made to the general
questionnaire layout, instructions, and some individual items. For example, one finding from the pilot test
was that a surprising number of systems recorded dollar amounts to the penny, probably because they were
transcribing from financial records. Since fractional dollars were not being entered into the data file, the
reporting of pennies could introduce errors into the keyed data and the need to edit out the pennies added to
processing time. Therefore, the forms were re-designed to make it very clear that only whole dollars should
be recorded. It was also possible to significantly reduce the burden and complexity of the final instruments
by eliminating a separate question that was asked for each line item in questions about financial data: an
indicator of whether the data were being reported directly from accounting records. Based on response
rates to these questions and EPA's assessment of the feasibility of incorporating these responses into future
analyses, the entire set of questions was dropped from the final instruments.
4.2
Mail Survey Opv rations
EPA and Westat designed the questionnaire forms so that they were 'self-mailers", that is,
they did not require an external carrier envelope for mailing. The cover of the questionnaire was card
stock, making it durable enough to withstand postal processing. An EPA logo, the name of the survey, and
return address were placed in the upper left corner of one side of the cover. The other side contained a
large EPA logo and the survey name, specific to ownership type (e.g., Survey of Public Community Water
4-2
-------
Systems). The forms were designed so that a'return envelope could be glued to the inside front cover. The
package was closed with a "wafer" seal.
Westat prepared camera-ready versions of each of the three mail questionnaires as well as the
business-reply envelope. Westat coordinated with EPA's print facility to produce the bulk copies.
Westat produced three mailing information and control labels for each sampled system's
questionnaire. Information for the mailing labels (CWS Address Labels) was extracted from CATI
screener Question SI7, which recorded the name and address of the person the screener respondent felt
would be best suited to respond to the mail questionnaire. Additionally, an indicator of the type of CWS
(publicly owned, privately owned, or ancillary) was placed in the corner of the label. A second label, with
the specific FRDS name and address of the CWS, was produced for the inside of the questionnaire,
explicitly instructing the respondent to refer only to the sampled system when answering the questions.
This CWS Information label was a critical quality assurance measure for the data collection process, since
the appropriate respondents were often located somewhere other than at the address of the sampled system,
worked for entities with different names from the sampled system, and/or managed more than one water
system. While explicitly pinpointing the sampled system was essential to data validity, especially in multi-
system operator situations, the effectiveness of this technique was marginally affected when the FRDS
identifying information contained problematic data. However, it was better to provide anomalous
information than to provide no indication: it was assumed that the data as recorded in FRDS would have
meaning for personnel at the water system, since the system had at some point provided this information.
The third label contained the Cadmus toll-free telephone support line information.
Actual mail processing operations occurred in three steps. Each of these is described below.
Pre-Production Processing
Prior to mail processing, there were several tasks than needed to be accomplished:
• Because it was decided to use a single color for the covers of all questionnaire
booklets, Westat affixed color-coded wafers to the exterior back cover of each
questionnaire. Doing so would make handling the questionnaires easier during the
data processing stage. When the wafers were attached, the workers checked the
print quality of every 50th questionnaire to ensure that there were no batch printing
problems.
4-3
-------
• Business-reply envelopes were glued into the interior front cover of all
questionnaires.
• Telephone support line labels were affixed in the appropriate location inside the
questionnaire.
Production Processing
The Washington Consulting Group, who conducted the advance notification calls to the
CWSs to alert them that a questionnaire was being sent to them, provided the results of these calls to
Westat in two waves. The mailing was dependent on call results because EPA honored the wishes of a few
systems who indicated during the phone call that they did not choose to participate in the survey. These
systems were immediately recorded in the CWSS receipt control system as refusals (see below). Therefore,
questionnaires were processed in two waves: Wave 1 was mailed on June 1, 1995; Wave 2 was sent on
June 9, 1995. The set of CWS Information and Address labels for each system was printed. Within each
wave, questionnaires were processed in two batches: Multi-system respondents and all other cases
("regular" cases).
• Multi-System Respondents - For several hundred systems, the CATI screener
identified respondents who would be responsible for anywhere from two to seven
of the sampled systems. Often, these people were named independently on a case-
by-case basis during the direct contacts with individual systems. Westat used a
combination of computer logic and manual review of all sampled systems to
identify situations where the same respondent had been identified for more than
one system. For reasons of courtesy, improved response rate, and respondent
efficiency, an extra effort was made to identify these multi-system respondents and
send them all their questionnaires in a single batch. Special indicators were put on
the computer records of multi-system respondents and a computer program
segregated and associated the mailing labels for each multi-system respondent.
The questionnaires for a given respondent were placed in a single envelope, along
with a multi-system cover letter from EPA, acknowledging the respondent's extra
effort in completing multiple questionnaires. Before the envelopes were sealed,
cross checks were made against a listing of multi-system respondents to ensure that
each respondent was being sent the correct questionnaires.
• Regular Cases - Regular questionnaires were processed in batches by system
ownership type, thus ensuring that each respondent got the type of questionnaire
appropriate to his system.
The mail operations manager conducted spot checks of the questionnaires before
they were sealed to ensure that the respondent was receiving the appropriate type
of questionnaire for his system as determined by the type of system code on the
4-4
-------
address label; and that the case control ID number on the Address Label and CWS
Information Label matched.
Across the two mail waves, Westat sent questionnaires to 3,640 of the 3,681 systems
sampled. The remaining 41 systems are accounted for as follows:
• Pilot Test (11) - For reasons of statistical validity, systems that had been sampled
for the pilot test were not excluded from the sample frame for the full study. Some
of these systems were sampled for the full study. However, questionnaires were
sent only to systems that did not reply during the pilot test. The eleven pilot test
systems that did reply were excluded from further contact during data collection
either because they had returned a completed questionnaire during the pilot (n =
8), or they had refused to participate in the pilot (n = 3). Operationally, these
eleven systems were handled as if they had responded to or refused participation in
the full survey. Discrepancies between the pilot test questionnaire and final
versions were reconciled during the expert review and data retrieval stage.
• Large Multi-System Operator (15) - A large operator of water systems was
provided an advance copy of the questionnaire as a courtesy. Fifteen of these
systems used the advance copy to respond slightly ahead of the data collection
period, and therefore were not sent a questionnaire during the actual data
collection.
• Advance Calling (15) - The Washington Consulting Group made advance calls to
the water systems immediately before the questionnaires were mailed; 12 systems
refused to participate and three were out of business.
Post-Production Processing
In order to enhance response rate as well as improve the quality of data, several operations
implemented after the questionnaires were mailed. These included:
• Reminder Calls - During the summer of 1995, Cadmus conducted a series of
reminder calls to systems that had not yet responded to the survey. Callers stressed the
importance of the survey data and encouraged systems to respond as soon as possible.
Cadmus senior staff assisted respondents who requested help in completing the survey.
Late in the response period, Cadmus conducted a second round of reminder calls to
follow up on respondents who had indicated they would submit surveys during the first
round of calls.
were
Toll-free Support Line - During data collection, Cadmus maintained a toll-free
support line to answer respondents' technical and administrative questions. Cadmus
senior staff walked through survey questions with callers as necessary and provided
4-5
-------
guidance on how to represent respondents' specific situations on the survey
instrument.
• Remails - Respondents, either as a result of the reminder calls or from contacting
the support line, sometimes requested that a new questionnaire be sent to them.
Westat processed these requests in batches, typically on a weekly basis.
As questionnaires were received from the water systems, Westat logged them into a receipt
control system. The typical path of a returned questionnaire was as follows. Completed questionnaires
were first received at Westat. They were then sent in batches to The Cadmus Group for data quality
review and possible data retrieval. Upon return to Westat, they were logged back into the system and then
passed through a detailed data review and preparation process, to ensure consistent and clear recording of
the responses on the forms. Any special problems were noted in a permanent log and resolved at the end of
the process. The questionnaires were then key-entered using 100 percent verified double-key entry. After
entry, the data were run through automated cleaning and editing programs that checked each variable for
proper values and ranges, and checked skip patterns. Items failing these checks were examined and either
confirmed or corrected. Questionnaires that reached this stage were considered to be entered and cleaned.
(Subsequent to this point, the data were subjected to intensive computer checks that fell outside the scope
of the mail receipt and processing operation. See Section 5.7.)
Table 4-1 presents an example of the information that was provided to The Cadmus Group
and EPA in the weekly receipt control status report. By examining this report, project managers could
easily track the progress of the survey. The status codes included:
• Received/At Cadmus - Cases assigned this code had been received by Westat and
batched to The Cadmus Group for review.
• Remailed - Cases had been remailed a questionnaire.
• Postal Return - Cases with insufficient or incorrect address that were returned by
the US Postal Service as undeliverable and were not relocated on subsequent
contact.
• Refused - Respondents returned a blank questionnaire or a note indicating that they
did not wish to participate.
• Out of Business - CWS was no longer in business.
• Outstanding - No returned questionnaire was received as yet and no remailing had
been requested.
4-6
-------
Back from Cadmus - Cases that were returned to Westat from The Cadmus Group
upon completion of the data quality review.
In Data Entry - Cases whose data were being entered.
Clean - Cases whose data had been entered and successfully passed the standard
automated cleaning process.
Table 4-1 Example of Data from CWS Receipt Control Status Report
Number of Questionnaires
Received/At Cadmus
Remailed
Postal Return
Refused
Out of Business
Outstanding
Back from Cadmus
In Data Entry
Clean
Total
Public
3
259
3
148
1
318
0
0
1,052
1,784
Private
2
173
3
187
2
306
0
0
757
1,430
Ancillary
0
72
5
90
0
84
0
0
193
444
Total
5
504
11
425
3
708
0
0
2,002
3,658
4.3
Mail Survey Results
Receipt of returned questionnaires was closed out on March 28, 1996. Table 4-2 presents the
final status of the entire CWSS mail sample. Of the 3,681 cases in the mail sample, 2,004 returned a
questionnaire. The overall response rate for the mail survey is 54 percent. Table 4-3 shows the response
rate broken down for each of the 38 sample strata.
4-7
-------
Table 4-2
Final Status of CWS Mail Cases
Case Disposition
Complete
Questionnaire not returned
Refused
Non-usable returns
Post office return
Ineligible/Out of Business
Total
Number of Cases
2,004
1,212
425
25
11
4
3,681
Percent of Sample
54.4
32.9
11.5
0.7
0.3
0.1
100.0
4-8
-------
Table 4-3 CWSS Mail Response Rates by Stratum
Phase II (Mail Survey) Sample Strata
Ownership
Type
Public
Private
Ancillary
Public
Private
Ancillary
Public
Private
Public
Private
Public
Private
Public
Private
Public
Private
Public
Private
Population
Served
<100
<100
<100
100-500
100-500
100-500
501-1,000
501-1,000
1,001-3,300
1,001-3,300
3,301-10,000
3,301-10,000
10,001-50,000
10,001-50,000
50,001-100,000
50,001-100,000
> 100,000
> 100,000
Primary
Water Source
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Ground
Surface
Response
Rate (%)
53.6
56.7
41.6
42.4
43.9
39.4
58.4
50.3
45.0
51.5
42.9
48.2
56.9
55.6
51.5
61.0
64.8
56.3
52.6
56.6
59.1
54.3
69.0
59.1
57.3
64.9
51.2
64.6
64.0
73.3
72.2
51.7
60.6
75.9
68.8
54.5
4-9
-------
-------
5. QUALITY ASSURANCE AND PEER REVIEW
The quality assurance plan for the CWSS encompassed specific measures to check and ensure
the validity of the survey data during the various data handling and data processing stages, as well as
general quality assurance measures for the other survey components. Sections 5.1.1 and 5.1.2 discuss the
questionnaire pretest and survey pilot test. Section 5.2 presents quality assurance measures taken during
sampling. Sections 5.3 and 5.4 reference procedures undertaken during telephone and mail data collection.
Sections 5.5 through 5.11 describe quality assurance measures pertaining to the processing of questionnaire
data. In some instances, these discussions will highlight. in summary form certain aspects of previous
discussions that relate directly to the topic of quality assurance. Section 5.12 describes quality assurance
measures taken during the preparation of this report to ensure accurate presentation of findings.
The Center for Environmental Statistics in EPA's Office of Policy, Planning and Evaluation
peer-reviewed both this Methodology Chapter, to ensure adherence to sound surveying principles, and the
data presented in Volumes 1 and 2, to ensure that the findings were appropriately represented. Particular
aspects of this survey were peer-reviewed by subject matter experts at different stages of survey
development. These reviews are described, where they occur, in the following sections of this chapter.
5.1
Draft Questionnaire Pretest and Survey Pilot Test
A significant component of the survey quality assurance plan was to thoroughly test the
questionnaire design, the survey design, and data collection procedures prior to implementing the full study.
Confirming the validity and effectiveness of these designs, or revising them when the tests revealed
problems, errors, or difficulties, led to design and process improvements that would have a positive effect
on the quality of the survey in such areas as data reliability, data completeness, accuracy of the sample
frame, and response rates. Peer review of draft questionnaires was provided by John Trax of the National
Rural Water Association and by Vern Achtormann, Waterstats Manager for American Water Works
Association until the Summer of 1996.
5-1
-------
5.1.1
Pretest
When the questionnaire design had reached a point where the initial data collection objectives
had been identified and shaped into a working draft instrument, EPA conducted a pretest of this draft with
nine water systems in Maryland and Delaware, with the assistance of environmental engineer John Trax,
from the National Rural Water Association. These systems were recruited knowing they were participating
in a test. The main objective was to gauge the respondents' reactions to the questionnaire itself. The test
did not address any of the actual survey operations and response rate issues that would later be tested in the
full-scale pilot test.
The recruited systems received the questionnaire in March, 1994. All but one completed it.
The EPA project manager then conducted in-person debriefings with the respondents to explore such areas
as question comprehensibility, use of clear and appropriate terminology, provision of suitable response
categories, questionnaire layout, respondent's ease or difficulty in providing answers, respondent's
immediate knowledge of or access to the information required to fill out the questionnaire, and overall
reactions to the survey. Westat provided some guidelines to EPA to help determine what issues to address
in the debriefings and how to address them.
The pretest supported the view that the general objectives and design of the data collection
instrument were feasible. The pretest found no systematic problems in the respondents' ability to provide
answers to the questions. Principal outcomes of the pretest included the addition of some response
categories to certain questions to cover likely responses and changes in terminology to simplify wording or
reflect actual technical usage within the water industry. The ordering and grouping of the questions was
also improved.
5.1.2
Pilot Test
A full pilot test was conducted in August and September, 1994, to test all three versions of the
questionnaires and the major operational components of the survey design. The goal was to produce
findings to improve the design and procedures for the full study. A full report on the pilot test was
delivered to EPA on September 30, 1994 (Community Water System Survey Pilot Test Report). The
following sections highlight some of the quality assurance measures represented by the pilot test and its
findings.
5-2
-------
Sampling. The pilot test incorporated the procedures and specifications developed for
processing, cleaning, and extracting needed data elements from the Federal Reporting Data Systems
(FRDS) database and for drawing the sample from the extracted records and variables. These procedures
performed as expected and achieved their intended goals. The principal goals included identifying and
eliminating duplicate records on the frame, improving missing or ambiguous ownership data, checking for
valid telephone numbers, stratifying the resulting frame according to the sample design, and drawing a
systematic random sample. Eighty-two water systems were selected for the pilot study. One of these,
which had participated in the pretest, was excluded at EPA's request, to avoid further burden on the
system.
Based on lessons learned during the pilot, adjustments were made to the frame preparation
and sampling plan. These changes were implemented to improve location rate, eligibility rate, and sample
yields, either across the entire sample or within specific strata. For example, analysis of ineligible systems
identified through the pilot test telephone screener supported the hypothesis that most FRDS records coded
as 'historical" records should be excluded from the frame, since those were found to be ineligible or out of
business. In the interest of frame validity and coverage, it was necessary to test this situation empirically
and not rely exclusively on the general FRDS documentation, since adherence to FRDS protocols can vary
state by state.
Telephone Survey. The pilot-tested CATI questionnaire was programmed according to the
document entitled Community Water Systems 1994 Survey, CATI Screening Questionnaire, CATI Specs
#3 (7/25/94). The CATI software programs functioned in full conformity with these specifications
throughout the pilot test. In general, the design of the questions and the questionnaire structure also
accomplished their objective of screening the Phase I sample for eligible water systems, collecting the data
needed to place the system in the correct sample stratum, and collecting the name and address of the
appropriate respondent for the mail survey. Sixty-seven systems completed the CATI screener
questionnaire; 62 were eligible systems.
Based on monitoring and observing the pilot interviews by CWSS project managers and
feedback received from the telephone center managers and pilot test interviewers, changes were made to the
CATI questionnaire. These included question rewording to clarify technical distinctions for the respondent,
converting complex single questions into two simpler questions, and eliminating a few less important
questions and response categories.
5-3
-------
Another pilot test finding in the area of data collection also contributed significantly to the
validity and coverage of the sample. Because of the sparseness or ambiguity of some of the system
information on the FRDS record, it was determined that the CWSS interviewer training should incorporate
additional instruction and practice in interpreting the FRDS sample and contact information, in order to
ensure successfully locating and accurately identifying sampled systems in the full study.
Mail Survey. For Phase II of the pilot test, questionnaires were mailed out to the 62 eligible
systems. Westat maintained a toll-free support line for sampled system respondents to call if they had
technical or administrative questions. Seven of the 62 systems used the line. Calls were a mix of technical
and administrative questions.
Twenty-one systems, about 39 percent, completed the mail questionnaire. The short pilot test
data collection period did not permit testing any of the response-rate enhancement measures carried out in
the full survey, such as reminder calls to non-respondents. Therefore a 39 percent response rate based on a
single mailing was satisfactory enough to confirm the basic designs of the questionnaires and data
collection plan.
The returned questionnaires were completely reviewed and coded by the Westat CWSS data
preparation manager. The data were double-key entered with 100 percent verification. Review of the
actual completed questionnaires and of frequency distributions of the keyed data led to some revisions in
the design, layout, wording, content, and structure of the mail questionnaires. Incorporating these changes
for the full study led to a clearer instrument with consistent response categories, particularly for items
related to financial data. For example, improvements were made to make recording units of volume,
length, and distance as simple and unambiguous as possible for the systems. Instructions were clarified
about how to record financial data. Some explicit instructions alerted respondents when items in different
questions should add up to the same totals.
Conclusion. The pilot tested many important study assumptions, designs, procedures, and
systems. A range of resulting adjustments and improvements were implemented prior to fielding the full
survey. The pilot also provided the opportunity to fully develop, try out, and gain hands-on experience
with many of the operational designs, procedures, systems, and tools before committing the resources of the
full study.
5-4
-------
5.2 Sampling Quality Assurance
Quality assurance of the sampling process for the CWSS involved four principal areas:
• Review and cleaning of the FRDS file,
• Sampling specifications,
• Use of software designed specifically to draw complex stratified samples, and
• Review of sample tabulations.
FRDS File Review and Cleaning. For purposes of designing the sample and of preparing the
frame for drawing the sample, extensive qualitative and quantitative review was performed on the records
in the FRDS file. This FRDS review and cleaning addressed such issues as:
• The presence of necessary variables and appropriate values within those variables to
support a sample design that would meet the analytical objectives of the survey;
• The accuracy of the data on the file, for both sampling and data collection purposes; and
• The rate of missing data within each variable.
Based on the findings of this review process, a viable sample design was created and a set of
automated and manual data processing procedures were designed, documented, and carried out to improve
the quality of the frame data. This improvement was designed for both sampling purposes and also for
using the information on the frame to correctly locate and verify, during the CATI screening phase, the
exact water system represented by a sampled record. This process was described above, in Section 2.1.
Sampling Specifications. In order to carry out the two-phase sampling processes, the survey
statisticians prepared detailed specifications that served as directions for performing the sampling and as
permanent documentation of the process. In addition to clearly specifying the process of drawing the
sample, these specifications addressed the organization of the frame prior to sampling to ensure both proper
stratification and also normal distributions of other attributes of interest, such as geography. These
specifications ensured that the sample was drawn in conformity with the sample design and in a statistically
5-5
-------
valid manner. Prior to being implemented, each set of specifications was reviewed by a senior statistician
and by a senior systems analyst.
Sampling Software. The CWSS samples were drawn using WESSAMP, a set of macros for
the SAS data analysis software package. WESSAMP was developed to standardize and automate the
drawing of survey samples; it has been extensively tested and proven over hundreds of surveys. It
minimizes the chance for human error and eliminates the need to write extensive custom programs to draw
a sample. It specifically supports and simplifies the process of drawing a complex stratified sample as
required for the CWSS.
Review of Sample Tabulations. WESSAMP automatically produces appropriate tabulations
of sample statistics to be used by the sampling statistician to verify that the sample as drawn actually
conforms to the sample design and sampling specifications. The review of these tabulations confirmed that
the sample counts matched the design within each size/ownership/water source stratum, and also that the
selected systems conformed within stratum to other meaningful normal distributions that had been identified
on the frame.
5.3
Telephone Survey Quality Assurance
Several quality assurance measures were in place for the CWS telephone survey. These
measures, listed under the appropriate telephone survey component, are discussed below.
CATI Questionnaire Design and Programming
• The final version of the CATI questionnaire incorporated improvements made as a result
of the pilot test.
• Complete and detailed question-by-question specifications were prepared for every
questionnaire item, to unambiguously document for interviewers and analysts the
meaning, purpose, and context of all questions and responses.
• The CATI questionnaire was programmed according to detailed specifications. The
programmed CATI questionnaire was tested extensively prior to the start of data
collection to ensure that it was performing according to specification.
5-6
-------
Telephone Interviewer Training and Performance
• CWSS interviewers had previously received 8 hours of general training in interviewing
techniques and CATI operation, and then received another 8 hours on specific CWSS
topics, such as procedures for contacting and verifying the water systems, administering
the CWSS CATI questionnaires, and sensitivity in presenting the survey to the sampled
water systems. The training followed a carefully prepared agenda. Exercises and role
plays were scripted in advance, to ensure consistency and comprehensive coverage of
important issues.
• Supervisory staff monitored the interviewers throughout the telephone data collection
period. Silent monitoring equipment ensured that interviewers could not determine when
an individual monitoring session was occurring. As needed, general adjustments or
specific instructions for the interviewing process or individual interviewers were made as
a result of the monitoring findings.
Telephone Data Collection
• Special data collection operations, such as tracing, improved response rates, sample
coverage, and locating the correct sampled CWS.
• Where it was possible to identify them in advance, water systems owned or managed by a
single entity were grouped together ahead of time, to make the telephone contact more
efficient and less intrusive for the respondent.
Telephone Data Quality Assurance
There were a number of automated measures implemented for the telephone data collection to
address specific data quality items:
• Use of the CATI questionnaire assured consistent and accurate administration of the
questionnaire, with correct skip patterns, question choice, question wording, and response
categories, as appropriate to each system's response patterns.
• On-line range checks for every variable produced immediate interviewer prompts and
respondent probes whenever an entered value was outside of the expected or allowed
range.
• Water systems that could not provide an answer to several critical screening questions
could not be included in the final sample, since their eligibility would be unknown. A
series of special prompts and interviewer probes were designed and automatically called
up by the CATI system when a respondent initially failed to provide an answer to a
critical question. Those probes helped the interview work cooperatively with respondents
to attempt to work through an answer to the question.
5-7
-------
On a periodic basis during data collection and on a cumulative basis at its conclusion,
data preparation staff performed a series of range checks, skip pattern checks, and
problem resolution to monitor the collected data and correct it, if necessary. These
measures checked the operation of the CAT! instrument, writing of data to the database
variables, use of response categories by interviewers, assignment of result codes, and
qualitative notes recorded in the CATI system and on paper by telephone staff. This data
review did not detect any errors in the automated data collection.
After completion of the telephone screening, systems classified as ineligible or out of
business were back-checked against FKDS data, using a series of strict criteria applied to
FRDS data elements that were independent of the original sample frame dimensions. This
provided an independent and parallel test of the reasonableness of the screener outcomes.
As a result, 97 systems were reclassified, leading to improvements in the sample weights
and final analytical tabulations of the weighted survey data.
5.4
Mail Survey Quality Assurance
Each component of the CWS mail survey was implemented pursuant to detailed written
specifications that clearly stipulated how each design was to be implemented.
Questionnaire Design
• The various drafts of the three mail questionnaires were the product of several rounds of
close review and comments by EPA, Westat, and independent peer review. Additionally,
improvements were made based on the pretest and pilot test (see Section 5.1).
• Peer review of the questionnaire was provided by The National Drinking Water Advisory
Council, John E. Peterson of the Government Finance Group; and Don Fraser, an
independent consultant with expertise in the operational characteristics of drinking water
systems.
• Questionnaire version control was maintained through the various drafts by hand-writing
all edits onto the hard copy master of the current version. Once the edits were made in the
master word processing file, the previous hard copy version was placed in a notebook and
the new version became the master. Each new version was dated and serially numbered.
• Design of the questionnaire form paid particular attention to the presentation and layout
of questions, response categories, response recording blocks, and instructions to clarify
and simplify for respondents the provision of the highly detailed and complex data
required for this survey. Graphic devices were used to make the forms clearer, simpler to
use, and attractive. These devices included choice and consistency in type fonts, sizes,
weights, and styles; question borders, text boxes, and shading.
5-8
-------
Mail Data Collection
• The production of the physical components of the mailing package was designed to
minimize the chance of human error in assembling each water system's package.
• Workers preparing the questionnaires for mailing were provided with detailed written
specifications for the job. They were supervised by the mail operations manager, who had
assisted in the design of the specifications.
• While preparing the questionnaires for mailing, workers checked the print quality of every
50th questionnaire to ensure that there were no batch printing problems.
• The mail operations manager conducted spot checks of the questionnaires before they
were sealed to ensure that: a) the respondent was receiving the appropriate type of
questionnaire (Public, Private, or Ancillary) for his system as determined by the type of
system code on the address label; and b) Mail control ID numbers on the Address Label
and CWS Information Label matched.
• Counts of the prepared questionnaires were done prior to mailing to verify that the correct
number of questionnaires were mailed. Spot checks were also done on individual states,
comparing the number of questionnaires processed to the sample frequency counts for
those states.
• Instances where one respondent was responsible for multiple systems were given special
handling. This was done to ensure that the respondent was provided with all of his
questionnaires in one envelope, rather than inundated with multiple packages. Before the
envelopes were sealed, cross checks were made against a listing of multi-system
respondents to ensure that each respondent was being sent the correct questionnaires.
• Special data collection operations, such as the toll-free support line and reminder calls,
were instituted with the goal of improving the response rate and sample coverage. They
also improved the quality of reported data because respondents had resources available
for resolving technical questions about the meaning of questionnaire items and about how
to report their own special circumstances in terms of the generic questionnaire.
• The receipt control system ensured proper tracking and control of all questionnaires from
the point of sampling until data were entered and cleaned. In addition to supporting the
overall management of the project, the periodic reports of case statuses enabled The
Cadmus Group to identify particular response rate problem areas and take appropriate
telephone follow-up measures.
5-9
-------
5.5
Expert Review of Critical Questions and Data Retrieval
Cadmus reviewed the completeness and accuracy of responses to eight questions designated as
'critical" by EPA staff because the information provided in response to the questions was essential for
conducting subsequent analyses in support of regulatory development and implementation efforts. Table 5-
1 lists the critical questions. These questions can be found in the CWSS mail questionnaires, which appear
as Appendices to this report. The remainder of this section provides an overview of the review effort.
Table 5-1
Critical Questions
Question
4
11
18
20
29
30
33
34
Description
Sources of Water
Population Served and Number of Connections
Treatment Facility Information
Information on Intakes with No Treatment
Water Sales by Customer Type
Other Water Related Revenue Sources
Routine Operating Expenses
Assets, Liabilities, and Debt
Categorization
To expedite the review process, each questionnaire was categorized according to the level of
difficulty required to review it. Specifically, Cadmus reviewers placed each questionnaire into one of the
following three categories:
Category 1: Complete responses to all critical questions and requiring the least amount
of time to review and correct.
Category 2: Incomplete responses or apparent errors in response to less than three of
the critical questions. These questionnaires required additional time to
correct or contact the respondent for clarification.
5-10
-------
Category 3: Incomplete or erroneous responses to more than three critical questions.
Because these questionnaires would require a time-consuming effort to
remedy them, Cadmus omitted most of them from further review.
In order to maximize the number of questionnaires reviewed, Cadmus staff reviewed questionnaires in
Category 1 first and Category 2 second. To the extent that the budget and project schedule allowed,
Cadmus staff reviewed the Category 3 systems.
Within each category, Cadmus reviewers gave priority to systems with apparent errors in
responses over systems with missing data. Cadmus staff reasoned that a system's filling out a question
indicated both the availability of data and a willingness to provide it.
General Approach
Where a system provided information in response to a survey question, Cadmus' review
focused on determining whether the response was consistent with responses to other questions. To ensure
consistency in the reviews, Cadmus developed a table for each critical question listing the steps to be
followed in checking the question. In these tables, typical errors associated with the responses to the
critical questions were listed, along with appropriate actions to take. In general, the strategy for solving
problems was as follows:
• First, wherever possible, the reviewer attempted to derive an answer using information
provided in response to other questions.
• Second, the reviewer contacted a Cadmus senior advisor to estimate an answer using best
professional judgment.
• Third, the reviewer attempted to contact the respondent directly for clarification.
Cadmus reviewers did not estimate responses if adequate information was not provided by the respondent
or contained in responses to other questions. If the respondent provided income statements, balance sheets,
rate schedules, or other supporting documentation, Cadmus reviewers and/or senior staff used the
documentation to derive missing answers.
In order to ensure a complete record of the review, Cadmus reviewers filled out a
Questionnaire Review Sheet (QRS) during the review and attached it to the survey instrument. The QRS
5-11
-------
required reviewers to list problems encountered during the review, actions taken to resolve the problems,
and any unresolved issues at the conclusion of the review. Cadmus senior staff used the QRS forms to
perform Quality Assurance checks of the reviews and to identify any problems or bottlenecks in the review
process. The QRS and any attachments included by the respondents became permanent parts of the
questionnaire form for future processing and archiving purposes.
Cadmus reviewers were successful in obtaining missing data for the majority of Category 1
surveys and a significant proportion of Category 2 surveys. In many cases, reviewers utilized answers to
other questions in the survey to derive missing data, especially for the operational questions. Similarly,
reviewers corrected numerous erroneous responses. Some of the most common problems were resolved
quickly and with minimal effort. For example, surveys with incorrect units and/or mathematical errors
typically were corrected without contacting the respondent. More complex problems typically required a
call to the respondent. If a problematic answer could not be corrected, reviewers crossed it out on the
survey instrument, and the item was handled as missing data in future processing steps.
5.6
Manual Editing, Coding, and Data Entry
Following the expert review and data retrieval process for critical items, the questionnaires
were subjected to a 100 percent editing review in preparation for entering the data. This editing process
examined every response field on every form, to check skip patterns, clarify handwriting that would be
difficult for the data entry staff to read, standardize the recording of quantitative data, and identify any
potential problems, such as marginal notes or potential order-of-magnitude reporting errors in the
volumetric non-critical questions. General protocols were developed to guide the data preparation staff in
reviewing the forms and in handling generic problems. When the editing process resulted in a change to a
form other than routine coding actions like deleting penny amounts from responses in financial data fields,
each change was individually recorded by ID number in a permanent coding-decision log. Problems that
were too idiosyncratic or too significant to be handled generically were recorded in the log, then reviewed
by a supervisor or manager who had the appropriate knowledge to resolve the issue. Any changes resulting
from this level of review were also recorded in the log.
After the initial edit, the questionnaires moved to the data entry process. Each form was
entered with 100 percent verification, that is, using independent double key entry. The automated data
entry program was customized to each of the three questionnaire forms.
5-12
-------
As the data were entered, the batches of entered records passed through a data cleaning
process, consisting of standard computer edit that examined each variable for conformity to appropriate
values or data ranges and also checked the small number of skip patterns that existed in the survey
instruments. A report identified each variable for each case that failed any of these tests. A data
preparation supervisor then examined the original questionnaire forms to determine whether the anomaly
occurred in the original data and, if so, whether to confirm it as correct or to record it in the problem log for
resolution as described in the preceding paragraph. The standard computerized edits were repeated for all
the data until no cases failed the edits, except for any that had been specifically confirmed as valid outliers
during a previous review.
After all questions had been edited, entered, and cleaned to the degree permitted by these
processes, the resulting keyed database passed to a process of detailed automated logical edits that enabled
expert staff to conduct a highly focused review of data values and relationships.
5.7
Automated Data Validation Checks
In preparing the final database, EPA, The Cadmus Group, and Westat designed, produced,
and analyzed a series of computer validation checks. These validation checks were run on the full survey
database after the data had been entered and passed the standard computer edits for values and ranges on a
variable-by-variable basis. The checks included the following:
• Distribution frequencies for all categorical variables, plus continuous character variables
(e.g., OTHER (SPECIFY) fields);
• Distribution frequencies for all continuous numerical variables formatted into four
categories (non-zero responses, zero-responses, legitimately skipped, and missing);
• Univariates for each continuous variable;
• Item-specific cross-tabulations of categorical data;
• Item-specific cross-univariates of continuous data; and
• Item-specific advanced logic edits.
5-13
-------
All the validation checks were programmed and run using the SAS data analysis software
package. Univariates are a standard SAS procedure producing various descriptive statistics for
quantitative data (e.g., mean, median, and percentiles). A cross-tabulation or cross-univariate further
breaks out the values or descriptive statistics for a given variable into subsets defined by the values of a
second variable. For example, many of the crossed-univariates produced statistics for key survey variables
such as number of gallons produced or water sales revenues, and further broke them down into the eight
population-served size categories used to define the sample strata. This permitted a more precise review of
the reasonableness of the responses in such variables, since the magnitude of their response value is closely
related to the size of the system.
Output for the automated series was produced on a flow basis, allowing project staff to
examine and modify the database on a continual basis. In general, frequency output was produced first,
then univariates of continuous data, followed by item-specific crosses (tabulations and univariates) and the
advanced logic checks. This hierarchical approach allowed for the efficient building of the validated
database by examining simple output, such as frequencies, before proceeding to running more complex
validation checks.
The advanced logic edits provided the most focused view into the data. The objective of the
advanced logic edits was either to test the relationship between two or more variables or to identify extreme
values as determined by expert opinion. In some cases, multiple logic checks were developed to achieve
both objectives. An example of a logic edit is to check that respondents reported data consistently between
Questions 29 and 31: a series of checks determines that if responses are reported in a line-item in Question
29 (water sales customer categories) then responses should also be reported in the parallel line-item in
Question 31 (billing structure for each customer category).
Often, the logic checks created composite variables from a related set of survey variables, and
tested the composite against other survey data or against external measures of reasonableness. For
example, the three variables containing water production data (Question 4) were summed and compared to
the sum of the eight variables containing water storage data (Question 8). A logic test then determined if
systems reported more than ten days production worth of storage. The parameters for any absolute range
tests or tolerances between variables were refined during the review process. The final edits were run using
the revised tolerances.
The logic edits were designed so that if the expected logic condition was not met (say, for
example, data fell out of the acceptable range) than the survey record would be flagged as containing a
5-14
-------
possible error. The logic edits were numbered sequentially and a report was generated for each edit that
listed the ID number of each system failing the edit, followed by all of its relevant survey variables and
composite variables involved in the error check. Exhibit 5-1 is an example of a report page from an
advanced logic edit.
It is important to note that the logic checks were diagnostic tools. None of the logic tests or
other automated edits were ever used to make computerized global changes to the database based solely on
failure of the logical check. Rather, they were used to guide expert review of individual cases. Review of
the edit reports often confirmed the reasonableness of the data. In other cases, the error conditions were
extreme enough that it was decided to review the original questionnaires to determine whether the data
appeared to be truly invalid. When expert review determined that an individual data value was highly
unreliable, it was corrected whenever possible, or removed from the record in relatively few extreme cases.
The advanced logic edits confirmed the effectiveness of the initial expert review of the critical
questions. The number of items identified for the critical questions was considerably smaller than those
flagged for the non-critical questions.
Altogether, 44 different advanced logic edits were developed. Overall, these edits examined,
in one or more ways, approximately 500 out of the 600 survey variables, including all variables in the
questions identified as critical questions by EPA.
Many of the data points that were changed as a result of the various data reviews and
automated checks were found to be associated with order-of-magnitude confusion on the part of the
respondents; for example, reporting in gallons when the question specified millions of gallons.
Finally, the various reviews and checks resulted in corrections to clear any extreme data
outliers whose inclusion in the survey database as originally reported would have markedly skewed
subsequent analyses, especially at sub group levels. However, while the effect of these corrections on the
overall data quality is considerable, they represent a trivial number of changes to the total number of items
reported. For example, the survey database contains approximately 1.2 million data points. The changes
resulting from review of the advanced logic edits affected about 3,000 items. This represents about 0.25
percent of the data points. In terms of a possible trade-off between data quality and data integrity, there is a
very small exposure to reduced data integrity and a very large yield in survey data quality.
5-15
-------
I
I
Exhibit 5-1 Report from an Advanced Logic Check
o-^eoeoooeooooo'Oooo.-ioooooooinooooo — oo
$
3
a.
a. o
M in
u ro
to ~* o oin o CMIMCVS. o oo in oo otn o o o o o o in «o o sr o
>: co O 3 10 o r*i in »- o o •- •» in o o in «o o in o in o o ^- *-^ »o *-
I i
> w« «o«- »-oeM»-oN-ocMin«-ir»«-ocM.-ooooiMinooooo<
MB^^^H »— F»ioo»- oinro»-
L o W >- »- ^ «~ •*•
51 A ^^
»s.
o to o
o o o
2°S
ooooo-d'
'
KiO'-
nio<-
OOO
-------
5.8
Quality Assurance for Financial Ratio Analysis in Volume I
Publicly owned and privately owned water systems record and report financial data in various
ways. Comparing these data necessarily involves making assumptions about, and adjustments to, data that
were reported in the questionnaires. In the following paragraphs, we describe how we developed the
financial ratios contained in Volume I. We also describe how we made adjustments for the different
accounting practices of publicly owned and privately owned systems.
Background
It is difficult to obtain comparable financial measurements between privately owned and
publicly owned water systems, and even between water systems in each sector. This problem is created in
part by the wide variations in size, objectives, and complexity of systems. Small ancillary systems, for
example, provide potable water as an ancillary service; they may keep no records of the costs or revenues
associated with water production and sale. (Indeed, the financial characteristics of ancillary systems are so
unique that they were excluded from much of the financial discussion in Volume I.) The private and public
sectors also may have fundamentally different objectives. For example, in an investor-owned company,
provision of drinking water may be a means of creating returns on invested capital. In publicly owned
systems, provision of drinking water may be part of an array of public services that may be supported by
general revenues. Alternatively, in publicly owned systems that use an enterprise fund approach,
consumers generally pay the costs of their water through water (or water and sewer) charges.
Differences in system complexity, size, and purpose are reflected in the methods by which
these entities keep their books and report financial results. In the private sector, accounting systems and
principles are governed by tax and regulatory requirements, and in the case of large companies with
publicly held securities, by disclosures that may be required under federal securities laws. If several water
systems are owned by a single company, or if water supply is only part of a larger package of services
provided by the firm, it may be difficult to assemble and compare the precise costs and revenues associated
with water supply. For example, some large water systems have ancillary businesses such as laboratories,
or they provide contract operations and maintenance (O&M) services for other systems. The equity
ownership and various classes of stock outstanding in investor-owned systems also inhibit direct
comparisons with government-owned systems, whose equity interest is undivided, undistributed, and
effectively held in the aggregate by the public.
5-17
-------
There can be significant differences in how publicly owned systems account for and report
their financial condition. The major difference (alluded to above) is between systems that account on an
enterprise or full accrual basis (much as an investor owned system would) and those that use governmental
fund or modified accrual accounting precepts. While there are several distinctions between the two broad
categories of accounting concepts, the major difference is in the treatment of depreciation expense and debt
service expenditures. Enterprise accounting systems generally calculate accrued depreciation and interest
as a current expense, while governmental funds do not. (In some cases, even enterprise accounting systems
may vary in their treatment of depreciation expense due to a lack of historical records or alternative
methods of calculation.) Governmental fund accounting systems, in contrast, do not record depreciation as
an expenditure; and to the extent that information related to the water system's debt is reported, the focus is
on debt service payments (principal and interest). Moreover, a water system is treated for accounting
purposes as a governmental unit, and its accounts are incorporated into the general fund structure, it is very
likely that there will not be separate and identifiable debt service charge exclusively related to water supply.
In such a case, although the water system may be self-supporting in terms of meeting its current costs of
operation, the debt service (in effect) will be paid from general tax revenues, and the capital investment will
be carried in the general asset accounts of the government.
These barriers to consistent financial information and uniform analysis have been somewhat
overcome in public sector debt analysis by various adjustments to financial data that seek to make them
comparable across the different accounting systems. The focus of these efforts has been to isolate revenues
and cash expenses that are seen as most relevant to credit analysis, and to determining a water system's
ability to repay bonded debt where water-related revenues such as user charges and dedicated taxes are
pledged to bond repayment.
Given the different accounting techniques used in the public sector, it is common practice for
bond market credit analysts to adjust cashflows to make water system finances as comparable as possible
when revenues are used as a source of security for bond repayment. One of the more prominent
adjustments is to disregard depreciation as a charge and not to deduct interest expense as an operating
expense (even though it would be counted as an expense under commercial accounting practices). The
objective is to derive "net available revenue," which focuses on the cash available to pay debt service after
current O&M expenses have been paid. Another adjustment is to distinguish between recurrent operating
revenues derived from the sale of water and those other revenues such as connection fees, inspection fees,
5-18
-------
and developer fees, that are more episodic in nature.1 If revenues are pledged to the repayment of bonds,
the bond contract will typically require the water utility to set rates at levels that generate sufficient
revenues to provide certain levels of "coverage."
To the extent permitted by the structure of our data on system finance, we attempted to use
the adjustments outlined in the paragraph above to create comparable data in Volume I across all system
types.
5.9
Data Processing Quality Assurance
The final, clean survey database represented the product of the various review, editing, data
entry, and data validation steps described in Sections 5.5 through 5.7. Once this database was prepared,
there were a number of subsequent data processing steps required to create a variety of files suitable for
analyses and tabulations and for final delivery of a permanent database to EPA. The principal processing
steps included:
• Appending needed variables from external files, including sample and contact information
from the FRDS and Phase I CATI screener databases.
• Analyzing the hard copy questionnaires and the frequency distribution of continuous and
categorical variables to devise rules for handling missing data.
• Zero-filling blank responses. Because of the large number of blanks appearing in
quantitative data fields in the returned questionnaires, it was determined that respondents
tended to leave blanks where zero was the actual response. Therefore, a detailed series of
rules was developed for assessing such blank responses and determining whether to regard
these as zeroes or missing values. In general, blank quantity fields were treated as zero,
except when there was external evidence in a logically related item that the response
should not be zero. A detailed set of programming specifications was then designed to
implement these rules, then computer processes created pursuant to the specifications.
These processes were run on the database to convert blank fields either to zeroes or to
explicit missing values, as determined by the rules.
Analysts attempted to exclude connection fees and inspection fees from operating revenues, but they faced a practical problem. If one excludes these
fees from revenues, proper accounting methods require that one should also exclude any associated expenses (e.g., connection expenses and
inspection expenses) from total operating expenses. Given the limitations of our survey data, we could not disaggregate expenses in this manner.
Therefore, connection fees and inspection fees were not excluded from operating revenue when calculating the operating ratio.
5-19
-------
Creating new derived variables from the survey data to categorize systems into strata
comparable to the original sampling strata, but based on the final survey responses rather
than the FRDS data and first-phase screening data.
Attaching the sample weights to the analytical file.
For the final delivery of the database to EPA, deriving and attaching the numerous
composite variables created for production of the analytical tables in Volume II of this
report.
Each of these steps was carefully planned in advance. Detailed specifications were written to
guide the programming and data processing needed to perform each step. In addition to these
specifications, the processing of files and flow of data throughout these various steps were planned,
controlled, and documented through data flow diagrams. The data flow diagrams are schematic
representations of how files, data records, data elements, and individual data point values are handled,
combined, extracted, and moved from one stage to the next. These diagrams are crucial quality assurance
tools to help ensure that programmers and systems analysts have a clear and common understanding of the
entire process of data management, that the processing stages fit together in logical order and accomplish
the intended objectives, and that there is an unambiguous audit trail of the condition of the data at each
processing stage.
Version control was maintained for all computer programs, and interim stages of all data files
were permanently archived. This meant that, when changes were made to a program or process, it was
clear which was the current version and it was always clear of sequential changes that had been made from
one version to the next. If earlier versions of data files were needed because it was desirable to revert to
values or restore data items that had been updated or modified as a result of the various review and
processing steps, it was always possible to restore any earlier version in full or to merge selected data from
the old version to the new version.
The combination of the processing specifications, data flow diagrams, version control, and
data archiving ensured that no process was irreversible, that it was always possible to recover from any
deliberate or inadvertent changes to the data, and that the characteristics of the survey data were fully
known at each processing stage.
5-20
-------
5.10
Tabulation Quality Assurance
The tabulations of survey results presented in the 1994 Community Water System Survey
report are varied, detailed, and complex. Rather than being a simple presentation of individual survey
variables, each table usually presents the results of multiple calculations involving a variety of survey
variables. Many tables present several such results within a single table. There were often several
different ways of defining or calculating an item of interest, and sometimes there were different direct or
derived sources of data for the calculation available on the survey database. Hence, the following steps
were taken to help assure that each table accurately summarized and presented the data contained in the
final survey database.
Identify important, relevant, and useful information that could be developed from analyses
of the survey data;
Design each table to effectively present the analytical results or to juxtapose related
results in the same table;
Clearly describe the contents of each table;
Define in detail the variables, values, formulas, and derivations that went into each
calculation;
Prepare clear and detailed data processing specifications for carrying out the tabulations
according to the calculation definitions;
Develop computer programs to process the data pursuant to the tabulation specifications;
Conduct an independent review of the resulting programs against the data processing
specifications;
Review the initial tabular output for:
- Consistency with the design of the table contents,
- Conformity with the definitional and programming specifications, and
- Reasonable agreement with expected values based on external measures and expert
knowledge of water system operations and finances;
Review definitions, specifications, programs, and underlying data for tabulations
exhibiting data anomalies or outliers;
5-21
-------
• Revise any definitions, specifications, or programs if the review process identifies errors
or the need for modifications to previous decisions; and
• Repeat previous tabulation quality assurance steps and re-run tabulations until no further
unacceptable data anomalies or outliers are found.
In addition to the above steps, the CWSS tabulation process employed one other measure to
help assure the accuracy the tabulated results. The tabulation process was fully automated, from the
underlying source data through all processing stages to the final formatted tables. There were no
intermediate stages requiring manual transfer or entry of data from one stage to the next. This eliminated
human transcription error. Equally importantly, it also expedited the process of successive iterations of the
tabulations during the quality review process, since each time a table was produced the output data
automatically were transferred into the same final table form as on the previous iteration. This ensured that
any new anomalies identified in later iterations did not result from transcription errors, and allowed the
review staff to focus their investigations on the table data, specifications, and programs.
5.11
Examination of Potential Nonresponse Bias
The mail survey response rate was approximately 50 percent. If the nonrespondents have
different characteristics from the respondents, this could result in a significant nonresponse bias in the
survey estimates. There are a variety of characteristics that may be useful to explore in this regard. For
this report, EPA was particularly interested to investigate whether patterns of compliance and non-
compliance with EPA regulations were associated with systems' propensity to respond to the survey.
For the 3,681 water systems included in the mail survey sample, noncompliance data from
FRDS were reviewed for the three-year period from January 1, 1993 to December 31, 1995. Systems with
violations of the Maximum Contaminant Level (MCL) were flagged as having at least one MCL violation.
Systems with monitoring and/or reporting (M/R) violations were classified as having at least one M/R
violation.
If the percentage of systems with at least one violation were significantly different between
respondents and nonrespondents, it would indicate that the estimates from the survey may be subject to
significant nonresponse biases, in addition to their sampling error. It is important to look for differences
that are both statistically significant (not likely to be due to random chance) and practically significant
5-22
-------
(differences that might be large enough to impact the estimates to a noticeable degree). Differences were
examined for all systems; by size, water source, and ownership type; and by combinations of these three
variables. Differences were also examined for MCL violations, M/R violations, either type of violation,
and both violations. Given the many breakdowns examined, a statistical significance level of .01 was used
for each test of statistical significance. A difference of more than 15 percent was used to determine
practical significance. For purposes of this analysis, we will use the term Significantly" different to mean
differences that are both statistically and practically different.
When systems were examined that had both types of violations during the three year period,
no significant differences were found for any breakdowns.
When systems were examined that had MCL violations during the three year period, the only
significant difference was for privately owned surface-water systems, with 17 percent of responding
systems having at least one violation and 33 percent of nonresponding systems.
When systems were examined that had M/R violations during the three year period, a number
of differences were detected. Among responding systems, four out of 11 (36 percent) of privately owned
ground-water systems serving over 100,000 people had a violation, while none of the five nonresponding
systems had violations.
Systems serving between 500 and 1,000 people had more M/R violations among
nonrespondents (57% compared to 40%), with similar results found for subsets of these systems. Publicly
owned systems serving 3,300 to 10,000 people had 32 percent violations for respondents and 50 percent for
nonrespondents. Similar results were found for publicly owned ground-water systems and privately owned
surface-water systems of this size and publicly owned surface-water systems serving 10,000 to 50,000
people. Among ancillary ground-water systems, 48 percent of completed systems had a violation,
compared to 65 percent of nonrespondents.
Results for systems that had either type of violations during the three year period were
similar to those with M/R violations. Systems serving 50,000 to 100,000 people tended to have more
violations among respondents than did nonrespondents (46% to 14% for privately owned ground-water
systems and 50% to 29% for publicly owned surface-water systems).
5-23
-------
Surface-water systems serving between 500 and 1,000 and between 3,300 and 10,000 showed
a higher percentage with violation among nonrespondents, as did ancillary ground-water systems and a few
other categories.
Given these results there appears to be no obvious systematic bias from nonresponse, at least
for variables correlated with the presence of a violation. There are obviously many other types of
differences that might exist between responding systems and nonresponding systems. However, there are
no easily accessible data sources that contain indicators of these differences.
5.12
Quality Assurance During Report Preparation
As noted in the introduction to this chapter, EPA's Center for Environmental Statistics
provided peer review of the data presented on this report to ensure that the findings have been appropriately
described and presented. Additional peer review for this purpose was obtained from Mr. John Peterson,
The Government Finance Group (GFG), expert in the field of public finance; Mr. Dan Fraser, an engineer
and expert in the operational characteristics of water systems; Dr. Janice Beecher, Senior Research
Scientist and Director of Regulatory Studies at Indiana University's Center for Urban Policy and the
Environment; and Mr. Barry Liner, Water Research Center, an expert in benchmarking applied to water
and waste water utilities.
5-24
-------
APPENDIX A
CWSS CATI Screener Questionnaire
A-l
-------
-------
OMB No.: 2040-0173
Expires: 07/31/97
Community Water Systems
1994 Survey
CATI SCREENING
Questionnaire
- CATI SPECS #6 (FINAL) - December 22,1994
-------
This telephone survey is estimated to require approximately 4 minutes to complete. This
includes time for listening to the question and instructions, and reporting the requested data.
Send comments regarding the burden estimate or any other aspect of this survey, indicating
suggestions for reducing this burden to: Chief, Information Policy Branch, 2136 • U.S.
Environmental Protection Agency • 401 M Street, S.W. • Washington, D.C. 20460, and
Desk Officer for EPA • Office of Information and Regulatory Affairs • Office of Management
and Budget • Washington, D.C. 20503
- CATI SPECS #6 (FINAL) - December 22,1994
-------
1994 COMMUNITY WATER SYSTEM SURVEY
Telephone (CATI) Questionnaire
SCREENING SURVEY
SECTION A. CONTACT PROCEDURES
C1. Hello, my name is {INTERVIEWER'S NAME} from Westat, Inc. Have I reached the {CWS NAME}.
[I'm calling for a study being conducted for the U.S. Environmental Protection Agency.]
YES..
1 -» [READ THE CWS NAME AND ADDRESS
PRINTED IN BOX 1 TO VERIFY THAT
IT IS CORRECT AND GO TO C2]
YES, BUT NAME/ADDRESS 2 -» [RECORD CLARIFICATION INFORMATION
HAS CHANGED IN BOX 4 AND GO TO C2]
NO.
[GO TO SUGGESTIONS FOR C1 ON
BLUE TIP SHEET]
COMMENTS:
C2. The U.S. Environmental Protection Agency is conducting a survey to collect information to help
develop regulations and guidelines for community water systems.
C3. I would like to speak with someone knowledgeable about the water system who can answer a few
questions about the system size, ownership, and water sources. Would that be you or someone
else?
May I have your/his/her first name (LAST NAME/TITLE/PHONE NUMBER) please?
CONTACT NAME:.
CONTACT TITLE:
(First)
(Last)
CONTACT TELEPHONE NUMBER: ( )_
Ext._
CONTACT IS THE SPEAKER 1 -* [GO TO C5]
CONTACT IS NOT THE SPEAKER 2
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 2
C3a. Would you please transfer me to {CONTACT GIVEN IN i
AVAILABLE
NOT AVAILABLE.
CONTACT IS REACHABLE.
AT DIFFERENT NUMBER
1 -» [QOTOC4]
2 -* [END CALL SELECT THE
APPROPRIATE RESULT
CODE AND RECORD THE
INFORMATION ON THE CALL
RECORD]
3 •* [END CALL, RECORD THE
INFORMATION ON CALL
RECORD. REDIALONCATI
SCREEN. ENTER THE NEW
TELEPHONE NUMBER AND
BEGIN AT C4]
FOR OTHER OUTCOMES, GO TO SUGGESTIONS FOR C3/C3a ON THE GREEN TIP SHEET OR SELECT
APPROPRIATE RESULT CODE AND RECORD THE INFORMATION ON THE CALL RECORD.
C4. Hello, my name Is {INTERVIEWER'S NAME} from Westat, Inc. I am calling for a study being
conducted for the U.S. Environmental Protection Agency. When we spoke to someone else at
{CWS NAME}, you were Identified as being knowledgeable about your system's size, ownership,
and water sources. EPA is conducting a survey to collect information to help develop regulations
and guidelines for community water systems. [CONTINUE WITH C5]
C5.
For this study, the Environmental Protection Agency will select a sample of water systems from
across the country. Yours may be selected. While your participation is voluntary, ft is crucial to
the success of this project. I would like to ask you a few questions now to verify our records.
CONTINUE WITH CATI PORTION 1
OF SCREENER
RESPONDENT SAYS HE/SHE IS NOT-
KNOWLEDGEABLE
[ATTACH ADDITIONAL
CONTACT QUESTIONNAIRE
AND BEGIN AT C3 WITH CURRENT
RESPONDENT]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-3
SECTION B. CWS CATISCREENER
Part III. Screening Questions
S1. Does {CWS NAME} supply drinking water to its customers?
STRA
YES 1
NO 2 -*• [(THANK 01) Thank you very much, but we are only
interviewing suppliers of piped drinking water. Thank
you for your cooperation. AUTOCODE RESULT AS
•I' AND DISPLAY RESULT MESSAGE]
REFUSED -7 -> [GO TO REFUSAL PROBE1]
DONTKNOW -8 •* [GOTODKPROBE1]
[QUESTION S1A IS NEW]
S1 A. Does {CWS NAME} deliver that drinking water through a system of pipes [that is. water pipes, water
STRA lines, or water mains]?
YES 1
NO 2 -+ [(THANK01)] Thank you very much, but we are only
interviewing suppliers of piped drinking water.
Thank you for your cooperation. AUTOCODE
RESULT AS T AND DISPLAY RESULT MESSAGE]
REFUSED -7 -» [GO TO REFUSAL PROBE1]
DONTKNOW -8 -» [GOTODKPROBE1]
S2. Does {CWS NAME} have at least 15 service connections used bvvear-round residents?
STRA
YES 1 •+ [SKIPTOS3A]
NO 2
REFUSED -7 -* [GOTOREFUSALPROBE1]
DONTKNOW -8
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 4
S3. Does {CWS NAME} serve at least 25 year-round residents with piped drinking water?
STRA
YES
NO.
1
2 -*•
REFUSED -7
DONTKNOW -8
[(THANK 02) Thank you very much, but we are
only interviewing providers with 25 or more
residents. Thank you for your cooperation.
AUTOCODE RESULT AS 'I1 AND DISPLAY
RESULT MESSAGE]
[GO TO REFUSAL PROBE1]
[GOTODKPROBE1]
[QUESTION S3A IS NEW]
S3A. Does {CWS NAME}, or any parent company or agency, own or operate any other water systems
STRA besides {CWS NAME} in {CWS CITY [C137]}, {CWS STATE [C139]}
YES 1
NO 2
REFUSED -7
DONTKNOW -8
[BOXSIS NEW]
BOX 5
NOTE TO PROGRAMMER:
IF S3A - 1, -7, -Q, DISPLAY THE FOLLOWING PROMPT:
When you answer the following questions, please refer in your answers only to {CWS NAME} in
{CWS CITY [C137]}, {CWS STATE [C139]}. If this presents a difficulty for you at any point,
please explain the problem to me so that I can make a note of it.
CONTINUE WITH BOX 10.
ELSE CONTINUE WITH BOX 10.
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 5
Part IV. Stratification Questions
BOX 10
NOTE TO PROGRAMMER:
USE THE FOLLOWING TABLE TO DETERMINE THE HANDLING OF S4,
S4A, S5, S6, S7, AND SB WHEN ANY COMBINATION OF REFUSAL PROBE1
DK PROBE1, AND S7 VERIFICATION PROBE IS TRIGGERED.
MITIAL
RESPONSE
TO 84 -88
R
DISPLAY
PROBE
DKPROBE2
REF PROBE2
SECOND
RESPONSE
TOS4-S8
(FROM
PROBE)
DK
R
R
DK
FMAL
VALUE
M 84-88
DK
R
R
DK
GOTO
S7
OUTCOME
GOTO
S8
THE FOLLOWING COMBINATIONS CAN OCCUR FOR S7 ONLY:
DK
R
ANY VALUE
OUTSIDE S7
VERIFICATION
RANGE
DK PROBE2
REF PROBE2
S7
VERIFICATION
PROBE
ANY VALUE
OUTSIDE S7
VERIFICATION
RANGE
ANY VALUE
OUTSIDE S7
VERIFICATION
RANGE
DK
R
OUTLIER
VALUE
OUTLIER
VALUE
DK
R
NA
GOTO
S8
GOTO
S13
I
NA
™, OTHER RESPONSES REGULAR CATEGORICAL VALUES), FOLLOW THE MAIN PATH
INDICATED FOR EACH CATEGORICAL VALUE IN THE RESPECTIVE QUESTION FOR S7
^APPLIES ONLY IF THE CATEGORICAL VALUE IS ALSO WITHINI^.ERVF,?AT,ON RANGE
THESE SPECIFICATIONS PREVENT REPEATED PROBING OF THE SAME QUESTION IF
ALTERNATING SEQUENCES OF RESPONSES LEADING TO DIFFERENT FRO«WERE; TO
OCCUR. SPECIFICALLY, NO RANGE VERIFICATION PROBE IS ATTEMPTED ON^N OUTLIER
RESPONSE TO S7 WHEN THE RESPONSE FOLLOWS A DK OR REFUSED PROBE ON "
PREMISE THAT THIS SEQUENCE ALREADY INDICATES QUESTIONABLE
- CAT! SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 6
S4. Does {CWS NAME} purchase any of the water it distributes?
STRA
YES 1
NO.Z!!Z.*m " 2 -> [SKIP TO S6]
REFUSED -7 -» [GO TO REFUSAL PROBE2 AND
SKIP TO S7 IF CONFIRMED]
DONTKNOW -8 -*• [GO TO DK PROBE2 AND
SKIP TO S7 IF CONFIRMED]
[QUESTION S4A IS NEW]
S4A. Does {CWS NAME} purchase 100 percent of the water it distributes?
STRA
YES 1
NO 2 - [SKIP TO S6]
REFUSED -7 -* [GO TO REFUSAL PROBE2 AND SKIP TO S7 IF
CONFIRMED]
DONTKNOW -8 -" [GO TO DK PROBE2 AND SKIP TO S7 IF
CONFIRMED]
S5. Is {CWS NAME}'s primary source of purchased water from ground water or surface water?
STRA
GROUND WATER 1 -* [SKIP TOST]
SURFACE WATER 2 -> [SKIP TOST]
REFUSED -7 -» [GO TO REFUSAL PROBE2 AND
SKIP TO S7 IF CONFIRMED]
DONTKNOW -8 - [GOTO DKPROBE2 AND
SKIP TO S7 IF CONFIRMED]
[Ground water: wells, springs, aquifers, etc.]
[Surface water: lakes, reservoirs, ponds, rivers, streams, etc.]
S6. Is {CWS NAME}'s primary source of non-purchased water from ground water or surface water?
STRA
GROUND WATER 1
SURFACE WATER 2
REFUSED -7 •* [GO TO REFUSAL PROBE2 AND
CONTINUE WITH S7 IF CONFIRMED]
DONTKNOW -8 -» [GO TO DK PROBE2 AND
CONTINUE WITH S7 IF CONFIRMED]
[Ground water: wells, springs, aquifers, etc.]
(Surface water: lakes, reservoirs, ponds, rivers, streams, etc.]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 7
S7. What is the total year-round residential population served directly by {CWS NAME}? Would you
STRA say...
[ENTER ONE ONLY]
100 or less 1
101-500 2
501-1,000 3
1,001-3,300 4
3,301-10,000 5
10,001-50,000 6
50,001-100,000, or 7
Over 100,000? 8
REFUSED -7
DONTKNOW -8
-" [GO TO REFUSAL PROBE2 AND
CONTINUE WITH S8 IF CONFIRMED]
•* [GOTODKPROBE2AND
CONTINUE WITH S8 IF CONFIRMED]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S - 8
NOTE TO PROGRAMMER:
BOX 15
THE FOLLOWING RANGE VERIFICATION WILL BE APPLIED TO QUESTION S7.
IF THE RETAIL POPULATION COUNT FOR THE WATER SYSTEM (LOADED FROM
FRDS RLE DATA ELEMENT C117) EXCEEDS THE UPPER BOUND OF THE SELECTED
RESPONSE CATEGORY BY AN AMOUNT GREATER THAN 25% OF THE VALUE OF
C117,OR
IF C117 IS BELOW THE LOWER BOUND OF THE SELECTED RESPONSE CATEGORY
BY AN AMOUNT GREATER THAN 25% OF THE VALUE OF C117,
DISPLAY THE FOLLOWING PROBE AND REQUIRE RE-ENTRY OF THE RESPONSE:
That amount I* significantly different from the information in our records. I would like
to confirm that I recorded your answer correctly. It may help to mention that the
question concerns the year-round residential population served by {CWS NAME}.
This may be different from the total census population of the area, the total number of
year-round and transient people served, the number of residential water connections.
or the total residential population served by any other systems you might also operate.
REDISPLAYS?.
ACCEPT SECOND ENTRY WITHOUT FURTHER VERIFICATION.
NOTE: FORMULA LOGIC IS AS FOLLOWS:
IF S7 IS NOT MISSING AND C117 IS NOT MISSING
IF C117 2:25
IF (C117 - UPPER BOUND) > 0.25 * C117 OR
IF (LOWER BOUND - C117) > 0.25 * C117
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES -9
SB.
STRA Is the {CWS NAME}'s water system...
Owned or operated by
a government or public agency, 1 -»•
Owned privately and operated primarily
as a water business, or 2 -*
Owned privately and operated as a necessary
part of another business? 3 -»
REFUSED -7 •*
[SWPTOS12]
[CONTINUE WITH S10]
DONT KNOW -8
[SKIPTOS11]
[GO TO REFUSAL PROBE2 AND
SKIP TO S13 IF CONFIRMED]
[GOTODKPROBE2AND
SKIP TO S13 IF CONFIRMED]
[Government includes: city, town, township, village, or any municipal government;
county, borough, parish; special district or authority; state or Federal
government; or any other publicly owned or operated system.]
S9. S9 DELETED
S10. Is the water business...
STRA
Investor owned; operating separately but financially
dependent on a parent company, 1
Investor owned; operating separately and not financially
dependent on a parent company, 2
Owned and operated by a homeowners' association or [ [SKIP TO S13]
subdivision, or 3
Something else? 4
REFUSED .7
DONT KNOW -8
811. Is the other business a...
STRA
Mobile home parki 1
Hospital, 2
School 3
Some other institution, 4
Restaurant 5
Campground 6
Resort, 7
Apartments, 8
Condominiums, or 9
Something else? 10
REFUSED -7
DONT KNOW -8
> [SKIPTOS13]
- CATI SPECS #6 (FINAL) - December 22.1994
-------
PAGES-10
S12. Is that government a...
STRA
Town, township, village or municipal
government, 1
County, borough, or parish
government, 2
State government 3
Special district government 4
An authority, 5
The Federal government, or 6
Some other government? 7
REFUSED -7
DONT KNOW -8
[BOX 20 IS NEW]
BOX 20
NOTE TO PROGRAMMER:
FEDERAL AND STATE CWSs WERE EXCLUDED FROM THE SAMPLE FRAME PRIOR TO SAMPLING.
IF ANY WERE INCORRECTLY CODED IN FRDS, THEY WILL BE ELIMINATED HERE.
IF S12 - 3 OR 6, TERMINATE INTERVIEW AND AUTOCODE FINAL RESULT CODE AS '12.' DISPLAY
THANK YOU:
Those are all the questions I have. Thank you for your participation.
DISPLAY RESULT MESSAGE SCREEN.
S13. Is {CWS NAME} located on Native American or American Indian land, such as a reservation or
STRA other tribal land?
YES 1
NO 2
REFUSED -7
DONT KNOW -8
S14. S14 DELETED
S15. S15 DELETED
S16. S16 DELETED
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-11
Part V. Mailing and Contact information
S17. Your water system qualifies as the type of system we might be interested in for the Survey of
STRA Community Water Systems. Please give me the name, title, mailing address, and telephone number
of the person who would be best qualified to answer some more detailed questions about the
operations and finances of {CWS NAME} if it is selected for the survey.
CONTACT NAME:
TITLE:
MA1LFNAM (20 COLS)
(FIRST)
MAILTITL (40 COLS)
MA1LLNAM (25 COLS)
(LAST)
ORGANIZATION NAME: MAILCWS (40 COLS)
ADDRESS:
MAILADDR (40 COLS)
CITY:
MAILCITY
STATE: MAILSTAT
ZIP CODE:
(30 COLS)
(2 COLS)
MAILZIP
I_J_J_I.
MA1LAREA
MAILEXCH
TELEPHONE: ( |_|_|_| )
MA1LLOCL
- I_|_|_|
MA1LBCT
EXT.
[NOTE TO PROGRAMMER: REQUIRE ENTRY IN ALL FIELDS OF S17 EXCEPT TITLE AND EXT
ALLOW ENTRY OF REFUSED OR DONT KNOW IN ALL FIELDS.]
Thank you very much for your participation. [NOTE TO PROGRAMMER: AUTOCODE RESULT AS
'C' AND DISPLAY RESULT MESSAGE SCREEN]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S -12
REFUSAL PROBE1
This information is very important for the EPA water systems survey. Your answers will be kept
confidential. I would be happy to discuss your concerns, so that you might reconsider answering
this question.
[ENTER RESPONDENTS ANSWER IN THE ENTRY FIELD. IF RESPONDENT STILL REFUSES, RE-
ENTER 'REFUSED,' (SHIFT/7).]
NOTE TO PROGRAMMER:
USE THE FOLLOWING TABLE TO DETERMINE THE HANDLING OF Si, S1A, S2, AND
S3 WHEN REFUSAL PROBE1 IS TRIGGERED BY INITIAL ITEM REFUSAL:
2nd Response Final Value
toSl.S1AS2. S3 inSl.SlAS2. S3
R
DK
1,2
R
DK
1,2
Outcome
Terminate Interview and Treat as Refusal
Si, S1A S3: Terminate Interview and Treat as
Refusal
S2: Go To S3
Follow Main Path Instructions
REFUSAL TREATMENT NOTE: REFUSALS RESULTING AUTOMATICALLY FROM
REFUSAL PROBE1 WILL PRODUCE A RESULT MESSAGE TO INTERVIEWERS TO
RECORD RESULT CODE '12' ON CALL RECORD. OTHER REFUSALS WILL BE
CODED BY INTERVIEWERS AS STANDARD INTERIM REFUSAL CODE '2'. THIS WILL
PERMIT SEPARATE HANDLING OF REFUSALS DUE TO ITEM-REFUSAL IN
ELIGIBILITY QUESTIONS Si, S1A, S2, OR S3.
CATI WILL NOT ASSIGN INDIVIDUAL INTERIM RESULT CODES. ALL INTERIMS WILL
BE CODED BY CATI AS RESULT CODE '9'.
TELEPHONE SUPERVISORS WILL REVIEW CALL RECORDS AND UPDATE ALL
CASES AS FINAL REFUSALS, 'RB,' WHEN ANY COMBINATION OF RESULT CODES '2'
OR '12' REACHES A TOTAL COUNT OF TWO.
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-13
REFUSAL PROBE2 !
Your answer to this question will provide information needed to correctly categorize your water
system. It would be very helpful if you would reconsider answering It, so that we can be sure of
asking you the appropriate questions.
[ENTER RESPONDENTS ANSWER IN THE ENTRY FIELD. IF RESPONDENT STILL REFUSES RE-
ENTER 'REFUSED,' (SHIFT/7).]
NOTE TO PROGRAMMER:
UPON SECOND ENTRY IN DATA ENTRY FIELD (AFTER FIRST ENTRY INVOKED
REFUSAL PROBE 2), FOLLOW THE PATH INDICATED IN THE QUESTIONNAIRE FOR
THE ENTERED VALUE, EXCEPT FOR A SECOND REFUSED (= -7). IF -7 IS ENTERED
FOR A SECOND TIME, CONTINUE WITH NEXT QUESTION OR SKIP TO THE
INDICATED QUESTION, AS INDICATED IN THE QUESTIONNAIRE FOR A CONFIRMED
REFUSAL
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-14
DK PROBE1
This information is very important for the EPA water systems survey. If you don't feel certain enough
to answer this question, I would be glad to speak with someone else who could provide this
Information. If you feel you can give a reasonably certain answer, I can accept that, or would you
rather give me the name of someone else to contact?
STRA.DK PRB1
RESPONDENT CAN ANSWER QUESTION.,
,1
RESPONDENT CAN PROVIDE NAME OF
SOMEONE ELSE 2
RESPONDENT CANNOT ANSWER QUESTION AND
CANNOT PROVIDE NAME OF SOMEONE ELSE 3
[REDISPLAY QUESTION]
[DISPLAY COLLECT NAME
SCREEN]
[TERMINATE INTERVIEW AND
TREAT AS REFUSAL]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S -15
NOTE TO PROGRAMMER:
USE THE FOLLOWING TABLE TO DETERMINE THE HANDLING OF S1, S1A AND S3
WHEN DKPROBE1 IS TRIGGERED BY INITIAL DONT KNOW RESPONSE:
Response 2nd Final
to Response Value
DK PROBE1 to S1.S1A. S3 in S1. S1 A. S3 Outcome
1 1,2 -1,2 Follow Main Path Instructions
1 DK DK Terminate Interview and Treat as Refusal
1 R R Terminate Interview and Treat as Refusal
2 — DK Collect Name
3 — DK Terminate Interview and Treat as Refusal
REFUSAL TREATMENT NOTE: REFUSALS RESULTING AUTOMATICALLY FROM DK
PROBE1 WILL PRODUCE A RESULT MESSAGE TO INTERVIEWERS TO RECORD
RESULT CODE '12' ON CALL RECORD. OTHER REFUSALS WILL BE CODED BY
INTERVIEWERS AS STANDARD INTERIM REFUSAL CODE '2'. THIS WILL PERMIT
SEPARATE HANDLING OF REFUSAL DUE TO DONT KNOW RESPONSES AND
UNWILLINGNESS TO PROVIDE BETTER RESPONDENT FOR ELIGIBILITY QUESTIONS
S1-S3.
CAT! WILL NOT ASSIGN INDIVIDUAL INTERIM RESULT CODES. ALL INTERIMS WILL
BE CODED BY CATI AS RESULT CODE '9'.
TELEPHONE SUPERVISORS WILL REVIEW CALL RECORDS AND UPDATE ALL
CASES AS FINAL REFUSALS, 'RB,f WHEN ANY COMBINATION OF RESULT CODES '2f
OR '12' REACHES A TOTAL COUNT OF TWO.
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGE S -16
COLLECT NAME SCREEN (NEWCONT)
RECORD NEW CONTACT PERSON'S NAME, TITLE, AND PHONE NUMBER IN CONTACT UPDATE
SECTION OF RESPONDENT INFORMATION SHEET [FtIS].
IF NEW CONTACT PERSON IS AT CURRENT PHONE NUMBER, ASK IF PERSON IS AVAILABLE.
IF NOT, SCHEDULE CALLBACK ON CALL RECORD.
STRA.NEWCONT ( )
NEW CONTACT AVAILABLE....,
NEW CONTACT NOT AVAILABLE.
[RESTART INTERVIEW AT
VERIFICATION SCREEN,
VERF02]
[DISPLAY RESULT SCREEN
RESULT 01]
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-f7
DK PROBE2
Your answer to this question will provide information needed to correctly categorize your water
system. If you feel you can give a reasonably certain answer, I can accept that.
[ENTER RESPONDENT'S ANSWER IN THE ENTRY FIELD.
•DONT KNOW," RE-ENTER "DONT KNOW,' (SHIFT/8).]
IF RESPONDENT STILL ANSWERS
NOTE TO PROGRAMMER:
UPON SECOND ENTRY IN DATA ENTRY FIELD (AFTER FIRST ENTRY INVOKED DK
PROBE2), FOLLOW THE PATH INDICATED IN THE QUESTIONNAIRE FOR THE
ENTERED VALUE, EXCEPT FOR A SECOND DONT KNOW (= -8). IF -8 IS ENTERED
FOR A SECOND TIME, CONTINUE WITH NEXT QUESTION OR SKIP TO THE
INDICATED QUESTION, AS INDICATED IN THE QUESTIONNAIRE FOR A CONFIRMED
DONT KNOW.
- CATI SPECS #6 (FINAL) - December 22,1994
-------
PAGES-18
There will be four front-end screens:
1. Case ID entry screen
2. Information Verification Screen for selected case, to enable interviewer to confirm
correct entry of ID by matching screen information against RIS information:
CWS Name
CWS Address
CWS City, State, ZIP
CWS FRDS Number
3. Telephone number entry screen, to enable interviewer to enter current target number
from RIS into CATI, whjph will pass it to the autodialer. This will reduce dialing error,
since dialed number will be visible on screen for interviewer to verify before sending it
to the autodialer.
4. Information Verification Screen for selected case, to enable interviewer to a) re-confirm
correct entry of ID after dialing (by matching screen information against RIS
Information or information provided by contact), before proceeding into CATI
instrument, or b) redial/exit case if not confirmed by contact.
There will be two back-end screens.
1. Thank you screen, with reminder to interviewer to code result on paper Call Record.
2. Thank you and result message screen, displayed to interviewer after autocoding of
result by CATI; will display the assigned code and prompt the interviewer to record
that code on the paper Call Record.
In addition to the standard CATI displays at the top of each screen, the following Items
should also be displayed: CWS Name, City, State, FRDS number.
- CATI SPECS #6 (FINAL) - December 22,1994
-------
APPENDIX B
CWSS Telephone Contact Questionnaire
B-l
-------
-------
CONTACT QUESTIONNAIRE #
Westat ID
C1. Hello, my name is {INTERVIEWER'S NAME} from Westat, Inc. Have I reached the {CWS
NAME}. [I'm calling for a study being conducted for the U.S. Environmental Protection
Agency.]
YES.
1 -» [READ THE CWS NAME AND ADDRESS
PRINTED IN BOX 1 TO VERIFY THAT
IT IS CORRECT AND GO TO C2]
YES, BUT NAME/ADDRESS 2 •* [RECORD CLARIFICATION INFORMATION
HAS CHANGED IN BOX 4 AND GO TO C2]
NO.
COMMENTS:
3 -* [GO TO SUGGESTIONS FOR C1 ON
BLUE TIP SHEET]
C2. The U.S. Environmental Protection Agency is conducting a survey to collect information to
help develop regulations and guidelines for community water systems.
C3. I would like to speak with someone knowledgeable about the water system who can
answer a few questions about the system size, ownership, and water sources. Would that
be you or someone else?
May I have your/his/her first name (LAST NAME/TITLE/PHONE NUMBER) please?
CONTACT NAME:
(First)
(Last)
CONTACT TITLE:
CONTACT TELEPHONE NUMBER: ( )_
Ext.
CONTACT IS THE SPEAKER 1 -» [QOTOC5]
CONTACT IS NOT THE SPEAKER 2
-------
Westat ID
CONTACT QUESTIONNAIRE #
C3a. Would you please transfer me to {CONTACT GIVEN IN C3}?
AVAILABLE 1 - [GOTOC4]
NOT AVAILABLE 2 -
CONTACT IS REACHABLE.
AT DIFFERENT NUMBER
[END CALL. SELECT THE
APPROPRIATE RESULT
CODE AND RECORD THE
INFORMATION ON THE CALL
RECORD]
[END CALL, RECORD THE
INFORMATION ON CALL
RECORD. REDIALONCATI
SCREEN. ENTER THE NEW
TELEPHONE NUMBER AND
BEGIN AT C4]
FOR OTHER OUTCOMES, GO TO SUGGESTIONS FOR C3/C3a ON THE GREEN TIP SHEET OR
SELECT APPROPRIATE RESULT CODE AND RECORD THE INFORMATION ON THE CALL
RECORD.
C4. Hello, my name is {INTERVIEWER'S NAME} from Westat, Inc. I am calling for a study
being conducted for the U.S. Environmental Protection Agency. When we spoke to
someone else at {CWS NAME}, you were identified as being knowledgeable about your
system's size, ownership, and water sources. EPA is conducting a survey to collect
information to help develop regulations and guidelines for community water systems.
[CONTINUE WITH C5]
C5. For this study, the Environmental Protection Agency will select a sample of water systems
from across the country. Yours may be selected. While your participation is voluntary, it is
crucial to the success of this project. I would like to ask you a few questions now to verify
our records.
CONTINUE WITH CATI PORTION.
OF SCREENER
RESPONDENT SAYS HE/SHE IS NOT.
KNOWLEDGEABLE
1
[ATTACH ADDITIONAL
CONTACT QUESTIONNAIRE
AND BEGIN AT C3 WITH CURRENT
RESPONDENT]
-------
APPENDIX C
Public CWSS Mail Questionnaire
C-l
-------
-------
OMB No.: 20404173
Expires: 7/31/97
United States
Environmental Protection Agency
SURVEY OF PUBLIC
COMMUNITY WATER SYSTEMS
12/28/94
-------
Please return this questionnaire in the enclosed postage-paid envelope
or mail to:
EPA Community Water Systems Survey
1650 Research Boulevard
Room GA 45
Rockville, MD 20850-9973
The following questionnaire is estimated to require 45 minutes to an hour to complete.
This includes time for reviewing instructions, gathering and reporting the requested data, and reviewing the questionnaire.
Send comments regarding the burden estimate or any other aspect of this survey, indicating suggestions for reducing this burden, to:
Chief, Information Policy Branch, 2136 • VS. Environmental Protection Agency • 401M Street, S.W. • Washington, DC 20460, and
Desk Officer for EPA • Office of Information and Regulatory Affairs •• Office of Management and Budget • Washington, DC 20503.
-------
Please respond about:
If you have any questions about the survey or how to
complete the questionnaire, please call:
Please return your completed questionnaire in the
enclosed postage-paid envelope by March 1O, 1995.
-------
-------
1994 Community Water Systems Survey:
Public Systems Questionnaire
GENERAL INSTRUCTIONS
This questionnaire asks three preliminary questions and then is divided into two major parts:
PART I - OPERATING CHARACTERISTICS (Questions 4-27); and
PART II - FINANCIAL CHARACTERISTICS (Questions 28-40).
Please complete the questionnaire as follows:
• In Question 1, provide the best contact person for each part (I and II);
• In Question 2, indicate the latest full-year reporting periods for which your operating information,
and financial information are available;
• In Part I of the questionnaire, use the period indicated in Question 2(A) to report "last year's"
operating data; and in Part II, use the period indicated in Question 2(B) to report "last year's"
financial data;
• In Part II of the questionnaire, record dollar amounts as whole dollars;
. Please record your answers for the questionnaire by filling in the blank(s)
or circling the appropriate number(s) for each item; and
• Make a copy of the completed questionnaire for your records before sealing it in the enclosed
envelope.
1.
Please provide the name, title and telephone number of the most knowledgeable person to
contact for information on:
(A) PART I - OPERATING CHARACTERISTICS:
Name: Title:
Tel. No. ( ) - Fax No.
T
(B) PART II - FINANCIAL CHARACTERISTICS
(Write "SAME" if same as above)
Name: Title:
Tel. No. ( )_
Fax No.
-------
2. Please specify the end date of the most recent 12-month reporting period for which your
drinking water system can provide operating and financial information.
T
Can be reported
for the
12 months ending
(A) Operating information.
(B) Financial information .
3.
Please indicate, by circling the appropriate numbers in columns A, B, and C, whether the
organizations or people listed below provide your drinking water system with:
(A) Information on drinking water requirements and guidance;
(B) Operator training; and
(C) Technical assistance.
(Circle all numbers that apply for each information source)
INFORMATION SOURCE
T
(A)
Source providing
information
on drinking
water requirements
and guidance
1. State Department of Natural Resources, state
Health Department, or state EPA 1
2. Other state government departments or
extension services 1
3. U.S. Environmental Protection Agency 1
4. Other federal agencies or extension services
(e.g., FmHA, Rural Development Administration) 1
s. County government 1
6, Local government 1
7, State rural water associations 1
a. Other associations 1
9. Rural community assistance program 1
10. Contracted engineering services 1
11. Citizen volunteers 1
12. Electronic bulletin boards 1
13, Technical publications 1
14. Radio or television 1
15. Local newspapers 1
16. Federal register 1
17. Other (Please specify) 1
18. 1
T
(B)
Source
providing
operator
training
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
T
(C)
Source
providing
technical
assistance
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
-------
PART I - OPERATING CHARACTERISTICS
PRODUCTION AND STORAGE
4. For each type of water source listed below, please indicate which ones you use and:
(A) the number of gallons (in millions of gallons) produced in the last year (i.e., the amount
of water going into the distribution system); and
(B) the number of water intake points with disinfection.
WATER SOURCE
Ground water . . . .
Surface water . . . .
Water purchased from
other systems . . .
Do you obtain water
from this source?
YES
1
1
1
NO
2
2
If YES, enter the number of:
(A)
Gallons
produced In
the last year
(In millions)
(B)
Number of
Intake
points with
disinfection
5. What was your system's peak daily production of non-purchased drinking water during the past
year, and what is the system's maximum daily treatment design capacity?
V
Gallons
per day
(A) Peak daily production
(B) Maximum daily treatment design capacity
6. You reported your system's maximum daily treatment design capacity in Part B of Question 5.
There are several possible factors that may have resulted in a maximum capacity of this size.
Some possibilities are listed below. Please circle the number to the right of each factor that
indicates how important that factor was in determining your system's maximum design capacity.
Factor Determining Maximum Design Capacity
1. Current peak needs (beyond average daily flow)
2. Seasonal demand (e.g., irrigation)
3. Emergency flows (e.g., fire, drought)
4. Expected growth
s. Limited choice in package plant sizes.
6. Other (Please specify)
How important was this factor?
Very Important _^ >, Not important at all
2
2
2
2
2
2
3
3
3
3
3
3
4
4
4
4
4
4
5
5
5
5
5
5
-------
Do you have treated water storage?
1 Yes
2 No —»
Go to Question 9
8. Please indicate whether you have the following types of treated water storage listed below; and
if so, for each type of storage:
(A) how many tanks do you have;
(B) what is their storage capacity (in millions of gallons); and
(C) do you disinfect water in these tanks after storage?
TYPE OF TREATED
WATER STORAGE
Does your water
system have this
type of treated water
storage?
YES NO
If YES, complete the following:
(A) (B) (C)
Total Disinfect
storage capacity after
Number (In millions of storage?
of tanks gallons) YES NO
GROUND LEVEL OR SUB-SURFACE STORAGE
Natural materials (e.g., wood, earth):
1. Uncovered 1 2
2. Covered. 1 2
Synthetic materials
(e.g., steel, concrete):
3. Uncovered 1 2
4. Covered. 1 2
2
2
2
2
ELEVATED STORAGE
Natural materials (e.g., wood):
s. Uncovered
6. Covered.
Synthetic materials
(e.g., steel):
7. Uncovered . . ,
a. Covered. . . . ,
2
2
2
2
2
2
2
2
-------
9. Please indicate the types of pipe used in your distribution system. For each type of pipe, what
is the number of:
(A) miles (or feet) of existing pipe;
(B) miles (or feet) of pipe replaced in the last year;
(C) water main repairs in the last year; and
(D) months between flushes for that type of pipe.
TYPE OF PIPE
IRON:
1. w/Cement Lining. . .
2. w/o Cement Lining . .
ASBESTOS CEMENT:
3. w/Vinyl
4. w/o Vinyl
5. PVC
e. Other plastic
7. Other (Please specify)
Does your
distribution
system
have this
type of
pipe?
YES NO
If YES, enter the number of:
(A)
Miles or
feet
(specify
which)
of exist-
ing pipe
(B)
Miles or
Miles feet (specify Miles
or which) of or
feet? pipe replaced feet?
(Circle in the (Circle
MorF) last year MorF)
(C)
Water
main
repairs in
the last
year for
type of pipe
(D)
Months
between
flushes for
type of pipe
2
2
2
2
2
2
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
10. How many miles (or feet) of new pipe (for expansion purposes) have you installed in the last 5
years? Enter response for either miles pj; feet, but not both.
(If zero, enter "0"}
. MILES OF NEW PIPE OR
FEET OF NEW PIPE
-------
11.
How many people and connections does your system currently serve with piped drinking water,
and how many did it serve 5 years ago?
(Please estimate if you dont know the exact number.)
NOTE:
If your system serves a population that changes on a seasonal basis (for example, a winter
or summer resort area), please indicate the highest seasonal number of people served or
active connections.
(A)
(B)
T
Currently
T
5 years ago
PEOPLE SERVED WITH PIPED DRINKING WATER
ACTIVE CONNECTIONS WITH PIPED
DRINKING WATER
12. What are the ZIP codes of your service area? (If your system's service area covers more than
10 ZIP codes, record only the first 3 digits of the ZIP code(s), i.e., all ZIP codes covered by the
same 3 starting digits can be summarized as one ZIP code by recording the first 3 digits.)
OR
FIRST 3 DIGITS OF ZIP CODES (For systems whose service area covers more than 10 ZIP codes)
XX XX XX XX XX
OPERATOR' TlMlii
13. Do you have any drinking water treatment plant operators currently employed by your system?
T
1 Yes
2 No —> Go to Question 15
-------
14. Please indicate whether the treatment plant operators you employ have attained any of the
training level categories listed below. Provide the number of operators and average operator
work week (in hours) for each applicable training level category:
TRAINING LEVEL CATEGORY
STATE CERTIFIED (i.e., with state-approved
certified training for drinking water)
- Full time operator(s) [Definition:
Works at least 35 hours a week]
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, state certified
operators
Do you employ
drinking water
treatment
operators who
have attained this
training level
category?
YES NO
How many
operators
do you have?
(Number)
If "YES"
Average hours per week
per operator:
Other
Drinking
Treatment Water
Duties Duties
(Mrs) (Hrs)
2
2
2
TRAINED THROUGH A NATIONAL OR STATE
PROGRAM, BUT NOT STATE CERTIFIED
- Full time operators) (see definition above). .
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, trained operators
2
2
OTHER TRAINING LEVEL
(e.g., on-the-job training)
- Full time operator(s) (see definition above). . 1
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") . 1
- Other part time operators not
classified above 1
2
2
2
WATER SOURCES AND TREATMENT
15. Is your water system interconnected to another system that you can use for emergency
purposes (e.g., hot summers)?
T
1 Yes
2 No
-------
16. If your primary source of drinking water became permanently unusable due to contamination,
please indicate whether or not you would adopt any of the solutions listed below:
SOLUTION
1. Draw more heavily upon other sources on the present system. .
2, Draw upon another system to which you are now connected. . .
3. Draw upon alternative sources (e.g., hook up to another system)
4. Implement a water management plan (e.g., rationing)
s. Drill new well(s).
6. Curtail service
7. Other (Please specify)
If primary water
sources became
unusable, would you
adopt this solution?
YES NO
2
2
2
2
2
2
2
17. If you are currently interconnected to your long term alternate water source, check the box
indicated and go to Question 18.
Currently interconnected —>
Go to Question 18
If you have no long term alternate water source(s), check the box indicated and go to Question 18.
No long term alternate water source(s) —>
Go to Question 18
What is the name of your long term alternate water source(s) and how many miles is it from the
nearest connection point on your current system?
Name of long term
alternate water source(s)
1.
2.
3.
T
Distance
from
system
(to nearest mile)
If distance is under
one mile, please
estimate distance
in feet
-------
o>
CO
1- -
223
2 o
co
111
8
eo
co
•* in
co co
to
co
co en
co co
co «t in to
~& "& *t ^
I*- co en o i- CM
*- .1- f CM CM CM
CO •*
CM CM
-------
19. Are there places in your distribution system other than those reported in your answer to
question 8C (storage) or question 18 (treatment facilities) where you boost disinfectant
residuals?
1
2
Yes
No
20. Please supply the following information for each well or surface water intake not receiving
treatment (include only sources that were active in 1994). If you have more than five wells and
intakes, please check here QL (Record the information about the additional wells and intakes
on a photocopy of this page or use a blank sheet of your own.)
T
AQUIFER OR
SURFACE WATER
SOURCE NAME
(e.g, Ogalala Aquifer
or Ohh River)
SOURCE
TYPE (enter
G for Ground
orS for Surface)
LOCATION OF WELL OR INTAKE:
(Enter latitude and longitude from
local plat map or permit)
Latitude Longitude
(Degrees Min. Sec.) (Degrees Min. Sec.)
Average
flow
(Gal/day)
1..
If well,
please list:
Potential
flow
(Gal/day)
Well
depth
(feet)
SOURCE WATER
21. Does your drinking water system participate in a source-water or wellhead protection program?
V
1 Yes
2 No —»
Go to Question 25
22. Please indicate whether or not the following measures are being adopted in your source-water
or wellhead protection program:
MEASURE
Is this measure
adopted in your
source-water or
wellhead protection
program?
YES NO
1.
2.
3.
4.
5.
Education on land use impacts
Ownership of a watershed . .
Zoning or land use controls. .
Best Management Practices
(such as run-off controls, fertilizer scheduling,
less toxic road maintenance materials) . . . .
Other (Please specify)
2
2
2
2
2
10
-------
23. Who leads or manages this program?
(Circle only one number)
1 Local government
2 Regional authority (e.g., Section 208 Agency)
3 State agency
4 Other (Please specify)
24. How is the management area delineated?
(Circle all numbers that apply and fill in the blanks if 3, 4 or 5 is circled)
T
1
2
3
4
5
By watershed boundaries
By aquifer boundaries
By a fixed radius around well of
.feet
By a fixed distance from a surface water body of
Other (Please specify)
feet
25. Please indicate if any of the potential sources of contamination listed below exist within 2 miles
of your water supply intakes:
POTENTIAL SOURCE OF CONTAMINATION
Does this potential source of contamination
exist within 2 miles of your water supply?
1. Industrial or manufacturing facilities
2. Agricultural runoff.
3. Animal feed lots
4. Urban runoff
5. Sewage discharge
e. Hazardous waste site
7. Solid waste disposal
s. Nitrates.
9. Pesticides, rodenticides, fungicides
(e.g., mixing or storage facilities)
10. Mining, oil, or gas activities
11. Petroleum products (e.g., auto repair shops)
12. Solvents (e.g., dry cleaners)
13. Septic systems or other sewage discharges.
14. Other (Please specify)
YES
1
1
1
1
1
1
1
NO
2
2
2
2
2
2
2
2
2
2
2
2
2
2
11
-------
26. Who performs laboratory analysis on your drinking water?
LAB ANALYSIS PROVIDER
The state
A private firm. . . .
In-house employees
Other (Specify)
Does this provider perform your lab analysis for ...
Matals/ other
Inorganics?
YES
1
1
1
1
NO
2
2
2
2
Microbials?
YES
1
1
1
1
NO
2
2
2
2
VOCs'? Organics?
YES NO YES
1 2 1
1 2 1
1 2 1
1 2 1
NO
2
2
2
2
'VOCs^Volatile organic compounds (e.g., carbon
tetrachloride, benzine, THMs, etc.)
27. How do you pay for your laboratory analysis?
PAYMENT METHOD
Direct payment for tests to state or private lab.
Included as part of state permit
Dontpay
Other (Please specify)
Do you use this payment method?
YES NO
2
2
2
2
PART H - FINANCIAL INFORMATION
REVENUES AND EXPENSES
28.
Are your financial reports or income and expense statements for your drinking water system
completed in accordance to Generally Accepted Accounting Principles (GAAP)?
(Circle one number)
T
1 Yes
2 No
3 Dont have separate income and expense
statements for our drinking water system
4 Don't know
12
-------
To simplify your task of providing financial information, please follow the guidelines
below when filling but the remainder of the questionnaire.
PROVIDING ESTIMATES:
The following questions ask for information on drinking water supply operations, exclusive of other
activities with other types of operations. Where possible, please provide exact information from
your system's records. Otherwise provide your best estimate of financial information that is
applicable to your drinking water system only.
ROUNDING:
Please record your dollar amounts to the nearest dollar. DO NOT record fractional
dollars (i.e., dollars and cents).
13
-------
29. During the last year [as defined in your response to Question 2(B)] what were your drinking
water system's revenues from water sales for each of the following customer categories:
(If zero, enter "0")
1.
2.
3.
4.
5.
6.
7.
8.
9.
WATER SALES
CUSTOMER CATEGORIES
Residential customers $_
Commercial customers $_
Industrial customers $_
Wholesale customers (i.e., those
who redistribute your water
to other users) $_
Local municipal government $_
Other government customers $_
Agricultural customers $_
Other (Specify) $_
TOTAL $
T
Water Sales
Revenues
Gallons delivered
(in millions)
30. Please indicate your drinking water system's revenues during the last year from the other water-
related revenue sources listed below.
(If zero, enter "0")
WATER RELATED REVENUE SOURCE
(EXCLUDING WATER SALES)
1. Connection fees $_
2. Inspection fees $_
3. Developerfees $_
4. Other fees $_
5. General fund revenues (operating transfers in) $_
6. Interest earnings (on water fund, etc.) $_
7. Fines/penalties $_
8.
9.
Please specify other water system
revenues (not elsewhere reported)
$_
$_
T
Revenues
14
-------
31. For each customer category listed below, please identify your drinking water system's billing
structure, indicate the'year and percent of the two most recent rate increases, and provide the
number of metered and unmetered active connections.
(If zero, enter "0")
T V
1.
2.
3.
4.
5.
6.
7.
8.
CUSTOMER CATEGORY
Residential customers
Commercial customers
Industrial customers
Wholesale customers (i.e., those who
redistribute your water to other users)
Local municipal government
Other government customers
Agricultural customers
Other (Specify)
1
1
1
Billing
structure
(Circle all
code(s)
from Box 2
that apply)
23456
3456
o 4 o o
23456
3456
o 4 o b
3456
23456
7
7
7
Year and percent of
two most recent rate
increases
YR. % YR. %
Number
of active
connections
Metered/Unmetered
/
/
/
1
1
Note: The total of all metered and unmetered connections should be the same as the
current active connections reported in question 11 (B).
Metered Charges
CODE Billing Structure
1 Uniform rate
2 Declining block rate
3 Increasing block rate
4 Peak period rate
(e.g., seasonal)
BOX 2 - BILLING STRUCTURE
Unmetered Charges
CODE Billing Structure
5 Separate flat rate for water
6 Combined flat rate for water and other services
(e.g., rental fees, association fees, pad fees)
Other Type of Charges
CODE Billing structure
7 Other (Specify)
15
-------
32. How many gallons (or dollar equivalents) of uncompensated usage did your water system have
in the last year for each of the usage categories listed below:
UNCOMPENSATED USAGE CATEGORY
1. Free service to municipal buildings and parks.
2. Fke protection, street cleaning,
hydrant flushing
3. Leaks, breaks, failed meters
4. Uncollected bills
5. Other (Specify)
Uncompensated usage
(Enter either millions of gallons
QT dollar equivalent, if gallons unknown)
million gals, or $
.million gals, or $_
.million gals, or $_
.million gals, or $_
..million gals, or $_
The next question is intended to account for M of your drinking water expenses. Please list
your:
*£• Routine operating expenses in Part A;
^> Capital-related expenses (including interest or
principal repayment) in Part B; and
=*• Other expenses in Part C.
16
-------
33A.
Please enter the routine operating expenses of your drinking water system in the fast year, according
to the operating expense categories listed below:
PART A
OPERATING EXPENSES
Last year's expenses
DIRECT COMPENSATION (wages, salaries, bonuses, etc.):
1. Managers $
2. Operators $
3. Others $
4. Benefits (health & insurance premiums, PICA,
FUTA, and pension contributions) $
ENERGY COSTS:
s. Electricity $
e. Other energy (gas, oil, etc.) $
CHEMICALS:
7. Disinfectants $
s. Precipitant chemicals $
9. Other chemicals $
10. Materials and supplies $
11. Outside analytical lab services $
12. Other outside contractor services $
13. Depreciation expenses $
14. Water purchase expense
LJ raw water Ql treated water $
15. Payments in lieu of taxes or other
cash transfers out $
16. Other operating expenses (general and administra-
tive expenses not reported elsewhere) $
17. TOTAL ALL OPERATING EXPENSES $
B. Please enter the amount of debt service expenditures for your drinking water system in the last year.
PART B
DEBT SERVICE EXPENDITURES
is. Interest payments $
19. Principal payments $__
20. Other debt service
expenditures (Specify) $
21. TOTAL ALL DEBT SERVICE
EXPENDITURES $
C. Please enter the amount of other expenses (excluding operating and debt service expenses reported
in Parts A and B) for your drinking water system in the last year.
PARTC
OTHER EXPENSES
22. Capital improvements (e.g.,
expansion, new treatment) $
23. Advance contributions to sinking funds $
24. Other (Specify) $
25. TOTAL OTHER EXPENSES $
26. TOTAL ALL EXPENSES (FROM PARTS A - C) . . . . $
17
-------
ASSETS,
34. Please provide the following information on your drinking water system's total assets and
liabilities, outstanding debt, and total capital reserve fund.
1.
2.
3,
4.
S.
6.
7.
Amount at end
of last year
TOTALASSETS
TOTAL LIABILITIES
TOTAL DEBT OUTSTANDING:
DIRECT NET DEBT (see definition below):
Due within 5 years
Longer than 5 years JL
Revenue Bond Debt sL
All Other Debt &_
TOTAL CAPITAL RESERVE FUND $_
DEFINITION:
Direct Net Debt - Gross direct debt (owed directly by a jurisdiction) less debt that is self-supporting
(revenue bonds) and double-barreled bonds (general obligation bonds secured by earmarked revenues
which flow outside the general fund).
CAPITAL INVESTMENT
35. Have you paid for major capital improvements, repairs or expansion since January 1,1987?
T
1 Yes
2 No —>
Go to Question 37
18
-------
36. What sources of funds did you use to pay for these major capital improvements, repairs, or
expansion?
SOURCE OF FUNDS FOR
CAPITAL INVESTMENT
Debt Financing
1. Revenue or industrial development
bond
2. General obligation bond
3. Bank loan
STATE OR FEDERAL SUBSIDIZED LOAN:
4. Rural Development
Administration (RDA)
5. Farmers Home
Administration (FmHA)
6. State Agencies (Specify)
Other Sources of Funds
7. Payment from capital reserve fund . .
8. Special assessment
9. Cash flow from current revenues. . .
STATE OR FEDERAL SUBSIDIZED GRANT:
10. Rural Development
Administration (RDA)
11. Farmers Home
Administration (FmHA)
12. Other (Specify)
Was this source of
funds used since
1/1/87?
YES
NO
2
2
2
2
2
2
2
2
2
2
2
If YES, how much was secured or provided for
each of the following?
Water quality Replacement System
improvement or major repairs expansion
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$.
$_
$_
$_
$_
$_
$_
$_
37. Have you ever had to reduce or cancel plans for major capital improvements, repairs, or
expansion of your drinking water system because you were unable to secure an adequate loan
from any source; and if so, what was the amount of the loan sought?
1 Yes —»
2 No
Amount of Loan
$
Reason for Loan Denial (if known)
IF YOU HAVE NOT USED BONDS FOR FINANCING, GO TO QUESTION 40.
19
-------
38. Have your bonds ever been rated by a rating service?
T
1 Yes
2 No —»
Go to Question 39C
39A. What was your system's latest bond rating?
RATING SERVICE
T
Rating
Moody's
Standard and Poor's
Other (Specify)
(e.g., Baal)
(e.g., BBB+)
39B. What was the year of your system's latest bond rating?
T
19
39C. What was the type of bond that was last issued by your system?
(Circle one number)
1 Revenue or industrial development bond
2 General obligation bond
3 Other (Specify)
40. Please enter any additional comments (optional):
THANK YOU FOR COMPLETING THIS QUESTIONNAIRE. YOUR
TIME AND EFFORT ARE GREATLY APPRECIATED.
MAILING INSTRUCTIONS ARE INSIDE THE FRONT COVER.
20
-------
APPENDIX D
Private CWSS Mail Questionnaire
D-l
-------
-------
OMB No.: 20400173
Expires: 7/31/97
United States
Environmental Protection Agency
SURVEY OF PRIVATE
COMMUNITY WATER SYSTEMS
12/28/94
-------
Please return this questionnaire in the enclosed postage-paid envelope
or mail to:
EPA Community Water Systems Survey
1650 Research Boulevard
Room GA 45
Rockville, MD 20850-9973
The following questionnaire is estimated to require 45 minutes to an hour to complete.
This includes time for reviewing instructions, gathering and reporting the requested data, and reviewing the questionnaire.
Send comments regarding the burden estimate or any other aspect of this survey, indicating suggestions for reducing this burden, to:
Chief, Information Policy Branch, 2136 • VS. Environmental Protection Agency • 401M Street, S.W. • Washington, DC 20460, and
Desk Officer for EPA • Office of Information and Regulatory Affairs • Off ice of Management and Budget • Washington, DC 20503.
-------
Please respond about:
If you have any questions about the survey or how to
complete the questionnaire, please call:
Please return your completed questionnaire in the
enclosed postage-paid envelope by March 1O, 1995.
-------
-------
1994 Community Water Systems Survey:
Private Systems Questionnaire
GENERAL INSTRUCTIONS
This questionnaire asks three preliminary questions and then is divided into two major parts:
PART I - OPERATING CHARACTERISTICS (Questions 4-27); and
PART II - FINANCIAL CHARACTERISTICS (Questions 28-40).
Please complete the questionnaire as follows:
• In Question 1, provide the best contact person for each part (I and II);
• In Question 2, indicate the latest full-year reporting periods for which your operating information,
and financial information are available;
• In Part I of the questionnaire, use the period indicated in Question 2(A) to report "last year's"
operating data; and in Part II, use the period indicated in Question 2(B) to report "last year's"
financial data;
• In Part II of the questionnaire, record dollar amounts as whole dollars;
. Please record your answers for the questionnaire by filling in the blank(s)
or circling the appropriate number(s) for each item; and
• Make a copy of the completed questionnaire for your records before sealing it in the enclosed
envelope.
1. Please provide the name, title and telephone number of the most knowledgeable person to
contact for information on:
(A) PART I - OPERATING CHARACTERISTICS:
Name: Title:
Tel. No. ( ) - Fax No..
T
(B) PART II - FINANCIAL CHARACTERISTICS
(Write "SAME" if same as above)
Name:
Tel. No. ( )_
Title:
Fax No.
-------
2. Please specify the end date of the most recent 12-month reporting period for which your
drinking water system can provide operating and financial information.
Can be reported
for the
12 months ending
(A) Operating information.
(B) Financial information .
3.
Please indicate, by circling the appropriate numbers in columns A, B, and C, whether the
organizations or people listed below provide your drinking water system with:
(A) Information on drinking water requirements and guidance;
(B) Operator training; and
(C) Technical assistance.
(Circle an numbers that apply for each information source)
INFORMATION SOURCE
T
(A)
Source providing
information
on drinking
water requirements
and guidance
1. State Department of Natural Resources, state
Health Department, or state EPA 1
2, Other state government departments or
extension services 1
3, U.S. Environmental Protection Agency 1
4. Other federal agencies or extension services
(e.g., FmHA, Rural Development Administration) 1
s. County government 1
e. Local government 1
7. State rural water associations 1
8. Other associations 1
9. Rural community assistance program 1
10. Contracted engineering services 1
11. Citizen volunteers 1
12. Electronic bulletin boards 1
13. Technical publications 1
14. Radio or television 1
15. Local newspapers 1
16. Federal register 1
17. Other (Please specify)__ 1
18. 1
T
(B)
Source
providing
operator
training
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
(C)
Source
providing
technical
assistance
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
-------
PART I - OPERATING CHARACTERISTICS
PRODUCTION AMD STORAGE
4. For each type of water source listed below, please indicate .which ones you use and:
(A) the number of gallons (in millions of gallons) produced in the last year (i.e., the amount
of water going into the distribution system); and
(B) the number of water intake points with disinfection.
WATER SOURCE
Ground water ....
Surface water . . . .
Water purchased from
other systems . . .
Do you obtain water
from this source?
YES
1
1
NO
2
2
If YES, enter the number of:
(A) (B)
Gallons Number of
produced in intake
the last year points with
(in millions) disinfection
5. What was your system's peak daily production of non-purchased drinking water during the past
year, and what is the system's maximum daily treatment design capacity?
T
Gallons
per day
(A) Peak daily production
(B) Maximum daily treatment design capacity
6. You reported your system's maximum daily treatment design capacity in Part B of Question 5.
There are several possible factors that may have resulted in a maximum capacity of this size.
Some possibilities are listed below. Please circle the number to the right of each factor that
indicates how important that factor was in determining your system's maximum design capacity.
Factor Determining Maximum Design Capacity
1. Current peak needs (beyond average daily flow)
2. Seasonal demand (e.g., irrigation)
3. Emergency flows (e.g., fire, drought)
4. Expected growth
s. Limited choice in package plant sizes.
e. Other (Please specify)
How important was this factor?
Very Important ^ y Not important at all
2
2
2
2
2
2
3
3
3
3
3
3
4
4
4
4
4
4
5
5
5
5
5
5
-------
Do you have treated water storage?
1 Yes
2 No —»
Go to Question 9
Please indicate whether you have the following types of treated water storage listed below; and
if so, for each type of storage:
(A) how many tanks do you have;
(B) what is their storage capacity (in millions of gallons); and
(C) do you disinfect water in these tanks after storage?
TYPE OF TREATED
WATER STORAGE
Does your water
system have this
type of treated water
storage?
YES NO
If YES, complete the following:
(A) (B) (C)
Total Disinfect
storage capacity after
Number (in millions of storage?
of tanks gallons) YES NO
GROUND LEVEL OR SUB-SURFACE STORAGE
Natural materials (e.g., wood, earth):
1. Uncovered 1
2. Covered. 1
Synthetic materials
(e.g., steel, concrete):
3. Uncovered . . . .
4. Covered
2
2
2
2
2
2
2
2
ELEVATED STORAGE
Natural materials (e.g., wood):
s. Uncovered
6. Covered.
Synthetic materials
(e.g., steel):
7. Uncovered . . ,
8. Covered. . . . ,
2
2
2
2
2
2
2
2
-------
9. Please indicate the types of pipe used in your distribution system. For each type of pipe, what
is the number of:
(A) miles (or feet) of existing pipe;
(B) miles (or feet) of pipe replaced in the last year;
(C) water main repairs in the last year; and
(D) months between flushes for that type of pipe.
TYPE OF PIPE
IRON:
1. w/Cement Lining.
2. w/o Cement Lining
ASBESTOS CEMENT:
3. w/Vinyl
4. w/o Vinyl
5. PVC
e. Other plastic
7. Other (Please specify)
Does your
distribution
system
have this
type of
pipe?
YES NO
If YES, enter the number of:
(A)
Miles or
feet
(specify
which)
of exist-
ing pipe
(B)
Miles or
Miles feet (specify Miles
or which) of or
feet? pipe replaced feet?
(Circle in the (Circle
M or F) last year M or F)
(C)
Water
main
repairs In
the last
year for
type of pipe
(D)
Months
between
flushes for
type of pipe
2
2
2
2
2
2
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
10. How many miles (or feet) of new pipe (for expansion purposes) have you installed in the last 5
years? Enter response for either miles pi feet, but not both.
(If zero, enter "0")
. MILES OF NEW PIPE OR
FEET OF NEW PIPE
-------
11.
How many people and connections does your system currently serve with piped drinking water,
and how many did it serve 5 years ago?
(Please estimate if you don't know the exact number.)
NOTE:
If your system serves a population that changes on a seasonal basis (for example, a winter
or summer resort area), please indicate the highest seasonal number of people served or
active connections.
(A)
(B)
V
Currently
5 years ago
PEOPLE SERVED WITH PIPED DRINKING WATER
ACTIVE CONNECTIONS WITH PIPED
DRINKING WATER
12. What are the ZIP codes of your service area? (If your system's service area covers more than
10 ZIP codes, record only the first 3 digits of the ZIP codefs), i.e., all ZIP codes covered by the
same 3 starting digits can be summarized as one ZIP code by recording the first 3 digits.)
OR
FIRST 3 DIGITS OF ZIP CODES (For systems whose service area covers more than 10 ZIP codes)
XX XX XX XX XX
OPERATOR TRAINING
13. Do you have any drinking water treatment plant operators currently employed by your system?
1 Yes
2 No —> Go to Question 15
-------
14. Please indicate whether the treatment plant operators you employ have attained any of the
training level categories listed below. Provide the number of operators and average operator
work week (in hours) for each applicable training level category:
TRAINING LEVEL CATEGORY
STATE CERTIFIED (i.e., with state-approved
certified training for drinking water)
- Full time operators) [Definition:
Works at least 35 hours a week]
- Part time operators) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, state certified
operators
Do you employ
drinking water
treatment
operators who
have attained this
training level
category?
YES NO
How many
operators
do you have?
(Number)
If "YES"
Average hours per week
per operator:
Other
Drinking
Treatment Water
Duties Duties
(Hrs) (Hrs)
1 2
1 2
1 2
TRAINED THROUGH A NATIONAL OR STATE
PROGRAM, BUT NOT STATE CERTIFIED
- Full time operator(s) (see definition above). .
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, trained operators
2
2
OTHER TRAINING LEVEL
(e.g., on-the-job training)
- Full time operator(s) (see definition above). . 1
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") . 1
- Other part time operators not
classified above 1
2
2
2
WATER SOURCES AND TREATMENT
15. Is your water system interconnected to another system that you can use for emergency
purposes (e.g., hot summers)?
V
1 Yes
2 No
-------
16. If your primary source of drinking water became permanently unusable due to contamination,
please indicate whether or not you would adopt any of the solutions listed below:
SOLUTION
1. Draw more heavily upon other sources on the present system. .
2. Draw upon another system to which you are now connected. . .
3. Draw upon alternative sources (e.g., hook up to another system)
4. Implement a water management plan (e.g., rationing)
s. Drill new well(s).
6. Curtail service
7. Other (Please specify)
If primary water
sources became
unusable, would you
adopt this solution?
YES NO
2
2
2
2
2
2
2
17. If you are currently interconnected to your long term alternate water source, check the box
indicated and go to Question 18.
Q
Currently interconnected —>
Go to Question 18
If you have no long term alternate water source(s), check the box indicated and go to Question 18.
T
Q
No long term alternate water source(s) —>
Go to Question 18
What is the name of your long term alternate water source(s) and how many miles is it from the
nearest connection point on your current system?
Name of long term
alternate water source(s)
1.
2.
3.
Distance
from
system
(to nearest mile)
If distance is under
one mile, please
estimate distance
in feet
-------
OJ .g
f i
§•5
l
<5
\g
!I
2
0
o
i§
^
-S
o § o
•- s.
ifca
I 2 x
o w o
=s 3 O"
.OOO
/i\ O -^p
111
Q> .2 'S
Q. Q.-2
co
o
jo
<
u
o
~l ~l ~
1 1
"I "I ~
1 1
~l ~l -
1 1
"I ~l ~
1 1
"1 1 ~
1 1
"1
1
~l
1
1
1
1
~l
1
~l
S
s
- * i 2 £
I '*!{
g 2 c " E?
f, £050
Q CO CO CO CO
8
8 z
•a 2
5 I- (B
i 8 |
8 g
-------
19. Are there places in your distribution system other than those reported in your answer to
question 8C (storage) or question 18 (treatment facilities) where you boost disinfectant
residuals?
V
1 Yes
2 No
20. Please supply the following information for each well or surface water intake not receiving
treatment (include only sources that were active in 1994). If you have more than five wells and
intakes, please check here Q. (Record the information about the additional wells and intakes
on a photocopy of this page or use a blank sheet of your own.)
T
AQUIFER OR
SURFACE WATER
SOURCE NAME
(e,g, Ogalala Aquifer
LOCATION OF WELL OR INTAKE:
SOURCE (Enter latitude and longitude from
TYPE (enter local plat map or permit)
G for Ground Latitude Longitude
or Ohio River) or S for Surface) (Degrees Win. Sec.)
(Degrees Min. Sec.)
Average
flow
(Gal/day)
If well,
pleas* list:
Potential
flow
(Gal/day)
Well
depth
(feet)
*._
21. Does your drinking water system participate in a source-water or wellhead protection program?
1 Yes
2 No —» Go to Question 25
22. Please indicate whether or not the following measures are being adopted in your source-water
or wellhead protection program:
MEASURE
Is this measure
adopted in your
source-water or
wellhead protection
program?
YES NO
1.
2.
3.
4.
Education on land use impacts
Ownership of a watershed . .
Zoning or land use controls. .
Best Management Practices
(such as run-off controls, fertilizer scheduling,
less toxic road maintenance materials) . . . .
Other (Please specify)
2
2
2
2
2
10
-------
23. Who leads or manages this program?
(Circle only one number)
1 Local government
2 Regional authority (e.g., Section 208 Agency)
3 State agency
4 Other (Please specify)
24. How is the management area delineated?
(Circle all numbers that apply and fill in the blanks if 3, 4 or 5 is circled)
T
1
2
3
4
5
By watershed boundaries
By aquifer boundaries
By a fixed radius around well of
feet
By a fixed distance from a surface water body of
Other (Please specify)
feet
25. Please indicate if any of the potential sources of contamination listed below exist within 2 miles
of your water supply intakes:
POTENTIAL SOURCE OF CONTAMINATION
Does this potential-source of contamination
exist within 2 miles of your water supply?
1. Industrial or manufacturing facilities
2. Agricultural runoff.
3. Animal feed lots
4. Urban runoff
5. Sewage discharge
6. Hazardous waste site
7. Solid waste disposal
a. Nitratea
9. Pesticides, rodenticides, fungicides
(e.g., mixing or storage facilities)
10. Mining, oil, or gas activities
11. Petroleum products (e.g., auto repair shops)
12. Solvents (e.g., dry cleaners)
13. Septic systems or other sewage discharges.
14. Other (Please specify)
YES
1
1
1
1
1
1
1
NO
2
2
2
2
2
2
2
2
2
2
2
2
2
2
11
-------
26. Who performs laboratory analysis on your drinking water?
LAB ANALYSIS PROVIDER
The state
A private firm. . . .
In-house employees
Other (Specify)
Does this provider perform your lab analysis for...
Metals/ other
Inorganics? Microblals? VOCs'? Organics?
YES NO YES NO YES NO YES NO
'VOCsaVolatile organic compounds (e.g., carbon
tetrachloride, benzine, THMs, etc.)
27. How do you pay for your laboratory analysis?
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
PAYMENT METHOD
Direct payment for tests to state or private lab.
Included as part of state permit
Don't pay
Other (Please specify)
Do you use this payment method?
YES NO
2
2
2
2
PART H - FINANCIAL INFORMATION
REVENUES AMD EXPENSES
28. Are your financial reports or income and expense statements for your drinking water system
completed in accordance to Generally Accepted Accounting Principles (GAAP)?
(Circle one number)
1 Yes
2 No
3 Don't have separate income and expense
statements for our drinking water system
4 Don't know
12
-------
To simplify your task of providing financial information, please follow the guidelines
below when filling out the remainder of the questionnaire.
PROVIDING ESTIMATES:
The following questions ask for information on drinking water supply operations, exclusive of other
activities with other types of operations. Where possible, please provide exact information from
your system's records. Otherwise provide your best estimate of financial information that is
applicable to your drinking water system only.
ROUNDING:
Please record your dollar amounts to the nearest dollar. DO NOT record fractional
dollars (i.e., dollars and cents).
13
-------
29. During the last year [as defined in your response to Question 2(B)] what were your drinking
water system's revenues from water sales for each of the following customer categories:
(If zero, enter *0")
1.
2.
3.
4.
5.
6.
7.
8.
9.
WATER SALES
CUSTOMER CATEGORIES
Residential customers $_
Commercial customers $
Industrial customers $_
Wholesale customers (i.e., those
who redistribute your water
to other users) $_
Local municipal government $_
Other government customers $_
Agricultural customers $_
Other (Specify) $_
T
Water Sales
Revenues
T
Gallons delivered
(in millions)
TOTAL $_
30. Please indicate your drinking water system's revenues during the last year from the other water-
related revenue sources listed below.
(If zero, enter "0")
WATER RELATED REVENUE SOURCE
(EXCLUDING WATER SALES)
1. Connection fees $_
2. Inspection fees $_
3. Developer fees $_
4. Other fees $_
s. Interest earnings (on water fund, etc.) $_
Please specify other water system
revenues (not elsewhere reported)
6. $_
7. $_
Revenues
14
-------
31. For each customer category listed below, please identify your drinking water system's billing
structure, indicate the year and percent of the two most recent rate increases, and provide the
number of metered and unmetered active connections.
(If zero, enter "0")
T Y T
1.
2.
3.
4.
5.
6.
7.
8.
CUSTOMER CATEGORY
Residential customers
Commercial customers
Industrial customers
Wholesale customers (i.e., those who
redistribute your water to other users)
Local municipal government
Other government customers
Agricultural customers
Other (Specify)
1
1
1
Billing
structure
(Circle all
code(s)
from Box 2
that apply)
23456
«3 4 O O
J 4 O O
23456
3456
3456
2Q A C C
O 4 O O
23456
7
7
7
Year and percent of
two most recent rate
increases
YR. % YR. %
Number
of active
connections
Metered/Unmetered
/
/
/
/
/
/
Note: The total of all metered and unmetered connections should be the same as the
current active connections reported in question 11 (B).
Metered Charges
CODE Billing Structure
1 Uniform rate
2 Declining block rate
3 Increasing block rate
4 Peak period rate
(e.g., seasonal)
BOX 2 - BILLING STRUCTURE
Unmetered Charges
CODE Billing Structure
5 Separate flat rate for water
6 Combined flat rate for water and other services
(e.g., rental fees, association fees, pad fees)
Other Type of Charges
CODE Billing structure
7 Other (Specify)
15
-------
32. How many gallons (or dollar equivalents) of uncompensated usage did your water system have
in the last year for each of the usage categories listed below:
UNCOMPENSATED USAGE CATEGORY
1. Free service to municipal buildings and parks.
2, Fire protection, street cleaning,
hydrant flushing
3. Leaks, breaks, failed meters
4. Uncollected bills
s. Other (Specify)
Uncompensated usage
(Enter either millions of gallons
QT dollar equivalent, if gallons unknown)
million gals, or $
jnillion gals, or $_
jnillion gals, or $_
jnillion gals, or $_
jnillion gals, or $_
The next question is intended to account for M of your drinking water system expenses.
Please list your:
• Routine operating expenses in Part A;
• Capital-related expenses (including interest or
principal repayment) in Part B; and
• Other expenses in Part C.
16
-------
33A. Please enter the routine operating expenses of your drinking water system in the last year, according
to the operating expense categories listed below:
PART A
OPERATING EXPENSES
Last year's expenses
DIRECT COMPENSATION (wages, salaries, bonuses, etc.):
1. Managers $
2. Operators $
3. Others $
4. Benefits (health & insurance premiums, PICA,
FUTA, and pension contributions) $
ENERGY COSTS:
5. Electricity $
6. Other energy (gas, oil, etc.) . '. $
CHEMICALS:
7. Disinfectants $
8. Precipitant chemicals $
9. Other chemicals $
10. Materials and supplies $
11. Outside analytical lab services $
12. Other outside contractor services $
13. Depreciation expenses $
14. Water purchase expense
Gl raw water Q treated water $
15. Other operating expenses (general and administra-
tive expenses not reported elsewhere) $
ALL TAXES: (income, property, etc.)
16. Federal taxes $
17. State taxes $
is. Local taxes $
19. TOTAL ALL OPERATING EXPENSES $
B. Please enter the amount of debt service expenditures for your drinking water system in the last year.
PARTS
DEBT SERVICE EXPENDITURES
20. Interest payments $
21. Principal payments $
22. Other debt service
expenditures (Specify) $
23. TOTAL ALL DEBT SERVICE
EXPENDITURES $
C. Please enter the amount of other expenses (excluding operating and debt service expenses reported
in Parts A and B) for your drinking water system in the last year.
PARTC
OTHER EXPENSES
24. Capital improvements (e.g.,
expansion, new treatment) $
25. Advance contributions to sinking funds $.
26. Other (Specify) $
27. TOTAL OTHER EXPENSES $
28. TOTAL ALL EXPENSES (FROM PARTS A - C) . . . . $
17
-------
ASSETS,
* DEBT
34. Please provide the following information on your drinking water system's total assets and
liabilities, outstanding debt, and total capital reserve fund.
Amount at end
of last year
1. TOTAL ASSETS $_
2. TOTAL LIABILITIES $_
TOTAL DEBT OUTSTANDING:
3. Due within 5 years $_
4. Longer than 5 years $_
5. TOTAL CAPITAL RESERVE FUND $
35.
Have you paid for major capital improvements, repairs or expansion since January 1,1987?
V
1 Yes
2 No —>
Go to Question 37
18
-------
36. What sources of funds did you use to pay for these major capital improvements, repairs, or
expansion?
SOURCE OF FUNDS FOR
CAPITAL INVESTMENT
Debt Financing
1. Revenue or industrial development
bond
2. Company bond
3. Bank loan
STATE OR FEDERAL SUBSIDIZED LOAN:
4. Small Business Administration .
s. Rural Development
Administration (RDA)
6. Farmers Home
Administration (FmHA)
7. State Agencies (Specify)
Other Sources of Funds
8. Payment from capital reserve fund .
9. Special assessment
10. Stock issue
11. Cash flow from current revenues. .
STATE OR FEDERAL SUBSIDIZED GRANT:
12. Rural Development
Administration (RDA)
13. Farmers Home
Administration (FmHA)
14. Other (Please specify)
Was this source of
funds used since
1/1/87?
YES NO
2
2
2
2
2
2
2
2
2
2
2
2
2
If YES, how much was secured or provided for
each of the following?
Water quality Replacement System
improvement or major repairs expansion
$_
$.
$_
$_
$_
$_
$_
$_
$_
$_
$_
$.
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
37. Have you ever had to reduce or cancel plans for major capital improvements, repairs, or
expansion of your drinking water system because you were unable to secure an adequate loan
from any source; and if so, what was the amount of the loan sought?
1
2
Yes —»
No
Amount of Loan
$
Reason for Loan Denial (if known)
IF YOU HAVE NOT USED BONDS FOR FINANCING, GO TO QUESTION 40.
19
-------
38. Have your bonds ever been rated by a rating service?
T
1 Yes
2 No —»
Go to Question 39C
39A. What was your system's latest bond rating?
RATING SERVICE
T
Rating
Moody's
Standard and Poor's
Other (Specify)
(e.g., Baal)
(e.g., BBB+)
39B. What was the year of your system's latest bond rating?
T
19
39C. What was the type of bond that was last issued by your system?
(Circle one number)
1 Revenue or industrial development bond
3 Company bond
4 Other (Specify)
40. Please enter any additional comments (optional):
THANK YOU FOR COMPLETING THIS QUESTIONNAIRE. YOUR
TIME AND EFFORT ARE GREATLY APPRECIATED.
MAILING INSTRUCTIONS ARE INSIDE THE FRONT COVER.
20
-------
APPENDIX E
Ancillary CWSS Mail Questionnaire
E-l
-------
-------
OMB No.: 204(M)173
Expires: 7/31/97
United States
Environmental Protection Agency
3D
Q
VvX
5
CD
SURVEY OF COMMUNITY WATER
SYSTEMS OPERATED IN
CONJUNCTION WITH MOBILE HOME
PARKS OR OTHER BUSINESSES
12/28/94
-------
Please return this questionnaire in the enclosed postage-paid envelope
or mail to:
EPA Community Water Systems Survey
1650 Research Boulevard
Room GA 45
Rockville, MD 20850-9973
The following questionnaire is estimated to requite 45 minutes to an hour to complete.
This includes time for reviewing instructions, gathering and reporting the requested data, and reviewing the questionnaire.
Send comments regarding the burden estimate or any other aspect of this survey, indicating suggestions for reducing this burden, to:
Chief, Information Policy Branch, 2136 • US. Environmental Protection Agency • 401M Street, S.W. • Washington, DC 20460, and
Desk Officer for EPA • Office of Information and Regulatory Affairs • Office of Management and Budget • Washington, DC 20503.
-------
Please respond about:
If you have any questions about the survey or how to
complete the questionnaire, please call:
Please return your completed questionnaire in the
enclosed postage-paid envelope by March 1O, 1995.
-------
-------
1994 Community Water Systems Survey:
Questionnaire for Systems Operated in Conjuction
With Mobile Home Parks Or Other Businesses
GENERAL INSTRUCTIONS
This questionnaire asks three preliminary questions and then is divided into two major parts:
PART I - OPERATING CHARACTERISTICS (Questions 4-27); and
PART II - FINANCIAL CHARACTERISTICS (Questions 28-40).
Please complete the questionnaire as follows:
• In Question 1, provide the best contact person for each part (I and II);
• In Question 2, indicate the latest full-year reporting periods for which your operating information,
and financial information are available;
• In Part I of the questionnaire, use the period indicated in Question 2(A) to report "last year's"
operating data; and in Part II, use the period indicated in Question 2(B) to report "last year's"
financial data;
• In Part II of the questionnaire, record dollar amounts as whole dollars;
. Please record your answers for the questionnaire by filling in the blank(s)
or circling the appropriate number(s) for each item; and
• Make a copy of the completed questionnaire for your records before sealing it in the enclosed
envelope.
1. Please provide the name, title and telephone number of the most knowledgeable person to
contact for information on:
T
(A) PART I - OPERATING CHARACTERISTICS:
Name: Title:
Tel. No. ( ) - Fax No..
T
(B) PART II - FINANCIAL CHARACTERISTICS
(Write "SAME" if same as above)
Name:
Tel. No. (__ )_
Title:
Fax No..
-------
2. Please specify the end date of the most recent 12-month reporting period for which your
drinking water system can provide operating and financial information.
T
Can be reported
for the
12 months ending
(A) Operating information / /
(B) Financial information / /
3.
Please indicate, by circling the appropriate numbers in columns A, B, and C, whether the
organizations or people listed below provide your drinking water system with:
(A) Information on drinking water requirements and guidance;
(B) Operator training; and
(C) Technical assistance.
(Circle all numbers that apply for each information source)
INFORMATION SOURCE
T
(A)
Source providing
information
on drinking
water requirements
and guidance
1. State Department of Natural Resources, state
Health Department, or state EPA 1
2. Other state government departments or
extension services 1
a. U.S. Environmental Protection Agency 1
4. Other federal agencies or extension services
(e.g., FmHA, Rural Development Administration) 1
s. County government 1
6. Local government 1
7. State rural water associations 1
8. Other associations 1
9. Rural community assistance program 1
10. Contracted engineering services 1
11. Citizen volunteers 1
12. Electronic bulletin boards 1
13. Technical publications 1
14. Radio or television 1
15. Local newspapers 1
16. Federal register 1
17. Other (Please specify) 1
18. 1
V
(B)
Source
providing
operator
training
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
V
(C)
Source
providing
technical
assistance
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
-------
PART I - OPERATING CHARACTERISTICS
PRODUCTION AND STORAGE
4. For each type of water source listed below, please indicate which ones you use and:
(A) the number of gallons (in millions of gallons) produced in the last year (i.e., the amount
of water going into the distribution system); and
(B) the number of water intake points with disinfection.
WATER SOURCE
Ground water ....
Surface water ....
Water purchased from
other systems . . .
Do you obtain water
from this source?
YES
1
1
1
NO
2
2
If YES, enter the number of:
(A) (B)
Gallons Number of
produced in intake
the last year points with
(in millions) disinfection
What was your system's peak daily production of non-purchased drinking water during the past
year, and what is the system's maximum daily treatment design capacity?
T
Gallons
per day
(A) Peak daily production
(B) Maximum daily treatment design capacity
You reported your system's maximum daily treatment design capacity in Part B of Question 5.
There are several possible factors that may have resulted in a maximum capacity of this size.
Some possibilities are listed below. Please circle the number to the right of each factor that
indicates how important that factor was in determining your system's maximum design capacity.
Factor Determining Maximum Design Capacity
1. Current peak needs (beyond average daily flow)
2. Seasonal demand (e.g., irrigation)
3. Emergency flows (e.g., fire, drought)
4. Expected growth
s. Limited choice in package plant sizes.
e. Other (Please specify)
How important was this factor?
Very Important _^ ^ Not important at all
2
2
2
2
2
2
3
3
3
3
3
3
4
4
4
4
4
4
5
5
5
5
5
5
-------
Do you have treated water storage?
T
1 Yes
2 No —»
Go to Question 9
Please indicate whether you have the following types of treated water storage listed below; and
if so, for each type of storage:
(A) how many tanks do you have;
(B) what is their storage capacity (in millions of gallons); and
(C) do you disinfect water in these tanks after storage?
TYPE OF TREATED
WATER STORAGE
GROUND LEVEL OR SUB-SURFACE STORAGE
Natural materials (e.g., wood, earth):
Does your water
system have this
type of treated water
storage?
YES NO
If YES,
(A)
complete the following:
(B)
Total
storage capacity
Number
of tanks
(in millions ol
gallons)
(C)
Disinfect
after
storage?
YES NO
i. Uncovered
2. Covered .
Synthetic materials
(e.g., steel, concrete):
3. Uncovered . . . .
4. Covered.
ELEVATED STORAGE
Natural materials (e.g., wood):
s. Uncovered
6. Covered.
Synthetic materials
(e.g., steel):
7. Uncovered . . ,
8. Covered. . . . ,
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
-------
9. Please indicate the types of pipe used in your distribution system. For each type of pipe, what
is the number of:
(A) miles (or feet) of existing pipe;
(B) miles (or feet) of pipe replaced in the last year;
(C) water main repairs in the last year; and
(D) months between flushes for that type of pipe.
TYPE OF PIPE
IRON:
1. w/Cement Lining.
2. w/o Cement Lining
ASBESTOS CEMENT:
3. w/Vinyl
4. w/o Vinyl
5. PVC
6. Other plastic
7. Other (Please specify)
Does your
distribution
system
have this
type of
pipe?
YES NO
If YES, enter the number of:
(A)
Miles or
feet
(specify
which)
of exist-
ing pipe
(B)
Miles or
Miles feet (specify Miles
or which) of or
feet? pipe replaced feet?
(Circle in the (Circle
MorF) last year MorF)
(C)
Water
main
repairs in
the last
year for
type of pipe
(D)
Months
between
flushes for
type of pipe
2
2
2
2
2
2
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
M
F
10. How many miles (or feet) of new pipe (for expansion purposes) have you installed in the last 5
years? Enter response for either miles or. feet, but not both.
(If zero, enter "0")
. MILES OF NEW PIPE OR
FEET OF NEW PIPE
-------
11.
How many people and connections does your system currently serve with piped drinking water,
and how many did it serve 5 years ago?
(Please estimate if you don't know the exact number.)
NOTE:
If your system serves a population that changes on a seasonal basis (for example, a winter
or summer resort area), please indicate the highest seasonal number of people served or
active connections.
(A)
(B)
T
Currently
5 years ago
PEOPLE SERVED WITH PIPED DRINKING WATER
ACTIVE CONNECTIONS WITH PIPED
DRINKING WATER
12. What are the ZIP codes of your service area? (If your system's service area covers more than
10 ZIP codes, record only the first 3 digits of the ZIP code(s), i.e., all ZIP codes covered by the
same 3 starting digits can be summarized as one ZIP code by recording the first 3 digits.)
OR
FIRST 3 DIGITS OF ZIP CODES (For systems whose service area covers more than 10 ZIP codes)
XX XX XX XX XX
OPERATOR TRAINING
13. Do you have any drinking water treatment plant operators currently employed by your system?
T
1 Yes
2 No —> Go to Question 15
-------
14. Please indicate whether the treatment plant operators you employ have attained any of the
training level categories listed below. Provide the number of operators and average operator
work week (in hours) for each applicable training level category:
TRAINING LEVEL CATEGORY
STATE CERTIFIED (i.e., with state-approved
certified training for drinking water)
- Full time operators) [Definition:
Works at least 35 hours a week]
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, state certified
operators
Do you employ
drinking water
treatment
operators who
have attained this
training level
category?
YES NO
How many
operators
do you have?
(Number)
If "YES"
Average hours per week
per operator:
Other
Drinking
Treatment Water
Duties Duties
(Mrs) (Mrs)
2
2
2
TRAINED THROUGH A NATIONAL OR STATE
PROGRAM, BUT NOT STATE CERTIFIED
- Full time operator(s) (see definition above). .
- Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") .
- Other part time, trained operators
2
2
OTHER TRAINING LEVEL
(e.g., on-the-job training)
- Full time operators) (see definition above). . 1
Part time operator(s) who also operate other
drinking water plants (e.g., "circuit riders") . 1
- Other part time operators not
classified above 1
2
2
2
WATER SOURCES AND TREATMENT
15. Is your water system interconnected to another system that you can use for emergency
purposes (e.g., hot summers)?
T
1 Yes
2 No
-------
16. If your primary source of drinking water became permanently unusable due to contamination,
please indicate whether or not you would adopt any of the solutions listed below:
SOLUTION
1. Draw more heavily upon other sources on the present system. .
2. Draw upon another system to which you are now connected. . .
3. Draw upon alternative sources (e.g., hook up to another system)
4. Implement a water management plan (e.g., rationing)
5. Drill new well(s).
6. Curtail service
7. Other (Please specify)
If primary water
sources became
unusable, would you
adopt this solution?
YES NO
2
2
2
2
2
2
2
17. If you are currently interconnected to your long term alternate water source, check the box
indicated and go to Question 18.
T
Q
Currently interconnected —>
Go to Question 18
If you have no long term alternate water source(s), check the box indicated and go to Question 18.
Q
No long term alternate water source(s) —> Go to Question 18
What is the name of your long term alternate water source(s) and how many miles is it from the
nearest connection point on your current system?
Name of long term
alternate water source(s)
1.
2.
3.
T
Distance
from
system
(to nearest mile)
If distance is under
one mile, please
estimate distance
in feet
-------
si
co .2
1 I I
"I 1 ~l
J J I
I I I
I I I
1 ~l n
I ! 1
"I ~l 1
! I I
0) *3
o co
co p
ll
2-5
II
<
fc 58
S J
CD "3
I
•r, = "SI
1 E ^
f S §>
5!i
111
1 11
o o o o g
111
Z o ~
"o P ~
CD CD c
•- ^ 0> =:
5 c
O) ^
= « f I
- -° ^ J <£
>- £
. o
O CO Q.
•a
CD 2 CD
OT .-5
CO £ :5
51 'o.-2
00
llf J
o a ~" I d
O & '-5 ^
! I
I I
i I
I I
I I
I I
I i
I i
I I
si
(0
LU
Q
o
u
LU
S
^
LU
Treatment
Ul
Q
o
o
PAC addition
S
Ion exchange
Air stripping
g jo
to
g
1
5
8
jo
8
I
0
to
CO
c
.9
£
i
POST-DISINFECTII
Chlorine
CO
to
1
E
u
Chlorine dioxide
CO
CO
„
i
15
1
Chloramines
Post-disinfection <
en o
CO - CM
18
Lime/Soda ash soft
Recarbonation with
O i-
Q
UJ
C
O
UJ
OC
Ul
2 1
^o to
"ffl UJ
™ H~
| » gl 1
z 1 ^ S I S S
2 I * c c S" S.
CO « ^ 0 0 E «
il
O X :£ 0 0 I <»
0 Q.< O O £fL,
co n- in to h-
2 jo
P CO
S -e
2 o
§TJ
to
co c to 5? o. ;5
•to .2 .£ ^ c c
g 13 c -g Eu -2 .S
C .fa .O t CC Q. Q.
8 8= 13 8 co 5 o
§ 1 j 1 ill
2 to fc ? CD O O
o £ •& = cc < <
cc a. O u. o O O
f- co en o T- CM
CM CM CM CO CO CO
.Li
<
S = §
i g E
D; '•» 2
UJ i; =>
llll III
CM CO •* IO ID
-------
19. Are there places in your distribution system other than those reported in your answer to
question 8C (storage) or question 18 (treatment facilities) where you boost disinfectant
residuals?
1
2
Yes
No
20. Please supply the following information for each well or surface water intake not receiving
treatment (include only sources that were active in 1994). If you have more than five wells and
intakes, please check here Q. (Record the information about the additional wells and intakes
on a photocopy of this page or use a blank sheet of your own.)
T
AQUIFER OR
SURFACE WATER
SOURCE NAME
(e.g, Ogalala Aquifer
or Ohh River)
SOURCE
TYPE (enter
G for Ground
or S for Surface)
LOCATION OF WELL OR INTAKE:
(Enter latitude and longitude from
local plat map or permit)
Latitude Longitude
(Degrees Min. Sec.) (Degrees Win. Sec.)
Average
flow
(Gal/day)
If well,
Dlease list:
Potential
flow
(Gal/day)
Well
depth
(feet)
SOURCE
21. Does your drinking water system participate in a source-water or wellhead protection program?
1 Yes
2 No —» Go to Question 25
22. Please indicate whether or not the following measures are being adopted in your source-water
or wellhead protection program:
MEASURE
Is this measure
adopted in your
source-water or
wellhead protection
program?
YES NO
1.
2.
3.
4.
Education on land use impacts
Ownership of a watershed . .
Zoning or land use controls. .
Best Management Practices
(such as run-off controls, fertilizer scheduling,
less toxic road maintenance materials) . . . .
Other (Please specify)
2
2
2
2
2
10
-------
23. Who leads or manages this program?
(Circle only one number)'
V
1
2
3
4
Local government
Regional authority (e.g., Section 208 Agency)
State agency
Other (Please specify)
24. How is the management area delineated?
(Circle all numbers that apply and fill in the blanks if 3, 4 or 5 is circled)
T
1
2
3
4
5
By watershed boundaries
By aquifer boundaries
By a fixed radius around well of
feet
By a fixed distance from a surface water body of
Other (Please specify)
.feet
25. Please indicate if any of the potential sources of contamination listed below exist within 2 miles
of your water supply intakes:
POTENTIAL SOURCE OF CONTAMINATION
Does this potential source of contamination
exist within 2 miles of your water supply?
1. Industrial or manufacturing facilities
2. Agricultural runoff.
3. Animal feed lots
4. Urban runoff
s. Sewage discharge
e. Hazardous waste site
7. Solid waste disposal
8. Nitrates
9. Pesticides, rodenticides, fungicides
(e.g., mixing or storage facilities)
10. Mining, oil, or gas activities
11. Petroleum products (e.g., auto repair shops)
12. Solvents (e.g., dry cleaners)
13. Septic systems or other sewage discharges.
14. Other (Please specify)_
YES
1
1
1
1
1
1
1
1
1
1
1
1
1
1
NO
2
2
2
2
2
2
2
2
2
2
2
2
2
2
11
-------
26. Who performs laboratory analysis on your drinking water?
LAB ANALYSIS PROVIDER
The state
A private firm. . . .
In-house employees
Other (Specify)
Does this provider perform your lab analysis for ...
Metals/ other
Inorganics?
YES
1
1
1
1
NO
2
2
2
2
Mlcrobials?
YES
1
1
1
1
NO
2
2
2
2
VOCs*?
YES
1
1
1
1
NO
2
2
2
2
Organics?
YES
1
1
1
1
NO
2
2
2
2
'VOCs^Volatile organic compounds (e.g., carbon
tetrachlorlde, benzine, THMs, etc.)
27. How do you pay for your laboratory analysis?
PAYMENT METHOD
Direct payment for tests to state or private lab.
Included as part of state permit
Dontpay
Other (Please specify)
Do you use this payment method?
YES NO
2
2
2
2
PART H - FINANCIAL INFORMATION
REVENUES AND EXPENSES
28. Are your financial reports or income and expense statements for your drinking water system
completed in accordance to Generally Accepted Accounting Principles (GAAP)?
(Circle one number)
T
1 Yes
2 No
3 Don't have separate income and expense
statements for our drinking water system
4 Don't know
12
-------
To simplify your task of providing financial information, please follow the guidelines
below when filling out the remainder of the questionnaire.
PROVIDING ESTIMATES:
The following questions ask for information on drinking water supply operations, exclusive of other
activities with other types of operations. Where possible, please provide exact information from
your system's records. Otherwise provide your best estimate of financial information that is
applicable to your drinking water system only.
Unless specifically requested, exclude financial information relating to your primary business.
ROUNDING:
Please record your dollar amounts to the nearest dollar. DO NOT record fractional
dollars (i.e., dollars and cents).
13
-------
29. If your business billed separately for water supplied, what were the water revenues during the
last year [as defined in your response to Question 2(B)] for each of the following customer
categories? (If your business did not bill separately for water, report only the gallons delivered
(in millions) and please check here: Q)
(If zero, enter "0")
1.
2.
3.
4.
5.
e.
WATER SALES
CUSTOMER CATEGORIES
Residential customers $_
Commercial customers $_
Industrial customers $_
Wholesale customers (i.e., those
who redistribute your water
to other users) $_
Local municipal government $_
Other government customers $_
7. Agricultural customers $_
8. Other (Specify) $_
T
Water Sales
Revenues
T
Gallons delivered
(in millions)
TOTAL $_
30. Please indicate your drinking water system's revenues during the last year from the other water-
related revenue sources listed below; and what were the total revenues from your primary
business, excluding water-related revenues?
(If zero, enter "0")
1.
2.
3.
4.
5.
6.
WATER RELATED REVENUE SOURCE
(EXCLUDING WATER SALES)
WATER-RELATED REVENUES:
Connection fees $_
Inspection fees $_
Usage fees $_
Revenues
Please specify other water system
revenues (not elsewhere reported)
$_
PRIMARY BUSINESS REVENUES (excluding
water-related revenues): $_
14
-------
31. For each customer category listed below, please identify your drinking water system's billing
structure, indicate the year and percent of the two most recent rate increases, and provide the
number of metered and unmetered active connections.
(If zero, enter "0")
T V T
1.
2.
3.
Note:
CUSTOMER CATEGORY
Residential customers
Commercial customers
Other (Specify)
Billing
structure
(Circle all
code(s)
from Box 2
that apply)
•i r> 1 A K ft 7
£. O 4 O O /
1234567
Year and percent of
two most recent rate
increases
YR. % YR. %
Number
of active
connections
Metered/Unmetered
/
/
/
The total of all metered and unmetered connections should be the same as the
current active connections reported in question 11 (B).
Metered Charges
CODE Billing Structure
1 Uniform rate
2 Declining block rate
3 Increasing block rate
4 Peak period rate
(e.g., seasonal)
BOX 2 - BILLING STRUCTURE
Unmetered Charges
CODE Billing Structure
5 Separate flat rate for water
6 Combined flat rate for water and other services
(e.g., rental fees, association fees, pad fees)
Other Type of Charges
CODE Billing structure
7 Other (Specify)
15
-------
32. How many gallons (or dollar equivalents) of uncompensated usage did your water system have
in the last year for each of the usage categories listed below:
UNCOMPENSATED USAGE CATEGORY
Leaks, breaks, failed meters
Uncollected bills
Other (Specify)
Uncompensated usage
(Enter either millions of gallons
QT dollar equivalent, if gallons unknown)
million gals, or $
.million gals, or $_
jnillion gals, or $_
The next question is intended to account for aJl of your drinking water system expenses.
Please list your:
• Routine operating expenses in Part A;
• Capital-related expenses (including interest or
principal repayment) in Part B; and
• Other expenses in Part C.
16
-------
33A.
Please enter the routine operating expenses of your drinking water system in the last year, according
to the operating expense categories listed below:
PART A
OPERATING EXPENSES
WATER SYSTEM EXPENSES:
Last year's expenses
DIRECT COMPENSATION (wages, salaries, bonuses, etc.):
1. Managers $_
2. Operators $
3. Others $
4. Benefits (health & insurance premiums, PICA,
FUTA, and pension contributions) $
ENERGY COSTS:
5. Electricity $
e. Other energy (gas, oil, etc.) $
CHEMICALS:
7. Disinfectants $
s. Precipitant chemicals $
9. Other chemicals $
10. Materials and supplies $
11. Outside analytical lab services $
12. Other outside contractor services $
13. Depreciation expenses $
14. Water purchase expense
Ul raw water Q treated water $
15. Other water system operating expenses (general and
administrative expenses not reported elsewhere) . . . $
16. TOTAL ALL WATER SYSTEM OPERATING
EXPENSES $
17. EXPENSES FOR PRIMARY BUSINESS
(excluding taxes) $
ALL TAXES ON PRIMARY BUSINESS
(income taxes, property taxes, etc.)
18. Federal taxes $
19. State taxes $
20. Local taxes $
B. Please enter the amount of debt service expenditures for your drinking water system in the last year.
PARTB
DEBT SERVICE EXPENDITURES
21. Interest payments $
22. Principal payments $
23. Other debt service
expenditures (Specify) $
24. TOTAL ALL WATER SYSTEM DEBT
SERVICE EXPENDITURES $
17
-------
C. Please enter the amount of other expenses (excluding operating and debt service expenses
reported in Parts A and B) for your drinking water system in the last year.
PARTC
OTHER EXPENSES
Last year's expenses
25. Capital improvements (e.g.,
expansion, new treatment) $_
26. Advance contributions to sinking funds $_
27. Other (Specify) $_
28. TOTAL OTHER WATER SYSTEM EXPENSES ....$_
29. TOTAL ALL WATER SYSTEM EXPENSES
(FROM PARTS A - C) $_
ASSETS; i
34. Please provide the following information on your drinking water system's total assets and
liabilities, outstanding debt, and total capital reserve fund.
T
Amount at end
of last year
1. TOTAL ASSETS $
2. TOTAL LIABILITIES $
TOTAL DEBT OUTSTANDING:
3. Due within 5 years $
4. Longer than 5 years $
5. TOTAL CAPITAL RESERVE FUND $•
CAPITAL INVESTMENT
35. Have you paid for major capital improvements, repairs or expansion since January 1,1987?
T
1 Yes
2 No —;
Go to Question 37
18
-------
36. What sources of funds did you use to pay for these major capital improvements, repairs, or
expansion?
SOURCE OF FUNDS FOR
CAPITAL INVESTMENT
Debt Financing
1. Revenue or industrial development
bond
2. Company bond
3. Bank loan
STATE OR FEDERAL SUBSIDIZED LOAN:
4. Small Business Administration .
5. Rural Development
Administration (RDA)
e. Farmers Home
Administration (FmHA)
7. State Agencies (Specify)
Other Sources of Funds
s. Payment from capital reserve fund .
9. Special assessment
10. Stock issue
11. Cash flow from current revenues. .
STATE OR FEDERAL SUBSIDIZED GRANT:
12. Rural Development
Administration (RDA)
13. Farmers Home
Administration (FmHA)
14. Other (Specify)
Was this source of
funds used since
1/1/87?
YES NO
2
2
2
2
2
2
2
2
2
2
2
2
2
If YES, how much was secured or provided for
each of the following?
Water quality Replacement System
improvement or major repairs expansion
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$.
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
$_
37. Have you ever had to reduce or cancel plans for major capital improvements, repairs, or
expansion of your drinking water system because you were unable to secure an adequate loan
from any source; and if so, what was the amount of the loan sought?
1
2
Yes —>
No
Amount of Loan
$
Reason for Loan Denial (if known)
IF YOU HAVE NOT USED BONDS FOR FINANCING, GO TO QUESTION 40.
19
-------
38. Have your bonds ever been rated by a rating service?
T
1 Yes
2 No —> Go to Question 39C
39A. What was your system's latest bond rating?
RATING SERVICE
T
Rating
Moody's
Standard and Poor's
Other (Specify)
/ /_
(e.g., Baal)
e.g., BBB+)
39B. What was the year of your system's latest bond rating?
T
19
39C. What was the type of bond that was last issued by your system?
(Circle one number)
1 Revenue or industrial development bond
3 Company bond
4 Other (Specify)
40. Please enter any additional comments (optional):
THANK YOU FOR COMPLETING THIS QUESTIONNAIRE. YOUR
TIME AND EFFORT ARE GREATLY APPRECIATED.
MAILING INSTRUCTIONS ARE INSIDE THE FRONT COVER.
20
-------
-------
-------
-------
TJ Q
CD 13;
^ —•
jn Q.
— 03
0
-• P
0 O
^ o
(D
CO
3. CD
SL"1
-a
a
o'
(Q
CD
-------