EPA Procedures to Ensure Drinking Water Data
Integrity

TABLE OF CONTENTS

Executive Summary
Chapters

1.	Introduction
Purpose
Background

Scope and Methodology
Prior Audit Coverage

2.	Data Integrity Enhancements Needed

Twelve Percent Of PWSs Reported Erroneous Data

Site Visits Confirmed Problems With Some PWSs

Data Falsification Issues Not A Priority For EPA Or States

Conclusion

Recommendations

Agency Comments and Actions

OIG Evaluation

Exhibits

Exhibit 1 Region 1 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 2 Region 2 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 3 Region 3 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 4 Region 4 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 5 Region 5 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 6 Region 6 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data


-------
Exhibit 7 Region 7 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data

Exhibit 8 Region 8 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 9 Region 9 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 10 Region 10 Surface Water Systems the OIG Reviewed That Submitted Erroneous Data
Exhibit 11 Calculation of Sample Results Projected to the PWS Universe
Exhibit 12 Site Visits Summary
Appendices

Appendix 1 Office of Water Response to Draft Report (Available in hard copy only)

Appendix 2 Abbreviations
Appendix 3 Distribution

Chapter 1: Introduction

PURPOSE

U.S. Environmental Protection Agency (EPA) Office of Ground Water and Drinking Water (OGWDW)
officials requested a nationwide audit of drinking water data integrity issues: (1) because of their concern that
falsified drinking water data may be going undetected resulting in potential health threats, and (2) to aid them in
planning how to use limited resources efficiently and effectively. If water system operators report invalid or
falsified test results, serious health risks could go undetected. Moreover, if a state does not detect invalid or
falsified test data and investigate the situation, the state's ability to take proactive measures to prevent
contamination of the water supply is negated.

Americans drink over one billion glasses of water a day. The Centers for Disease Control and Prevention (CDC)
estimated that over 940,000 people become ill every year from drinking contaminated water. Of those, CDC
estimated 900 people die each year. Moreover, controversy has recently been generated amongst the public
regarding the overall quality of our nation's drinking water supply and the technology of the systems that
provide it.

The objectives of this audit were to determine:

the frequency with which operators of community, surface public water systems (PWSs) throughout the nation
reported invalid or potentially falsified drinking water test data, and

what procedures EPA regions and states used to detect invalid or potentially falsified data.

This is a consolidated report which includes the results of our 1994 pilot audit of Region 5, as well as our most
recent review of the other nine EPA regions. During our 1994 audit of Region 5's program (footnote 1). we
found that operators at about four percent of the surface water systems in Region 5 reported erroneous drinking
water test data that the states did not detect.

BACKGROUND


-------
Responsibilities Of EPA And States

The Safe Drinking Water Act of 1974 (the Act), asamended, required that EPA ensure that PWSs monitor and
test the quality of drinking water and distribute water to consumers that is safe to drink. EPA's OGWDW, the
regions, and the states are all responsible for achieving the goals of the Act (footnote 2). Over the last decade
statutory monitoring and testing requirements increased, but, according to EPA officials, only limited EPA and
state resources were available for reviewing the validity of reported drinking water test data.

Under the current safe drinking water program, all states and Federal territories, except Wyoming and the
District of Columbia, are responsible for ensuring that the water within their jurisdiction is safe for
consumption. EPA delegates this authority, called "primacy", to states meeting the Act's and EPA's
implementing regulations. To maintain primacy, states must adopt regulations at least as stringent as the Federal
Government's, and demonstrate that they can effectively execute program requirements.

In 1991, OGWDW's Enforcement and Program Implementation Division published its Public Water System
Supervision Data Verification Guidance. Appendix G of this guidance, entitled "Guidance on Detecting Invalid
or Fraudulent Data" presents ways to identify systems that may be submitting invalid or falsified data and
outlines the correct procedures for investigating and prosecuting systems suspected of falsifying data. This
guidance states that EPA regions and states with primacy should identify and prosecute systems suspected of
falsifying data, as part of their oversight of drinking water programs.

According to the guidance, the problem of data falsification may be larger than originally suspected. It explains
that:

...with the promulgation of new regulations under the Safe Drinking Water Act, there is an increased incentive
for water systems to purposely submit invalid compliance data. This is due to the cost of compliance with the
new regulations and an increased complexity in the monitoring and reporting requirements.

OGWDW maintained an automated database called the Federal Reporting Data System (FRDS) as a centralized
repository for information about PWSs nationwide and their compliance with monitoring requirements,
maximum contaminant levels, and other requirements of the Act. State officials are required to assess
monitoring results and submit quarterly reports to OGWDW for input to FRDS. FRDS was an "exception-
based" database with nearly all data being provided by states.

Responsibilities Of Public Water Systems

The majority of community PWSs treat the drinkingwater they distribute to the public to ensure that thewater
customers drink is safe. Some sources of drinking water contamination include turbidity (or "cloudiness") of
water due to suspended matter such as clay, silt, and microscopic organisms; bacteria from human or animal
sources; pesticide run-off; underground injections of hazardous wastes; disinfection by-products; and lead,
copper, and asbestos from corroding pipes.

PWSs must test their water supply for various contaminants and report the results to their states. Generally, the
larger the population a system serves, the more often that system is required to test and report. The type of PWS
(community or noncommunity) and the water source (surface or groundwater) determine what contaminants
must be monitored. The four basic contaminant groups PWSs must test for and report on are: (1) inorganic
chemicals, (2) organic chemicals, (3) radionuclides, and (4) microbiological contaminants.

PWSs generally either contract with an independent certified laboratory or a state-owned laboratory to analyze
water samples from their systems involving complex tests, such as those for bacteria, most inorganic and
organic chemicals, and radionuclides. Many states require independent labs to report the results of their tests
directly to the state before reporting the results to the water system, their client.


-------
While PWSs generally contract out their complex tests, they conduct other tests in-house. Water system
personnel conducting these operational tests are normally required to report the results of these tests to their
state once per month. Water operators submit this data on monthly operational report (MOR) forms.

At least three measures are important indicators of water quality: turbidity, chlorine residual, and pH.

Turbidity measurements indicate how well filtration systems are removing particulates from drinking water. As
turbidity levels increase, the efficiency of a water treatment plant in filtering particulates in water decreases. If
turbidity levels are too high, health risks from microbes increase.

Chlorine measurements, including the time chlorine is in contact with water in the distribution system, are very
important also. Although chlorine can kill microbes and prevent the regrowth of any bacteria that make it
through the filtration system, microbes may not be inactivated by chlorine if the time chlorine is in contact with
the microbes is too short.

A measure of the acidity or alkalinity of a liquid is referred to as its pH. Improper pH levels affect the efficiency
of disinfectants and also affect the types and levels of disinfection by-products generated. In addition, pH levels
that are too acidic increase the leaching of metals, such as lead, from within the distribution system into the
water supply.

Many PWS operators are certified to perform operational-type tests on drinking water samples, including those
for turbidity, chlorine residual, and pH. Each state has an operator certification program, which is responsible
for ensuring that operators are properly trained on PWS monitoring requirements. According to regional
officials, state training is available for all operators, certified or not. The requirements for certified operators
vary by state. These requirements are based on the type and size of the PWS, its water source and quality, and
the population affected. If operators do not continue to meet these requirements, their certifications can be
revoked. This could result in the loss of their jobs. Falsification of test data is also grounds for revocation of an
operator's certification.

Potential For Invalid Or Falsified Data

PWS operators are in the business of providing drinking water to the public. Thus, it is in the best interest of
operators to provide a safe, reliable, and high quality drinking water product to their customers. However, the
increased cost of new regulations and the increased complexity in monitoring and reporting requirements might
be an incentive to submit falsified data.

Because PWS operators and staff are directly involved with recording and reporting results of operational tests,
there is an increased chance that these results might be erroneous (either invalid or falsified). Operators might
report invalid test data because of malfunctioning equipment, improper testing procedures, or a
misunderstanding of how to properly record and report test readings. Also, PWS personnel might falsify test
data in several different ways. For example, PWS personnel could: (1) report fabricated data without taking
required tests; (2) boil or microwave samples, thereby killing harmful organisms; (3) submit samples from
sources outside the PWS; or (4) collect all samples from one location within the PWS.

Potential Health Risks

If PWS operators report invalid or falsified test results, serious health threats could go undetected. Moreover, if
a state does not detect and investigate questionable test data, the state's ability to take proactive measures to
prevent contamination of the water supply is negated.

Drinking water regulations combined with improved water treatment practices over the past twenty years have
significantly reduced the occurrence of many waterborne diseases. As a result, water in the U.S. is considered


-------
very safe to drink. However, despite increased monitoring and testing requirements, contaminated drinking
water still remains a serious health threat. The CDC estimated that over 940,000 people in the U.S. become ill
every year from drinking contaminated water. Of those, they estimated 900 people die each year.

According to the Associate Director of EPA's Health Effects Research Laboratory in Research Triangle Park,
North Carolina, testing for the presence of microbes in drinking water is very important in ensuring harmful
microbes that can cause sickness are not present. Microbes include parasites, bacteria, and viruses. Such
organisms, if ingested, can result in varying degrees of sickness from mild gastrointestinal illness to death. He
further stated that a recent outbreak of cryptosporidiosis in Milwaukee, Wisconsin, shows the significance of
turbidity measurements (footnote 3). He said that the high turbidity measurements that were reported were good
indicators that microbes may have passed through the filtration system and into the water supply.

Types, Sizes, And Number Of PWSs

A PWS, as defined by EPA, provides piped water for human consumption to at least 15 service connections or
serves an average of at least 25 people for at least 60 days each year. PWSs are classified as either community
or noncommunity systems. Community systems provide water generally to the same population year-round,
such as in residential areas (footnote 4).

OGWDW categorizes the size of PWSs by the number of people each serves. There are five size categories as
shown in figure 1.

Figure 1: PWS Size Categories

Population Served	Size Category

25 - 500	Very Small

501 - 3,300	Small

3,301 - 10,000	Medium

10,001 - 100,000	Large

More than 100,000	Very Large

As of January 1995, there were about 186,800 PWSs in the United States and its territories. An estimated
56,700 PWSs of the total number of PWSs were community systems—about 46,100 were ground water PWSs
while the remaining 10,600 were surface PWSs. Although surface community PWSs comprised only about 19
percent of all community systems, those PWSs served about 63 percent of the population which received their
drinking water from community PWSs.

Figure 2: Community PWSs

(Figure 2 is only available in hard copy.)

PWSs obtain the water that they distribute to the public from either: (1) groundwater sources—such as springs
and aquifers (footnote 5). or (2) surface water sources—such as rivers, lakes, and reservoirs. Both types of water
sources are subject to contamination. However, because surface water sources are exposed to the air and are
accessible to human and animal use, these water sources require more extensive treatment before they can be
used for human consumption.


-------
SCOPE AND METHODOLOGY

Our audit was national in scope, intended to provide OGWDW officials with the comprehensive observations
and projections that they requested. As a result, although our audit work focused primarily on the efforts of each
EPA region and its states to detect questionable test results reported by community surface water systems, we
did not draw any conclusions regarding specific regions or states. We also interviewed OGWDW officials in
Washington, D.C., and obtained information regarding OGWDW's efforts in the data integrity area. We focused
only on community surface systems. Therefore, throughout this report "PWS" refers only to those community
systems with surface water sources.

OGWDW officials requested that we concentrate our efforts on community systems because those PWSs
distribute drinking water to most of the nation's population. OGWDW provided the Office of Inspector General
(OIG) with a random sample of 271 community surface PWSs from FRDS for review. OGWDW used the
automated random sampling capability of FRDS to select the 271 PWSs (footnote 6). The sample was selected
from the total of 4,417 community surface PWSs, located in the continental United States, excluding Region 5
states (footnote 7). To determine the sample size, we used a 95 percent confidence interval, an error discrepancy
rate of 25 percent, and an error tolerance level of 5 percent. FRDS randomly generated PWSs for every state
except Delaware, Mississippi, Nebraska, and Rhode Island.

We did not conduct a separate review of FRDS to validate the accuracy of its data. However, we did identify 19
PWSs in our original sample that were incorrectly coded in FRDS and, therefore, needed to be replaced with
systems meeting our criteria. For example, some systems were categorized in FRDS as surface systems when,
in fact, they were groundwater systems. Other PWSs in the sample were either not active, served less than 25
people, or purchased water from another PWS and were, therefore, dropped from our original sample.

We replaced all PWSs discarded from the original sample with substitute PWSs. OGWDW provided an
additional list of randomly selected PWSs for our use in case of coding errors in FRDS. We used this list and
replaced PWSs from the state from which they were dropped (footnote 8). For example, we replaced a PWS in
North Dakota which had been a groundwater system since 1991—but was coded in FRDS as a surface system—
with another North Dakota PWS from the substitute list of randomly selected systems.

We performed audit work within every EPA region and all 44 states within our sample . We reviewed PWS
files and interviewed officials at each state's environmental or public health agency responsible for oversight of
its drinking water program. Because some states maintained PWS files at several different district offices
throughout their states, and because of limited OIG resources, we were unable to travel to every state. As a
result, for the states of Montana, North Dakota, South Dakota, Utah, and Virginia, state officials sent either the
original files or copies of the files to us to review and certified to the authenticity of any copies.

We reviewed MORs, where available, for a four year period—January 1991 through December 1994. While
reviewing these MORs, we attempted to answer the following question:

How probable was it that the test values reported by this surface water system appeared valid and were based on
actual tests?

We looked specifically for operational test readings—such as those for turbidity, chlorine residual, and pH—that
did not fluctuate at all or fluctuated very little over the course of several consecutive months. We also looked
for any other obvious patterns in reported data over time. We considered such data to be questionable and
suspect, in accordance with OGWDW's Appendix G. We also looked for additional indicators of potential data
falsification, such as correction fluid on MORs or the use of pre-printed forms to record results. Because we
looked for obvious cases of invalid or potentially falsified data, there may be other cases of such data that we
did not identify.


-------
Of the 94 PWSs we identified as having reported suspect test data, we conducted unannounced site visits at 14
PWSs and an announced visit at 1 PWS. An OIG engineer accompanied us on each site visit, while state
officials accompanied us on all but two of the visits. We judgmentally selected these 15 PWSs, based on the
size and location of the systems, and the data patterns reported. We visited three PWSs in Ohio, two in
Colorado, Illinois, and Indiana, and one in California, Louisiana, New Mexico, North Carolina, Oregon, and
Tennessee. We visited 9 small PWSs, 4 medium-sized PWSs, 1 large PWS, and 1 very large PWS. During the
site visits, we: (1) interviewed PWS personnel, (2) examined original test records, (3) inspected the equipment
and overall condition of the facility, and (4) observed water plant operators conduct certain water tests we
requested.

We conducted our national fieldwork from November 4, 1994 to July 14, 1995. Each OIG audit division which
performed work provided its respective region or regions with a written summary document at the completion
of their work. Regional comments to those summaries have been provided to OGWDW.

We performed our audit in accordance with the Government Auditing Standards issued by the Comptroller
General (1994 revision). As part of our audit, we also reviewed OGWDW's and regions' recent Federal
Managers' Financial Integrity Act reports. Neither OGWDW nor any of the regions cited data falsification as a
material weakness. During our audit, we did not detect any internal control weaknesses significant enough to be
reported as material risks to program operations. However, we have identified opportunities to strengthen the
management controls over the reporting and review processes for drinking water data.

PRIOR AUDIT COVERAGE

The OIG issued a report on September 30, 1994, which addressed Region 5's procedures to ensure the integrity
and validity of drinking water test data (EPA Report No. 4100540). The report stated that Region 5 and state
officials placed a low priority on reviewing reported data for falsification, and states within that Region did not
have formal procedures to do so. Operators at about four percent of all Region 5 surface water systems reported
invalid or potentially falsified test data.

The OIG issued a report on July 30, 1993, which addressed Region l's enforcement of the Act, including its
data falsification efforts. Region 1 and its states did not routinely conduct data falsification reviews or follow-up
on cases of questionable data that were identified (footnote 9).

The U.S. General Accounting Office (GAO) issued two reports related to drinking water quality and data
integrity. An April 1993 report focused on the importance of sanitary surveys as a way to ensure the quality of
drinking water distributed to the public. The report stated that: (1) the frequency of sanitary surveys by the
states had declined and that many states were not conducting surveys as often as EPA recommended, (2)
inadequate training of state inspectors might have contributed to survey flaws, and (3) EPA placed limited
emphasis on sanitary surveys (footnote 10).

According to a June 1990 GAO report, most EPA and state officials GAO interviewed did not believe data
falsification was extensive. These officials also told GAO that falsifying test results was relatively easy and
incentives for doing so would increase in the future (footnote 11).

Chapter 2: Data Integrity
Enhancements Needed

Primacy states are responsible, under 40 Code of Federal Regulations (CFR) Part 142, for administering and
enforcing drinking water regulations, and reviewing PWS-reported test data. In turn, all PWS operators are
required, under the Act and 40 CFR Part 141, to ensure that all appropriate drinking water tests are conducted
and test results are accurately reported. Accurate test data are necessary to assure that the quality of drinking


-------
water provided to the public meets all drinking water standards. According to OGWDW's Appendix G
guidance, sanitary surveys are likely the best vehicle for detecting instances of invalid or falsified data.

Overall, the results of our nationwide review showed that operators at community, surface PWSs generally
reported valid data. Based on our statistical sample review, we projected nationwide that 11.6 percent (566 of
4,869) of PWSs reported erroneous data one or more times from 1991 through 1994, with 95 percent confidence
and a tolerance level of 5.0 percent. These PWSs served about 0.1 percent of the population. Very small and
small-sized PWSs most often reported erroneous data, and about 58 percent of the erroneous data cases
involved invalid data, rather than data which might have been deliberately falsified.

Operators at small PWSs most often reported invalid data because of: (1) a lack of training and knowledge on
how to properly test water samples, and record and report results and (2) improperly functioning equipment.
According to OGWDW, regional, and state officials, data falsification issues were not a priority because of
limited resources and other public health-related priorities. State officials generally did not believe data
falsification was a widespread problem. In addition, states' sanitary survey procedures generally did not include
data quality review steps, nor did state officials require PWS operators to certify to the validity of reported data.
Because most states did not use sanitary surveys to review the quality of reported operational test data, officials
missed opportunities to identify and correct testing and reporting problems.

TWELVE PERCENT OF PWSs REPORTED ERRONEOUS DATA

Based on our statistical sample review, we projected nationwide that 18.3 percent (890 of 4,869) of PWSs
reported data that was questionable in accordance with EPA guidance, with 95 percent confidence and a
tolerance level of 5.0 percent. (For further discussion of this projected percentage, see exhibit 11.) Files for 94
of the 723 PWSs we reviewed nationwide contained test data that were questionable in accordance with
OGWDW's Appendix G guidance. We referred these cases to the regions which, in turn, required the states to
follow-up. We accepted PWS data as valid if state officials: (1) visited or contacted the PWSs and (2) provided
reasonable explanations for the data patterns we identified. Based on this follow-up, we determined that 46 of
the 94 PWSs which we originally questioned, appeared to have reported accurate test readings.

Data for the remaining 48 PWSs, which statistically represented 11.6 percent (5.0 percent of the total number of
community, surface PWSs nationwide) continued to be questionable. Of these 48, operators at 28 PWSs
reported invalid data, while operators at 20 PWSs reported data that we believe might have been deliberately
falsified (footnote 12). According to FRDS, these 48 PWSs served about 140,000 people. Very small and small
PWSs reported erroneous data in 34 of the 48 cases. However, several medium-sized and large systems also
reported erroneous data (footnote 13).

The following table shows details about the projected 11.6 percent of PWSs that reported invalid or potentially
falsified data. We identified PWSs within each EPA region—except Regions 1, 7, and 8—that submitted
erroneous data. Among the regions, the percentages ranged from a low of 3.1 percent in Region 5 to a high of
28.6 percent in Region 3. These results should not be used for statistical projection purposes by region/state
because the statistical sample we used for our review was randomly selected on a national basis from the
universe of 4,417 community surface PWSs located in the continental United States. The sample excluded
Region 5 PWSs, which we previously reviewed. For more information regarding the percentages among the
states in each region, see exhibits 1 through 10.

Table 1

Percentage Of Community Surface Water Systems, By Region, That Submitted Erroneous Dat

Region Reviewed

Invalid Potentially Total

Percent


-------
Falsified

I

22

0

0

0

0.0

II

18

2

0

2

11.1

III

42

6

6

12

28.6

IV

46

3

1

4

8.7

V

452(Tootnote 14)

8

6

14

3.1

VI

49

4

6

10

20.4

VII

16

0

0

0

0.0

VIII

22

0

0

0

0.0

IX

40

3

0

3

7.5

X

16

2

1

3

18.8

Total

723

28

20

4 8 (footnote 15)

11,6(Tootenotel 6)

SITE VISITS CONFIRMED PROBLEMS WITH SOME PWSs

We visited 15 of the 94 PWSs that we identified as having reported questionable data. Six of the 15 PWSs
seemed to be conducting and reporting test results accurately. At the remaining 9 PWSs (60 percent), we found
indicators that data were either invalid (40 percent) or potentially falsified (20 percent). Of the nine PWSs we
visited and found problems, one each was in California, Illinois, Indiana, Louisiana, New Mexico, North
Carolina, Ohio, Oregon, and Tennessee. Operators at these PWSs: (1) recorded and reported test readings
improperly, (2) obtained readings using improper procedures or malfunctioning equipment, or (3) lacked
documentation regarding tests taken or recorded test readings before tests were taken. Our site visits also
disclosed some weaknesses in states' oversight of data review and reporting. In all cases except our visits to the
systems in California and New Mexico, state inspectors or EPA regional officials accompanied us and verified
our observations. For specific examples of issues we identified during our site visits to water treatment plants,
see exhibit 12.

DATA FALSIFICATION ISSUES NOT PRIORITY FOR EPA OR STATES

EPA and state officials placed low priority on reviewingrecords for invalid or falsified data primarily because
they did not believe that falsification was widespread. State officials generally trusted that data reported to them
was valid. According to OGWDW guidance, states should try to identify and prosecute PWSs suspected of
falsifying data.

States Did Not Review Data To Identify Invalid Or Falsified Results

Although some states might review data during field inspections, only four states (Alabama, Missouri, South
Carolina, and Tennessee) out of 44 we reviewed had formal, documented procedures for reviewing reported test
data for invalid or falsified data. Also, when state officials reviewed data for reported compliance violations, in
most cases, the data was not reviewed with the idea that it could be invalid or falsified. State officials generally
agreed that unusual or repetitive patterns in reported data should be questioned. However, according to state
officials, they: (1) generally did not believe data falsification was widespread, and (2) had limited resources to
review reported data even if falsification was occurring.


-------
Further, state officials generally were not aware that Federal guidance (OGWDW's Appendix G) existed which:
(1) detailed ways to detect falsified data, and (2) discussed the importance of reviewing data for falsification.
Also, they had not received any training on how to detect falsified data or how to recognize basic fraud
indicators. As a result, state officials were generally unaware of the best indicators to look for, and techniques to
use, to detect erroneous data.

According to OGWDW's Appendix G, "A sanitary survey is likely our [EPA and states] best vehicle for
detecting instances of data falsification. Every sanitary survey should include an investigation of data quality."
However, states did not have specific procedures to identify falsified data while conducting sanitary surveys. In
addition, the frequency with which states conducted sanitary surveys varied greatly.

According to state officials, only two states—South Carolina and Missouri—had formalized sanitary survey
procedures which included reviews of data quality (footnote 17). A third state, Tennessee, is in the process of
revising its survey procedures to include data quality review steps. Generally, states' sanitary survey procedures
did not include reviews of test data to identify potential falsification. Most states did not review monthly reports
for data falsification prior to conducting sanitary surveys. Moreover, they did not review on-site PWS records
during surveys to verify the validity of reported data. For example, Arizona officials maintained that field
inspectors reviewed and compared data during inspections of PWSs. However, we reviewed their inspection
procedures and found that inspectors were not required to: (1) reconcile backup PWS documentation with the
summary data sheets sent to the State, or (2) observe PWS operators taking routine water samples and tests.
When we pointed this out to State officials and explained that the normal inspection process might not allow an
inspector to detect invalid or falsified data, they agreed that their procedures needed revision.

Based on our audit, we believe reviewing MORs and on-site records would enable state officials to focus on
suspect areas during sanitary surveys. In addition, state field inspectors should analyze water samples during the
surveys to determine if readings are similar to historical data (or data recorded earlier in the day). Operators
should also be made to conduct tests to demonstrate their competency to accurately test samples. We believe
these tasks would require minimal use of additional resources.

According to current Federal requirements in 40 CFR 141.21, states must conduct a sanitary survey once every
five years at each surface PWS that does not collect five or more bacteriological samples each month. However,
EPA has issued guidance recommending that states conduct comprehensive sanitary surveys of PWSs at least
once every three years. We found that the frequency with which states conducted sanitary surveys varied.
Although states had to perform surveys only once every 5 years to meet the minimum Federal requirements,
more frequent and complete sanitary surveys may be useful to ensure effective protection against data
falsification.

OGWDW and state officials met in June 1994 to discuss the need for national guidance on conducting sanitary
surveys. Their meeting included a discussion on how often states should conduct sanitary surveys. According to
an OGWDW Education Specialist, OGWDW is in the process of preparing a document called "EPA/State Joint
Guidance on Sanitary Surveys." It is intended to be used primarily by state officials who conduct sanitary
surveys. According to the specialist, EPA and state officials established eight elements, at a minimum, that
should be addressed during sanitary surveys. He stated that one of the eight elements is entitled, "Monitoring,
Reporting, and Data Verification." According to OGWDW's Safe Drinking Water Branch Chief, the guidance
will not include specific data quality review steps; however, it will recommend that states conduct a data
quality/falsification review as part of its sanitary survey program. OGWDW intends to formally issue this
guidance during the first quarter of fiscal 1996. OGWDW also accepted the OIG's offer to help rewrite and
supplement the information in Appendix G.

As a result of our 1994 audit of Region 5, OGWDW also added a one-hour data falsification module to its 4-day
sanitary survey training course it and the regions provide to the states. In addition, OGWDW has included its
Appendix G guidance as an attachment to the sanitary survey training manual they distribute to course


-------
participants. However, according to the Safe Drinking Water Branch Chief, OGWDW did not conduct or
sponsor any sanitary survey training in fiscal year 1995. Instead, funds were spent on developing: (1) a
"learning video" for inspecting wells, and (2) a how-to booklet for states to use when conducting sanitary
surveys of small PWSs.

Regions Did Not Closely Monitor State's Efforts

Most Federal environmental statutes, including the Safe Drinking Water Act, embrace the concept that states
should have primary responsibility for operating regulatory and enforcement programs. In 1994, EPA and the
states issued a policy statement describing a new framework for their partnership (footnote 18). According to
the statement, an effective relationship between EPA and the states depends on mutual dedication to shared
responsibility and accountability for implementing environmental programs. While recognizing this concept,
the statement continues to call for EPA to perform its mandated statutory mission, including constructive
program review.

Regional oversight of drinking water data falsification issues was minimal. According to regional officials,
reviewing data for falsification was not a high priority because they: (1) had limited program resources to
manage increasing drinking water program responsibilities, (2) did not believe data falsification was
widespread, and (3) relied on their states to detect questionable data.

Regional officials did not closely monitor states' efforts to detect invalid or falsified data. Moreover, officials in
most regions did not evaluate states' efforts in detecting such data during their annual review of each state's
drinking water program. For example, Region 6 had developed its own specific "protocol" (guidance) related to
data falsification as a result of a material internal control weakness identified in 1990. However, we found that
Region 6 did not implement the guidance because officials believed that taking enforcement actions against
PWS operators on potential data falsification cases was not a productive use of resources.

In 1991, Region 8 conducted a project related to identifying falsified data. Region 8—which had primacy of
Wyoming's drinking water program—analyzed turbidity reporting trends of 41 Wyoming community, surface
PWSs from 1987-1990. Using a computer model, Regional officials evaluated three factors in predicting the
potential for falsified reporting: (1) violation of precipitation-based turbidity limit predictions, (2) seasonal
turbidity pattern abnormalities, and (3) lack of variation in daily turbidity values. Officials found that 4 of the 41
PWSs reported suspect turbidity data based on two of three factors. Region 8 officials believed that this project,
if used by other regions and states, could be a successful and reliable enforcement tool.

In 1992, OGWDW formulated its plan to automate PWS information and began developing the Safe Drinking
Water Information System (SDWIS). SDWIS provides a comprehensive automated data system for EPA and
states to manage public drinking water programs, and is intended to replace FRDS. The objective of the SDWIS
modernization effort is to make quality data accessible to managers at both the state and Federal levels. The
data requirements and systems design for each component of SDWIS are based on a facilitated series of
meetings with the states, regions, and OGWDW. The end product is a state-level data system (called
SDWIS/LAN) that has the same data model as the counterpart Federal system (SDWIS/FED), thus ensuring that
the data that EPA has access to is of the same quality as the counterpart state data. According to the SDWIS
Project Manager, the conversion to SDWIS from FRDS was completed, and SDWIS became OGWDW's
official database on August 15, 1995.

As of August 28, 1995, the SDWIS Project Manager stated that nine states and two Regional Indian land
programs had installed SDWIS components. He projected that by the end of fiscal 1996, 25 states will have
installed SDWIS.

Based on this review, we believe the computer model developed by Region 8, or a comparable model, should be
considered in the design and implementation of SDWIS to assist regions and states in identifying PWSs that


-------
report erroneous data. According to the SDWIS Project Manager, states will be able to download data from
SDWIS and use a report generator capability to evaluate PWS data integrity. Considering the modular design of
SDWIS for use by states, we believe that a computer model is a more cost-effective approach for evaluating
PWS data integrity than states expending limited resources to develop alternative techniques.

Regions Need To Reemphasize Appendix G's

Regions needed to reemphasize the importance of OGWDW's Appendix G guidance on detecting invalid and
falsified data to its states and redistribute copies to the states. Most state officials we spoke with were not aware
that Federal guidance existed addressing data falsification. In fact, according to both Region 7 and Region 8
officials, those two regions had not distributed Appendix Gto its respective states. In Appendix G, OGWDW
advised the regions to work with their states to identify and prosecute PWSs suspected of falsifying data.

According to officials in some regions, they have not conducted any training for regional staff on how to detect
potentially falsified data using the guidance in Appendix G. Reemphasizing the Appendix G guidance would
require minimal resources.

PWS Operator Certification Statements Might Deter Some False Data Reporting

Even though the Act does not address the issue of falsified test data or potential penalties for those caught
reporting it, any deliberate submission of false information to the Federal government is a crime under 18
United States Code (USC) 1001. Section 1432 of the amended Act addresses potential penalties for anyone
found tampering with a PWS to intentionally harm the public. This would cover, for instance, a terrorist who
purposely added harmful contaminants to the water supply. EPA's regulations which implement the Act do not
contain provisions addressing data falsification.

Although some states had specific drinking water regulations prohibiting the submission of falsified test results,
and most states had general statutes outlawing false claims, only six states (Arizona, Illinois, Maryland,
Oklahoma, Tennessee, and Utah) required PWS operators to certify by signature that reported MOR data were
accurate and true. However, we also found that Oklahoma accepted unsigned, uncertified MORs.

We believe that PWS operators should be held accountable in cases when they intentionally report erroneous
data, and be required to certify by signature to the validity and authenticity of the data they report. Such a
certification requirement would reinforce the fact that deliberate reporting of false data is a crime.

Operational Reporting Requirements Could Be Reduced

Many states required PWS operators to report more operational test data to them than EPA required. We
identified some states that required monthly reporting of daily test results for tests including, but not limited to,
alkalinity, hardness, iron, manganese, and polyphosphate. Although operators reported this additional data, state
officials generally reviewed operational data almost exclusively to identify compliance related violations, and
did not review such data to identify unusual reporting patterns.

On March 16, 1995, the President announced the establishment of ten principles for reinventing environmental
regulation. Under these principles, Federal, state, tribal, and local governments must work as partners to achieve
common environmental goals, with non-Federal partners taking the lead when appropriate. Currently, EPA is
reviewing its priorities for drinking water regulations based on an analysis of public health risks and discussions
with stakeholders. OGWDW initiated a 25 percent regulatory paperwork burden reduction project and held state
and regional workgroup meetings to identify candidates in CFR Parts 141 and 142 for elimination or
simplification. Among ideas being considered are the streamlining of: (1) non-violation information that states
are currently required to report to OGWDW, and (2) chemical monitoring requirements.


-------
CONCLUSION

Improvements are needed to prevent and detect erroneously reported drinking water test data. Based on our
statistical sample review, we projected nationwide that 18.3 percent of the PWSs reported questionable test
data, and that 11.6 percent of the PWSs reported erroneous test results. Although the percentage of the
population served by those PWSs that reported erroneous data was extremely small, some improvements would
provide further assurance that operators are accurately reporting test data. The reporting of invalid data could be
corrected through increased operator training or the purchasing of better test equipment. The regions should
take necessary corrective actions, such as recommending that the states provide training to those operators in
need, or take enforcement actions against those who intentionally falsified data.

OGWDW should consider the computer model developed by Region 8, or a comparable model, in the
development and implementation of SDWIS. Such a model, we believe, would enable regions and states to
efficiently identify PWSs that reported erroneous data.

OGWDW needs to reemphasize the importance of its Appendix G guidance and the role sanitary surveys can
play in detecting erroneous data. State officials need to be made more aware that data falsification does, in some
cases, occur and that they need to improve their capabilities for detecting questionable data. State officials told
us that they rely on PWS operators to report valid test results and, therefore, do not believe they need to review
reported data for potential falsification. Our audit has shown, however, that the data reported to them was not
always valid. Even though the data reporting process is based on self-monitoring, states should not presume that
oversight or review of the data for reasonableness is unnecessary.

Although there is a movement towards less Federal oversight of state programs, regions should be aware of
what each state is doing to detect or prevent invalid and falsified data. The regions should also, as a good
management practice, continue to distribute relevant guidance to their states to heighten awareness of the
potential for data falsification and to aid them in detecting invalid and falsified data.

Small and very small PWSs most often reported invalid data. States could detect or prevent the reporting of
invalid data by these small systems by conducting more thorough sanitary surveys, including comparing on-site
records with MORs sent to states. States should examine water system records during sanitary surveys to verify
that previously reported data were based on actual tests. OGWDW sanitary survey training materials should
include data quality/falsification review steps.

Submission of falsified data is a crime under 18 USC 1001. However, because the Act itself does not provide
criminal penalties for falsifying data, many operators may not realize that there are Federal criminal penalties
for such an offense. Many states have either specific drinking water regulations or, at a minimum, more general
false claims statutes in place as legal authority for taking enforcement actions against those who falsified data.
However, most states do not currently require PW S operators to certify on MORs that the data they are
submitting is accurate and true. We believe that state reporting forms should include a certification block to be
signed by PWS operators. This certification requirement would make it clear that deliberate false reporting is a
crime. For example, operators could certify to the following:

"I certify that this report is true. I also certify that the statements I have made on this form and all attachments
thereto are true, accurate, and complete. I acknowledge that any knowingly false or misleading statement may
be punishable by fine or imprisonment or both under 18 USC 1001 and other applicable laws."

Finally, given the resource constraints that states are experiencing, we believe OGWDW, through its regulatory
reduction project, should encourage states to determine the testing and data reporting requirements that are most
critical to protecting public health and adjust the reporting requirements accordingly.

RECOMMENDATIONS


-------
We recommend that the Assistant Administrator for Water require the Director of EPA's Office of Ground
Water and Drinking Water to:

1.	request that the regions coordinate with the states to follow up on the 48 PWSs, or operators, which we found
reported invalid or potentially falsified test data and consider providing training to, or taking enforcement
actions against, them;

2.	consider incorporating the computer model developed by Region 8, or a comparable model, in the design and
implementation of SDWIS to assist regions and states in identifying PWSs that report erroneous data;

3.	revise EPA's Appendix G guidance to include additional steps to identify erroneous data and provide updated
examples to illustrate the types of data patterns indicative of erroneous data, and redistribute this revised
guidance to the regions;

4.	request that the regions distribute the revised Appendix G guidance to the states;

5.	update EPA's sanitary survey training materials to include reviews of data quality (for example, comparison
of MORs with on-site logs and bench sheets);

6.	establish schedules, with the regions, for providing sanitary survey training to state officials;

7.	discuss with EPA regions and state officials the feasibility of including a certification block on MOR forms
for PW S operators to sign and certify to the accuracy of the reported data; and,

8.	continue to work with regional and state stakeholders to minimize operational data reporting in streamlining
the regulatory paperwork requirements.

AGENCY COMMENTS AND ACTIONS

The Assistant Administrator (AA) for Water generally agreed with the recommendations, but stated his office
could not commit to implementing all of the recommendations due to significant resource constraints they are
facing. The AA provided the following comments:

...Recommendation 1. We agree that follow up is needed on the 48 systems which the Inspector General has
identified as reporting invalid or potentially falsified data and we will ask the regions to encourage States to
follow up on these cases and take appropriate action. Some states have already begun investigating these
systems.

We would like to point out that training and enforcement may not be the only types of appropriate response. In
some instances, the operators that were involved are gone, so the appropriate response may be no action.
Further, it is difficult to take an enforcement action against systems which may have falsified data. Our regions
are concerned that, even where they have evidence of potential criminal conduct, the criminal investigators
often decline to investigate or prosecute because the case is not a high priority. This is a clear disincentive for
addressing cases of potentially falsified data.

...Recommendation 2. We would like to explore the feasibility of this recommendation, however, we cannot
commit to this activity at this time given our budget constraints.

...Recommendation 3. Assuming we receive sufficient resources in next year's budget, we will review the
Inspector General's specific suggestions for revising the Appendix G guidance and make any necessary
revisions. If we revise the guidance, we will distribute it to our Regional Offices.


-------
...Recommendation 4. If we revise the guidance, we will ask the Regions to distribute it to their states.

...Recommendation 5. We have already added a one-hour data falsification module to our 4 day sanitary survey
training course which addresses the need to compare monthly operational reports (MORs) with on-site logs and
bench sheets. We have also drafted "EPA/State Joint Guidance on Sanitary Surveys" that identifies eight
recommended elements of a sanitary survey. One of these elements, "Monitoring, Reporting, and Data
Verification" includes reviewing the validity of data reported to the State, reviewing bench sheets, on-site logs,
and monthly operational reports, and the calibration of process control and compliance equipment. This
guidance will be finalized in November.

...Recommendation 6. Because of significant travel and resource constraints, we cannot commit at this time to
establishing schedules for providing sanitary survey training to States.

...Recommendation 7. We have discussed this recommendation with our regional counterparts at a national
meeting that was held on September 19-21, 1995. Based on this discussion, we will send a memorandum to the
Regions requesting them to inform their States of your findings and suggest that States consider including a
certification block on the forms. Unfortunately, resource limits preclude us from following up with the States to
track their progress in implementing this suggestion.

...Recommendation 8. We plan to continue this effort and appreciate the Inspector General's support.

In addition, the AA for Water stated that he and his staff were very surprised to learn that the Inspector General
nominated drinking water data integrity as an Agency level weakness after the issuance of the draft report. He
stated that he did not believe that data integrity was an appropriate candidate for such a nomination, and that
this report does not support the nomination.

OIG EVALUATION

The proposed actions generally meet the intent of the recommendations. Although the future of drinking water
funding levels is uncertain, the program office needs to provide specific target dates for implementing
recommendations 1 through 4, and 6, once EPA's fiscal 1996 budget is approved. Regarding the difficulty of
taking enforcement action against systems which may have falsified data raised by the AA, EPA's Office of
Enforcement and Compliance Assurance (OECA) can direct EPA regional Offices of Criminal Investigations to
assign data falsification cases a higher priority for investigation, if OECA believes that such cases are high
priority items, relative to their entire workload. To deter future data falsification, OECA management could
work with Regional management to determine the priority necessary to ensure that enough cases are pursued to
demonstrate the severity and consequences of such actions.

As stated above, the AA did not believe that data integrity was an appropriate candidate as a 1995 Agency level
weakness. In response, the Office of Inspector General believes drinking water data quality is an appropriate
candidate based on the results of this review, along with the results of another OIG review titled "Region 7 and
States Improved Drinking Water Programs Through Alternative Measures" (Report No. 5100226, dated March
24, 1995). In that review, we found that states in EPA Region 7 needed automated data management systems
for more consistent and efficient data management and that EPA relied upon disjointed and manual or partially
automated state systems to update FRDS, its primary water management automated database. Ineffective data
management systems have impacted States' abilities to provide Region 7 with accurate enforcement data and to
obtain small and very small systems' timely compliance with Safe Drinking Water Act requirements. We
believe that this adverse condition is representative of states in other regions based on our nationwide review of
data integrity. Also, as of August 28, 1995, only nine states and two Regional Indian land programs had
installed components of SDWIS, which replaced FRDS on August 15, 1995.


-------
We have concluded that the Agency should consider drinking water data quality as a 1995 weakness candidate
for its annual assurance letter because: (1) SDWIS is not fully implemented by the Agency, (2) the 25 states that
have agreed to install SDWIS have not done so, and (3) we identified data integrity weaknesses in this review.

In response to the Agency's request, the Inspector General provided input on the issues the OIG believed to be
weakness candidates. The Inspector General included drinking water data quality as one of the candidates for
consideration as an Agency-level weakness. The Assistant Administrator for Water's response indicates they
considered the OIG's input, but instead nominated other weaknesses they believed to be better candidates.

Exhibit 1

State

REGION 1 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed

Invalid

Potentially
Falsified

Total

Percent

CT

4

0

0

0

0.0

MA

8

0

0

0

0.0

ME

6

0

0

0

0.0

NH

1

0

0

0

0.0

RI

Offootnote 19)

N/A

N/A

N/A

N/A

VT

3

0

0

0

0.0

Total

22

0

0

0

O.Offootnote 20)

This exhibit shows that of the 22 Region 1 PWSs reviewed, none reported invalid or potentially falsified data.
These results should not be used for statistical projection purposes by region/state because the statistical sample
we used for our review was randomly selected on a national basis from the universe of 4,417 community
surface PWSs located in the continental United States. The sample excluded Region 5 PWSs, which we
previously reviewed.

Exhibit 2

REGION 2 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

State	Reviewed	Invalid	Total	Percent

Falsified

NJ	2	0	0	0	0.0

NY	16	2	0	2	12.5

Total	18	2	0	2	11.1

This exhibit shows that of the 18 Region 2 PWSs reviewed, two systems reported erroneous data. Both of the
PWSs were located in the State of New York and reported invalid data. These results should not be used for
statistical projection purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface PWSs located in the
continental United States. The sample excluded Region 5 PWSs, which we previously reviewed.


-------
Exhibit 3

State

REGION 3 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed

Invalid

Potentially
Falsified

Total

Percent

DE

Offootnote 21)

N/A

N/A

N/A

N/A

MD

2

0

0

0

0.0

PA

25

3

5

8

32.0

VA

6

0

0

0

0.0

wv

9

3

1

4

44.4

Total

42

6

6

12

28.6

This exhibit shows that of the 42 Region 3 PWSs reviewed, 12 reported erroneous data. Six reported invalid
data and six reported potentially falsified data. Eight of the PWSs were located in Pennsylvania, while the
remaining four were in West Virginia. These results should not be used for statistical projection purposes by
region/state because the statistical sample we used for our review was randomly selected on a national basis
from the universe of 4,417 community surface PWSs located in the continental United States. The sample
excluded Region 5 PWSs, which we previously reviewed.

Exhibit 4

State

REGION 4 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed

Invalid

Potentially
Falsified

Total

Percent

AL

4

0

0

0

0.0

FL

1

0

0

0

0.0

GA

7

0

0

0

0.0

KY

7

0

0

0

0.0

MS

Offootnote 22)

N/A

N/A

N/A

N/A

NC

9

1

0

1

11.1

SC

7

1

0

1

14.3

TN

11

1

1 (footnote 23)

2

18.2

Total

46

3

1

4

8.7

This exhibit shows that four Region 4 systems reported invalid or potentially falsified data. Of the four, two
were in Tennessee, one was in North Carolina, and one was in South Carolina. These results should not be used
for statistical projection purposes by region/state because the statistical sample we used for our review was


-------
randomly selected on a national basis from the universe of 4,417 community surface PWSs located in the
continental United States. The sample excluded Region 5 PWSs, which we previously reviewed.

Exhibit 5

REGION 5 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

State	Reviewed	Invalid	Total	Percent

Falsified

IL

129

3

1

4

3.1

IN

57

1

3

4

7.0

MI

70

1

0

1

1.4

MN

26

0

0

0

0.0

OH

149

2

2

4

2.7

WI

20

1

0

1

5.0

R5

1 (footnote 24)

0

0

0

0.0

Total

452

8

6

14

3.1

This exhibit shows that of the 452 Region 5 PWSs reviewed, 14 PWSs reported either invalid or potentially
falsified test data. Of the 14, eight reported invalid data, while the remaining six PWSs reported potentially
falsified data. We reviewed the entire population of community, surface PWSs in Region 5, which was not a
part of the national random sample selection.

Exhibit 6

State

REGION 6 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed

Invalid

Potentially
Falsified

Total

Percent

AR

7

1

0

1

14.3

LA

3

1

1

2

66.7

NM

1

1

0

1

100.0

OK

19

0

4

4

21.1

TX

18

1

1

2

11.1

R6

1 (footnote 25)

0

0

0

0.0

Total

49

4

6

10

20.4

This exhibit shows that of the 49 Region 6 PWSs reviewed, 10 PWSs reported either invalid or potentially
falsified test data. These results should not be used for statistical projection purposes by region/state because the
statistical sample we used for our review was randomly selected on a national basis from the universe of 4,417


-------
community surface PWSs located in the continental United States. The sample excluded Region 5 PWSs, which
we previously reviewed.

Exhibit 7

State

REGION 7 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed

Invalid

Potentially
Falsified

Total

Percent

IA

4

0

0

0

0.0

KS

6

0

0

0

0.0

MO

5

0

0

0

0.0

NE

Offootnote 26)

N/A

N/A

N/A

N/A

R7

1 (footnote 27)

0

0

0

0.0

Total

16

0

0

0

0.0

This exhibit shows that of the 16 Region 7 PWSs reviewed, none reported invalid or potentially falsified data.
These results should not be used for statistical projection purposes by region/state because the statistical sample
we used for our review was randomly selected on a national basis from the universe of 4,417 community
surface PWSs located in the continental United States. The sample excluded Region 5 PWSs, which we
previously reviewed.

Exhibit 8

REGION 8 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

State	Reviewed	Invalid	Total	Percent

Falsified

CO

10

0

0

0

0.0

MT

3

0

0

0

0.0

ND

4

0

0

0

0.0

SD

1

0

0

0

0.0

UT

1

0

0

0

0.0

WY

3

0

0

0

0.0

Total

22

0

0

0

0.0

This exhibit shows that of the 22 Region 8 PWSs reviewed, none reported invalid or potentially falsified data.
These results should not be used for statistical projection purposes by region/state because the statistical sample
we used for our review was randomly selected on a national basis from the universe of 4,417 community
surface PWSs located in the continental United States. The sample excluded Region 5 PWSs, which we
previously reviewed.


-------
Exhibit 9

REGION 9 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

State

Reviewed

Invalid

ruifiniaiiy

Falsified

Total

Percent

AZ

4

0

0

0

0.0

CA

33

2

0

2

6.1

NV

1

0

0

0

0.0

R9

2(Tootnote 28)

1

0

1

50.0

Total

40

3

0

3

7.5

This exhibit shows that of the 40 Region 9 PWSs reviewed, three reported erroneous data. Two were located in
the State of California, while the third was an Indian land system in New Mexico. These results should not be
used for statistical projection purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface PWSs located in the
continental United States. The sample excluded Region 5 PWSs, which we previously reviewed.

Exhibit 10

State



ID

2

OR

9

WA

5

Total

16

REGION 10 SURFACE WATER SYSTEMS THE OIG REVIEWED
THAT SUBMITTED ERRONEOUS DATA

Reviewed	Invalid	Total	Percent

Falsified

1

0

1

50.0

1

0

1

11.1

0

1

1

20.0

2

1

3

18.8

This exhibit shows that of the 16 Region 10 PWSs reviewed, three reported invalid or potentially falsified data.
These results should not be used for statistical projection purposes by region/state because the statistical sample
we used for our review was randomly selected on a national basis from the universe of 4,417 community
surface PWSs located in the continental United States. The sample excluded Region 5 PWSs, which we
previously reviewed.

EXHIBIT 11

CALCULATION OF PWSs WITH QUESTIONABLE
DATA PROJECTED TO THE PWS UNIVERSE

The 18.3 percent figure was derived by aggregating our prior results from our audit of Region 5 with our results
from the national audit of the remaining regions. We calculated this percentage as follows:

Step 1. Number of PWSs with Data that Appeared Questionable (National) = 52 = 0.192


-------
Number of PWSs in Sample (National) 271

Step 2. 0.192 4,417 (Universe of National Sample) = 848

Step 3. 848 + 42 (Region 5 PWSs with Questionable Data) = 890

Step 4. 4,417 (Universe of National Sample) + 452 (Universe of Region 5 PWSs) = 4,869

Step 5. 890 4,869 (Total Universe) = 0.183 or 18.3 percent

EXHIBIT 12

SITE VISITS SUMMARY

The following information relates to some of the 15 site visits we conducted. We visited these PWSs to further
evaluate questionable test data reported by certain PWS operators we identified during our file reviews. This
information includes two examples where we determined, upon completion of the site visit, that problems did
not exist.

An Oregon PWS Operator Used Improper Procedures to Record Test Results

A PWS operator at a small-sized system in Oregon used improper testing procedures and equipment to record
some test results. For example, he used his personal hot tub testing equipment to determine pH levels. The
operator reported invalid test data for turbidity, pH, and water temperature.

The operator reported data to its state on MORs which showed little variation in values for turbidity, and no
variation in pH and water temperature readings from September 1993 through July 1994. Our site visit showed
that: (1) turbidity sampling and analysis was not done in accordance with proper laboratory techniques; and (2)
the instruments, methodology, and monitoring practices for pH and temperature were unacceptable. Moreover,
we found that the State of Oregon had not conducted a sanitary survey of this system since 1989.

Turbidity Testing Conducted Improperly

We found that turbidity sampling and analysis were not conducted in accordance with proper laboratory
techniques (footnote 29). According to the operator, a 1.0 nephelometric turbidity unit (NTU) primary standard
was used once a week to standardize the machine. However, secondary standards were never used, and the
machine was never calibrated prior to turbidity measurements. In addition, the operator collected and tested
only one sample for turbidity each day, instead of every four hours, as required under the Surface Water
Treatment Rule.

Testing for pH Was Also Conducted Improperly

For a period of eleven consecutive months, from September 1993 through July 1994, the operator reported a pH
of 6.5 every day but one. Based on our site visit, we found that: (1) a pH measurement was taken on the first
day of each month and the operator recorded and reported that same value for the rest of the days in the month;
(2) pH was measured colorimetrically, which is not an approved method; and (3) the pH instrument at the
system only had a range of 6.8 to 8.2; therefore, a reading of 6.5 was not possible. According to the operator, he
used a color meter that he used for his own hot tub to come up with the readings of 6.5. He said that his personal
hot tub colorimeter had a scale that included 6.5 and he would compare the color of the sample being tested
with the colors on his meter. Region 10 officials agreed that this PWS's reported pH readings outside the 6.8 to


-------
8.2 range were unacceptable. According to Regional officials, the PWS recently purchased an EPA-approved
pH meter to obtain more accurate readings.

Temperature Testing Conducted Improperly

The operator also reported water temperature at 5 degrees Centigrade every day from October 1993 through
July 1994. Based on our site visit, we found that: (1) a temperature measurement was taken on the first day of
each month and the operator then recorded and reported that reading for the rest of the days in the month, and
(2) temperature was measured using a non-scientific thermometer. According to the operator, when he was
hired and trained, he was told to test for pH and temperature only on the first day of the month and to record
that same value for the rest of the month.

Problems with State Oversight

According to Region 10's Drinking Water Program Section Chief, the State of Oregon had not performed a
sanitary survey at this system since October 1989—over five years ago. According to current Federal
requirements in 40 CFR 141.21, states generally must conduct a sanitary survey at least once every five years at
each surface PWS. The Region 10 Chief told us that this PWS will soon be merging with other PWSs and will
cease to be a separately regulated PWS.

An Indian Land PWS in New Mexico Reported Unreliable Turbidity Results

A PWS operator at a medium-sized Indian land system in New Mexico (footnote 30) reported unreliable
effluent turbidity values due to the use of improper laboratory techniques. Beginning in July 1993, we noted this
system's reported effluent turbidity levels dropped from around 1.0 NTUs to around 0.5 NTUs or below and
stayed at this new level in the ensuing months (footnote 31). Based on our site visit, we determined that: (1)
secondary standards used by the operator were over a year old, and a primary standard was not used to check
the accuracy of the secondary standard; (2) the secondary standards used by the operator were not within the
range of turbidity values expected (that is, 0-1.0 NTU) to be achieved; and (3) the operator did not calibrate
the bench model turbidimeter prior to sample measurements.

A North Carolina PWS Reported Invalid pH Test Results

A PWS operator at a medium-sized system in North Carolina reported invalid data for pH for over three years
due to improper testing procedures and poor equipment. The operator reported identical pH readings of 8.4 for
filtered and finished water to its state nearly every day from January 1991 through May 1994. During our site
visit, we determined that operators used: (1) improper procedures to read the pH levels and (2) a crude
colormetric wheel to measure pH. For example, we found that operators read their results by holding the color
wheel up to a window with evergreen trees in the background. Because the color wheel used varying shades of
green to measure pH values, light filtered through the green background of the trees would have made it
difficult to obtain accurate readings.

This PWS obtained a new and more scientific pH meter in March 1995. Recorded pH values in the PWS's daily
logs began to fluctuate beginning that month. Although erroneous readings were reported in the past, there did
not seem to be an ongoing reporting problem as of March 1995. The system's purchase of a new pH meter,
along with training for the operators, should result in more accurate, and fluctuating, readings.

A Tennessee PWS Operator Reported Invalid Chlorine Residual Results

A PWS operator at a small-sized system in Tennessee reported invalid data for chlorine residual for over two
years. The operator reported identical chlorine residual readings of 2.0 mg/1 for "on top of filter", "plant
effluent", and "distribution system" from September 1991 through November 1993. A state official told us that:


-------
(1) the city which operated the PWS had difficulty keeping a certified operator on-site because the city was
unable to pay much in wages and (2) he had visited the treatment plant in the past and found operators either
asleep or watching television. The city, however, has taken steps to correct deficiencies including the recent
hiring of a State-certified operator. It was evident during our site visit that the newly hired operator was making
a serious effort to provide a safe product to his customers and meet State and Federal requirements. The
operator's daily logs showed that he was recording more accurate data than others reported in the past.

Six Site Visits Showed Accurate Testing

Six of the 15 PWSs we visited seemed to be operating satisfactorily. Our inspections of these PWSs' facilities,
equipment, and records, and discussions with water plant officials, showed that the reported data we initially
questioned were likely to be valid.

For example, one PWS consistently reported finished turbidity readings of 0.02 from September 1991 through
March 1992. Turbidity measurements at that level are very low—that is, the water would have to be extremely
clear and free of particulate matter. As a result of our on-site visit, however, we found that the PWS used a
continuous monitoring device which had an automatic alarm system that shut the plant down when turbidity and
chlorine residuals reached a pre-set limit. As a result, the systems were prevented from exceeding certain limits.
Independent tests we took during our visit did not significantly differ from the reported turbidity results.

Another PWS consistently reported chlorine residuals in the distribution system at 0.8 for 20 consecutive
months. The operator at the treatment plant stated that he began using a manual Hach color wheel to measure
chlorine residuals because the in-line chlorine monitor was not working properly. The operator rounded
chlorine residual measurements to the nearest tenth of a decimal. Independent tests we took during our visit did
not significantly differ from the reported chlorine residual readings.

APPENDIX 1

APPENDIX 2

ABBREVIATIONS

AA Assistant Administrator

CDC Centers for Disease Control and Prevention

CFR Code of Federal Regulations

EPA Environmental Protection Agency

FRDS Federal Reporting Data System

GAO General Accounting Office

MOR Monthly Operational Report

NTU Nephelometric Turbidity Units

OECA Office of Enforcement and Compliance Assurance

OGWDW Office of Ground Water and Drinking Water


-------
OIG Office of Inspector General

PWS Public Water System

SDWIS Safe Drinking Water Information System

USC United States Code

APPENDIX 3

DISTRIBUTION

Inspector General (2410)

Assistant Administrator for Water (4101)

Assistant Administrator for Enforcement and Compliance Assurance (2201)
Director, Office of Ground Water & Drinking Water (4601)

Director,

Region

1 Water Management Division

Director,

Region

2 Water Management Division

Director,

Region

3 Water Management Division

Director,

Region

4 Water Management Division

Director,

Region

5 Water Division (W-15J)

Director,

Region

6 Water Management Division

Director,

Region

7 Water Management Division

Director,

Region

8 Water Management Division

Director,

Region

9 Water Management Division

Director,

Region

10 Water Division

OIG Office of Investigations (11-13 J)

Agency Follow-up Official (3304)

Attention: Assistant Administrator for the Office of

Administration and Resources Management

Agency Follow-up Coordinator (3304)

Attention: Director, Resources Management Division


-------
Audit Follow-up Coordinator (3304)

Attention: Management Controls Branch

Audit Follow-up Coordinator (3802F)

Attention: Office of Policy, Training, and Oversight Division

Audit Follow-up Coordinator (1104)

Attention: Executive Support Office

Region 1 Office of External Programs

Region 2 Public Affairs Branch

Region 2 Intergovernmental Relations Branch

Region 3 Office of External Affairs

Region 4 Public Affairs

Region 5 Public Affairs (P-19J)

Region 5 Intergovernmental Relations Office (R-19J)

Region 6 Office of External Affairs

Region 7 Public Affairs

Region 7 Intergovernmental Liaison Office

Region 8 Public Affairs

Region 9 Public Affairs

Region 10 External Affairs Office

Headquarters Library (3404)

FOOTNOTES:

1.	Region 5 Procedures to Ensure Drinking Water Data Integrity (EPA Report No. 4100540, issued September
30, 1994).

2.	The Act applies to all 50 states, the District of Columbia, Puerto Rico, Indian lands, the Virgin Islands,
American Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the Republic of Palau.
Currently, EPA administers public water system supervision programs on all Indian lands.

3.	According to EPA, in 1993, over 370,000 people in the Milwaukee area were affected by the parasitic
microorganism Cryptosporidium in the water supply . Reports indicated that at least 100 people in the


-------
Milwaukee area died as a result of the parasite. Although the Milwaukee PWS operators did not report invalid
or falsified data, the outbreak highlighted the potential health effects associated with turbidity violations.

4.	Noncommunity systems are either institutions such as schools, hospitals, and work places that regularly serve
at least 25 of the same people at least 6 months a year, or establishments such as campgrounds and motels that
supply water to people at non-residential facilities.

5.	An aquifer is an underground geological formation containing usable amounts of groundwater that can supply
wells and springs.

6.	The statistical sampling technique used for this review is the same approach—random sampling without
replacement—used by data verification teams to make inferences about the number of discrepancies that exist
between the FRDS database and PWS records. This approach is described in the PWSS Data Verification
Guidance. Appendix B. The sample size calculation formulae selected for this audit were based on recent
OGWDW data verification audits.

7.	We reviewed all 452 community surface PWSs in Region 5 during our 1994 audit of that Region.

8.	For Arkansas PWSs, we randomly selected seven PWSs from a list supplied directly by Arkansas officials,
rather than the additional list provided by OGWDW. All seven Arkansas PWSs on the original sample list were
miscoded, as were five of the seven Arkansas PWSs on OGWDW's replacement list.

9.	Audit Report of Region I's Enforcement of The Safe Drinking Water Act (SDWA) (EPA Report No.
3100291).

10.	Audit Report of Region I's Enforcement of The Safe Drinking Water Act (SDWA) (EPA Report No.
3100291).

11.Drinking	Water: Compliance Problems Undermine EPA Program as New Challenges Emerge.
(GAO/RCED-90-127).

12.	We classified cases as invalid if we obtained evidence from state officials, or concluded via our own site
visits, that data were improperly measured or reported due to equipment malfunction or a lack of operator
training. We classified cases as potentially falsified if state officials provided information that was insufficient
to convince us that the reported data were accurate and the cases were not classified as invalid. We did not find
any instances where operators knowingly reported test measurements from malfunctioning equipment. Had this
occurred, we would have considered such cases as potentially falsified.

13.There	were no very large systems which reported invalid or potentially falsified data.

14.We	reviewed these community, surface PWSs during our 1994 audit of Region 5.

15.We	do not contend that this number includes all possible PWSs reporting erroneous data. Rather, it is the
number of PWSs we identified based on our review criteria.

16.Total	percent refers to the statistically projected percentage of PWSs reviewed which reported erroneous
data. We calculated this percentage as follows:

Step 1. Number of PWSs with Erroneous Data (National) = 34 = 0.125

Number of PWSs in Sample (National) 271


-------
Step 2. 0.125 4,417 (Universe of National Sample) = 552
Step 3. 552 + 14 (Region 5's PWSs With Erroneous Data) = 566
Step 4. 566 4,869 (Total Universe) = 0.1162 or 11.6 percent

17.	Alabama's written policy to ensure data integrity was a general memo and was not specific to its sanitary
survey procedures.

18.	Joint Policy Statement on State/EPA Relations. State/EPA Capacity Steering Committee, July 14, 1994.

19.We	did not review data for any Rhode Island PWSs because none were selected in the random sample.

20.	Total percent refers to the percentage of PWSs reviewed which reported invalid or potentially falsified data.
This percentage was derived by adding the number of PWSs that reported invalid data to the number that
reported potentially falsified data and dividing by the number of PWSs reviewed.

21.We	did not review data for any Delaware PWSs because none were selected in the random sample.

22.We	did not review data for any Delaware PWSs because none were selected in the random sample.

23.This	Tennessee PWS also reported some invalid data. We classified this PWS within the potentially falsified
category to prevent the system from being counted twice in the total and because actions to correct falsified data
are typically more serious and resource-intensive.

24.Region	5 was directly responsible for oversight of PWSs on Indian lands. This one PWS was an Indian land
system in Michigan.

25.Region	6 was directly responsible for oversight of PWSs on Indian lands. This one system was an Indian
land system in New Mexico.

26.We	did not review data for any Nebraska PWSs because none were selected in the random sample.

27.Region	7 was directly responsible for oversight of PWSs on Indian lands. This one system was an Indian
land system in Kansas.

28.Region	9 was directly responsible for oversight of PWSs on Indian lands. One of these two systems was in
California, while the other was in New Mexico. Region 9 officials had an agreement with Region 6 officials to
oversee this New Mexico system, which normally would fall within Region 6's geographical coverage.

29.EPA	Method 180.1

30.Region	9 is responsible for direct oversight of this system. Although the system is in New Mexico, which
normally falls within the coverage of Region 6, Region 9 officials have an agreement with Region 6 to monitor
the system.

31 .In July 1993, the Surface Water Treatment Rule became effective. Under the Rule, a PWS using
conventional or direct filtration was deemed in violation of the turbidity standard if more than 5 percent of its
test readings for the month were over 0.5 NTU. Prior to the Rule, the violation standard was 1.0 NTU.


-------