-------








-------
  >AX
 WO (o
*
             \
     mj
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
              WASHINGTON, o.c. 20450.
k
                                      September 29, 1995
                                                          OFFICE OP
                                                     THE INSPECTOR GENERAL
     MEMORANDUM
     SUBJECT:    EPA Procedures to Ensure
                  Drinking Water Data Integrity                     ,
                  Audit Report No.  E1HWE5-23-0001-5100516
                                            \         •        ..
    'FROM:      Michael Simmons   V\te>rvcxe\ -Svjr*\w»o"«^s	
                  Deputy Assistant Inspector General
                   for Internal and Performance Audits

     TO:          Robert Perciasepe
                  Assistant Administrator for Water

            Attached is our final report titled EPA Procedures to Ensure Drinking Water Data
     Integrity. This audit report contains findings that describe the results of our nationwide audit
     of regions' and states' procedures for reviewing the validity of drinking water test data.

            We appreciate, the opportunity to respond to the Office of Water's request for a
     nationwide review of drinking water data integrity issues.  We hope that the results of our
     review will be useful to your office as you continue your work to assure a safe drinking
     water supply throughout the nation. .            .

            This report represents  the opinion of the Office of the Inspector General.  Final
     determinations on matters in this report will be made by EPA managers in accordance with
     established EPA audit resolution procedures.  Accordingly, the findings described in this
    . audit report do not necessarily represent the final EPA position and are not binding upon
     EPA in any enforcement proceeding brought by EPA or the Department of Justice.

            Your response to our draft report is included as Appendix  1. Based on the Office of
    , Water's response and discussions at the exit conference, we made appropriate changes to this
     final report,  including the revision of text and recommendations contained in the draft report.

     Action Required

            In accordance with EPA Order 2750, we have designated you as the Action  Official
     for this report. The Action Official is required to provide this office with a written response
     to the audit report within 90 days of the final audit report date. The response should address
                                                                           ftocyctodffiMyctabfc
                                                                           PrintedwNfi SoyiCanota Ink on papor thai'

-------
all recommendations.  For corrective actions planned but not completed by the response date,
reference to the specific milestone dates will assist us in deciding whether to close this
report.  We have no objection to the release of this report to the public.              '   .  .- •

       Should you or your staff have any questions regarding this report, please contact Leah
Nikaidoh, Audit Manager, Northern Audit Division, at (513) 366-4365.

Attachment  .

-------
                        Executive Summary
PURPOSE
The Environmental Protection Agency's (EPA) Office of
Ground Water and Drinking Water (OGWDW) officials
requested a nationwide audit of drinking water data
integrity issues:  (1) because of their concern that falsified
data might be going undetected and could result in
potential threats to human health, and (2) "to aid them in
planning how to use limited resources efficiently and
effectively.              .

The Centers for Disease Control and Prevention (CDC)
estimated that over 940,000 people become ill every year
from drinking contaminated water. Of those,  CDC
estimated 900 people die each year.

The objectives of the audit were to determine:

   •   the frequency with which operators of community,
.      surface public water systems (PWSs) throughout
       the nation reported invalid or potentially  falsified
       drinking water test data, and

   •   what procedures EPA regions and states used to
       detect invalid or potentially falsified data.
BACKGROUND
OGWDW, the regions, and the primacy states are
responsible for ensuring that the water the public drinks is
.safe and free of harmful contaminants.  EPA delegates
"primacy" authority to states meeting the Safe Drinking
Water Act's and EPA's implementing regulations.

OGWDW guidance published in 1991 to assist regions and
states in detecting falsified data stated that the problem of
data falsification may be larger than originally suspected.
It also stated that:
                                       Report No. E1HWE5-23-0001-5100516

-------
RESULTS IN BRIEF
                                                          Executive Summary

                                 ...with the promulgation of new regulations
                                 under the Safe Drinking Water Act, there is
                                 an increased incentive for water systems to
                               <  purposely submit invalid compliance data.
                                 This is due to the cost of compliance with
                                 the new regulations and an increased
                                 complexity hi the monitoring and reporting
                                 requirements.

                           According to the guidance, sanitary surveys are likely the
                           best vehicle for  detecting erroneous test data. A sanitary
                           survey is an on-site review of the water sources,^ facilities,
                           equipment, and  operation and maintenance of a PWS to
                           evaluate the adequacy of those elements for producing and
                           distributing safe drinking water.

                           This audit was a follow-on effort to work we conducted  in
                           Region 5 from January through June 1994.  During that
                           audit, we found that operators at about 4 percent of all
                           surface water systems within Region 5 reported invalid or
                           potentially falsified data.
                           Overall, the results of our nationwide review showed
                           that operators at community, surface PWSs generally
                           reported valid data.  Based on our statistical sample
                           review, we projected nationwide that about 12 percent of
                           PWSs reported erroneous data one or more times from
                           1991 through 1994.  Although these PWSs served only
                           about 0.1 percent of the population, some improvements
                           would provide further assurance that operators are
                           accurately reporting test data.  Very small and small-sized
                           PWSs most often reported erroneous data, and about 58
                           percent of the erroneous data cases involved invalid data,
                           rather than data which might have been deliberately
                           falsified.  Operators reported invalid data because of:
                           (1) a lack of training and knowledge on how to properly
                                        11
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                           Executive Summary

                           test water samples, and record and report results; or
                           (2) improperly functioning equipment.

                           According to OGWDW's Appendix G guidance, sanitary
                           surveys are likely the best vehicle for, detecting .instances
                           of invalid or falsified data.  However, states' sanitary
                           survey procedures generally did not include data quality
                           review steps, nor did state officials require PWS operators
                           to certify to the validity of reported data. Because most
                           states did not (1) use sanitary surveys to review the quality
                           of reported operational test data and (2) have operators
                           certify to the authenticity of reported data, officials missed
                           opportunities  to identify and correct testing and reporting
                          .problems.                                          ,
RECOMMENDATIONS
                           Our detailed recommendations follow the findings in
                           Chapter 2.  However, in summary, we are recommending
                           that the Assistant Administrator for Water require the
                           Director of EPA's Office of Ground Water and Drinking
                           Water to:            .            ,

                           1.      request that the regions coordinate with the states
                                  to follow up on the 48 PWSs, or operators, which
                                  we found reported invalid'or potentially falsified
                                  test data and consider providing training to, or
                                  taking enforcement actions against, them;

                           2.      revise EPA's Appendix G guidance to include
                                  additional steps to identify  erroneous data and
                                  provide updated examples to illustrate the types of
                                  data patterns indicative of erroneous.data, and
                                  redistribute this revised guidance to the regions;.

                           3.      update EPA's sanitary survey training materials to
                                  include reviews of data quality  [for example,
                                  comparison of monthly operational reports (MORs)
                                  with on-site logs  and bench sheets]; <
                                        in
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                         Executive Summary

                          4.     establish schedules, with the regions, for providing
                                sanitary survey training to state officials; and,

                          5.     discuss with EPA regions and state officials the
                                feasibility of including a certification block on
                                MOR forms for PWS operator to sign and certify
                                to the accuracy of the reported data.
AGENCY COMMENTS
AND ACTIONS
OIG EVALUATION
                          We provided the draft report to the Office of Water on
                          August 7, 1995. The Assistant Administrator for Water
                          provided a written response to our draft report on
                          September 27, 1995.  This response is included as
                          Appendix 1 of the report.  The audit report has been
                          modified to incorporate the Office of Water's comments,
                          concerns, and perspectives, where appropriate.

                          The Office of Water generally agreed with the audit report
                          recommendations.  However, the Assistant Administrator
                          stated that his office could not commit to implementing all
                          of our recommendations due to significant resource
                          constraints they are facing.  The Assistant Administrator's
                          responses to our recommendations are included following
                          the detailed recommendations in Chapter 2, along with
                          additional comments from the Office of Inspector General.
                          Although the future of drinking water funding levels is
                          uncertain,.the program office needs to provide specific
                          target dates for implementing recommendations 1 through
                          4, and 6, once EPA's fiscal 1996 budget is approved.
                                      IV
                                       Report No. E1HWE5-23-0001-5100516

-------
                       Table of Contents
Executive Summary	 . .	   i


Chapters      :                   .


1  Introduction		• • •	   1


      Purpose	   1


   •   Background	• • •  •	•	.• •  • •   2


      Scope and Methodology .-.'...	 .	   9


      Prior Audit Coverage	 .  r . ,	'..'	'.  .   12


2  Data Integrity Enhancements Needed	15


      Twelve Percent Of PWSs Reported Erroneous  Data	  .   16
                                                 \            '       ~

      Site Visits Confirmed Problems With Some PWSs		   18


      Data Falsification Issues Not A Priority  For EPA Or States  ......'...   19
             /    ; •            .        .                         •      •

      Conclusion	  26


      Recommendations	   29

                     *'                   '              /
      Agency Comments and Actions  	-.	   30


      OIG Evaluation  . .	:•'.'_	  .   32


Exhibits  N              •     •          .  '              .  .        ,    -
                        *      . -                      ,

  Exhibit 1  Region 1 Surface Water Systems the OIG Reviewed That
          . Submitted Erroneous Data	'.  .   35


  Exhibit 2  Region 2 Surface Water Systems the OIG Reviewed That
           Submitted Erroneous Data  . . .	   36
                                   Report No. E1HWE5-23-0001-5100516

-------
  Exhibit 3  Region 3 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	; .
  Exhibit 4  Region 4 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	
  Exhibit 5  Region 5 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	'	
  Exhibit 6  Region 6 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	
  Exhibit 7  Region 7 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	
  Exhibit 8  Region 8 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	
37
38
39
40
41
42
  Exhibit 9  Region 9 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	   43

  Exhibit 10 Region 10 Surface Water Systems the OIG Reviewed That
            Submitted Erroneous Data	   44

  Exhibit 11 Calculation of Sample Results
            Projected to the PWS Universe	   45

  Exhibit 12 Site Visits Summary	'.  .,..... 47

Appendices

  Appendix'1      Office of Water Response to Draft Report	 .   51

  Appendix 2 *     Abbreviations	   55

  Appendix 3      Distribution	   57
                                    VI
                                      Report No. E1HWE5-23-0001-5100516

-------
                     Chapter 1:  Introduction
PURPOSE
U.S. Environmental Protection Agency (EPA) Office of
Ground Water and Drinking Water (OGWDW) officials
requested a nationwide audit of drinking water data
integrity issues:  (1) because of their concern that falsified
drinking water data may be going undetected resulting in
potential health threats, and (2) to aid them in planning
how to use limited resources efficiently and effectively.  If
water system operators report invalid or falsified test
results, serious health risks could go undetected.
Moreover, if'a'state does not detect invalid or falsified test
data and investigate the situation, the state's ability to take
proactive measures to prevent contamination of the water
supply is negated.                                 .     >

Americans drink over one billion glasses of water a day.
The Centers for Disease Control and Prevention (CDC)
estimated that over 940, (XX) people become ill every year
from drinking contaminated water. Of those, CDC
estimated 900 people die each year. Moreover,
controversy has recently been generated amongst the
public regarding the overall quality of our nation's
drinking water supply and the technology of the systems
that provide it.

The objectives of this audit were to determine:

   •   the frequency  with which operators of community,
       surface public water systems (PWSs) throughout
       the nation reported invalid  or potentially falsified
       drinking water test data, and

   •   what procedures EPA regions and states used to
       detect invalid  or potentially falsified data.

This is a consolidated report which includes the results of
our 1994 pilot audit of Region 5, as well as our most
recent review of the other nine EPA regions.  During bur
                                        Report No. E1HWE5-23-0001-5100516

-------
                                                       Chapter 1: Introduction
                           1994 audit of Region 5's program1, we found that
                           operators at about four percent of the surface water
                           systems in Region 5 reported erroneous drinking water test
                           data that the states did not detect.
BACKGROUND

Responsibilities Of
EPA And States
The Safe Drinking Water Act of 1974 (the Act), as
amended, required that EPA ensure that PWSs monitor
and test the quality of drinking  water and distribute water
to consumers that is safe to drink.  EPA's OGWDW, the
regions, and the states are all responsible for achieving the
goals of the Act.2  Over the last decade statutory
monitoring and testing requirements increased, but,
according to EPA officials, only limited EPA and state
resources were available for reviewing the validity of
reported drinking water test data.

Under the current safe drinking water program, all states
and Federal territories, except Wyoming and the District
of Columbia, are responsible  for ensuring that the water
within their jurisdiction is safe for consumption.  EPA
delegates this authority, called "primacy", to states
meeting the Act's and EPA's  implementing regulations.
To.maintain primacy, states must adopt regulations at least
as stringent as the Federal Government's, and demonstrate
that they can effectively execute program requirements.
                 '           /

In 1991,  OGWDW's Enforcement and Program
Implementation Division published its Public Water
System Supervision Data Verification Guidance.
   'Region 5 Procedures to Ensure Drinking Water Data Integrity (EPA Report No. 4100540, issued
September 30, 1994).

    2The Act applies to all 50 states, the District of Columbia, Puerto Rico, Indian lands, the Virgin Islands,
American Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the Republic of Palau.
Currently, EPA administers public water system supervision programs on all Indian lands.
                                       ' 2
                                          Report No. E1HWE5-23-0001-5100516

-------
                                                      Chapter 1:. Introduction
Responsibilities
Of Public Water
Systemsj
Appendix G of this guidance, entitled "Guidance on
Detecting Invalid or Fraudulent Data" presents ways to
identify systems  that may be submitting invalid or falsified
data and outlines the correct procedures for investigating
and prosecuting systems suspected of falsifying data.  This
guidance states that EPA regions and states with primacy
should identify and prosecute systems suspected of
falsifying data, as part of their oversight of drinking water
programs.                   '

According to the guidance, the problem of data
falsification may be larger than originally suspected.  It
explains that:

       ...with the promulgation of new regulations
       under the Safe  Drinking Water Act, there is
       an increased incentive for water systems to
       purposely submit invalid compliance data.
       This is due  to the cost of compliance with
       the new regulations and an increased
       complexity  in the monitoring and reporting
       requirements.

OGWDW maintained an automated database called the
Federal Reporting Data System (FRDS) as a centralized
repository for information about PWSs nationwide and
their compliance with  monitoring requirements, maximum
contaminant levels, and other requirements of the Act.
State officials are required to assess monitoring results and
submit quarterly  reports to OGWDW for input to FRDS.
FRDS  was an "exception-based" database with nearly all
data being provided by states.

The majority of community PWSs treat the drinking
water they distribute to the public to ensure that the
water customers  drink is safe.  Some sources of drinking
water contamination include turbidity (or "cloudiness") of
water due to suspended matter such as clay, silt, and
microscopic  organisms; bacteria from human or animal
sources; pesticide run-off; underground injections of.
                                         Report No. E1HWE5-23-0001-5100516

-------
                            Chapter 1: Introduction
hazardous wastes; disinfection by-products; and lead,
copper, and asbestos front corroding pipes.

PWSs must test their water supply for various
contaminants and report the results to their states.
Generally,  the larger the population a system serves, the
more often that system is required to test and report.  The
type of PWS (community or noncommunity) and the water
source (surface or groundwater) determine what
contaminants must be monitored.  The four basic
contaminant groups PWSs must test for and report on are:
(1) inorganic chemicals, (2) organic chemicals, (3)
radionuclides, and (4) microbiological contaminants.

PWSs generally either contract with an independent
certified laboratory or a state-owned laboratory to analyze
water samples from their systems involving complex tests,
such as those for bacteria, most inorganic and organic
chemicals,  and radibnuclides.  Many states require
independent labs to report the results of their tests directly
to the state before reporting the.results to the water
system, their client.

While PWSs generally contract out their complex tests,
they conduct other tests  in-house.  Water system personnel
conducting these operational tests are normally required to
report the results of these tests to their state once per
month.  Water operators submit this data on monthly
operational report (MOR) forms.

At least three measures are important indicators of water
quality: turbidity, chlorine residual, and  pH.

  •    Turbidity measurements indicate how well filtration
       systems are removing particulates  from drinking
       water.  As turbidity levels increase, the efficiency
       of a water treatment plant in filtering particulates in
       water decreases.  If turbidity levels are too high,
       health risks from microbes increase.
               Report No. E1HWE5-23-0001-5100516

-------
                                                        Chapter 1: Introduction
Potential For Invalid
Or Falsified Data
   •   Chlorine measurements, including the time chlorine
       is in contact with water in the distribution system,
       are very important also.  Although chlorine can kill
       microbes and prevent the regrowth of any bacteria
       that make it through the filtration system, microbes
       may not be inactivated by chlorine if the time
       chlorine is in contact with the microbes is too
       short.

   •   A measure of the acidity or alkalinity of a liquid  is
       referred to as its pH. Improper pH levels affect
       the efficiency of disinfectants and  also affect the
       types and levels of disinfection by-products
     ,  generated.  In addition, pH  levels  that are too
       acidic increase the leaching  of metals, such as lead,
       from within the distribution system into the water
       supply;

Many PWS operators are certified to perform operational-
type tests on drinking water samples, including those for
turbidity, chlorine residual,  and pH.  Each state has an
operator certification program, which is responsible for
ensuring that operators are properly trained on PWS
monitoring requirements.  According to regional officials,.
state training is available for all operators, certified or not.
The requirements for certified operators vary by state.
These requirements are based on the type and size of the
PWS, its water source and quality,  and the population
affected.. If operators do not continue to meet these
requirements, their certifications can be revoked. This
could result in  the loss of their jobs.  Falsification of test
data is also grounds for revocation  of an operator's
certification.           '

PWS operators are in the business of providing drinking
water to the public.  Thus, it is in the best interest of
operators to provide a safe,  reliable, and high quality
drinking water  product to their customers. However, the
increased cost of new regulations and the  increased
complexity in monitoring and reporting requirements
might be an incentive to submit falsified data.
                                          Report No. E1HWE5-23-0001-5100516

-------
                                                       Chapter 1: Introduction
Potential Health
Risks
Because PWS operators and staff are directly involved
with recording and reporting results of operational tests,
there is an increased chance that these results might be
erroneous (either invalid or falsified)..  Operators might
report invalid test data because of malfunctioning
equipment, improper testing procedures, or a
misunderstanding of how to properly record and report test
readings.  Also, PWS personnel might  falsify test data in  ,
several different ways.  For example, PWS personnel
could:  (1) report fabricated data without taking required
tests; (2) boil or microwave samples, thereby killing
harmful organisms; (3) submit samples from sources
outside the PWS; or (4) collect all samples from one
location within the PWS.

If PWS operators report invalid or falsified test results,
serious health threats could go undetected.  Moreover, if a
state does not detect and investigate questionable test data,
the state's ability to take proactive measures to prevent
contamination of the water supply is negated.

Drinking water regulations combined with improved water
treatment practices over the past twenty years have
significantly reduced the occurrence of many waterborne  <
diseases.   As a result, water in the U.S. is considered very
safe to drink. However, despite increased monitoring and
testing requirements, contaminated drinking water still
remains a serious health threat.  The CDC estimated that
over 940,000 people in the U.S. become ill every year
from drinking contaminated water.  Of those, they
estimated 900 people die each year.

According to the Associate Director of EPA's Health
Effects Research Laboratory in Research Triangle Park,
North Carolina, testing for the presence of microbes in
drinking water is very important hi ensuring harmful
microbes that can cause sickness are not present.
Microbes include parasites, bacteria, and viruses.  Such
organisms, if ingested, can result in varying degrees of
sickness from mild gastrointestinal illness to death.  He
further stated that a recent outbreak of cryptosporidiosis in
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                          Chapter  1: Introduction
Types, Sizes, And
Number Of PWSs
Milwaukee, Wisconsin, shows the significance of turbidity
measurements.3  He said that the high turbidity
measurements that were reported were good indicators that
microbes may have passed through the filtration system
and into the water supply.

A PWS, as defined by EPA, provides piped water for
human consumption to at least 15 service connections or
serves an average of .at least 25 people for at least 60 days
each year. PWSs are classified as either community or
noncommunity systems., Community systems provide
water generally to the same population year-round, such as
in residential areas.4

OGWDW categorizes the size of PWSs by the number of
people each serves. There are five size categories as
shown in figure 1.

              Figure  1:  PWS Size Categories
                                                             ^--' •! .::v>:.:. Size Cateebry ^'
                                                                          Medium
    According to EPA, in 1993, over 370,000 people in the Milwaukee area were affected by the parasitic
microorganism Cryptosporidium in the water supply. Reports indicated that at least 100 people in the
Milwaukee area died as a result of the parasite. Although the Milwaukee PWS operators did not report
invalid or falsified data, the outbreak highlighted the potential health effects associated with turbidity
violations.

    4Noncommunity systems are either institutions such as schools, hospitals, and work places that regularly
serve at least 25 of the same people at least 6 months a year, or establishments such as campgrounds and
motels that supply water to people at non-residential facilities.

    .    '        '                          7           .               .       .
                                            Report No. E1HWE5-23-0001-5100516

-------
                                                   Chapter 1: Introduction
                         .As of January 1995, there were about 186,800 PWSs in
                         the United States and its territories.  An estimated 56,700
                         PWSs of the total number of PWSs were community
                         systems-about 46,100 were ground water PWSs while the
                         remaining 10,600 were surface PWSs.  Although surface
                         community PWSs comprised only about 19 percent of all
                         community systems, those PWSs served about 63 percent
                         of the population which received their drinking water from
                         community PWSs.

                                         Figure 2: Community PWSs
                                          COMMUNITY PWSs
                                     Number of PWSs
                                       Surface Water  19%
Total Served
Surface Water 63%
                                       Ground Water
                                                           Ground Water 37%
                         PWSs obtain the water that they distribute to the public
                         from either:  (1) groundwater sources—such as springs and
                         aquifers5, or (2) surface water sources—such as rivers,  -
                         lakes, and reservoirs.  Both types of water sources are
                         subject to contamination.  However, because surface water
                         sources are exposed to the air and are accessible to human
                         and animal use, these water sources require more
                         extensive treatment before they can be used for human
                         consumption.
    5 An aquifer is an underground geological formation containing usable amounts of groundwater that can
supply wells and springs.

                                      8                  '
                                       Report No. E1HWE5-23-0001-5100516

-------
                                                       Chapter 1: Introduction
SCOPE AND METHODOLOGY
                           Our audit was national in scope, intended to provide
                          , OGWDW officials with the "comprehensive observations
                           and projections that they requested.  As a result, although
                           our audit work, focused primarily on the efforts of each
                           EPA region and its states to detect questionable  test results
                           reported by community surface water systems, we did not
                           draw any conclusions regarding specific regions or states.
                          'We also interviewed OGWDW officials in Washington,
                           D.C., and obtained information regarding OGWDW's
                           efforts in the  data integrity area. We focused only on
                           community surface systems.  Therefore, throughout this
                           report "PWS" refers only, to those community systems
                           with surface water sources.
                               /

                           OGWDW officials requested that we concentrate our
                           efforts on community systems because those PWSs
                           distribute drinking water to most of the nation's
                           population. OGWDW provided the  Office of Inspector
                           General (OIG) with a random sample of 271 community
                           surface PWSs from FRDS for  review.  OGWDW used the
                           automated random sampling capability of FRDS to select
                           the 271 PWSs6. The sample was selected from the total
                           of 4,417 community surface PWSs, located  in the
                           continental1 United States, excluding Region 5  states.7  To
                           determine the sample size, we  used a 95 percent
                           confidence interval, ah error discrepancy rate of 25
                           percent, and an error tolerance level of 5 percent. FRDS
                           randomly generated PWSs for every state except
                           Delaware, Mississippi, Nebraska, and Rhode Island.
    6The statistical sampling technique used for this review is the same approach-random sampling without
replacement-used by. data verification teams to make inferences about the number of discrepancies that exist
between the FRDS database and PWS records. This approach is described in the PWSS Data Verification
Guidance. Appendix B. The sample size calculation formulae selected for this audit were based on recent
OGWDW data verification audits.        <

    7We reviewed all 452 community surface PWSs in Region 5 during our 1994 audit of that Region.

    •   -      •         .           .      9

                                         Report No. E1HWE5-23-0001-5100516

-------
                                                      Chapter 1: Introduction
                           We did not conduct a separate review of FRDS to validate
                           the accuracy of its data.  However, we did identify 19
                           PWSs in our original sample that were incorrectly coded
                           in FRDS and, therefore, needed to be replaced with
                           systems meeting our criteria. For example, some systems
                           were categorized in FRDS as surface systems when, in
                           fact,  they were groundwater systems.  Other PWSs in the
                           sample were either not'active, served less than 25 people,
                           or purchased water from another PWS and were,
                           therefore, dropped from our original sample.

                           We replaced all PWSs discarded from the  original sample
                           with  substitute PWSs.  OGWDW provided an additional
                           list of randomly selected PWSs for our use in case of
                           coding errors in FRDS.  We used this list and replaced
                           PWSs from the state from which they were dropped.8
                           For example, we replaced a PWS hi North Dakota which
                           had been a groundwater system since 199lT-but was coded
                           in FRDS as a surface system~with another North Dakota
                           PWS from the substitute list of randomly selected systems.

                           We performed audit work within every EPA region and all
                           44  states within our sample.  We reviewed PWS files and
                           interviewed officials at each state's environmental or
                           public health agency responsible for oversight of its
                           drinking water program.  Because some states maintained.
                           PWS files at several different,district offices throughout
                           their  states, and because of limited OIG resources, we
                           were  unable to travel to every state. As a result,  for the
                           states of Montana, North Dakota, South Dakota, Utah,  and
                           Virginia, state officials sent either the  original files or
                           copies of the files to us to review and certified to the
                           authenticity of any copies.

                           We reviewed MORs, where available, for  a four year
                           period-January 1991 through December 1994.  While
    8For Arkansas PWSs, we randomly selected seven PWSs from a list supplied directly by Arkansas
officials, rather thah.the additional list provided by OGWDW.  All seven Arkansas PWSs on the original
sample list were miscoded, as were five of the seven Arkansas PWSs on OGWDW's replacement list.

                                       10

                                         Report No.  E1HWE5-23-0001-5100516

-------
                            Chapter 1: Introduction
reviewing these MORs, we attempted to answer the
following question:   •

       How probable was it that the test values
       reported by this surface water system
       appeared valid and were based on actual
       tests?        •    •  .. .

We looked specifically for operational test readings-such
as those for turbidity, chlorine residual, and pH~that did
not fluctuate at all or fluctuated very little  over the course
of several consecutive months.  We also looked  for any
other obvious patterns in reported data over time.  We
considered such data to be questionable and suspect, in
accordance with OGWDW's Appendix G.  We also looked .
for additional indicators of potential data falsification, such
as correction fluid on MORs or the use of pre-printed
forms to record results. Because we  looked for  obvious
cases of invalid or potentially falsified data, there may be
other cases of such data that we did not identify..

Of the 94 PWSs we identified as having reported suspect
test data, we conducted unannounced site visits at 14
PWSs and an announced visit at 1 PWS.  An OIG
engineer accompanied us on each site visit, while state
officials accompanied us on all but two of the visits. We
judgmentally selected these 15 PWSs, based on the size
and location of the systems,  and the data patterns reported.
We visited three.PWSs in Ohio, two  hi Colorado,  Illinois,
and Indiana, and one in California, Louisiana, New  '
Mexico,  North Carolina, Oregon, and Tennessee.  We
visited 9 small PWSs, 4 medium-sized PWSs, 1 large
PWS, and 1 very large PWS.  During the  site visits, we:
(1) interviewed PWS personnel, (2) examined original test
records, (3) inspected the equipment and overall condition
of the facility, and (4) observed water plant operators
conduct certain water tests we requested.            .

We conducted our national fieldwork from November 4,
1994 to July 14, 1995.  Each OIG audit division which
performed work provided its respective region or regions
             11
               Report No. E1HWE5-23-0001-51Q0516

-------
                                                      Chapter 1: Introduction
                           with a written summary document at the completion of
                           their work. Regional comments to those summaries have
                           been provided to OGWDW.

                           We performed our audit in accordance with the
                           Government Auditing Standards issued by the Comptroller
                           General (1994 revision).  As part of our audit, we also
                           reviewed OGWDW's and regions' recent Federal
                           Managers' Financial Integrity Act  reports.  Neither
                           OGWDW nor any of the regions cited data falsification as
                           a material weakness.  During our audit, we did not detect
                           any internal control weaknesses significant enough to be
                           reported as material risks  to program operations.
                           However, we have identified opportunities to strengthen
                           the management controls over the reporting and review
                           processes for  drinking water data.
PRIOR AUDIT COVERAGE

                           The OIG issued a report on September 30,  1994, which
                          ' addressed Region 5's procedures to ensure the integrity
                           and validity of drinking water test data (EPA Report No.
                           4100540). The report stated that Region 5  and state
                           officials placed a low priority on reviewing reported data
                           for falsification, and states within that Region did not have
                           formal procedures to do so.  Operators at about four
                           percent of all Region 5 surface water systems reported
                           invalid or potentially falsified test data.

                           The OIG issued a report on July 30,  1993,  which
                           addressed Region 1's enforcement of the Act, including its
                           data falsification efforts.  Region  1 and its states did not
                           routinely conduct data falsification reviews  or follow-up
                           on cases of questionable data that were identified.9
    Audit Report of Region I's Enforcement of The Safe Drinking Water Act (SOWA) (EPA Report No.
3100291).
                                        12
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                       Chapter 1: Introduction
                          'The U.S. General Accounting Office (GAO) issued two
                           reports related to drinking water quality and data integrity.
                           An April 1993 report focused on the importance of
                           sanitary surveys as a way to ensure the quality of drinking
                           water distributed to the public.  The report stated that: (1)
                           the frequency of sanitary surveys by the states had
                           declined and that many states were not conducting surveys
                           as often as EPA recommended, (2)  inadequate training of
                           state inspectors might have contributed to survey flaws,
                           and (3) EPA placed limited emphasis on sanitary
                           surveys.
10
                           According to a June 1990 GAO report, ,most EPA and
                           state officials GAO interviewed did not believe data
                           falsification was extensive. These officials also told GAO
                           that falsifying test results was relatively easy and
                           incentives for doing so would increase in the future.11
    '°Drinkine Water: Key Quality Assurance Program is Flawed and Underfunded. (GAO/RCED-93-97).

     Drinkine Water: Comoliance Problems Undermine EPA Program as New Challenges Emeree.
(GAO/RCED-90-127).
                                        13
                                          Report No. E1HWE5-23-0001-5100516

-------
[This page intentionally left blank.]
              14
                Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity
   Enhancements Needed
       Primacy states are responsible, under 40 Code of Federal
       Regulations (CFR) Part 142, for administering and
       enforcing drinking water regulations, and reviewing PWS-
       reported test data.  In turn, all PWS operators are
       required, under the Act and 40 CFR Part 141, to ensure
       that all appropriate drinking water tests are conducted and
       test results are accurately reported. Accurate test data are
       necessary to assure that the quality of drinking water
       provided to the public meets all drinking water standards.
       According to OGWDW's Appendix G guidance, sanitary
       surveys are likely the best  vehicle  for detecting instances
       of invalid or falsified data.

      .Overall,  the results of our  nationwide review showed that
       operators at community, surface PWSs generally reported
       valid data.  Based on our statistical sample review, we
       projected nationwide that 11.6 percent (566 of 4,869) of
       PWSs reported erroneous data one or more times from
       1991 through 1994, with 95 percent confidence and a
       tolerance level.of ±5.0 percent. These PWSs served
       about 0.1 percent of the population.  Very  small and
       small-sized PWSs most often reported erroneous  data, and
       about 58 percent of the erroneous  data cases involved
       invalid data, rather than data which might have been
       deliberately falsified.

       Operators at small PWSs most often reported invalid data
       because of: (1) a lack of training  and knowledge on how
       to properly test water samples, and record and report
       results and (2) improperly  functioning equipment.
       According to OGWDW, regional,  and state officials, data
       falsification issues were not a priority because of limited
       resources and other public  health-related priorities.  State
       officials  generally did not believe data falsification was a
       widespread problem.  In addition,  states' sanitary survey
       procedures generally did not include data quality review
       steps, nor did state officials require PWS operators to
       certify to the validity of reported data. Because  most
                   15
                     Report No. E1HWE5-23-0001-5100516

-------
                         Chapter 2: Data Integrity Enhancements Needed
                         states did not use sanitary surveys to review the quality of
                         reported operational test data, officials missed
                         opportunities to identify and correct testing and reporting
                         problems.
TWELVE PERCENT OF PWSs
REPORTED ERRONEOUS DATA

                         Based on our statistical sample review, we projected
                         nationwide that 18.3 percent (890 of 4,869) of PWSs
                         reported data that was questionable in accordance with
                       .  EPA guidance, with 95 percent confidence and a tolerance
                         level of ±5.0 percent.  (For further discussion of this
                         projected ^percentage, see exhibit 11.) Files for 94 of the
                         723 PWSs we reviewed nationwide contained test data that
                         were questionable in accordance with OGWDW's
                         Appendix G  guidance.  We referred these cases to the
                         regions which, in turn, required the states to follow-up.
                         We accepted PWS data as valid if state officials:
                         (1) visited or contacted the PWSs and (2) provided
                         reasonable explanations for the data patterns we identified.
                         Based on this follow-up, we determined that 46 of the 94
                         PWSs which we originally questioned, appeared to have
                         reported accurate test readings.

                         Data for the  remaining 48 PWSs, which statistically
                         represented 11.6 percent (±5.0 percent of the total
                         number of community, surface PWSs nationwide)
                         continued to  be questionable.  Of these 48, operators at 28
                         PWSs reported invalid data, while operators at 20 PWSs
                         reported data that we believe might have been deliberately
                                     16
                                       Report No. E1HWE5-23-0001-5100516

-------
                            Chapter.2: Data Integrity Enhancements Needed
                             falsified.12  According to FRDS, these 48 PWSs served
                             about 140,000 people.  Very small and small PWSs   .
                             reported erroneous data in 34 of the 48 cases.  However,
                             several medium-sized and large systems also reported
                             erroneous data.13
                                                                                     /

                             The following table shows details about the projected 11.6
                             percent of PWSs that reported invalid or potentially
                             falsified data.  We identified PWSs within each EPA
                             region—except Regions 1,7, and 8—that submitted
                             erroneous data.  Among the regions, the percentages
                             ranged from a low of 3.1 percent in Region 5 to a high of
                             28.6 percent in Region 3.  These results should not be
                             used for statistical projection purposes by region/state
                             because the statistical sample we used for our review was
                             randomly selected on a national basis  from the  universe of
                             4,417 community surface PWSs located in the continental
                             United States. The sample excluded Region 5 PWSs,
                             which we previously reviewed. For more information
                             regarding the percentages among the states in each region,
                             see exhibits 1 through 10./
     L2We classified cases as invalid if we obtained evidence from"state officials, or concluded via our own
site visits, that data were improperly measured or reported, due to equipment malfunction or a lack of operator
training. ' We classified cases as potentially .falsified if state officials provided information that was insufficient
to convince us that the reported data were accurate and the cases were not classified as invalid. We did not
.find any instances where operators knowingly'reported test measurements from malfunctioning equipment.
Had this occurred, we would have considered such cases as potentially falsified.

     13There .were no very large systems which reported invalid or potentially falsified data. .

                                          17          .                            .'

                                            Report No. E1HWE5-23-0001-5100516

-------
                            Chapter 2: Data Integrity Enhancements Needed
                                                     Table 1
                                     Percentage Of Community'Surface Water Systems,
                                        By Region, That Submitted Erroneous Data

Region
I
n
III
IV
V '
' VI
VII
vin
IX
X
•',•'. '-•:
Total

Reviewed
22
ia
42
46
452U
49
16
22
40
16
723

Invalid
.0
2
6
3
8
4
0
0
3
2 ' .
28
Potential!}
Falsified
0
0
6
- 1
6
6,
0
0
0
1
20
r
Total
0
' - 2 *
12
4 •
14
, 10
0
0
3
,. .3.........
48M

Percent
0.0
11.1
28:6
8.7,
3.1
20.4
0.6
0.0
7.5
18.8
11.6"
SITE VISITS CONFIRMED
PROBLEMS WITH SOME PWSs
                     We visited 15 of the 94 PWSs that we identified as
                     having reported questionable data. Six of the 15 PWSs
    14We reviewed these community, surface PWSs during our 1994. audit of Region 5.

    15We do not contend that this number includes all possible PWSs reporting erroneous data.  Rather, it is
the number of PWSs we identified based on our review criteria.  •   .

    16Total percent refers to the statistically projected percentage of PWSs reviewed which reported
erroneous data.  We calculated,this percentage as follows:        <       '

     Stepl.'   Number of PWSs with Erroneous Data (National)  =  _34 = 0.125
                 Number of PWSs in Sample (National)   .  ;    271
     Step 2.    0.125 x 4,417 (Universe of National Sample)  =552
     Step 3.    552  + 14 (Region 5's PWSs With Erroneous Data) = 566
     Step 4.  .-. 566  •*• 4,869 (Total Universe) = 0.1162 or 11.6 percent

                -                         18              v  '   '
                                           Report No. E1HWE5-23-0001-5100516

-------
                          Chapter 2: Data Integrity Enhancements Needed
                    seemed to be conducting and reporting test results accurately.  At
                    the remaining 9 PWSs (60 percent), we found indicators that data
                    were either invalid (40 percent) or potentially falsified (20  .
                    percent).  Of the nine PWSs we visited and found problems, one
                    each was in California, Illinois, Indiana, Louisiana, New Mexico,
                    North Carolina, Ohio, Oregon, and Tennessee.  Operators at
                    these PWSs:  (1) recorded and reported test readings improperly,
                    (2) obtained readings using improper procedures or
                    malfunctioning equipment, or  (3) lacked documentation regarding
                    tests taken or recorded test readings before tests were taken. Our
                    site visits also disclosed some  weaknesses in states' oversight of
                    data'review and reporting. In all cases except our visits to the
                    systems in California and New Mexico, state inspectors or EPA
                    regional officials accompanied us and verified our observations.
                    For specific examples of issues we identified during our site visits
                    to water treatment plants, see exhibit 12.
DATA FALSIFICATION ISSUES NOT
A PRIORITY FOR EPA OR STATES
States Did Not
Review Data To
Identify Invalid
Or Falsified Results
EPA and state officials placed low priority on reviewing
records for invalid or falsified data primarily because
they did not believe that falsification was widespread.
State officials generally trusted that data reported to them
was valid.  According to OGWDW guidance, states should
try  to identify and prosecute PWSs suspected of falsifying
data.

Although some states might review data during field
inspections, only four states (Alabama, Missouri, South
Carolina, and Tennessee) out of 44 we reviewed had
formal, documented procedures for reviewing reported test
data for invalid or falsified data.  Also, when state
officials reviewed data for reported compliance violations,
in most cases, the data was not reviewed with the idea that
it could be invalid or falsified.  State officials generally
agreed that unusual or repetitive patterns in reported data
                                       19
                                        Report No. E1HWE5-23-0001-5100516

-------
                           Chapter 2: Data Integrity Enhancements Needed
                            should be questioned.  However, according to state
                            officials, they:  (1) generally did not believe data
                            falsification was widespread, and (2) had limited resources
                            to review reported data even if falsification'was occurring.

                            Further,  state officials generally were not aware that
                            Federal guidance (OGWDW's Appendix G) existed which:
                            (1) detailed ways to detect falsified data, and (2) discussed
                            the importance of reviewing data for falsification.  Also,
                            they had  not received any training on how  to detect
                            falsified data or how to recognize basic fraud indicators.
                            As a result, state officials were generally unaware of the
                            best indicators to look for, and techniques to use, to detect
                            erroneous data.

                           .According to OGWDW's Appendix G,  "A sanitary survey
                            is likely our [EPA and states] best vehicle for detecting
                            instances of data falsification. Every  sanitary survey
                            should include an investigation of data quality." However,
                            states  did not have specific procedures to identify falsified
                            data while conducting sanitary surveys. In addition, the
                            frequency with which states  conducted sanitary surveys
                            varied greatly.

                            According to state  officials,  only two  states-South
                            Carolina  and Missouri—had formalized sanitary survey
                            procedures which included reviews of data  quality.17 A
                            third state, Tennessee, is  in the process of revising its
                            survey procedures  to include data quality review steps.
                            Generally, states' sanitary survey procedures did not
                            include reviews of test data to identify potential
                            falsification. Most states did not review monthly reports
                            for data falsification prior to conducting sanitary surveys.
                            Moreover, they did not review on-site PWS records during
                            surveys to verify the validity of reported data.  For
                            example,  Arizona officials maintained that field inspectors
                            reviewed and compared data during inspections of PWSs.
                            However, we reviewed their inspection procedures and
    1 Alabama's written policy to ensure data integrity was a general memo and was not specific to its
sanitary survey procedures.                     .
                                                         \

                                         20


                                          Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
found that inspectors were hot required to:  (1) reconcile
backup PWS documentation with the summary data sheets
sent to the State, or (2) observe PWS operators taking
routine water samples and tests.  When we pointed this out
to State officials and explained that the normal inspection
process might not allow an inspector to detect invalid or
falsified data, they agreed that their procedures needed
revision.     .            .

Based on  our audit, we believe reviewing MORs and on-
site records  would enable state officials to focus on suspect
areas during sanitary surveys.  In addition, state field
inspectors should analyze water samples during the
surveys to determine if readings are similar to historical,
data (or data recorded earlier in the day).  Operators
should also  be made .to conduct tests to demonstrate their
competency to accurately test samples. We believe these
tasks would require minimal use of additional resources.

According to current Federal requirements in 40 CFR
141.21, states must conduct a sanitary survey once every
five years at each surface PWS that does not collect five
or more bacteriological samples each month.  However,
EPA has  issued guidance recommending that states
conduct comprehensive sanitary surveys of PWSs at least
once every three years.   We found that the frequency with
which states conducted sanitary surveys .varied.  Although
states had to perform surveys only once every 5 years to
meet the minimum Federal requirements, more frequent
and complete sanitary surveys may be useful to ensure
effective protection against data falsification.

OGWDW arid state officials met in June  1994 to discuss
the need for national guidance on conducting sanitary
surveys.  Their meeting included  a discussion on how
often states  should conduct sanitary surveys.  According to
an OGWDW Education Specialist, OGWDW is in the
process of preparing a  document called "EPA/State Joint
Guidance on Sanitary Surveys."  It is intended to be used
primarily  by state officials who conduct sanitary surveys.
According to the specialist, EPA and state officials
             21
               Report No. E1HWE5-23-0001-5100516

-------
                           Chapter 2:  Data Integrity Enhancements Needed
Regions Did Not
Closely Monitor
States' Efforts
 established eight elements, at a minimum, that should be
 addressed during sanitary surveys.  He stated that one of
 the eight elements is entitled, "Monitoring, Reporting, and
 Data Verification!"  According to OGWDW's Safe .
 Drinking Water Branch Chief, the guidance will not
 include specific data quality review steps; however, it will
 recommend that states conduct a data quality/falsification
 review as part of its sanitary survey program.  OGWDW
 intends to formally issue this guidance during  the first
 quarter of fiscal 1996.  OGWDW also accepted the OIG's
 offer to help rewrite and supplement the information in
 Appendix G.

 As a result of our 1994 audit of Region 5, OGWDW also
 added a one-hour data falsification module to its 4-day
 sanitary survey training course it  and  the regions provide
 to the states.  In addition, OGWDW has included its.
 Appendix G guidance as an attachment to the sanitary
 survey training manual they distribute to course
 participants.  However, according to the Safe Drinking
 Water Branch Chief, OGWDW did not conduct or sponsor
 any sanitary survey training in fiscal year 1995.  Instead,
 funds were spent on developing:  (1)  a  "learning video"
. for inspecting wells, and (2) a how-to booklet for states to
 use when conducting sanitary surveys of small PWSs.

 Most Federal environmental statutes,  including the Safe
 Drinking Water Act, embrace die concept that states
 should have primary responsibility for operating regulatory
 and enforcement programs.  In 1994, EPA and the states
 issued a policy statement describing a new framework for
 then- partnership.18  According to the  statement, an
 effective relationship between EPA and the states depends
 on mutual dedication to shared responsibility and
 accountability for implementing environmental programs.
 While recognizing this concept, the statement continues to
    18Joint Policy Statement on State/EPA Relations. State/EPA Capacity Steering Committee, July 14,
1994.
                                       22
                                         Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
call for EPA to perform its mandated statutory mission,
including constructive program review.
               »
Regional oversight of drinking water data falsification
issues was minimal.   According to regional officials,
reviewing data for falsification was not a high priority
because they:  (1) had limited program resources to
manage increasing drinking water program responsibilities,
(2) did not believe data falsification was widespread, and
(3) relied on their states to detect questionable data.

Regional officials did not closely monitor states' efforts to
detect invalid or falsified data. Moreover, officials in
most  regions did  not evaluate  states' efforts in detecting
such data during their annual review of each  state's
drinking water program. For example, Region 6 had
developed its own specific  "protocol" (guidance) related to
data falsification as a result of a material internal control
weakness identified in 1990.   However, we found that
Region 6 did not implement the guidance because officials
believed that taking enforcement actions against PWS
operators on potential data  falsification cases  was not a
productive use of resources.

In 1991, Region 8 conducted a project related to
identifying falsified data. Region.8-which had primacy of
Wyoming's drinking water program—analyzed turbidity
reporting trends of 41 Wyoming community, surface
PWSs from 1987-1990.  Using a computer model,
Regional officials evaluated three factors in predicting the
potential for falsified reporting: (1) violationof
precipitation-based turbidity limit predictions, (2) seasonal
turbidity pattern abnormalities, and (3) lack of variation hi
daily  turbidity values. Officials found that 4 of the 41
PWSs reported suspect turbidity data based on two of
three  factors. Region 8  officials believed'that this project,
if used by other regions  and states, could be  a successful
and reliable enforcement tool.

In 1992, OGWDW formulated its plan to automate PWS
information and began developing the Safe Drinking Water
             23
               Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
Information System (SDWIS).  SDWIS provides a
comprehensive automated data system for EPA and states
to manage public drinking water programs, and is intended
to replace FRDS.   The objective of the SDWIS
modernization effort is to make quality data accessible to
managers at both the state and Federal levels.  The data
requirements and systems design for each component of
SDWIS are based on a facilitated series of meetings with
the states, regions, and OGWDW.  The end product is a
state-level data system (called SDWIS/LAN) that has the
same data model as the counterpart Federal system
(SDWIS/FED), thus ensuring that the data that EPA has
access to is of the same quality as the counterpart state
data. 'According to the SDWIS Project Manager, the
conversion to SDWIS  from FRDS was completed, and
SDWIS became OGWDW's official database on August
15, 1995. .

As of August 28,  1995,' the SDWIS Project Manager
stated that nine states and two Regional Indian land
programs had installed SDWIS components.  He projected
that by the end of fiscal 1996, 25 states will have installed
SDWIS.

Based on this review, we  believe the computer model
developed by Region 8, or a comparable model, should be
considered in the design and implementation of SDWIS to
assist regions and states in identifying PWSs that report
erroneous data.  According to the SDWIS Project
Manager, states will be able to download data from
SDWIS and use a  report generator capability to evaluate
PWS data integrity. Considering the modular design of
SDWIS for use by, states,  we believe that a computer
model is a more cost-effective approach for evaluating
PWS data integrity than states expending limited resources
to develop alternative techniques.
            24
              Report No. E1HWE5-23-0001-5100516

-------
                          Chapter 2: Data Integrity Enhancements Needed
Regions Need To
Reemphasize
Appendix G's
Significance
PWS Operator
Certification
Statements Might.
Deter Some False
Data Reporting
Regions needed to reemphasize the importance of
OGWDW's Appendix G guidance on detecting invalid  '
and falsified data to its states and redistribute copies to
the states. Most state officials we spoke with were not
aware that Federal guidance existed addressing data
falsification.  In fact, according to both Region 7 and
Region 8 officials, those two regions had not distributed
Appendix G to its respective states.  In Appendix G,
OGWDW advised the regions to work  with their states to
identify and prosecute PWSs suspected of falsifying data.

According to officials in some regions, they have not
conducted any training for regional staff on how  to detect
potentially falsified data using the guidance in Appendix
G.  Reemphasizing the Appendix G guidance would
require minimal resources.

Even though the Act does not address the issue of falsified
test data or potential penalties for those caught
reporting it, any deliberate submission  of false information
to the Federal government is a crime under 18 United
States Code (USC) 1001.   Section 1432 of the amended
Act addresses potential penalties for anyone found
tampering with a PWS to intentionally  harm the public.
This would cover, for instance, a terrorist who purposely
added harmful contaminants to the water supply.  EPA's
regulations which implement the Act do not contain
provisions addressing data falsification.

Although some states had specific drinking water
regulations prohibiting the submission of falsified test
results, and most states had general statutes outlawing
false claims, only six states (Arizona, Illinois, Maryland,
Oklahoma, Tennessee, and Utah) required PWS operators
to certify by signature that reported MOR data were
accurate and true. However, we  also found that
Oklahoma accepted unsigned, uncertified MORs.

We believe that PWS operators should  be held accountable
in cases when they intentionally report  erroneous data, and
                                       25
                                        Report No. E1HWE5-23-0001-5100516

-------
                          Chapter 2: Data Integrity Enhancements Needed
Operational
Reporting
Requirements
Could Be Reduced
be required to certify by signature to the validity and
authenticity of the data they report.  Such a certification
requirement would reinforce the fact that deliberate
reporting of false data is a crime.

Many states required PWS operators to report more
operational test data to them than EPA required.  We
identified some  states that required  monthly reporting
of daily test results for tests including, but not limited to,
alkalinity, hardness, iron, manganese, and poly phosphate.
Although operators reported this additional data, state
officials generally reviewed operational data almost
exclusively to identify compliance related violations, and
did not review such data to identify  unusual reporting
patterns.

On March 16, 1995, the President announced  the
establishment of ten principles for reinventing
environmental regulation.  Under these principles,
Federal, state, tribal, and local governments must work as
partners to achieve common environmental goals, with
non-Federal partners taking the lead when appropriate.
Currently, EPA is reviewing its priorities for drinking
water regulations based on an analysis of public health
risks and discussions with stakeholders. OGWDW
initiated a 25 percent regulatory paperwork burden
reduction project and held state and  regional workgroup
meetings to identify candidates in CFR Parts 141 and 142
for elimination or simplification.  Among ideas being
considered are the streamlining of:  (1) non-violation
information that states are currently required to report to
OGWDW, and (2) chemical monitoring requirements.
CONCLUSION
Improvements are needed to prevent and detect
erroneously reported drinking water test data.  Based on
our statistical sample review, we projected nationwide that
18.3 percent of the PWSs reported questionable test data,
and that 11.6 percent of the PWSs reported erroneous test
                                       26
                                         Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
results.  Although the percentage of the population served
by 'those PWSs that reported erroneous data was extremely
small, some improvements would provide further
assurance that operators are accurately reporting test data.
The reporting of invalid data could be corrected through
increased operator training or the purchasing of better test
equipment. The regions should take necessary corrective
actions, such as recommending that the states provide
training to those operators in need, or take enforcement
actions against those who intentionally falsified data.

OGWDW should consider the  computer model developed.
by Region 8, or a comparable model, in the development
and  implementation of SDWIS.  Such a model, we
believe, would enable regions  and states to efficiently
identify PWSs that reported erroneous data.

OGWDW needs to reemphasize the importance of its
Appendix G guidance and the  role sanitary surveys can
play in detecting erroneous data.  State officials need to be
made more aware  that data falsification does, in some
cases, occur and that they need to improve their
capabilities for detecting questionable data. State officials
told us that they rely on PWS  operators to report valid test
results and, therefore, do not believe they need to review
reported data for potential falsification.  Our audit has
shown, however, that the data reported to them was not
always valid.  Even though the data reporting process is
based on self-monitoring, states should not presume that
oversight or review of the data for reasonableness is
unnecessary.

Although there is a movement towards less Federal
oversight of state programs, regions should be aware of
what each state is doing to detect or prevent invalid and
falsified data.  The regions should also,  as a good
management practice, continue to distribute relevant
guidance to their states to heighten awareness of the
potential for data falsification and to aid them in detecting
invalid and falsified data.
             27
               Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
Small and very small PWSs most often reported invalid
data. States could detect or prevent the reporting of
invalid data by these small systems by conducting more
thorough sanitary surveys,  including comparing on-site
records with MORs sent to states.  States should examine
water system records during sanitary surveys to verify that
previously reported data were based on actual tests.
OGWDW sanitary survey training materials should include
data quality/falsification review steps.

Submission  of falsified data is a crime under 18 USC
1001!  However, because the Act itself does not provide
criminal penalties for falsifying data, many operators may
not realize that there are Federal criminal penalties for
such an offense.  Many states have either specific drinking
water regulations or, at a minimum, more general false
claims statutes in place as legal authority for taking
enforcement actions against those who falsified data.
However, most states do not currently require PWS
operators to certify on MORs that the data they are
submitting is accurate and true.  We believe that state
reporting forms should include a certification block to be
signed by PWS operators.  This certification requirement
would make it clear that deliberate false reporting is1 a
crime.   For example, operators could certify to the
following:

       "I certify that this report is true.  I
       also certify that the statements I have
       made on this form and all
       attachments thereto are true,
       accurate, and complete.  I
       acknowledge that any knowingly
       false or misleading statement may be
       punishable by fine or imprisonment
       or both under 18 USC 1001 and   ^
       other applicable laws."

Finally, given the resource constraints that states are
experiencing, we believe OGWDW, through its regulatory
reduction project, should encourage states to determine the
             28
              Report No. E1HWE5-23-0001-5100516

-------
                          Chapter 2: Data Integrity Enhancements Needed
                          testing and data reporting requirements that are most
                          critical to protecting public health and adjust the reporting
                          requirements accordingly.           t
RECOMMENDATIONS
                          We recommend that the Assistant Administrator for Water
                          require the Director of EPA's Office of Ground Water and
                          Drinking Water to:

                          1.    request that the regions coordinate with the states
                                to follow up on the 48 PWSs, or operators, which
                                we found reported invalid or potentially falsified
                                test data and consider providing training to, or
                                taking enforcement actions against, them;

                          2.    consider incorporating the computer model
                                developed by Region 8, or a comparable model, in
                                the design and implementation of SDWIS to assist
                                regions and states in identifying PWSs that report
                                erroneous data;

                          3.    revise EPA's Appendix G guidance to include
                                additional steps to identify erroneous data and
                                provide updated examples to illustrate the types of
                                data patterns indicative of erroneous data,  and
                                redistribute  this  revised guidance to the regions;

                          4.    request that the regions distribute the revised
                                Appendix G guidance to the states;

                          5.    update EPA's sanitary survey training materials to
                                include reviews  of data quality (for example,
                                comparison of MORs with on-site logs  and bench
                                sheets);

                          6.    establish schedules, with the regions, for providing
                                sanitary survey training to state officials;
                                       29
                                        Report No. E1HWE5-23-0001-5100S16

-------
                          Chapter 2: Data Integrity Enhancements Needed
                          8.
discuss with EPA regions and state officials the
feasibility of including a certification block on
MOR forms for PWS operators to sign and certify
to the accuracy of the reported data; and,

continue to work with regional and state
stakeholders to minimize operational data reporting
in streamlining the regulatory paperwork
requirements.
AGENCY COMMENTS
AND ACTIONS
                          The Assistant Administrator (AA) for Water generally
                          agreed with the recommendations, but stated his office
                          could not commit to implementing all of the
                          recommendations due to significant resource constraints
                          they  are facing.  The AA provided the following
                          comments:

                                ...Recommendation 1.  We agree that follow up is
                                needed on the 48 systems which the Inspector
                                General has identified as reporting invalid or
                             v  potentially falsified data and we will ask the
                              .  regions to encourage States to follow up on these
                                cases and  take appropriate action.  Some  states
                                have already begun investigating these systems.

                                We would like to point out that training and
                                enforcement may not be the only types of
                                appropriate response.  In some instances, the
                                operators that were involved are gone, so the
                                appropriate response may be no action.  Further, it
                                is difficult to take an enforcement action against
                                systems which may have falsified data. Our
                                regions are concerned that, even where they have
                                evidence of potential criminal conduct, the criminal
                                investigators often decline to investigate or
                                      30
                                        Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed


       prosecute because the case is not a high priority.
       This is a clear disincentive for addressing cases of
       potentially falsified data.

       ...Recommendation 2.  We would like to explore
     '  the feasibility of this  recommendation, however,
       we cannot commit to this activity at this time given
       our budget constraints.

       ...Recommendation 3.  Assuming we receive
       sufficient resources in next year's budget, we will
       review the Inspector General's specific suggestions
       for revising the Appendix G guidance and make
       any necessary revisions. If we revise the guidance,
       we will distribute it to our Regional Offices.

       . ..Recommendation 4.  If we revise the guidance,
       we will ask the Regions to distribute it to their
       states.

       ... Recommendation 5.  We have already added a
       one-hour data falsification module to our 4 day
       sanitary survey training course which addresses the
       need to compare monthly operational reports
       (MORs) with on-site logs and bench sheets. We
       have also drafted "EPA/State Joint Guidance on
       Sanitary Surveys" that identifies eight
       recommended elements of a sanitary survey.  One
       of these elements, "Monitoring,  Reporting, and
       Data Verification" includes reviewing the validity
       of data reported to the State, reviewing bench
       sheets, on-site logs, and monthly operational
       reports, and the calibration of process control and
       compliance equipment.  This guidance will be
       finalized in November.

       ...Recommendation 6.  Because of significant
       travel and resource constraints, we cannot commit
       at this time to establishing schedules for  providing
       sanitary survey training to States.
             31
               Report No. E1HWE5-23-0001-5100516

-------
OIG EVALUATION
                          Chapter 2: Data Integrity Enhancements Needed
                                 ...Recommendation?. We have discussed this
                                 recommendation with our regional counterparts at a
                                 national meeting that was held on September 19-21,
                                 1995.  Based on this discussion, we will send a
                                 memorandum to the Regions requesting them to
                                 inform their States of your findings and suggest
                                 that States consider including a certification block
                                 on the forms.  Unfortunately, resource limits
                                 preclude us from following up with the States to
                                 track their progress in implementing this
                                 suggestion.

                                 .. .Recommendation 8. We plan to continue this
                                 effort and  appreciate the  Inspector General's
                                 support.

                          In addition, the AA for Water stated that he and his staff
                          were very surprised to learn that the Inspector General
                          nominated drinking water data integrity as an Agency level
                          weakness after the issuance of the draft report.  He stated
                          that he did not believe that data  integrity was an
                          appropriate candidate for  such a nomination, and that this
                          report does not support the nomination.
                          The proposed actions generally meet the intent of the
                          recommendations.  Although the future of drinking water
                          funding levels is uncertain, the program office needs to
                          provide specific target dates for implementing
                          recommendations 1 through 4,  and 6, once EPA's fiscal
                          1996 budget is approved. Regarding the, difficulty of
                          taking enforcement action against systems which may have
                          falsified data raised by the AA, EPA's Office of
                          Enforcement and Compliance Assurance (OECA) can
                          direct EPA regional Offices of Criminal Investigations to
                          assign data falsification cases a higher priority for
                          investigation, if OECA believes that such cases are high
                                       32
                                        Report No. E1HWE5-23-0001-5100516

-------
Chapter 2: Data Integrity Enhancements Needed
priority items, relative to their entire workload.   To deter
future data falsification, OECA management could work
with Regional management to determine the priority
necessary to ensure that enough cases are pursued to
demonstrate the severity and consequences of such actions.

As stated above, the AA did not believe that data integrity
was an appropriate candidate as a  1995 Agency level   ,
weakness.  In response, the Office of Inspector General
believes drinking water data quality is  an appropriate
candidate based on the results of this review, along with
the results of another OIG review  titled "Region 7 and
States Improved Drinking Water Programs Through
Alternative Measures" (Report No. 5100226, dated March
24, 1995). In that review, we found that states in EPA
Region 7 needed automated data management systems for
more consistent and efficient data  management and that
EPA  relied upon disjointed and manual or partially
automated state systems to update  FRDS, its primary
water management automated database. Ineffective data
management systems have impacted States' abilities to
provide Region 7 with accurate enforcement data and to
obtain small and very small systems' timely compliance
with Safe Drinking Water Act requirements.  We believe
that this adverse condition is representative of states in
other regions based on our nationwide review of data
integrity.  Also, as of August 28,  1995, only nine states
and two Regional Indian land programs had installed
components of SDWIS,  which replaced FRDS on August
15,1995.

We have concluded that the Agency should consider
drinking water data quality as a 1995 weakness candidate
for its annual assurance  letter because:  (1) SDWIS is not
fully implemented by the Agency, (2) the 25 states that
have agreed to install SDWIS have not done so, and (3)
we identified data integrity weaknesses hi this review.

In response to the Agency's request, the Inspector General
provided input on the issues the OIG believed to be
weakness candidates.  The Inspector General included
             33
              Report No. E1HWE5-23-0001-5100516

-------
Chapter 2:  Data Integrity Enhancements Needed
drinking water data quality as one of the candidates for
consideration as an Agency-level weakness.  The Assistant
Administrator for Water's response indicates they
considered the OIG's input, but instead nominated other
weaknesses they believed to be better candidates.
            34
              Report No. E1HWE5-23-0001-5100516

-------
                                                                       Exhibit 1
                                                                      Page 1 of 1
        REGION 1 SURFACE WATER SYSTEMS THE OIG REVIEWED
                    THAT SUBMITTED ERRONEOUS DATA
State
CT
MA
ME
NH
RI
VT
Total
Reviewed
4
8
6
1
o19
3
22 -
Invalid
0
0
0
0
N/A
0
0
Potentially
Falsified
0
0
0
0
N/A .
0
0
Total
0
0.
6
0
N/A
0
0
Percent
0.0
0.0
0.0
0.0
N/A
0.0
o.o20 .
This exhibit shows that of the 22 Region 1 PWSs reviewed, none reported invalid or
potentially falsified data.  These results should not be used for statistical projection
purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface
PWSs located in the continental United States.  The sample excluded Region 5 PWSs,
which we previously 'reviewed.              ,
    I9We did not review data for any Rhode Island PWSs because none were selected in the random sample.

    20Total percent refers to the percentage of PWSs reviewed which reported invalid or potentially falsified
data.  This percentage was derived by adding the number of PWSs that reported invalid data to the number
that reported potentially falsified data and dividing by the number of PWSs reviewed..

                                       35
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                                Exhibit 2
                                                                Page 1 of 1
       REGION 2 SURFACE WATER SYSTEMS THE OIG REVIEWED
                  THAT SUBMITTED ERRONEOUS DATA
State
NJ
NY
Total
Reviewed
2
16
18
Invalid
0
; 2
2
Potentially
Falsified .
,0.
0
0
Total
0
2
2
Percent
0.0
12.5
11.1
This exhibit shows that of the 18 Region 2 PWSs reviewed, two systems reported  ,
erroneous data.  Both of the PWSs were located in the State of New York and reported
invalid data.  These results should not be used for statistical projection purposes by
region/state because the statistical sample we used for our review was  randomly
selected on a national basis from the universe of 4,417 community surface PWSs
located in the continental United States. The sample excluded Region 5 PWSs, which
we previously reviewed.       •
                                    36
                                     Report No. E1HWE5-23-0001-5100516

-------
                                                                   Exhibit 3
                                                                   Page 1 of 1
        REGION 3 SURFACE WATER SYSTEMS THE OIG REVIEWED
                   THAT SUBMITTED ERRONEOUS DATA
State
DE
MD
PA
VA
WV
Total
Reviewed
O21
2
. 25
6
9
42
Invalid
N/A
0
3
0 -
3
6
Potentially
Falsified
N/A
0
5
"0
1
6
Total
' N/A
0
8
0
4
12 .
Percent
N/A
0.0
32.0
0.0 .
44.4
28.6
This exhibit shows that of the 42 Region 3 PWSs reviewed, 12 reported erroneous
data.  Six reported invalid data and six reported potentially falsified data.  Eight of the
PWSs were located in Pennsylvania, while the remaining four were in West Virginia.
These results should not be used for statistical projection purposes by region/state
because the statistical sample we used  for our review was randomly selected on a
national basis from the universe of 4,417 community surface PWSs located in the
continental United States. The sample excluded Region 5 PWSs, which we previously
reviewed.
    21 We did not review data for any Delaware PWSs because none were selected in the random sample.

        '            '                 37  .
                                       Report No. E1HWE5-23-0001-5100516

-------
                                                                        Exhibit 4
                                                                        Page 1 of 1
        REGION 4 SURFACE WATER SYSTEMS THE OIG REVIEWED
                    THAT SUBMITTED ERRONEOUS DATA
State
AL
FL
GA
KY
MS
NC
sc
TN
Total
Reviewed
4
1
7
7
Q22
r
9
7
11
46
Invalid
0
0
0
0
N/A
1
1
1
3
Potentially
Falsified
. 0
0
0
0
N/A.
0
' 0
I23
1
Total
0 '
0
0
0
N/A .
1
1
2
4
Percent
0.0
0.0
0.0
0.0
, N/A
11.1
14.3
18.2
8.7
This exhibit shows that four Region 4 systems reported invalid or potentially falsified
data.  Of the four, two were in Tennessee, one was in North Carolina, and one was in
South Carolina.  These results should,not be used for statistical projection purposes by
region/state because the statistical sample we used for our review was randomly
selected on a national basis from the universe of 4,417 community surface PWSs
located in the continental United States.. The sample excluded Region 5 PWSs, which
we previously reviewed.
  .  2?We did not review data for any Mississippi PWSs because none were selected in the random sample.

    23This Tennessee FWS also reported some invalid data. We classified this PWS within the potentially
falsified category to prevent the system from being counted twice in the total and because actions to correct
falsified data are typically more serious and resource-intensive.

                                        38
                                          Report No. E1HWE5-23-0001-5100516

-------
                                                                  Exhibit 5
                                                                  Page 1 of 1
        REGION 5 SURFACE WATER SYSTEMS THE DIG REVIEWED
                  THAT SUBMITTED ERRONEOUS DATA
State
IL
IN
MI
MN.
' OH
WI
R5
Total
Reviewed
129
57
70
26
149
20
I24
452
Invalid
3
1-
1
0 *
2 '
1
0
8
Potentially
Falsified
1
' 3 ' •
0
o
2
0
0
6^
Total
4
4
1
0
4
1
0
14
Percent
s
s:'i
7.0
1.4
0.0
2.7
5.0
0.0
3.1
This exhibit shows that of the 452 Region 5 PWSs reviewed, 14 PWSs reported either
invalid or potentially falsified test data.  Of the 14, eight reported invalid data, while
the remaining six PWSs reported potentially falsified data.  We reviewed the entire
population of community, surface PWSs in Region 5, which was not a part of the
national random sample selection.                    ,     .                 .
   24Region 5 was directly responsible for oversight of PWSs on Indian lands.  This one PWS was an
Indian land system in Michigan.

                                     39
                                      Report No. E1HWE5-23-0001-5100516

-------
                                                                   Exhibit 6
                                                                   Page 1 of 1
        REGION 6 SURFACE WATER SYSTEMS THE OIG REVIEWED
                  THAT SUBMITTED ERRONEOUS DATA
State
AR
LA
. NM
OK
' TX
R6
Total
Reviewed
7
3
1
19
18
I25
49
Invalid
1
1
1
0
1
0 ,'
4
Potentially
Falsified
0
, 1
0
4
1
0
6
Total
1
2
1
4
2
0
10
Percent
14.3
66.7
100.0
21.1
11.1
0.0
20.4
This exhibit shows that of the 49 Region 6 PWSs reviewed, 10 PWSs reported either
invalid or potentially falsified test data. These results should not be used for statistical
projection purposes by region/state because the statistical sample we used for our
review was randomly selected on a national basis from the universe of 4,417
community surface PWSs located in the continental United States. The  sample
excluded Region 5 PWSs, which we previously reviewed.
   ^Region 6 was directly responsible for oversight of PWSs on Indian lands. This one system was an
Indian land system in New Mexico.

                                     40
                                       Report No. E1HWE5-23-0001-5100516

-------
                                                                    Exhibit 7
                                                                    Page 1 of 1
        REGION 7 SURFACE WATER SYSTEMS THE OIG REVIEWED
                   THAT SUBMITTED ERRONEOUS DATA
State
IA
KS
MO
NE
R7
Total
Reviewed
4
6
5
O26
I27
16
Invalid
.0
0
o
N/A
0
0
Potentially
Falsified
0
0
0
N/A
0
0
Total
0
- 0
0
N/A
0
0
Percent
0.0
, 0.0
0.0
N/A
' 0.0
0.0
This exhibit shows that of the 16 Region 7 PWSs reviewed, none reported invalid or
potentially falsified data. These results should not be used for statistical projection
purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface
PWSs located in the  continental United States. The sample excluded Region 5 PWSs,
which we previously reviewed.
    26We did not review data for any Nebraska PWSs because none were selected in the random sample.

    27Region 7 was directly responsible for oversight of PWSs on Indian lands.  This one system was an
Indian land system in Kansas.

                                      41
                                       Report No. E1HWE5-23-0001-5100516

-------
                                                                 Exhibit 8
                                                                 Page 1 of 1
       REGION 8 SURFACE WATER SYSTEMS THE OIG REVIEWED
                  THAT SUBMITTED ERRONEOUS DATA
State
CO
MT
ND
sp
UT
WY
Total
Reviewed
10
. ' ' 3
4
1
1
. 3 •"
22
Invalid
0
0
0
0
0
0
0
Potentially
Falsified
0
0
o.
0
0
0
0
Total
0
0
0
0
0
0
0
Percent
0.0
0.0
0.0
0.0
0.0
0.0
0.0
This exhibit shows that of the 22 Region 8 PWSs reviewed, none reported invalid or
potentially falsified data.  These results should not be used for statistical projection
purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface
PWSs located in the continental United States. The sample excluded Region 5 PWSs,
which we previously reviewed.
                                    .42
                                     Report No. E1HWE5-23-0001-5100516

-------
                                                                      Exhibit 9
                                                                      Page 1 of 1
        REGION 9 SURFACE WATER SYSTEMS THE OIG REVIEWED
                   THAT SUBMITTED ERRONEOUS DATA
State
AZ
CA
.NV
R9
Total
Reviewed
4
33 '
1
228
40
Invalid
0
•2
0
i
3-
Potentially
Falsified
0
0
0
0
0
Total
0
2
0
1
3
. Percent
0.0
6.1
0.0
50.0
7.5
This exhibit shows that of the 40 Region 9 PWSs reviewed, three reported erroneous
data.  Two were located in the State of California, while the third was an Indian land
system in New Mexico. These results should not be used for statistical projection
purposes by region/state because the statistical sample we used for .our review was
randomly selected on a national basis from the universe of 4,417 community surface
PWSs located in the continental United States.  The sample  excluded Region 5 PWSs,
which we previously reviewed.
    28Region 9 was directly responsible for oversight of PWSs on Indian lands. One of these two systems
was in California, while the other was in New Mexico. Region 9 officials had an agreement with Region 6
officials to oversee this New Mexico system, which normally would fall within Region 6's geographical
coverage.                        .

                                       43
                                        Report No. E1HWE5-23-0001-5100516

-------
                                                                Exhibit 10
                                                                Page 1 of 1
       REGION 10 SURFACE WATER SYSTEMS THE OIG REVIEWED
                  THAT SUBMITTED ERRONEOUS DATA
State
ID
OR
WA
Total
. Reviewed
2
9
5
16
Invalid
1
1
0
2
Potentially
Falsified
0
0
.1
1
Total
1
1
1
3
Percent
50.0
11.1
20.0
18.8
This exhibit shows that of the 16 Region 10 PWSs reviewed, three reported invalid or
potentially falsified data.  These results should not be used for statistical projection
purposes by region/state because the statistical sample we used for our review was
randomly selected on a national basis from the universe of 4,417 community surface
PWSs located in the continental United States. The sample excluded Region 5 PWSs,
which we previously reviewed.                                  ,
                                    44
                                     Report No. E1HWE5-23-0001-5100516

-------
                                                            EXHIBIT 11
                                                             .  Page I of 1
             CALCULATION OF PWSs WITH QUESTIONABLE
               DATA PROJECTED TO THE PWS UNIVERSE
The 18.3 percent figure was derived by aggregating our prior results from our audit of
Region 5 with our results from the national audit of the remaining regions. We
calculated this percentage as follows:
 Step 1.



 Step, 2.

 Step 3.

 Step 4.
   Number of PWSs with Data that
   Appeared Questionable (National)
 Number of PWSs in Sample (National)
=    52  =  0.192
    271
0.192 x 4,417 (Universe of National Sample) = 848

848 + 42 (Region 5 PWSs with Questionable Data) = 890

4,417 (Universe of National Sample)  +452 (Universe of Region 5
PWSs)  = 4,869
 Step 5.     890 4- 4,869 (Total Universe) = 0.183 or 18;3 percent
                                   45
                                     Report No. E1HWE5-23-0001-5100516

-------
[This page intentionally left blank.]
              46
                Report No. E1HWE5-23-0001-5100516

-------
                                                                  EXHIBIT 12
                                                                      Page 1 of 4
                            SITE VISITS SUMMARY
The following information relates to some of the 15 site visits we conducted. We
visited these PWSs to further evaluate questionable test data reported by certain PWS
operators we identified during our file reviews. This information includes two
examples where we determined, upon completion of the site visit, that problems did not
exist.

An Oregon PWS Operator Used Improper
Procedures to Record Test Results

A PWS operator at a small-sized system in Oregon used improper testing procedures
and equipment to record some test results.  For example, he used his personal hot tub
testing equipment to determine pH levels.  The operator reported invalid test data for
turbidity, pH, and water temperature.                     '

The operator reported data to its state on MORs which showed little variation in values
for turbidity, and no  variation in pH and water temperature readings from September
1993 through July 1994.  Our site visit showed that: (1) turbidity sampling and
analysis was not done hi accordance with proper laboratory techniques; and (2)  the
instruments, methodology, and monitoring practices for pH and temperature were
unacceptable.  Moreover, we found that the State of Oregon had not conducted  a
sanitary survey of this system since 1989.

       Turbidity Testing Conducted Improperly

We found that turbidity sampling and analysis were not conducted in accordance with
proper laboratory techniques.29  According to the operator, a 1.0 nephelometric
turbidity unit (NTU)  primary standard .was used once a week to standardize the
machine.  However,  secondary standards were never used, and the machine was never
calibrated prior to turbidity measurements.  In addition, the operator collected and
tested only one sample for turbidity each day, instead of every four hours, as required
under the Surface Water Treatment Rule.
    29EPA Method 180.1
                                       47
                                         Report No. E1HWE5-23-0001-5100516

-------
                                                                  EXHIBIT 12
                                                                     Page 2  of 4
       Testing for pH Was Also Conducted Improperly
For a period of eleven consecutive months, from September 1993 through July 1994,
the operator reported a pH of 6.5 every day but one.  Based on our site visit, we found
that:  (1) a pH measurement was taken on the first day of each month and the operator
recorded and reported that same value for the rest of the days in the month; (2) pH was
measured colorimetrically, which is not aft approved method; and (3) the pH instrument
at the system only had a range of 6.8 to 8.2; therefore, a reading of 6.5 was not
possible. According to the operator, he used a color meter that he used for his own
hot tub to come up with the readings of 6.5.  He said that his personal hot tub
colorimeter had a scale that included 6.5 and hie would compare the color of the  sample
being tested with the colors on his meter.  Region 10 officials agreed that this PWS's
reported pH readings outside the 6.8 to 8.2 range were unacceptable.  According to
Regional officials, the PWS recently purchased an EPA-approved pH meter to obtain
more accurate readings.
                                  ;
      Temperature Testing Conducted Improperly

The operator also reported water temperature at 5 degrees Centigrade every day from
October 1993 through July 1994. Based on our site visit, we found that:  (1) a
temperature measurement was taken on the  first day of each month and the operator
then recorded and reported that reading for the rest of the days in the month, and (2)
temperature was measured using a non-scientific thermometer.  According to the
operator, 'when he was hired and trained, he was told to test for pH and temperature
only on the first day of the month and to record that same value for the rest of the
month.                                   •                        -

      Problems with State Oversight

According to Region 10's Drinking Water Program Section Chief, the State of Oregon
had not  performed a sanitary survey at this  system since October 1989—over five years
ago.   According to current Federal requirements in 40 CFR 141.21, states generally
must conduct a sanitary survey at least once every five years at each surface PWS.
The Region 10 Chief told us that this PWS  will soon be merging with other PWSs and
will cease to be a separately regulated PWS.
                                       48
                                        Report No. E1HWE5-23-0001-5100516

-------
                                                                    EXHIBIT 12
                                                                        Page 3 of 4
An Indian Land PWS in New Mexico
Reported Unreliable Turbidity Results
A PWS operator at a medium-sized Indian land system in New Mexico30 reported
unreliable effluent turbidity values due to the use of improper laboratory techniques.
Beginning in July  1993," we noted this system's reported effluent turbidity levels
dropped from around 1.0 NTUs to around 0.5 NTUs or below and stayed at this new
level in the ensuing months.31 Based on our site visit, we determined that:  (1)
secondary standards used by the operator were over a year old, and a primary standard
was not used to check the accuracy of the secondary standard; (2)  the secondary
standards used by  the operator were  not within the range of turbidity values expected
(that is, 0 -  1.0 NTU) to be achieved; and (3) the operator did not calibrate the bench
model turbidimeter prior to sample measurements.

A North Carolina PWS
Reported Invalid pH Test Results

A PWS operator at a medium-sized system in North Carolina reported invalid data for
pH for over three  years due to improper testing procedures and poor equipment.  The .
operator reported identical pH readings of 8.4 for filtered and finished water to its state
nearly every day from January 1991 through May 1994.  During our site visit, we
determined that operators used:  (1)  improper procedures to read the pH levels and  (2)
a crude colormetric wheel to measure pH.  For example, we found that operators read
their results by holding  the color wheel up to a window with evergreen trees in the
background.  Because the color wheel used varying shades of green to measure pH
values, light filtered through the green background of the trees would have made it
difficult to obtain accurate readings.

This PWS obtained a new and more scientific pH meter in March  1995. Recorded  pH
values in the PWS's daily logs began to fluctuate beginning that month.  Although
erroneous readings were reported in  the past, there did not seem to be an ongoing
reporting problem as of March 1995.  The system's purchase of a  new pH meter, along
    30Region 9 is responsible for direct oversight of this system. Although the system is in New Mexico,
which normally falls within the coverage of Region 6, Region 9 officials have an agreement with Region 6 to
monitor the system.
                                     ,          -                                \
    3IIn July 1993, the Surface Water Treatment Rule became effective. Under the Rule, a PWS using
conventional or direct filtration was deemed in violation of the turbidity standard if more than 5 percent of its
test readings for the month were over 0.5 NTU. Prior to the Rule, the violation standard was 1.0 NTU.

                                        49


                                          Report No. E1HWE5-23-0001-5100516

-------
                                                                    EXHIBIT 12
                                                          .    •      ,    Page 4 of 4

with training for the operators, should result in more accurate, and fluctuating,
readings.

A Tennessee PWS Operator Reported                                .
Invalid Chlorine Residual Results

A PWS operator at a small-sized system in Tennessee reported invalid data for chlorine
residual for over two years.  The operator reported identical chlorine residual readings
of 2.0 mg/1 for  "on top of filter,", "plant effluent",  and  "distribution system" from
September 1991 through November 1993.  A state official told us that:  (1) the city
which operated the PWS had difficulty keeping a certified operator on-site because the
city was unable  to pay rriuch in wages and  (2) he had visited the treatment plant in the
past and found operators either asleep or watching television. The city, however, has
taken steps to correct deficiencies including the recent hiring of a State-certified
operator.  It was .evident during our site visit that the newly hired operator  was making
a 'serious effort to provide a safe product to his customers and meet State and Federal
requirements.  The operator's daily logs showed that he was recording more accurate
data than others reported in the past.        ±

Six Site Visits
Showed  Accurate Testing

Six of the 15 PWSs we visited seemed to be operating satisfactorily. Our inspections
of these PWSs' facilities, equipment, and records, and discussions with water plant
officials, showed that the reported data we  initially questioned were likely to be valid.

For example, one PWS consistently reported finished turbidity readings of 0.02 from
September 1991 through March  1992.  Turbidity measurements at that level are very
low—that is, the water would have to be extremely clear and free of paniculate matter.
As a result of our on-site visit, however, we found that  the'PWS used a continuous
monitoring device which had an automatic alarm system that shut the plant down when
turbidity and chlorine residuals reached a pre-set limit.  As a result, the  systems were
prevented from exceeding certain limits.  Independent tests we took during our visit did
not significantly differ from the reported turbidity results.

Another PWS consistently reported chlorine residuals in the distribution  system at 0.8
for 20 consecutive months.  The operator at the treatment plant stated that he began
using  a manual Hach  color wheel to measure chlorine residuals because the in-line
chlorine monitor was  not working properly. The operator rounded chlorine residual
measurements to the nearest tenth of a decimal.  Independent tests we took during our
visit did not significantly differ from the reported chlorine residual readings.
                                        50
                                          Report No. E1HWE5-23-0001-5100516

-------
                                                          APPENDIX 1
                                                            Page 1 of 3
             UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                        WASHINGTON, D.C. 20460


                               SEP 27 1996
MEMORANDUM
 OFFICE OF
, WATER
SUBJECT:   OIG Draft Report "EPA Procedures to Ensure Drinking Water
           Data  Integrity"   (EIHWE5-23-001)

PROM:      Robert  Perciasepe1 l~ A   0
           Assistant  Adminis4ra4pf  >
                          •         i
TO:        Michael Simmons              '
           Deputy  Assistant*Inspector General
             for Internal and Performance Audits

     The   purpose of  this  memorandum   is  to  respond  to  the
recommendations in the OIG's. draft report entitled "EPA Procedures
to  Ensure  Drinking  Water   Data  Integrity".    I  appreciate  the
cooperative .spirit your staff has shown in carrying out this study.
Before responding to the recommendations, I would like to provide
our thoughts on the  principal findings of the draft report.

     The principal findings are:,

     •about  12 percent of the public water systems that use surface
     water reported erroneous  data- one  or more  times  from  1991
     through 1994,                                   .

     •these  systems  serve only 0.1 percent  of  the .population,'and

     •over half  of  the erroneous data  cases were  due to  lack
     of training  or  improperly functioning equipment.   Less  than
     half  of cases were  classified as "potentially  falsified".

In light of  these findings, .the draft report  states  that "We did
not detect .any  significant  internal control weaknesses during our
audit."

   .  After reviewing your findings, we are in  total  agreement that
data integrity,  and especially data falsification, is not currently
a  significant  problem  in  the  public  water  supply  supervision
program and  is  certainly not a significant  weakness.

     Since the  draft  report was  issued,  however,  we were  very
surprised  to  learn  that   the Inspector  General  has  nominated
drinking water  data  integrity as an Agency  level  weakness.   He do
not. believe  that  data integrity is an  appropriate candidate' for
such a nomination.  Further,-we believe  that the draft report does
not support this nomination either.  We  have already nominated and
                                                     PnntM ««n Sof/Ctmt r* »tier m*
                                                          « sox r»cyo«c ifcw
                               51
                                 Report No. E1HWE5-23-0001-5100516

-------
                                                           APPENDIX 1
                                                             Page 2 of 3
 are considering other issues that are better candidates.

      The report provided recommendations for reducing the potential
 for data to be reported improperly.   We have carefully considered
,these recommendations and  are  generally in agreement with  them,
 although significant resource constraints that we are facing will
 limit the scope of some of our responses.  Our specific responses
 to the recommendations are:

 Recommendation #1:  Request that the regions  coordinate with the
 States to follow up on the 48 public water systems, or operators,
 which we found reported invalid or potentially falsified test data
 and consider providing training to, or  taking enforcement action
 against them.             .                          '

      Response: We agree that follow up  is needed on the 48 systems
      which  the  IG  has   identified  as  reporting   invalid  or
      potentially falsified data  and we will  ask the regions  .to
      encourage  States  to  follow  up .on  these  cases . and   take
      appropriate  action.    Some   States  have  already   begun
      investigating these  systems.

      We would  -like to point out that training and enforcement may
      not be the only  types of - appropriate  response.   In  some
      instances,  the operators thai were involved  are  gone, so the
      appropriate  response  may be  no  action.   Further,   it  is
      difficult to take an enforcement action  against systems  which
      may have  falsified data. Our regions are concerned that, even
    '  where they have evidence of potential criminal  conduct,  the
      criminal   investigators often   decline   to   investigate  or
      prosecute because the  case is not a high priority.  This is a
      clear  disincentive   for  addressing  cases   of   potentially
      falsified data.                    '   .

 Recommendation £21   consider  incorporating   the   computer   model
 developed  by  Region 8,  or a  comparable  model,   in  the design,
 development and  implementation of  SDWIS  to  assist  regions  and
 States in  identifying- systems that report  erroneous data.

      Response:   We would like  to explore, the feasibility of this
      recommendation,  however, we cannot commit to  this activity at
      this  time given our  budget constraints.

 Recommendation *3i   Revise  EPA's  Appendix G  guidance to include
 additional steps to  identify erroneous  data  and  provide updated
 examples to illustrate the types of data patterns indicative  of
 erroneous  data,  and redistribute this  revised  guidance  to  the
 regions.

      Responses  Assuming  we receive sufficient resources in next
      year's budget, we will review the Inspector General's specific
      suggestions  for revising the Appendix G  guidance  and make any
      necessary revisions.'  If we  revise the  guidance, we  will
      distribute it to our Regional Offices.
                                 52
                                   Report No. E1HWE5-23-0001-5100516

-------
                                                           APPENDIX 1
                                                             Page 3 of 3
Recommendation  #4:   Request that the regions distribute the revised
Appendix 6 guidance  to  the States.

     Response:  If we revise the guidance,  we will  ask  the Regions
     to distribute it to their  States.

Recommendation #5: Update EPA's sanitary survey training materials
to include reviews of data quality (for example, comparison of MORs
with on-site  logs and bench sheets).

     Response:  We have  already  added a one-hour data falsification
     module  to  our  4 day  sanitary survey training  course  which
     addresses  the  need to  compare monthly operational reports
     (MORs)  with on-site logs  and bench  sheets.   We have also
     drafted  "EPA/State Joint Guidance .on. Sanitary Surveys" that
     identifies  eight recommended  elements of a sanitary survey.
     One  of  these  elements';   "Monitoring,  Reporting,  and Data
     Verification" includes reviewing the validity of data  reported
     to  the  State,  reviewing  bench   sheets,  on-site  logs,' and
     monthly  operational reports,  and  the calibration of process
     control  and compliance  equipment.    This guidance  will  be
     finalized  in November.

Recommendation #6:   Establish  schedules,  with the regions, for
providing sanitary survey training to state.officials.

     Response:    Because  of   significant   travel  and  resource
     .constraints, we cannot commit at this  time  to establishing
   '  schedules for providing sanitary survey training  to states.

Recommendation #7:   Discuss with -EPA Regions and  State officials
the feasibility of including a certification block  on MOR forms for
PWS operators to sign and certify to the accuracy of the reported
data.

     Response:   We  have discussed this recommendation  with our
     regional counterparts at a national meeting that was held on
     September 19-21, 1995.  Based on this discussion, we will" send
     a memorandum to the Regions requesting  them  to inform their
     states  of  your findings  and  suggest that States consider
     including a certification block on the forms.  Unfortunately,
     resource limits, preclude us from following up with the States
     to track their  progress in implementing this suggestion.

Recommendation 18:   Continue  to work  with  Regional  and  State
stakeholders to minimize operational data reporting  in streamlining
the regulatory paperwork requirements.

     Responses  We. plan to continue  this effort and appreciate the'
     IG's support.                     .

          Thank  you  for  the  opportunity  to  .review   this  draft
report.  If you would like to  discuss this further, please contact
me or Cynthia C. Dougherty on (202) 26O-5543.
                                 53
                                   Report No. E1HWE5-23-0001-5100516

-------
[This page intentionally left blank.]
              54
                Report No. E1HWE5-23-0001-5100516

-------
                                                              APPENDIX 2
                                                               , Page 1 of 1
                             ABBREVIATIONS


AA         Assistant Administrator

CDC        Centers for Disease Control and Prevention

CFR        Code of Federal Regulations

EPA        Environmental Protection Agency

FRDS       Federal Reporting Data System

GAO        General Accounting Office

MOR        Monthly Operational Report

NTU        Nepheiometric Turbidity Units

OECA      Office of Enforcement and Compliance Assurance

OGWDW    Office of Ground Water and Drinking Water

OIG        Office of Inspector General

PWS        Public Water System

SDWIS      Safe Drinking Water Information System

USC        United States Code
                                    55
                                     Report No. E1HWE5-23-0001-5100516

-------
[This page intentionally left blank.]
              56
                Report No. E1HWE5-23-0001-5100516

-------
                                                                 APPENDIX 3
                                                                    Page 1 of 2
                               DISTRIBUTION
Inspector General (2410) •'

Assistant Administrator for Water (4101)

Assistant Administrator for Enforcement and Compliance Assurance (2201)

Director, Office of Ground Water & Drinking Water (4601)

Director, Region 1 Water Management Division
                                                     /
Director, Region 2 Water Management Division

Director, Region 3, Water Management Division           .

Director, Region 4 Water Management Division
                            v
Director, Region 5 Water Division (W-15J)

Director, Region 6 Water Management Division

Director, Region 7 Water Management Division

Director, Region 8 Water Management Division

Director, Region 9 Water Management Division

Director, Region 10 Water Division

OIG Office of Investigations (II-13J)

Agency Follow-up Official (3304)  .
  Attention:  Assistant Administrator for the Office of
                Administration and Resources Management

Agency Follow-up Coordinator (3304)
  Attention:  Director, Resources Management Division
                                      57
                                        Report No. E1HWE5-23-0001-5100516

-------
                                                                    APPENDIX 3
                                                                      Page 2 of 2
 Audit Follow-up Coordinator (3304)
  Attention:  Management Controls Branch

 Audit Follow-up Coordinator (3802F)
  Attention:  Office of Policy, Training, and Oversight
           Division  .

 Audit Follow-up Coordinator (1104)       ">
  Attention:  Executive Support Office

 Region 1 Office of External Programs

 Region 2 Public Affairs Branch

 Region 2 Intergovernmental Relations Branch

 Region 3 Office of External Affairs

 Region 4 Public Affairs

 Region 5 Public Affairs (P-19J)

 Region 5 Intergovernmental Relations Office (R-19J)

 Region 6 Office of External Affairs

 Region  7 Public Affairs

Region  7 Intergovernmental Liaison, Office

Region  8 Public Affairs

Region 9 Public Affairs

Region  10 External Affairs Office

Headquarters Library (3404)
                                       58
                                         Report No. E1HWE5-23-0001-5100516

-------