OFFICE OF INSPECTOR GENERAL Memorandum Report EPA Claims to Meet Drinking Water Goals Despite Persistent Data Quality Shortcomings Report No. 2004-P-0008 March 05, 2004 ------- Report Contributors: Jill Ferguson Linda Pettit-Waldner Tim Roach Dan Engelberg Abbreviations EPA GPRA M/R OIG PWSS SDWIS/FED Environmental Protection Agency Government Performance and Results Act Monitoring and Reporting Office of Inspector General Public Water Supervision System Safe Drinking Water Information System/Federal Version ------- UNITED STATES ENVIRONMENTAL PROTECTION AGENCY WASHINGTON, D.C. 20460 OFFICE OF INSPECTOR GENERAL March 05, 2004 MEMORANDUM SUBJECT: EPA Claims to Meet Drinking Water Goals Despite Persistent Data Quality Shortcomings Report No. 2004-P-0008 FROM: KwaiChan/s/ Assistant Inspector General Office of Program Evaluation TO: Benjamin Grumbles Acting Assistant Administrator Office of Water In each of the past 4 years, the Environmental Protection Agency (EPA) incorrectly reported meeting its drinking water goal under the Government Performance and Results Act (GPRA). The Agency reported meeting its annual performance goal for drinking water quality even though it concurrently reported that the data used to draw those conclusions were flawed and incomplete. In each of those years, EPA reported that it met its annual goal of 91 percent of the population drinking water that met health-based standards. However, EPA's own analysis, supported by our review, indicated the correct number was unknown but less than what was reported. We must note that this inaccuracy in reporting does not necessarily indicate a direct or immediate threat to human health. Purpose We initiated this review to evaluate the "drinking water performance measure," a key component of EPA's GPRA goal of "Clean and Safe Water." The evaluation questions were: (1) how do incomplete or inaccurate drinking water data affect the drinking water GPRA calculation; and (2) what actions have EPA undertaken to ensure that drinking water data collected and distributed to the public are reliable and valid? During the preliminary research phase, we learned that the Office of Water was conducting analyses that largely overlapped our own, and was working with States and other stakeholders to address data quality problems. Since we already completed work on our first question but not the second, we are reporting the results on the first and suspending our work on the second. ------- Details on our scope and methodology, as well as background on GPRA and drinking water reporting, are in Appendix A. Results EPA has reported just meeting its annual performance goal for drinking water for fiscal years 1999 through 2002. In each of those years, EPA reported performance equaling the 91 percent GPRA annual performance goal. However, because EPA and the Office of Inspector General (OIG) reviews indicated that performance is less than what EPA reported, due to missing data on violations of drinking water standards, the Agency did not in fact meet its drinking water performance goals for these 4 years. Our assessment also mirrors statements made by the Agency in its performance reports and elsewhere. EPA Consistently Reported Meeting Drinking Water Goals EPA officials and reports consistently noted that national drinking water performance goals were being achieved. Annual performance reports, the 2003 Draft Report on the Environment, and statements by Agency officials indicated that national drinking water quality was high and EPA was progressing toward its goal of having 95 percent of the population drinking water that meets health-based standards. This was also repeated by the media. Figure 1 summarizes EPA's most recent annual performance report about drinking water quality, showing Agency claims that it just met its performance goal for each of the last 4 fiscal years: Figure 1: EPA Reports Meeting Drinking Water Performance Goals APG8 FY20S2 SsfeDtMSmg Water *1% afths p«p«Sa1i«ssi s«v6d rne&img all r«^th-ba«« fey community x*a$«' systems xvifl ^?ds, up fe«m S3& in 1384. Cvssi Raaasid jes«i» rfrsvfeimj water 9t% sty ietfesafy &>fo?seabie tea'ffr iasst? $iaatiaft& ?te? seere « js&ce i>y ?&54. - K>su&!;on sefvoi iy .non-cesmftity. nofiJf&ssisfti tftifftifig watsf systs&s y/Hft no vfoSsiions SM ffcre? <'te ^ssrsf any fet&a&y cfi^ft:-«ji'fe twSk-basecf stsrxSai&s to&f sss/0 ;fl ^scs fry ?i'i«. FKSOW SffBSG&rf. S(f>!fte ** Aetu»l 91% «?:^ ^K 21% fYJiSS SswcSeaf. DfifefCfffJaijjcJs. StS^ifef S!% £f:S j; FY 2S02 ResaiS; In Pf 2083, 244 milSai p«o| | issijii is 31% cf the 263 tniiiisOT popte •ssrved >!a ware ssived by ootnmunity waier systems maring all hsalth-has«d standards by 53.437 ccmmuni^' water systems in FV 2032. This »>»>>>>>>>>^>^>^^>^^>^>>>>^^>J,J,J,J,J,^l Source: £PX> 2002 Annual Performance Report, page 11-22. ------- "Our drinking water Is purer, lit 2002,94 percent of Americans were served, fay drinking water system? tfcat meet our health-based standards -an Increase of IS percent in the last decade." In addition to annual performance reports, EPA portrayed its success at improving drinking water quality through other reports and through statements by Agency officials. For example, the 2003 Draft Report on the Environment stated, "In 2002, (S)tates reported that 94 percent of the population served by community water systems were served by systems that met all health-based standards, up from 79 percent in 1993."1 A July 20, 2003, statement from the Assistant Administrator for Research and Development repeated this conclusion that, in 2002, 94 percent of Americans were served by drinking water that met health-based standards. „ „ „ . ..,«.. TT . . . ., ,. Figure 2: Statement on Drinking Water Quality Using Agency reports, the media _ _ communicate such information to the public. For example, a June 23, 2003, New York Times editorial, "An Environmental Report Card," used similar language in the press release ^^^^^^^^^^^^^^^ .. , . „. _ . r ., . «_ .. Source; EPA s June 23, 2003, Press Release cited in Figure 2 to report that, Fully 94 percent of Americans are served by drinking water systems that meet federal health standards, as opposed to 79 percent 10 years ago." Most recently, the new Administrator's Draft 500-Day Water Quality Plan continued using drinking water quality projections that remained unchanged from previous claims and press releases. The Plan stated, "(i)n 2002, 93.6 percent of the population received drinking water that met all health-based standards. By 2015, all people served by community water systems will receive drinking water that meets standards." EPA Reports Data Quality Problems While Reporting Performance Goals Met EPA's recent Performance Reports contain statements about the quality of the drinking water data used to report under this performance measure. In the last three Annual Performance Reports the Agency's message was consistent: there are problems with the quality of data in the Safe Drinking Water Information System/Federal Version (SDWIS/FED). These statements were echoed in other reports, such as the 2003-2008 Strategic Plan. In that Plan, EPA reported, "the baseline statistic of national compliance with health-based drinking water standards likely is lower than reported." However, EPA continued to report that drinking water performance goals were being met. The following tables contain disclaimers related to data quality problems that EPA has included in reports: 1 This statistic is based on the 2002 calendar year, while the 91 percent in EPA's 2002 annual performance is based on the Federal fiscal year. ------- Statement to EPA Annual Reposrfe Report 1999 Annual Performance Report 2000 Annual Performance Report 2001 Annual Performance Report 2002 Annual Performance Report Data Quality Description There is no indication of data quality in the discussion of performance for this Annual Performance Goal. "There are recurrent reports of discrepancies between national and state data bases . . . Given the particular need for confidence in the completeness and accuracy of data about drinking water quality, EPA designated SDWIS content as an Agency material weakness in 1999, under the Federal Managers' Financial Integrity Act." A technical appendix noted under-reporting of monitoring and violations data to EPA and that "failures to monitor could mask treatment technique and MCL violations." "The most significant data quality problem is under reporting to EPA of both monitoring and reporting violations and incomplete inventory characteristics . . . failures to monitor could mask treatment technique and MCL violations. Such underreporting of violations limits EPA's ability to precisely quantify the population served that are meeting health based standards." Stateimnts in Other IP A Reports Report Fiscal Year 2001 Annual Plan 2003 Draft Report on the Environment (Technical Document) 2003-2008 Strategic Plan Data Reliability Analysis of the EPA SDWIS/FED and Plan (Draft) Data Quality Description "SDWIS data quality has been problematic. It has been demonstrated that there are discrepancies between SDWIS data and state databases. In addition, utilities have pointed out specific data quality problems." "Underreporting and late reporting of CWS violations data by states to EPA affect the ability to accurately report the quality of our nation's drinking water... Based on this analysis, the agency estimated that states were not reporting 40 percent of all health-based violations to EPA." "Routine data analyses of the Safe Drinking Water Information System (SDWIS) have revealed a degree of nonreporting of violations of health-based drinking water standards As a result of these data quality problems, the baseline statistic of national compliance with health-based drinking water standards likely is lower than reported." "If the quality of the data measured and reported to SDWIS-FED, the source of the data for the [GPRA performance] calculation, are less than 100 percent as defined by this analysis, then the progress toward meeting the strategic goal may not be as great as reported." The information in the previous Figures and Tables indicate that while the Agency consistently reported meeting its drinking water performance goals, the Agency also consistently acknowledged problems with drinking water data quality. Therefore, we believe the Agency has wrongly reported that it met its 91 percent performance goal for the years 1999 to 2002 - the actual number is lower by some unknown amount. ------- EPA and OIG Reviews Indicated GPRA Measure Less Than Reported Since 2000, EPA has developed two drinking water data quality reports (with the second still in draft form). Both reports noted problems with under-reporting, in that States did not transmit all health-based violations and all monitoring and reporting violations into SDWIS/FED, which houses compliance information about drinking water systems. Our analysis confirmed these data quality problems of under-reporting of violations to health-based drinking water standards. EPA Reports indicate Drinking Water Data duality Improved But StlJI "Low" EPA has been conducting "data verifications" of drinking water data since 1991. In 2000, EPA issued the "Data Reliability Analysis of the EPA Safe Drinking Water Information System/Federal Version (SDWIS/FED)" report from data verifications conducted between 1996 and 1998. This report identified problems with the accuracy and completeness of SDWIS/FED data The followup 2003 report, currently under internal draft review, includes information from data verifications from 1999 to 2001. These two reports distinguish between"health-based" and "monitoring and reporting" (M/R) violations. As shown in Figure 3, the first report indicated that "data quality" for health-based violations at the community water systems (or "systems") whose records were examined during the 1996-1998 data verification time period was 40 percent.2 That is, 40 out of every 100 health-based violations that should have been in SDWIS/FED were in SDWIS/FED, and 60 out of 100 were not. In the second time period (1999-2001), data quality improved to 65 percent. According to EPA's own assessment, this is still in the "low quality" range. Figure 3: Despite Improvements "Data Quality" Remains a Problem 100% n 80% 60% 40% 20% 0% -I Health-based violations M/R violations EPA's Data Quality Ranges Low 0 to 70% Moderate: 711o 90% High 91 to 100% Data quality for the monitoring and reporting component of drinking water compliance also improved among the communities audited - from 9 percent to 23 percent between the two time periods. In other words, the data quality problem for M/R was at 91 percent and has reduced to 77 percent for an overall improvement of 14 percent. Because the enforcement portion of SDWIS/FED tracks only non-compliance with drinking water regulations, the 2 EPA has defined SDWIS/FED data quality as the percent of data that should be in SDWIS/FED that are in the database with no discrepancies or errors. ------- monitoring and reporting violations indicate instances in which information about water quality was not reviewed by the State to make compliance determinations. The reported levels of health-based and monitoring and reporting data quality indicate that EPA performance reports reflected a best-case scenario and that in all likelihood performance was lower than reported. EPA has recognized the data quality limitations of SDWIS/FED and its impact on the Agency's ability to manage the drinking water program, as well as to accurately report to Congress and the public. In the draft 2003 Data Reliability Analysis, EPA points out that, "Overall, the violations data that are reported to and accepted by SDWIS/FED are highly accurate. The weak link in data quality continues to be the large number of violations that are not reported to SDWIS/FED (as estimated by Completeness), with monitoring and reporting data being the least complete and of very low quality." Recent improvements in drinking water data quality are attributed to EPA, State, and third party efforts to identify data quality deficiencies and implement activities to remedy those deficiencies. This has been a long-term effort, signified by the publication of reports outlining problems with drinking water data quality and activities to fix those problems. The 2003 Draft Data Reliability Report of SDWIS/FED noted that in 1999, a workgroup of EPA, State, and stakeholder representatives developed a data quality action plan. Some of the components of that action plan included setting data quality goals for SDWIS/FED, quantify and qualify the quality of SDWIS/FED data, and take interim steps to improve data quality. The Draft Data Reliability Report also noted that EPA and States undertook and completed a number of activities to reach their data quality improvement goals. Some of those actions taken by EPA and the States included: (1) improved data entry processes, tools, and training for regions and States; (2) improved and simplified data retrieval and reporting tools; (3) improved data verification audit procedures; and (4) accelerated ongoing data quality improvement activities (development of SDWIS/STATE, and electronic reporting between utilities, labs, and States). A second data quality action plan is being developed and implemented. When we met with EPA officials to discuss our draft report, they told us that data quality continues to improve, as measured by the most recent data verifications. EPA also noted in March 2003 in the Draft Strategic Plan for 2003-2008 that it would consider how to best classify water systems that experienced monitoring and reporting violations. Options included (1) classifying systems with monitoring and reporting violations as not being in compliance with health-based standards, and (2) excluding these systems from the GPRA calculation. By doing so, the Agency would remove from its performance reporting the systems that it cannot determine provided water that met all health-based standards. However, the final Strategic Plan issued in September 2003 stated, "(The) Agency is currently engaged in statistical analysis to more accurately quantify the impact of these data quality problems." ------- During our meeting with Agency officials to discuss the draft report and these two options, they explained that there is a potential for water systems with no health-based violations to be eliminated from the GPRA calculation because of one M/R violation. They believe that this would distort their reporting under GPRA, and they are studying the issue. While we understand that this potential exists, we also believe that EPA's current policy of treating systems with M/R violations as being in compliance with health-based standards can also distort the GPRA measure. The Agency's 2001 and 2002 annual reports (see page 4) note that there is a potential for M/R violations to mask violations to drinking water standards. Our review of the EPA data verification database confirmed the Agency's conclusion that States did not report all health-based violations into SDWIS/FED. The actual percentage of people drinking water that met health-based standards in this sample was likely to be lower, but we cannot use the database to determine a range for the nation as a whole. This is because the methods used to select the 761 water systems do not support estimates for the nation's approximately 54,000 community water systems.3 In the data verification database, we observed that the error rate was high for systems with health-based violations to the contaminants reviewed during data verifications. Of the 71 systems with violations for the drinking water standards that were reviewed during the data verifications, 17 had not been reported into SDWIS/FED." Overall, the direction of errors in the reporting of health-based violations caused a downward bias in the drinking water performance measure among the 761 systems. This suggests that by utilizing incomplete results from SDWIS to report performance under GPRA, EPA portrayed an incorrect picture of the percentage of people drinking water that met all health-based standards. There is always potential for errors when collecting any type of information. In Appendix A, Figure 5 illustrates the flow of information from the water system, through the laboratories, and to the State and EPA. State data verifications identify errors related to data analysis and reporting of violations. Conclusions Congress, the public, and the media rely on transparent and accurate reports about drinking water quality. The significance of this measure is that each percent of the population served by community water systems receiving drinking water meeting all health-based standards represents more than 2.6 million people in the United States. As EPA persistently reports meeting its drinking water performance goal while acknowledging drinking water data quality problems, it in fact has not accurately reported its performance to the approximately 268 million people drinking 3 A Community Water System serves at least 25 people or 15 service connections on a year-round basis. 4 Our review also indicated that 2 of the 761 systems were identified to have a health-based violation in SDWIS/FED when no violation actually occurred. 7 ------- water from community water systems. EPA's increasing candor about the limitations associated with basing performance measures on its compliance database (SDWIS) and the fact that it identified possible corrective actions in its 2003-2008 Draft Strategic Plan for addressing the problems of water system monitoring and reporting violations indicate the Agency's willingness to consider alternative approaches for how it reports performance. We suggest that while EPA and States continue moving forward to correct data deficiencies, the Agency should also identify methods to better account for the impact of the "large number" (as described in the Draft 2003 Data Reliability Report) of violations that are not reported to SDWIS/FED. It should determine how best to account for community water systems with monitoring and reporting violations when reporting into GPRA and adjust the measure to reflect this. Options include those mentioned above that were described in the Draft Strategic Plan. In order to address broader concerns over this measure, given the inherent problems utilizing SDWIS for reporting on performance, we also suggest that in the future the Agency move toward employing an altogether different methodology for reporting performance for this Annual Performance Goal. One approach would be for EPA to base its future reporting on a stratified sample of the nation's 54,000 community water systems and audit those systems for compliance with health-based drinking water standards. This has the potential to provide a more accurate and transparent accounting of the nation's drinking water quality. Agency Comments and OIG Response In the Agency's February 2, 2004, response to our draft report, the Agency did not directly acknowledge our principal finding concerning the incorrect conclusions about drinking water performance contained in recent annual performance reports. In addition, while the Agency agreed to continue to improve how EPA communicates health risks associated with drinking water, no commitment to specific steps to correct the inconsistencies we had pointed out were agreed to. Appendix B contains Agency comments, and Appendix C contains some of our specific responses to those comments. Based on the Agency's comments, we made several revisions and clarifications to our report. However, insofar as data quality within SDWIS is not the principal focus of this report, the comments did not address our principal concern: that for 4 years, EPA has reported to Congress and the public that it met an important annual performance goal when available evidence indicates it did not. After reviewing EPA's comments, we continue to believe that the Agency inappropriately claimed to have met performance goals for its drinking water program for the past 4 years. Steps to account for missing and inaccurate data when reporting performance under GPRA are being considered by EPA, but no decisions have been made. We reiterate our suggestion that EPA change how it reports under GPRA to compensate for known concerns over the reliability of this measure. If you or your staff have any questions, please contact me at (202) 566-0827, or Dan Engelberg, Director of Water Issues, at (202) 566-0830. ------- Appendix A Background, Scope, and Methodology Background In 1993, Congress enacted GPRA to shift Federal planning, management, and decision-making away from a traditional focus on resources and activities to a focus on results and outcomes. The Office of the Chief Financial Officer and program offices produce an annual report to Congress on the Agency's progress toward achieving annual strategic goals. In response to GPRA, EPA , established major goals, including "Clean and Safe Water." One of the sub-objectives under this goal was "Water [mat is] Safe to Drink," which is designed to reflect the quality of the drinking water supplied to the population. In 1999, EPA established a measure of progress toward meeting this sub-objective: by the year 2005, 95 percent of the population served by community water systems would have water that is safe to drink, meaning that the water meets all applicable health-based standards. As is shown in Figure 4, over the past 9 years, EPA has reported an increasing percentage of the population drinking water that meets health-based standards. In its new strategic plan, EPA has retained the performance sub-objective of water that is safe to drink and the measure of percent of population, but has extended the timetable to accomplish the objective by 3 years, to 2008. Figure 4: Population Reported by EPA Meeting Health Based Standards 100 QS 90 - 5 » « O. SO . 7*i 70 - 1 79 n .5 h 99 3 1 M "T 1 ^ %- fit! 99 4 1 84 - , rrY 99 5 1 88 "• v 99 5 1 n v -. % 99 7 1 re a 89 m- ^ / v / '. r"tt 99 r 8 1 91 F*^n •" ' 99 9 2 91 rirn ' i» !'> . % 00 0 2 91 nrn /> ' ?, J 00 1 2 91 TTTI ^ 5 11 W I-" 00 2 Source: EPA 2002 Drinking Water Factoids. This data is based on the Federal fiscal year, which ends September 30. The Draft Report on the Environment reported 94 percent of the population served by community water systems drank water that met all health-based standards for the 2002 calendar year. The drinking water performance measure is based on compliance information contained in SDWIS/FED. The information utilized from this database is derived by sampling and analyses of drinking water, and assessments of treatment techniques from the approximately 54,000 community water systems that supply water to 268 million Americans. SDWIS/FED is designed to support many program management functions, including storing basic water system information, enforcement actions, and sampling results for unregulated contaminants. ------- Public water systems are responsible for monitoring their own systems and collecting and reporting sampling results to a primacy agent (typically State or Tribal drinking water programs). All States have primacy except for Wyoming. Primacy agents determine compliance with drinking water regulations and report violations to EPA (via SDWIS/FED). This process is illustrated in Figure 5. SDWIS/FED contains data when violations of drinking water standards and mandated treatment techniques are reported into it. To measure program performance, EPA aggregates the SDWIS/FED data into a national measure of overall compliance with health-based drinking water standards, which it reports as a percentage. Figure 5: SDWIS Data Flow PWSS Data Flow to SDWIS Under State Primacy Agreement Pvfcite Wafer Systems 1 T Water Sam pit Sompha RBSU 1 T C&tlift*d Ut> x sampling Results ^v Violations 10 ,.** ta Sampling R outfits By PWS-Stata Arrangement H / ;x-x;:;x:':':;:;:':'::::::::::-::::: J fc s;le;*s8;9;w; ». EPA pgt$$ii * ft fester Source: EPA Office of Water Scope and Methodology We reviewed a database containing the results of a series of contractor-conducted audits (known as data verifications) of public water system data in State files.5 We used this database to discuss two sources of errors that affect the precision of the data used in the drinking water quality measure: (1) errors in the process of reporting drinking water data from States to EPA, and (2) problems associated with under-reporting of drinking water information because of a reliance upon an exceptions-based database (meaning that only violations are recorded) for tracking drinking water violations. We followed all but one of the applicable Government Auditing Standards, issued by the Comptroller General of the United States. We were unable to adhere to the standard that we assess management controls. Specifically, there were no written procedures to document the flow of information from data verifications into the data verification database. Based on our discussions with Office of Water staff, we believe that the data in this database were sufficient 5 We focused on data from community water systems because compliance data from this group are used for the drinking water GPRA measure. 10 ------- for the purposes of our review. We followed all other standards for performance audits or evaluations. All work was completed between August 2002 and September 2003. 11 ------- Appendix B Agency Comments MEMORANDUM SUBJECT: Comments on the Draft Report EPA Reports Meeting Drinking Water Goals Despite Persistent Data Quality Shortcomings FROM: Benjamin H. Grumbles Acting Assistant Administrator TO: Nikki L. Tinsley Inspector General Dear Ms. Tinsley: Thank you for the opportunity to comment on your office's draft report, "EPA Reports Meeting Drinking Water Goals Despite Persistent Data Quality Shortcomings." I appreciate your general interest in this issue. The Office of Water recognizes the importance of high quality data and is committed to continue to make improvements in this area in the drinking water program as well as across our other activities EPA's data verification audits and associated analyses, which are also the basis for your draft report, indicate that the data in SDWIS-FED are highly accurate with very few errors, but are still incomplete. EPA and the states have made significant progress in improving the quality of SDWIS-FED data since we first became aware of this issue. We acknowledge, however, that more work remains to be done. EPA's report, Drinking Water Data Reliability Analysis and Action Plan (2003), to be released shortly, will highlight our continuing efforts and additional steps that we intend to take in partnership with states to further improve drinking water data quality. While the Agency does use SDWIS-FED data to meet reporting requirements under the GPRA, we are aware of the data's shortcomings and have been diligent in flagging those to key audiences as well as to the general public. EPA in its GPRA reporting is using the data that is available to us through the national reporting system. We will continue to explore ways to communicate the range of issues associated with the nature and quality of SDWIS-FED data and the relationships to public health risk with your suggestions in mind. We will also continue to engage in discussions with states regarding potential new approaches for reporting drinking water data (e.g., electronic transfer of monitoring results from the laboratory to the federal database). 12 ------- In a broader context, I would like to note that in the vast majority of instances where states make compliance determinations these determinations are correct. Most of the determinations correctly find that public water systems are meeting health-based standards and thus do not require an entry to be made in SDWIS-FED, which is a violations-only database. I mention this not to diminish the very real need to improve data quality, but as an important reminder that SDWIS data quality and drinking water quality are far from synonymous. I have attached a more detailed set of comments on specific aspects of the draft to assist your office as you prepare a final report. Please call Cynthia Dougherty, Director of the Office of Ground Water and Drinking Water, at (202) 564-3750 if you would like further clarification on any of these issues. Attachments cc: Kwai-Cheung Chan Mike Shapiro Cynthia C. Dougherty Elizabeth Con- Dan Engelberg Jill Ferguson Linda Pettit-Waldner Tim Roach Michael Mason 13 ------- Office of Ground Water and Drinking Water Specific Comments on Draft Inspector General Report, EPA Reports Meeting Drinking Water Quality Goals Despite Persistent Data Quality Shortcomings (12/23/03) 1) In general, the IG draft report addresses two topics EPA is actively engaged in analyzing: • The quality of data reported by states to EPA's database of record, the Safe Drinking Water Information System/Federal Version (SDWIS/FED) for public water system inventory, violations and enforcement actions, and • The implications of that quality on a Government Performance and Results Act (GPRA) measure for populations receiving drinking water from community water systems that are in compliance with all health-based standards. We are addressing both of these issues through our work internally and with states to continue to improve data quality and to identify new ways of communicating its significance to the public. We are also at this time near completion of our second comprehensive report on the quality of data in SDWIS. We expect to release this final report in the near future and will provide a copy to the IG at that time. 2) The IG draft report characterizes EPA as having "mistakenly" reported meeting its drinking water goal under the Government Performance and Results Act (GPRA). In actuality, however, we use the best available data reported to us by the states under their primacy agreements with the Agency, hi using this data to describe results under the GPRA, we have tried to be clear that the results are based on the data as reported in SDWIS and that our audits indicate that there is incomplete reporting of this data. We have also made our first triennial report on SDWIS data quality, published in October 2000, available to the public on our website. We will soon be making our second report, noted above, on data quality available on the web. 3) EPA's data verifications indicate that the data the states report to the Agency are very accurate, although incomplete. 4) Several of the IG's comments, including an incorrect flow chart, indicate the need for improved understanding of data flow from public water systems to states to EPA. We have prepared a revised flow chart at the same level of detail (attached). We would suggest a meeting between OGWDW and the IG's staff before you finalize your draft report to ensure an accurate understanding of key details related to data flow which are beyond the depictions in the chart. Such a meeting would also serve as an opportunity for us to provide, and discuss where necessary, other detailed edits to the draft report for purposes of accuracy. 5) The draft report includes a chart that depicts data quality improvements, but the draft report includes only brief discussion on this point. Specifically, we believe that the draft 14 ------- report should recognize that data quality has improved based on actions taken jointly by EPA and states since 1998. We are concerned that the absence of elaboration on this point undercuts the concerted efforts as well as the progress that EPA and states have made and are continuing to make. 6) To achieve a balanced examination, we suggest that the IG evaluate factors that could affect the results of data quality calculation in either direction, rather than emphasizing only factors that might appear to reduce the reported levels of the population receiving safe water all the time. We would like to take this opportunity to draw certain key points to the IG's attention for discussion in the draft report. • In developing our own report, we have been examining the issue of over- reported violations. We looked at all the large water systems (over 50,000 population served) that had been identified as being in violation and found that one-third or more had corrected the violations and should not have been reported as being in violation in 2001. If considered, this could have the effect of increasing the GPRA percentage. • Another noteworthy factor is that for many large water systems (which have the greatest effect on the GPRA number) a violation may not affect water quality throughout the entire water system, even though for GPRA accounting purposes the entire water system is credited with a violation. Quantitative consideration of this factor, while challenging to do, would likely contribute additional and substantial population to be counted as receiving drinking water which had no violation. • A further set of potentially relevant factors is the impact of violation timing, frequency and duration on the significance and potential public health consequences of violations. • Similarly, the varying nature of violations (e.g., a one-time violation for a chronic contaminant versus for an acute contaminant, or a violation significantly above the standard versus one that is close to the standard) may have differing public health implications. We are very interested in finding ways to communicate these complexities succinctly to enhance public understanding of the GPRA measure and what it means. We would welcome the IG's comments on these factors. 7) The draft report characterizes the success of improved drinking water quality (distinct from data quality) as the Agency's result when in fact it is the result of a broad partnership that includes EPA, the states that are the primary implementers of the national drinking water program and public water systems that carry out the regulatory requirements. 8) An area that the IG touches on in the draft report and which resurfaces in the context of the conclusions is the potential impact of monitoring and reporting violations on GPRA 15 ------- reporting. Monitoring and reporting violations, however, may have no link to whether a water system met health-based standards. For instance, where monitoring and reporting violations are scattered among numerous water systems that otherwise routinely demonstrate that they meet health-based standards, the likelihood of a significant impact on EPA's GPRA reporting is less than if monitoring and reporting violations occur repeatedly within the same water systems. We suggest that the IG reconsider whether to emphasize this complex issue as part of its conclusions in the absence of further analysis. 9) In the first paragraph on page 7, it is unclear to us whether the IG is discussing the utility of the data verification database or the SDWIS database and also unclear as to whether the estimates under discussion are the data quality estimates or the GPRA number. In this and the following paragraph, there also appears to be a misunderstanding about the regulatory framework of the SDWA. Under the drinking water program's regulatory structure, health standards for multiple contaminants are addressed within single rulemakings. The draft report indicates that data verifications only evaluate eight drinking water standards. This is incorrect and affects the IG's conclusion about extrapolating data verification results for data quality purposes and possible "larger discrepancies" in the Agency's GPRA calculation. In fact, EPA's data verifications examined all 87 contaminants that were regulated under the SDWA in 2001 and EPA will continue to evaluate all regulated contaminants in future data verifications. 10) We believe there are a wider range of circumstances affecting shortcomings in the data that should be considered in an evaluation, including but not limited to: • Relationship of waivers, variances and exemptions to violation data points; • Conditions in state programs that result in non-reporting of violations; • Methodologies that would improve evaluation and understanding of state program processes to determine compliance; • Development of methodologies for estimating the proportion of populations in larger water systems actually affected by violations, rather than charging the entire water system with a violation that only affects a portion of its population. 11) Concerning the listing "Stakeholders Identified Other Potential Sources of Error" in Appendix B, the draft report does not present an evaluation of these potential sources of error nor indications of which may be more problematic. In general, the reference to "error" is inappropriate. Some of these appear to be undocumented and/or unevaluated opinions about potential sources of error and others are not potential sources of error, but rather process vulnerabilities. This particularly applies to the GPRA measure section. 16 ------- 17 ------- Appendix C Additional OIG Responses Note to Agency Comment #2: We agree that the Agency currently uses the best available data reported by the States and has moved in recent years to be more transparent in the presentation of problems with SDWIS data. For this reason, we have changed "mistakenly" to "incorrectly" in the report. We realize that this isn't an error of oversight on the part of EPA. However, EPA continues reporting that GPRA goals are met while warning about the implications of missing data. In our view, correctly reporting whether it has met a performance goal is at least as important as disclosing the existence of errors. Note to Agency Comment #4: The final report contains the data flow chart provided by the Office of Water. Note to Agency Comment #5: Protecting and improving the nation's drinking water quality and drinking water data quality is a collective effort with credit for success attributable to many parties. We did not intend to reduce the share of credit to any one group by briefly noting data quality improvement efforts undertaken in previous years. We did intend to highlight that drinking water data quality improvements are a result of activities to improve data management systems and processes. The issues regarding the transparency of GPRA reporting are the focus of our report, which is why we did not further elaborate on data quality improvement activities. However, we have changed our presentation in the final report to better reflect the shared responsibility and accomplishments of EPA and its partners. Note to Agency Comment #6, first bullet: We agree that the GPRA percentage is affected by water systems with incorrectly reported health-based violations. Our review of the data verification database factored in 2 such systems out of the 71 that experienced health-based violations. We were aware of other drinking water databases, but chose to review only the data verification database because of the semi-random sampling methodology used for selecting the 761 community water systems. 18 ------- General Note to Agency Comment #6: These factors all contribute to the complexity of presenting a picture of the nation's drinking water quality for the purposes of GPRA reporting. We suggest that while EPA works to address data issues such as those described here, the Agency also more clearly report that the absence of drinking water data in SDWIS/FED have an effect on the accuracy of the annual GPRA reports. Note to Agency Comment #7: See our response to EPA's Comment 5. Note to Agency Comment #8: EPA's 2001 and 2002 Annual Performance Reports noted that failures to monitor could mask violations of health-based standards (see page 4). We agree with this position. We feel that EPA is mistaken in asserting that the impact of GPRA reporting is less if M/R violations occur repeatedly in a single water system than if the same number of M/R violations are spread among several different systems. Note to Agency Comment #9 We clarified language and corrected errors in the draft report. Note to Agency Comment #10 We agree that these are important factors that affect the accuracy and validity of the GPRA measure. For the purposes of this report, our focus was on the implications of reporting success at meeting the drinking water GPRA goal while concurrently reporting problems with the completeness of drinking water data. Note to Agency Comment We removed references to potential sources of error based on Agency comments. 19 ------- Appendix D Report Distribution Assistant Administrator, Office of Water (4101) Director, Office of Ground Water and Drinking Water (4607) Comptroller (2731 A) Agency Foliowup Official (the CFO) (2710A) Agency Audit Followup Coordinator (2724A) Associate Administrator for Congressional and Intergovernmental Relations (1301 A) Associate Administrator, Office of Public Affairs (1101 A) Inspector General (2410) 20 ------- VXf^ OFFICE OF INSPECTOR GENERAL Evaluation Report Impact of EPA and State Drinking Water Capacity Development Efforts Uncertain Report No. 2003-P-00018 September 30, 2003 ------- Report Contributors: Leah Nikaidoh Michael D. Davis Tim Roach Thane Thompson Robert Bronstrup Abbreviations CFR EPA DWSRF GAO GPRA HSNC OGWDW OIG SDWA SDWIS/FED T/M/F Code of Federal Regulations Environmental Protection Agency Drinking Water State Revolving Fund General Accounting Office Government Performance and Results Act Historical Significant Non-Compliance Office of Ground Water and Drinking Water Office of Inspector General Safe Drinking Water Act Safe Drinking Water Information System/Federal Version Technical, Managerial, and Financial Cover: Photo of water tower by EPA OIG. ------- Table of Contents Purpose 2 Background 2 Scope and Methodology 4 Results of Review 4 Conclusions 10 Recommendations 10 Agency Response and OIG Evaluation 11 Action Required 12 Appendices A Details on Capacity Development Design and Early Implementation Results 13 B Details on Capacity Development Performance Measurement 21 C Details on Scope and Methodology 25 D Agency Response to Draft Report 29 E OIG Analysis of Agency Response 47 F Report Distribution 57 ------- 1 UNITED STATES ENVIRONMENTAL PROTECTION AGENCY w - -, ff | WASHINGTON, D.C. 20460 OFFICE OF INSPECTOR GENERAL September 30, 2003 MEMORANDUM SUBJECT: Impact of EPA and State Drinking Water Capacity Development Efforts Uncertain Report No. 2003-P-00018 FROM: Dan Engelberg /Signed by Kwai Chan for/ Director for Water Issues Office of Program Evaluation TO: G. Tracy Mehan El Assistant Administrator Office of Water This is our final report on the subject evaluation conducted by the Office of Inspector General (OIG) of the U.S. Environmental Protection Agency (EPA). This report contains findings that describe the problems the OIG has identified and corrective actions the OIG recommends. This report represents the opinion of the OIG and the findings contained in this report do not necessarily represent the final EPA position. Final determinations on matters in this report will be made by EPA management in accordance with established resolution procedures. The Assistant Administrator for the Office of Water responded to our draft report on June 23, 2003, and that response is included as an appendix in this report. This report identifies issues that EPA needs to address to ensure that, as States continue to implement capacity development, they provide sufficient managerial and financial assistance to community water systems. Providing this assistance is vital to the long-term sustainability of these systems, and will ultimately help systems meet drinking water standards. EPA also needs to have a performance measurement process to determine if, nationally, Ihe overall capacity, or healthiness, of community water systems is improving. Without this strategic information, EPA cannot make critical management decisions to provide resources and assistance to States as the States work to implement their capacity development strategies. These issues are summarized below and presented in detail in Appendices A andB. ------- Purpose Congress, in passing the 1996 Safe Drinking Water Act (SDWA) amendments, provided funding for capacity development to meet significant challenges facing communily water systems, including aging infrastructure, underfunding, and meeting drinking water regulations. Capacity development is a way of structuring drinking water protection programs to assist water systems in attaining the technical, managerial, and financial (T/M/F) capacity to achieve and maintain long-term sustainabilily.' Capacity development is based on the idea that systems that achieve and maintain capacity will be the best prepared to meet current and future drinking water challenges, such as new Federal drinking water regulations and substantial infrastructure needs. Therefore, it is critical that capacity development is designed and implemented in a manner lhat will effectively ensure that systems needing capacity assistance will get this help before ultimately falling into noncompliance. Since the EPA drinking water capacity development process is relatively new, our national examination focused on the design and early implementation of this program. Our initial objective was to evaluate EPA and State formulation and initial implementation of capacity development programs to determine the extent to which such programs have been formulated and initially implemented consistent with the specific requirements and overall objectives of the SDWA. Specific emphasis was to be given to evaluating how States are integrating capacity development, together with other SDWA initiatives and drinking water program activities, to assist community water systems to consistently achieve the health objectives of the SDWA. During the course of our work, we identified an additional issue related to performance measurement As a result, this report addresses the following two questions: • Did EPA and State design of the capacity development strategies ensure that community water systems have received T/M/F capacity assistance? • How has EPA planned to assess the performance of capacity development initiatives at a national level, and how strong is the design for evaluation? Background Almost 264 million people in the United States obtain Iheir drinking water from 54,000 community water systems. These systems vary from very small rural systems to very large systems. While they all share !EPA has defined the three key capacity components needed for proper operation of a drinking water system as follows: Technical: The physical and operational ability of a drinking water system to meet SDWA requirements. Managerial: The ability of the system operators to conduct their affairs in a manner enabling the system to achieve and maintain compliance. • Financial: The ability to acquire and manage sufficient financial resources to allow the system to achieve and maintain compliance. ------- problems wilh aging infrastructure, underfunding, and meeting regulations, small systems have had great difficulty in keeping up with SDWA regulations. EPA recently estimated that over the next 20 years, the nation is facing a deficit of nearly $265 billion in resources available to meet projected drinking water infrastructure needs. At a January 2003 EPA conference on the infrastructure gap, the Assistant Administrator for Water promoted capacity development investment as a means to help address this issue. Congress amended Ihe SDWA in 1996, providing for a variety of initiatives to assist States and public water systems in providing safe drinking water to the public. Capacity development, the Drinking Water State Revolving Fund (DWSRF), operator certification programs, and such resources as the Environmental Finance Centers and Small System Technical Assistance Centers, were instituted to provide assistance to States and community water systems. Congress established capacity development with the intent of focusing on those systems most in need of assistance. These were primarily small systems (serving populations of 3,300 or less). Although small systems make up the majority of community water systems (46,000 of the 54,000 total systems), they only serve about 10 percent (25 million people) of the population. However, in 2000, small systems accounted for 90 percent of all systems that had a "History of Significant Noncompliance" (a system violating one or more National Primary Drinking Water Regulations in any three quarters within a 3-year period). All three components of capacity development (technical, managerial, and financial) are critical to Ihe successful operation of community water systems. EPA uses the diagram in Figure 1 to illustrate the interrelated nature of T/M/F capacity. EPA, States, and drinking water systems house T/M/F expertise in different program areas at different levels. The success of water systems' achieving capacity to run their operations in an efficient, business-like manner rests on water system owners and operators being able to effectively understand, communicate, and coordinate the various T/M/F needs. States, through the design and implementation of their capacity development strategies, have approached capacity development in different ways, to meet the unique issues facing their systems. /Technical Capacity / / Soiure Water IdequacM ]• InfriBlrueture AdiquaEV Managerial Capacity • dtatung and Mganlzatlon \ i ElfLttlvti L'KkTikal linkage! Financial Capacity i Bovnnug iiiffidnnty • CrtditwwlhinaBa and commix Figure 1: EPA Diagram on T/M/F Capacity As detailed in Appendix A, Table A-l, the 1996 SDWA Amendments present four key attributes that are needed to promote a capacity development process that assists public water systems in attaining T/M/F capacity. Specifically, a successful capacity development process should be: ------- • Flexible so that EPA and States can maximize the use of resources and capabilities to implement processes that meet the unique needs of each State. • Proactive in identifying and prioritizing Ihose water systems most in need of improving T/M/F capacities. • Integrated so that the resources of all Federal and State drinking water programs are utilized. • Accountable in being able to demonstrate that capacity development helps water systems provide safe water to customers. All four attributes do not need to be present to the same degree for capacity development programs to be successful. However, we believe that the combined presence of these attributes promotes a capacity development process lhat assists public water systems in attaining T/M/F capacity. EPA's Office of Ground Water and Drinking Water (OGWDW) is responsible for governing capacity development as a national program by providing guidance to States, as well as requiring accountability from States with respect to congressional expectations. Scope and Methodology We performed our evaluation in accordance with Government Auditing Standards, issued by the Comptroller General of the United States. We conducted our field work from August 2001 to November 2002. We focused our review on six States: Arizona, Illinois, Massachusetts, Nebraska, South Carolina, and Washington; related EPA regional offices; and EPA Headquarters. To draw conclusions about the design and initial implementation of EPA and State capacity development activities, we collected information from a broad range of sources (see Appendix C, which provides greater detail on our Scope and Methodology). Results of Review We identified two issues relating to our review questions: • With assistance from EPA, States designed capacity development strategies that generally met the requirements of the 1996 SDWA Amendments. However, EPA and States need to improve the continuing implementation of these strategies to ensure that managerial and financial capacity needs are being addressed, and EPA needs to improve its oversight of States' capacity development programs. • EPA has not developed or implemented a plan to assess the performance of the capacity development initiative, and is currently unable to report on the results lhat the capacity development program is having on a national basis. ------- These issues are discussed below. Capacity Development Design and Early Implementation Results Mixed EPA, working closely with States and stakeholders, provided guidance and publications to ensure that States' capacity development strategies generally were designed to provide capacity assistance to community water systems, while providing the States with flexibility to design strategies that met their individual needs. EPA's and States' implementation efforts, however, need to be more proactive, integrated, and accountable, to ensure that community water systems become and stay "healthy." When we assessed EPA and State programs using the four attributes of flexibility, proactivity, integration, and accountability, we determined that EPA and States have focused on providing technical assistance, and that more needs to be done to provide the managerial and financial capacity assistance. All three components of capacity development are critical for community water systems to operate effectively. States had success wilh providing technical capacity development that was proactive, integrated, and flexible. This is to be expected, to a certain extent, since drinking water programs have historically been technically focused. However, the States had less success providing managerial and financial assistance to water systems. EPA staff told us the Agency also did not provide sufficient oversight to States to hold them accountable for ensuring that systems most in need of T/M/F capacity assistance were being helped. We attribute the lack of accountability to the fact that during the design of the capacity development program, stakeholders disputed EPA's authority to hold States accountable for capacity development through DWSRF withholding. During our review, EPA staff indicated that there continues to be an overemphasis on technical capacity, at the expense of managerial and financial capacities. State-level managers identified the lack of managerial and financial capacities as significant obstacles to systems sustaining technical capacity. Massachusetts officials staled that all technical capacity deficiencies are caused by managerial and/or financial problems. Nebraska officials said 70 to 80 percent of their major deficiencies are because of managerial or financial problems. State Programs Had High Degree of Flexibility. EPA and States approached capacity development in a highly flexible manner. Through the 1996 SDWA Amendments, Congress required EPA to afford States the flexibility to tailor capacity development strategies to meet the varying needs of their community water systems. EPA promoted flexibility through its capacity development handbook, guidance, and stakeholder process when designing the capacity development process. The guidance that EPA produced was the result of a thorough stakeholder consultation process that included State officials. States embraced this opportunity to craft strategies to meet the needs facing their water systems. States used the guidance and materials that EPA developed in conjunction with its stakeholders to create and implement capacity development strategies for existing systems by October 1,2000. The flexibility EPA afforded States is seen in the varied approaches among their capacity development strategies, as shown in Appendix A. ------- During our review, EPA officials discussed 1he difficulties they sometimes faced regarding flexibility. The EPA Capacity Development Coordinator said simply creating the capacity development guidance had been a challenge, noting that many of the individuals involved were adamant that EPA was neither authorized nor directed by Congress to develop guidance. Rather, these individuals insisted that EPA was only to act as an information source and was, therefore, only empowered to publish information for use at the States' discretion. The highly flexible, partnering environment lhat encompasses capacity development has caused confusion and disagreement as to how much authority EPA has. Although EPA has been extremely responsive to the flexibility needs of capacity development, this responsiveness has resulted in EPA providing guidance lhat did not hold States sufficiently accountable for ensuring that public water systems are achieving and maintaining capacity, as discussed below. EPA and State Efforts Can Be More Proactive. We determined that while EPA provided tools to States for developing proactive capacity development strategies, States could improve their efforts to proactively assist public water systems. Proactivity is an important attribute because it describes how EPA and the States plan and manage their work to prevent water systems from having difficulties in providing safe drinking water to the public. EPA provided the States with guidance and publications that identified how they could design proactive capacity development strategies to reach all water systems. For example, the publication Information for States on Implementing the Capacity Development Provisions of The Safe Drinking Water Act Amendments of 1996 provides examples of tools, such as sanitary surveys, lhat States could use when assessing water system capacity. All six States were working to prevent technical deficiencies in water systems by providing assistance through activities such as conducting sanitary surveys. Three States (Arizona, Illinois, and South Carolina) were still in the process of developing assessment tools to help water systems address managerial and financial deficiencies. The three other States (Massachusetts, Nebraska, and Washington) were proactive in their efforts to assess and deliver assistance to water systems in developing their managerial and financial capacities. However, for all six States, we determined that they can improve their efforts to deliver managerial and financial assistance to systems before technical problems occur. The SDWA Amendments give four sequential, closely linked activities that describe how States can provide proactive capacity assistance to community water systems: • Assessing water system T/M/F capacities. • Prioritizing systems based on their capacity needs. • Delivering T/M/F capacity development services to systems most in need. • Collecting information to determine whether water systems are achieving results. ------- The extent to which States designed and were proactively implementing capacity development strategies regarding the first three activities is addressed in Appendix A, Table A-2; details on collecting information are in Appendix B. Capacity Development Integration into Established Programs Varied. There was a great deal of variability in the degree to which States integrated the three capacity elements (T/M/F) into the following established drinking water programs: (1) sanitary survey, (2) operator certification, (3) enforcement, and (4) DWSRF. Emphasis was being placed on improving the technical capacities of systems, with less attention being devoted to developing methods to assess and deliver managerial and financial capacity assistance. As a result, in these four programs, EPA and the States were not maximizing the investment of capacity development - a process that cuts across all drinking water programs - to assist water systems in acquiring and maintaining their managerial and financial capacities. Effectively incorporating capacity development into these four programs, which comprise a major share of States' interactions with water systems, will help to ensure that T/M/F assistance will be provided to water systems. Details on integration efforts for the four programs noted are in Appendix A, Table A-3. Accountability Process Needs Further Effort. EPA did not perform a meaningful evaluation of State capacity development activities as required in the 1996 SDWA Amendments. Although EPA received the reports required under SDWA, there was no systematic process to review them for purposes of making DWSRF withholding determinations, in accordance with the Amendments. Specifically, EPA Headquarters did not: • Issue guidance to regions on conducting reviews. • Identify what constituted passing criteria for evaluating State capacity development efforts. • Identify what sanctioning actions were available to EPA Details on these issues are addressed in Appendix A, Tables A-4 through A-6. Without an effective assessment process, EPA Regions will be unable to execute their oversight responsibilities to determine whether States are making progress in implementing their capacity development strategies as Congress intended, especially in the areas of providing managerial and financial capacity assistance. Providing managerial and financial assistance is vital to the long-term sustainability of public water systems. Ultimately, without sufficient assistance in all three areas of capacity development, there is no assurance that those water systems most in need will receive adequate help and all public water systems will have the tools they require to maintain compliance and long-term sustainability. Congress directed that States be held accountable, through DWSRF withholding, for "... developing and implementing a strategy to assist public water systems in acquiring and maintaining technical, managerial, and financial capacity." EPA is responsible for annually assessing the implementation of State strategies to determine whether States should incur DWSRF withholding. The DWSRF withholding provision that Congress linked to capacity development illustrates how critical EPA oversight of State strategies is to ensuring States are accountable for designing and implementing effective strategies. ------- The guidance that EPA uses to hold States accountable for successful implementation of capacity development is not effective. EPA issued Guidance on Implementing the Capacity Development Provisions of the Safe Drinking Water Act Amendments of 1996 (EPA 816-R-98-006) in July 1998 to establish national policy regarding implementation of the capacity development provisions of the 1996 SDWA Amendments. In this guidance, EPA listed the "minimum requirements" States must meet to avoid DWSRF withholding. These minimum requirements include the submission of five reports, relating to the States' design and on-going implementation of their capacity development programs. Although EPA stated that all States submitted the five reports required under the guidance, we concluded the guidance was too general or lacking in detail for EPA to use the reports to perform effective reviews on State progress. We also determined that EPA's regulations do not provide sufficient passing criteria assessing States' progress in implementing their capacity development strategies. In August 2000, EPA promulgated DWSRF withholding regulations, as part of 40 Code of Federal Regulations (CFR) 35, Drinking Water State Revolving Funds: Interim Final Rule, indicating EPA will withhold funds from a State if it does not adequately develop an implementing strategy to assist public water systems in acquiring T/M/F capacity. We determined that EPA's withholding regulations do not sufficiently define the various criteria necessary to conduct a meaningful and effective assessment of a State strategy. These criteria need to define "developing," "implementing," "acquiring," and "maintaining." Because of a lack of adequate guidance and defined criteria for assessment, EPA does not have a credible sanctioning mechanism for use under the DWSRF withholding provisions. Maintaining State accountability of capacity development activities necessitates that EPA incorporate three steps: (1) common knowledge of the availability of sanctions, (2) a defined process for applying sanctions, and (3) a willingness on the part of EPA to use sanctions. Although EPA made States aware of the possibility of the withholding sanction, it has not developed a process for initiating corrective actions, and EPA staff indicated a reluctance to use the sanctions. Region 7 has developed a comprehensive evaluation, which could be used by all Regions, to periodically assess, in some depth, whether States are meeting the SDWA requirements, including implementation of capacity development strategies. In April 2002, it initiated a comprehensive review of Nebraska, including identification of strengths of Nebraska's programs (including capacity development, operator certification, etc.), as well as areas that needed improvement Region 7 also included steps that it will take to continue to assist Nebraska in its capacity development implementation. This type of comprehensive evaluation may provide a good model for EPA to consider using for assessments, and, along with better defined criteria, could use to provide a meaningful assessment of States' progress, as part of their oversight responsibilities. ------- Capacity Development Accomplishments Uncertain Due to Lack of Performance Measures In addition to the capacity development implementation issues presented, EPA has not developed or implemented a plan to assess Ihe performance of the capacity development initiative, and is currently unable to report on the results lhat the capacity development program is achieving on a national basis. In earlier reports, we discussed EPA's progress in developing performance measures and meeting the requirements of the Government Performance and Results Act (GPRA).2 These efforts resulted in the Inspector General in September 2002 naming EPA's difficulty in linking mission to management a key management challenge. Measuring performance is important because it provides accountability, communicates the value of Ihe program to others, and gives managers information to make decisions for program improvement. Equipped with such information, EPA and Congress can better allocate resources to improve human health and Ihe environment. For example, under the President's Management Agenda, all Federal agencies will be expected to use OMB's Program Assessment Rating Tool (PART) results and performance measures to support and explain budget requests. OMB plans to assess EPA's drinking water program, using PART, in FY 2006. However, EPA specifically has not: • Identified capacity development goals; • Developed performance measures to assess progress toward the goals; • Collected data on capacity development performance measures; and • Analyzed data and reported on capacity development performance results. Details on these issues are in Appendix B, Table B-l. Instead of developing national measures, EPA relied on States to identify those measures, based on their individual capacity development strategies. But Ihe measures identified by States, being based on individual strategies, were not similar enough to be used for comparison, or Iheir results consolidated, to account for results at a national level. For example, there was high degree of variability in baseline measures for the six States that we reviewed, as shown in Appendix B, Table B-2. Having an established national measure for capacity is critical for EPA and Congress to determine the extent to which systems are becoming and staying healthy. Without Ihis information, EPA cannot report to Congress on its success in implementing the capacity development provisions of the 1996 SDWA Amendments. Further, EPA ultimately does not know whether it is, in fact, maximizing its efforts to improve the ability of water systems to deliver safe water to the public. Given Ihe severe budgetary circumstances that many States face, EPA and the States must be able to demonstrate that the financial investment in capacity development is yielding results and deserves continued support. 2EPA's Progress in Using the Government Performance and Results Act to Manage for Results [EPA-Office of Inspector General (DIG) 2001-B-000001]; and Audit of EPA's Fiscal 2000 Financial Statements [EPA-OIG 2001-1-00107]. ------- We also identified four challenges that EPA faces and needs to address to be able to measure capacity development: (1) having limited control over implementation; (2) lack of comparability due to high variability in approaches; (3) difficulty in showing periodic progress because demonstrating outcomes can take years; and (4) States' concerns over data collection burden and its accuracy. These challenges are discussed further in Appendix B, Table B-3. Conclusions Capacity development is a cornerstone of EPA's and States' drinking water programs. All olher drinking water programs support water systems' efforts to achieve and maintain capacity. Thus, how well EPA and States' are able to deliver capacity assistance to public water systems will, ultimately, affect drinking water quality. This is why it is so important for EPA and States to provide not only technical assistance but the needed financial and managerial assistance that systems require. EPA also needs to know how well States are doing in providing T/M/F assistance, to strategically and proactively invest resources necessary to help States with their capacity development efforts. Through more efficient investments in developing the capacity of drinking water systems, EPA and the Slates can better help public water systems meet current and future financial and regulatory challenges. To do this, EPA and State capacity development processes need to provide the appropriate amount of T/M/F assistance to help systems achieve and maintain long-term sustainability. They need to ensure that programs are balanced - flexible, proactive, integrated, and accountable. Further, EPA needs to address the challenges to implementing an adequate performance measure system for capacity development, to identify performance gaps and set target goals to improve overall drinking water quality. Recommendations For EPA to achieve the great potential that capacity development can provide to our Nation's community water systems, EPA needs to improve its oversight of States' capacity development effort, including having the right information and related indicators or measures to assess capacity development progress. Therefore, we recommend that the Assistant Administrator for Water: 1. Develop a national capacity development strategy that promotes T/M/F capacity in a proactive, integrated, flexible, and accountable manner throughout its key drinking water programs, and provide additional guidance and/or information, accordingly. 2. Revise 40 CFR 35.3515 (DWSRF withholding regulations) to provide more specific criteria that will allow EPA to conduct meaningful annual assessments of State capacity development strategies. These revisions should include denning the terms "developing," "implementing," "acquiring," and "maintaining," as criteria for EPA to conduct annual assessments of State capacity development strategies. 10 ------- 3. Develop the comprehensive evaluation to be used to assess implementation of States' capacity development strategies, consistent with differing States' needs and circumstances, and require the use of this tool by Regions as part of their oversight responsibilities. 4. Work with its partners and stakeholders to: (a) Identify a set of common measures that can be used to develop and implement national performance goals. (b) Determine what common capacity development data and/or information resources are available that could be used to support a national capacity development measure, while minimizing data collection burdens to States and water systems. 5. Using the results in Recommendation 4, develop national capacity development measures by: (a) Identifying capacity development goals to be accomplished, as part of the drinking water annual performance goals. (b) Developing specific capacity development measures that support the capacity development annual performance goals. (c) Either modifying already existing data collection efforts or developing new data collection processes for capacity development performance measures. (d) Analyzing results of capacity development performance on a national basis, and reporting progress to Congress and the public, as required by GPRA. Agency Response and OIG Evaluation In our draft report, we made recommendations to promote the use of all three elements of T/M/F in capacity development, as well as to develop a performance measurement system to assess capacity development achievements. The Assistant Administrator for the Office of Water provided a response to our draft report on June 23,2003 (see Appendix D). The Assistant Administrator substantially disagreed with the message in our report. Substantively, Office of Water felt we misinterpreted Congress' intent when it passed the SDWA Amendments, and also cited potential errors in the information presented. From a process standpoint, Office of Water felt that we had changed the focus of our study from capacity development to performance measurement, and that meaningful results could not be obtained, since capacity development is still in the early stages of implementation. Office of Water also felt that States were misled, since the report mentions States by name, contrary to the agreement OIG made with the States. We generally do not agree with these issues, and have responded to these concerns in Appendix E, Notes 7 to 12. EPA disagreed with the recommendations we presented in the draft report In response to our recommendations to change capacity development guidance or regulations, to enhance delivery of 11 ------- managerial and financial capacities through better oversight, EPA pointed out that these steps would impede the flexibility afforded to States. We disagree with EPA's opinion. States are afforded a high degree of flexibility within the boundaries of delivering T/M/F assistance to public water systems, and EPA has the responsibility to ensure that this happens. Therefore, we maintain our recommendations that EPA needs to work with its stakeholders and partners to better promote and integrate managerial and financial capacity efforts in States' drinking water strategies. EPA also did not support measuring capacity development progress, stating that the current GPRA drinking water goal sufficiently captures capacity development results. We do not agree with EPA's position We maintain that without defined annual goals for capacity development, EPA will not have the ability to manage this program effectively and to target resources where they will do the most good. Additionally, EPA's overall drinking water goal measures drinking water system compliance only, and accumulates this information in the Safe Drinking Water Information System/Federal Version (SDWIS- FED), an "exceptions" reporting database, with only systems in noncompliance being tracked. SDWIS- FED was designed to track enforcement activities, and was not designed to measure managerial or financial data Therefore, SDWIS-FED may provide some measure of technical capacity, but not managerial or financial capacity results. Therefore, relying on this measure to ascertain adequate capacity is incomplete, at best. Detailed analyses of the Agency's comments are in Appendix E. Action Required In accordance with EPA Manual 2750, you are required to provide a written response to this report within 90 calendar days of the date of this report You should include a corrective actions plan for agreed upon actions, including milestone dates. We have no objections to the further release of this report to the public. For your convenience, this report will be available at We appreciated the active participation by the EPA Office of Water and selected Regional and State officials. If you or your office have any questions, please contact me at (202) 566-0830 or Leah Nikaidoh at (513) 487-2365. 12 ------- Appendix A Details on Capacity Development Design and Early Implementation Results Table A-1: Capacity Development Attributes Presented In 1996 SDWA Amendments Proactivitywas required in the capacity development section of the Amendments, Public Law 104-182, §1420(c)(2)(A), which stated: In preparing the capacity development strategy, the State shall consider, solicit public comment on, and include as appropriate - (A) the methods or criteria that the State will use to identify and prioritize the public water systems most in need of improving technical, managerial, and financial capacity. Integration was identified in the findings section of the Amendments, Public Law 104-182, §3(8)(B), which stated: [M]ore effective protection of public health requires...maximizing the value of the different and complementary strengths and responsibilities of the Federal and State governments in those States that have primary enforcement responsibility for the Safe Drinking Water Act. Flexibility was identified in the findings section of the Amendments, Public Law 104-182 §3(4), which stated: States play a central role in the implementation of safe drinking water programs, and States need increased financial resources and appropriate flexibility to ensure the prompt and effective development and implementation of drinking water programs. Accountability was required in the capacity development section of the Amendments, Public Law 104-182, §1420(c)(1), which stated: ...Statefs] shall receive only [a portion] of the allotment that the State is otherwise entitled to receive under[DWSRF], unless the State is developing and implementing capacity development strategies that assist water systems in acquiring and maintaining technical, managerial, and financial capacity. 13 ------- Table A-2: Proactlvlty Efforts by States Steps Results of States' Efforts Assessing Arizona, Illinois, and South Carolina did not conduct managerial or financial capacity assessments of community water systems. The reasons involved insufficient design of capacity assessment tools, or components of capacity development strategies not yet being developed. Massachusetts and Nebraska assessed the managerial and financial capacity of water systems through the sanitary survey. Washington conducted managerial and financial assessments through a review of water system planning documents. These reviews occurred on a 6-year cycle for systems with 1,000 or more connections. Other reviews occurred only when a system was new or expanding, applying for a DWSRF loan, or experiencing problems. (Washington applies Federal drinking water regulations to approximately 16,000 water systems. Due to the large number of systems, State officials limit planning document reviews.) Prioritizing Arizona ranked all water systems by assigning point values to each system, based on a variety of data. However, water systems with managerial or financial deficiencies were not prioritized because these capacities were not assessed by the State. Subsequent to our site visit, Arizona designed a capacity assessment tool that included managerial and financial components, such as water system security and fiscal controls. The other five States grouped water systems into priority tiers. Massachusetts and Nebraska prioritized capacity assistance based on results from their water system T/M/F assessments. In Washington, water systems were prioritized based on the operating permit "color," which is determined through T/M/F assessments. Illinois did not prioritize capacity assistance based on water system managerial or financial deficiencies, but was in the process of developing a capacity assessment tool that will allow it to prioritize systems into three priority tiers. South Carolina was also experiencing difficulties prioritizing assistance based on water systems' financial deficiencies because the sanitary survey did not incorporate financial assessments into the initial review process. South Carolina officials indicated they would like assistance in solving this problem. Delivering States that identified and prioritized water systems with managerial and financial deficiencies were delivering needed assistance. In addition, States that had not identified and prioritized water systems based on managerial and financial deficiencies were also delivering various levels of managerial and financial assistance. However, by not first identifying and prioritizing water systems with managerial and financial deficiencies, States may miss the opportunity to prevent technical failures from occurring. For example: • Massachusetts delivered capacity assistance to water systems by requiring systems with T/M/F deficiencies to follow plans to address identified deficiencies. Third parties were hired to assist systems in the development of these plans. South Carolina, which had difficulties with its assessment and prioritization methods, also delivered this type of assistance to water systems through T/M/F capacity development plans. However, managerial and financial assistance was only delivered to systems once a technical deficiency was identified. Illinois explained that field engineers and third parties worked to educate water boards and water system managers about the managerial and financial operations of water systems. The State indicated it will also encourage water systems to undertake voluntary T/M/F capacity development plans once its capacity assessment tool is developed. 14 ------- Table A-3: Integration Efforts Program Sanitary Survey Operator Certification EPA and States' Efforts All States are required to use sanitary surveys to perform compliance assessments of public water systems. Sanitary surveys may also be used to perform assessments of the managerial and financial capacity of water system management and operators. EPA officials, through statements, training information, and capacity development material, identified the importance of sanitary surveys in ensuring technical capacity, and to a lesser degree, integrated financial and managerial capacities in their materials. When we asked EPA officials if sanitary surveys required information about managerial and financial capacity, they said: • Although most of the required elements of a sanitary survey are geared toward determining technical capacity, a number of significant deficiencies may disclose underlying problems with managerial and/or financial capacity. • States were allowed to create their own definition of what constituted a significant deficiency identified as part of the sanitary survey, but definitions for managerial and financial significant deficiencies did not exist because EPA and States did not consider such deficiencies significant to public health. • The sanitary survey is for compliance-oriented activities and is not related to capacity development. While EPA identified the sanitary survey as a tool States could use for evaluating capacity, the Agency has not conveyed that managerial and financial assessments are just as important as technical. This is evident by the fact that only Massachusetts used the sanitary survey to identify managerial and financial violations. Nebraska, and Washington also used the sanitary survey to varying degrees to assess all three components of capacity. However, Arizona, Illinois, and South Carolina are in the process of incorporating managerial and/or financial capacity assessments into sanitary surveys. As a result, we concluded the sanitary survey needs to better integrate all three components of capacity to ensure that a water system assessment is complete. Operators are responsible for the day-to-day management of a water system's technical operations and, therefore, are critical to ensuring the drinking water delivered to the public is safe, but can also be responsible for the management and financial budgets of systems, and can be a critical link to water boards and directors. Massachusetts, Nebraska, and Washington integrated all three aspects of T/M/F in the training portion of their operator certification programs. For example: • Massachusetts offers courses in the financial, managerial, and operational aspects of running a successful water system, and operators are required to be more informed on managerial and financial capacity matters as part of their duties and responsibilities. • Washington requires operators to have training about the T/M/F operations of a water system to maintain their certifications. 15 ------- Program Enforcement DWSRF EPA and States' Efforts Massachusetts has incorporated managerial and financial capacity requirements into its regulations, and South Carolina requires systems in noncompliance to develop business plans that contain all three elements of capacity. Illinois includes voluntary managerial and financial self-assessment as part of its enforcement agreements. The enforcement programs in Nebraska and Washington have not integrated managerial and financial capacity among its operations; however, Washington is attempting to do so. Although enforcement is often seen as the last resort to address noncompliant water systems, State enforcement programs can be used to promote long-term managerial and financial capacity with systems. DWSRF Loans: EPA requires that loans go to systems that either have adequate capacity or will achieve capacity through the loan project. The Drinking Water National Information Management System that EPA uses to track the DWSRF program cannot determine what T/M/F problems the loans were used to solve. Furthermore, the EPA capacity information about the DWSRF program is focused mostly on the financial ability of systems to repay the loans, rather than assessing the overall T/M/F health of systems. Ensuring DWSRF loan recipients have viable sources of funds from which to repay loans is a legitimate concern but, Congress also intended States to utilize all three elements of capacity. Therefore, in addition to focusing on the ability of systems to repay loans, the States should also address managerial and technical issues to ensure the financial investment is protected. DWSRF Set-Asides: An EPA Headquarters manager of the capacity development program told us EPA wants its regions to encourage States to use DWSRF set- asides for T/M/F activities. Future EPA plans include promoting the use of DWSRF set-asides for capacity activities in those States that do not currently utilize set-aside funds, and working with regions and States to encourage States to revise their strategies as necessary. Four of the States in our review (Massachusetts, Washington, Arizona, and Nebraska) used between 15 to 31 percent of the DWSRF for set-asides to assist in capacity development efforts. However, Illinois and South Carolina only used 4 percent for set-asides, in order to fund the administration of their DWSRF. 16 ------- Table A-4: Accountability Issues Issue EPA Guidance Inadequate Withholding Regulations Inadequate Detailed Information EPA's guidance was too general or lacking for EPA to perform effective reviews of these reports to assess States' progress. Stakeholders disputed the Agency's authority to promulgate such guidance and perform these reviews. For example, in October 1998, the National Rural Water Association testified before the House Subcommittee on Health and Environment that EPA was not authorized, through the SDWA Amendments, to establish guidance regulating capacity development strategies. This view was prevalent among stakeholders, who noted that, among other things, the guidance impacted State flexibility. Examples of stakeholders' comments on the EPA draft capacity development guidance are presented in Table A-5. The stakeholders disputed authority of EPA to hold States accountable for capacity development through DWSRF withholding, which contributed to EPA providing minimal oversight guidance to Regions and States. EPA guidance for the States in preparing strategies included six general "suggestions" or ideas for them to consider in their strategies. The guidance included a worksheet States could use to ensure that SDWA capacity development requirements were met. For example, the worksheet suggested that States describe how they solicited public comments and considered these comments. However, EPA did not provide any expectations on what it expected to see in the write-ups of these questions. In some instances, the EPA capacity development guidance provided to the regions was too vague for the regions to oversee States in designing and implementing strategies. Also, EPA provided no review guidance to the regions for three of five required reports, as presented in Table A-6. Without specific guidance, regions had no standard by which to hold States accountable for capacity development through DWSRF withholding provisions. EPA's withholding regulations are inadequate because they do not sufficiently define the various criteria necessary to conduct a meaningful and effective assessment of a State strategy. These criteria need to include definitions of "developing," "implementing," "acquiring," and "maintaining." Without these definitions, the withholding regulations are not specific enough to perform effective annual assessments. Therefore, there is no way to determine whether the strategy that a State is "designing and implementing" is effectively providing the necessary services to the neediest systems. As noted in Table 5, EPA stated that its authority to make regulations and even guidance were questioned by various stakeholders. We asked the OIG's Office of Counsel to review the 1996 SDWA Amendments and applicable regulations to determine whether EPA's authority truly was limited. The Office of Counsel attorney recognized that the provisions in the SDWA amendments provided a great deal of leeway for the States to design capacity development strategies. However, the attorney stated that through 40 CFR 35.3515, EPA, for the purpose of executing its responsibility under the Amendments to perform assessments, can provide more specific guidance or regulations to adequately perform these reviews. 17 ------- Issue Detailed Information Sanctioning Common Knowledge of the Availability of Sanctions. EPA, through its capacity Process development guidance, has made States aware of the possibility that DWSRF Ineffective withholding can be used as a sanctioning mechanism. In fact, the majority of the guidance is focused on DWSRP withholding. All six States visited were aware of DWSRF withholding as a sanctioning mechanism. Process for Applying Sanctions. EPA defined a process for a sanctioning mechanism, but did not establish this process as part of its official capacity development guidance (EPA816-R-98-006). EPA delineated a DWSRF withholding sanctioning process through an e-mail sent to regional capacity development and DWSRF coordinators on October 26, 2000. Although this e-mail generally establishes a process for regions to initiate a review to decide whether to implement a DWSRF withholding sanction, it does not specifically establish a process for applying withholding as a capacity development sanction. According to EPA, until a sanctioning process is part of the EPA capacity development guidance, Regions may be reluctant to employ DWSRF withholding against States. Also, as part of this process, EPA guidance needs to include criteria that can be applied when assessing capacity development progress. Willingness to Use Sanctions. The Capacity Development Coordinator for EPA told us that he was suspicious that no withholding recommendations were made. However, he indicated that several Regional capacity development coordinators were instructed by their management to accept what the States provided. There was also uncertainty among Regions about whether they would be able to actually implement sanctions through DWSRF withholding. 18 ------- Table A-5: Examples of Stakeholders' Comments to EPA Capacity Development Draft Guidance Stakeholder Comment EPA Response Many commentors were concerned with whether an appropriate balance could be struck between providing State flexibility while ensuring national program accountability in implementation. They stressed the need for EPA to acknowledge State program diversity in capacity development, and that only limited or no oversight by EPA would be necessary. While the law clearly contemplates such flexibility, it also clearly establishes what States must do regarding capacity development if they wish to avoid a capacity development-related withholding from their DWSRF. The guidance provides a simple objective framework for EPA to use in assessing whether or not a State has done what the law requires in order to avoid capacity development-related DWSRF withholding. Many commentors expressed concern regarding the proposed scope of EPA review of State strategies. The Agency [EPA] believes that it has proposed the narrowest and most limited review possible which is still consistent with the requirements of the statute. The Agency's proposed review of State capacity development strategies is narrowly focused on ensuring that these statutory requirements are met. Many commentors stated their belief that EPA does not have statutory authority to issue guidance related to capacity development strategies. The basis for this guidance is the EPA Administrator's authority to issue guidance and regulations under the DWSRF section. This guidance establishes national policy regarding implementation of the DWSRF withholding related to capacity development strategies under sections 1420(c) [capacity development strategies] and 1452(a)(1 )(G)(i) [DWSRF withholding relating to capacity development strategies] of the SDWA as amended. A number of commentors expressed concern regarding the proposed requirements for documentation of the strategy and documentation of ongoing strategy implementation. The Agency has simplified and streamlined the strategy documentation and reporting requirements. 19 ------- Table A-6: Analysis of EPA Guidance to Regions on Review of State Reports Report List of Systems in HSNC (Historical Significant Non-Compliance) State Capacity Development Strategies Annual State Strategy Implementation for DWSRF Withholding Determination August 6, 2001 Report to the Administrator on the Success of Initial Capacity Development Efforts State Report to Governor on Capacity Development Strategy Progress EPA Outdance to Regions No guidance was established. The EPA guidance for Regions to use when assessing proposed State strategies was identical to that provided to the States. Regions were instructed to use the same general suggested worksheet questions as States when assessing the proposed design of State strategies. No guidance was established. EPA stated to us: The determination of whether the state is implementing its strategy is the responsibility of the capacity development coordinator. There is no national guidance that outlines what a state implementation report should contain. When making a decision to withhold DWSRF funds, each regional coordinator should use a state's initial strategy as a baseline to determine the appropriate content of the report as well as progress made towards improving technical, managerial and financial capacity of public water systems. A January 25, 2001 EPA headquarters memo to regional capacity development coordinators stated the only statutory requirement was that a report needed to be submitted and the structure and details of the report were up to each State. The memo only offered "suggestions" to the States on how they may want to consider structuring the report. No guidance was established. 20 ------- Appendix B Details on Capacity Development Performance Measurement Table B-1: EPA's Difficulties In Addressing Performance Measurement Steps Step Issue Identified Discussion of Issue Identify EPA's Current Goal Goals Structure Does Not Address Capacity Development Objectives EPA did not include an annual performance goal for capacity development, a critical Amendment requirement, in its annual plans. EPA also has several other annual performance goals relating to the Clean and Safe Water goal that affect safe drinking water, stemming from requirements in the 1996 SDWA Amendments. For example, in EPA's fiscal 2002 annual performance goals, it included: [States] Ensure that 100% of community water systems are complying with the Consumer Confidence Rule (CCR) by issuing annual consumer confidence reports. EPA proposed an annual performance goal for capacity development for fiscal 1999. However, this annual performance goal was not adopted. An EPA Office of Water official said that this goal was dropped, in part, because the Agency was directed to decrease the number of output measures and goals for that year. Develop Performance Measures Measures Selected Cannot Be Used at National Level EPA has not selected capacity development performance measures necessary to enable the Agency to assess its national progress in achieving capacity development goals or intended outcomes. This situation is the result of EPA not identifying how capacity development would fit into its annual performance goals, as well as EPA's limited control over States' drinking water programs and the variability in States' capacity development processes. EPA also did not identify nation-wide performance measures, relying instead on States to identify those measures, based on their individual capacity development strategies. But the measures identified by States, being based on individual strategies, were not similar enough to be used for comparison, or their results consolidated, to account for results at a national level. The six States identified output rather than outcome measures in their capacity development strategies and, therefore, did not provide a means to assess long-term, outcome-oriented results. 21 ------- Step Issue Identified Discussion of Issue Collect Data Collection Not Data Providing Useful Measurement Information EPA and the States had five reporting requirements under the SDWA Amendments. Because EPA did not develop national capacity development goals and related performance measures, resulting data collection efforts have not been consistent. Additionally, the data that EPA has collected through various reporting efforts were not consistent enough, and were not always complete or accurate. A comparative analysis of the six States' baseline measures against their August 6, 2001, reports to EPA (item 4 in Table 2) on the success of enforcement mechanisms and initial capacity development efforts in relation to fiscal 2000 strategies showed that none of the States provided data or other specific information that related to how much progress they had made in regards to their initial capacity development efforts, based on their strategy baselines. Of the six States, only South Carolina provided any specific data, by including information on its progress to reduce HSNCs. Analyze and Report on Results Analysis and Reporting Will Be Difficult For EPA EPA has not identified goals, developed measures or collected sufficient, complete data needed to perform analysis and report on results at a national level. EPA has treated various capacity development reporting efforts as a data sharing effort for the States, instead of strategically assessing the information to make decisions at a national level. When we asked EPA staff whether they had analyzed the State responses to the HSNC lists and August 6, 2001, reports, they responded that they had not, even though "we know it's important and will likely use it in the future." EPA staff indicated plans to use the HSNC reports to publish a summary document, and then encourage States to use these summary reports as implementation tools. However, since the HSNC reports do not contain the information needed to assess and report on T/M/F capacities and why HSNCs continue to experience deficiencies, it is doubtful that this information, in its current form, would be useful in assessing system capacity or in improving State strategy implementation. 22 ------- Table B-2: Six States' Baseline Measures from Capacity Development Strategies State Arizona Illinois Massachusetts Nebraska South Carolina Washington Capacity Development Strategy Baseline Measures Identify baseline from 1 998 or 1 999 data that are tracked in State system, including basic system data such as owner, system type, population served, and number of enforcement/compliance actions. Will consider additional approaches to measuring capacity, including outreach activities (third party providers, sanitary surveys), operator certification, compliance data, and assessment surveys. Uses a combination of compliance with regulations, plus a factor of improved operations, based upon information obtained from sanitary surveys. Will also assess quality of consumer confidence reports. Will use sanitary surveys ("Comprehensive Compliance Evaluations") to assess baseline capacity status of systems. Based on results of evaluation, systems will be assigned capacity status: adequate, conditional, inadequate. Results will then be input into electronic tracking system. State will track and evaluate status of systems as they move from one status to another, as indicator of capacity improvement. Measuring performance based upon number of systems moving from one category to another. Massachusetts has used its State-wide data to identify trends, and develop assistance to target different problems that are surfacing. Number of sanitary surveys performed on an annual basis, site visits by outreach team, followup with systems via surveys to solicit feedback on assistance systems have received, followup on deficiencies being found. If sanitary survey yields poor results, system prepares a business plan, which includes information on facilities, management, and financial planning. State will determine if a system has a business plan. Development of benchmarks from systems' annual financial statements will provide empirical database for measurement of viability (ability to meet compliance standards). Use three-component formula to measure capacity: (1) operating permit status; (2) system has completed water system planning document that includes three capacity components; and (3) enforcement actions. Data will be gathered, based upon this formula, to prioritize systems for capacity assistance. 23 ------- Table B-3: Performance Measurement Challenges EPA Faces Four Challenges in Establishing Effective System Having Limited Control Over Implementation. EPA's limited control over drinking water program activities makes it difficult for the Agency to exert direct control over achieving national goals. Responsibilities for day-to-day implementation of Federal drinking water regulations have basically been delegated to the States. While EPA has oversight responsibility, it has limited control over what States do. This is a problem faced by Federal programs, according to GAO. Because EPA relies on States and other partners, such as rural water associations and third party contractors, to implement capacity development, EPA has to either get buy-in from stakeholders on measures, or adopt new regulations, requiring reporting on specific, mandated measures. Lack of Comparability Due to High Variability in Approaches. Congress afforded the States great flexibility in designing and implementing capacity development. As a result, variability, at the State level for capacity development activities can be so extensive that there is little commonality in outputs. Thus, there is currently little that EPA can measure that is common to provide a "national" picture. Difficulty Showing Periodic Progress Because Demonstrating Outcomes Can Take Years. The length of time that capacity development activities take to produce outcomes makes it difficult for EPA to report progress on an annual basis. Long-term outcomes may take years to manifest themselves, and capacity development is a generally a long-term sustainability process. Additionally, external factors can make it difficult to show a direct relationship between a specific program goal and the activities or outputs that caused the eventual outcome. States' Concerns Over Data Collection Burden and Its Accuracy. The time and cost of collecting comprehensive data is also an obstacle for developing performance measures. GAO has indicated that the following challenges with data collection were raised by Federal managers: (1) using data collected by others, (2) ascertaining the accuracy and quality of performance data, and (3) acquiring data in a timely way. During our review, five of six States (except South Carolina) expressed concerns over capacity development data collection burdens, such as additional paperwork requirements. 24 ------- Appendix C Details on Scope and Methodology To determine initial EPA and State success at designing and implementing capacity development strategies, we reviewed documents and conducted interviews and site visits at EPA Headquarters, regions, and a total of nine States. We conducted initial research to learn about State and EPA efforts to help systems develop their T/M/F capacities. This consisted of reviewing the 1996 SDWA Amendments and visiting three States - Kansas, Minnesota, and Vermont - judgmentally selected through discussions with Office of Water staff. We reviewed the capacity development strategies and implementation. We also interviewed water system operators regarding capacity development outreach. During our research and field work phases, we also reviewed capacity development policy and guidance issued by the Office of Water. We interviewed Office of Water officials to determine how guidance and policy was developed, and how it communicates with regional offices and States. We met with regional capacity development coordinators to determine how they have assisted States regarding capacity development strategies wilhin their drinking water programs. We also obtained legal advice from the OIG Office of Counsel regarding the 1996 SDWA Amendments and related regulations. After conducting our initial research and identifying T/M/F capacity as the three components that Congress stated were needed for systems to have the ability to deliver safe water, we also identified four attributes (proactivity, integration, flexibility, and accountability) that Congress described to EPA and States as necessary to enhancing a system's capacity. Sample Selection We selected six States for a more detailed analysis of capacity development: Arizona, Illinois, Massachusetts, Nebraska, South Carolina, and Washington. Our selection was based on information reported in EPA's annual report of drinking water violations, Factoids: Drinking Water and Ground Water Statistics for 2000, for 1998 through 2000.3 The information reported in Factoids used for our sample selection was drawn from the Safe Drinking Water Information System/Federal Version (SDWIS/FED), which EPA uses to measure the quality of drinking water in States and the nation and the effectiveness of drinking water programs. We ranked all States based upon percentage of population and exposure risk from health-based violations identified in SDWIS/FED. The states we selected were among those that (according to SDWIS/FED) had the greatest number and the highest percentage of their population receiving water from public water 3EPA816-K-01-004 (June 2001) http://www.epa.gov/safewater. At the time of field work, this was the most recent annual report of States' performance based on violations reported in the Safe Drinking Water Information System/Federal Version published by OGWDW. 25 ------- systems that had failed one or more health based violations. We used this approach in order to be able to determine progress designing and implementing capacity development in those states where it is most needed (based on how EPA currently measures performance). This also allows us to assess the prospect of EPA achieving its stated GPRA goals for its drinking water program. After the States had been selected, it came to our attention that there may be data errors in the SDWIS/FED database that could impact Ihe validity of the information in Factoids. Officials from five of the six States we visited also raised concerns about the quality of data in SDWIS/FED and EPA officials from OGWDW agreed that the data contained in SDWIS/FED may not be complete for all States. After reviewing these concerns we decided not to alter the initial list of states. First, given these slates' prominent position relative to others in SDWIS/FED, removing any of them would have been difficult to defend given Ihe information we had in hand. And expending resources to collect verifiable information to warrant adding others to those we had already selected would have also been difficult to justify given that sample selection was not the principal purpose of this review, which was to assess EPA and states' progress designing and implementing capacity development During our visits to the States (and discussions with corresponding Regional offices), we met with officials from a variety of drinking water programs to discuss how capacity development was being implemented using the four attributes. We met with program staff involved with capacity development, public water system supervision, DWSRF, operator certification, compliance and enforcement, and third parties such as State Rural Water Associations. We prepared a compilation of the information gathered at each State during our site visits, so that the States could review Ihe information collected, and provide any clarifications or corrections. We sent these compilations to each State in March 2002. All of the States reviewed the compilations and provided comments back to us. For one State, there were a substantial number of comments provided. As a result, we returned to this State, to gather additional information to support the results of our evaluation. We reviewed performance measurement requirements under the Government Performance and Results Act, and analyzed EPA and the six selected States' performance information on capacity development We assessed EPA's progress in developing a performance measurement system for capacity development by applying a framework of a four- step process presented by GAO through its analysis of GPRA requirements, and published in a series of recent reports.4 The four-step process, based upon GPRA requirements, is shown in Table C-l: ^Effectively Implementing the Government Performance and Results Act [GAO/GGD-96-118]; Managing for Results-Analytic Challenges in Measuring Performance [GAO/HEHS/GGD-97-138]; and, Managing for Results-Measuring Program Results That Are Under Limited Federal Control [GAO/GGD-99- 16]. 26 ------- Table C-1: Four Key Steps in the Performance Measurement Process Step Definition Components Identifying Goals Specifying long-term strategic goals and annual performance goals that include the outcomes of program activities. Assessing key elements of strategic plan. Involving stakeholders. Identifying external factors that affect goals. Linking goals to the mission, as well as to day-to- day operational activities. Developing Measures Selecting measures to assess a program's progress in achieving its goals or intended outcomes. Determining how annual performance goals will be measured. Ensuring performance measures will reinforce connection between long-term strategic goals and day-to-day activities of managers. Obtaining baseline data for comparison. Collecting Data Planning and implementing collection and validation of data on performance measures. Ensuring data collection efforts are sufficiently complete, accurate, and consistent to be useful in decision making. Analyzing and Reporting Results Comparing program performance data with annual performance goals and reporting results to agency and congressional decision makers. Comparing performance data in prior years to current year. Identifying performance gaps and setting improvement goals by targeting resources to improve overall mission accomplishment. Summarizing findings of program evaluations completed during year. Because of the progressive nature of Ihis process, if EPA faces difficulties in addressing some of the earlier steps (such as identifying goals), this will negatively impact the remaining steps. We conducted our field work from October 2001 to November 2002. This evaluation was performed in accordance wilh Government Auditing Standards, issued by the Comptroller General of Ihe United States. Prior Coverage • EPA OIG Report No. 2001-B-000001, EPA's Progress in Using the Government Performance and Results Act to Manage for Results ( June 13, 2001): This report noted that EPA needs to strengthen its partnerships wilh States and other Federal agencies, and invest in developing performance information that is more outcome oriented. • General Accounting Office Report No. GAO/T-RCED-00-298, Drinking Water Spending Constraints Could Affect States' Ability to Meet Increasing Program Requirements (September 19,2000): This testimony stated that the amount of Federal funding available to the States for the 1996 SDWA Amendment requirements had less impact on States' ability to implement their drinking water programs than State-imposed spending constraints. 27 ------- General Accounting Office Report No. GAO/RCED-99-31, Safe Drinking Water Act: Progress and Future Challenges in Implementing the 1996 Amendments (January 1999): This report indicated that the States lacked the resources needed to fully develop and implement drinking water programs, and that State legislatures were not putting needed authorities into place. The report also noted problems encountered by small water systems that do not have the T/M/F capacity to comply with current and future requirements 28 ------- Appendix D Agency Response to Draft Report UNITED STATES ENVIRONMENTAL PROTECTION AGENCY WASHINGTON, DC 20460 JUN232003 OFFICE OF WATER MEMORANDUM SUBJECT: Response to the Findings and Recommendations of the Draft Report, Impact of EPA and State Drinking Water Capacity Development Efforts is Unknown FROM: G. Tracy Mehan, HI /signed by Benjamin Grumbles, Assistant Adminstrator Deputy Assistant Administrator/ TO: Nikki L. Tinsley Inspector General Thank you for the opportunity to review and comment on your draft report, Impact of EPA and State Drinking Water Capacity Development Efforts is Unknown. The Environmental Protection Agency (EPA) and States have coordinated with the Office of Inspector General (OIG) throughout this lengthy evaluation. The Agency provided significant program materials at your request and professional staff have been available for numerous meetings. In general, we are deeply disappointed in the report, both in substantive content, study design and evaluation methods, factual accuracy, and overall tone. This negative assessment is strongly shared by the six States that participated extensively in your investigation of the national Capacity Development program. Their input and specific comments are reflected in this response. Even though your office took an additional year beyond your target date, the report is seriously flawed and demonstrates a lack of understanding of the program. After more than one and a half years of investigation and report preparation, the OIG found nothing positive in a program that is spawning new ideas, new ways of doing business, and developing and implementing proactive programs that your report underscores as a key feature of a sound capacity development program. Contrary to the mutually agreed upon objective, the final evaluation shifted gears without consultation and is narrowly focused on OIG's view that the Safe Drinking Water Act's (SDWA) Capacity Development program lacks nationally quantifiable performance goals and measures. We believe this view is fundamentally flawed and that the States and EPA have developed accountability measures that meet Congressional intent and capture the public health protection benefits of State Capacity Development programs. OIG's opinion of the respective roles of EPA Notel Note 2 NoteS Note 4 Internet Address (URL) • http://www.epa.gov Recycled/Recyclable • Printed with Vegetable Oil Based Inks on 50% Recycled Paper (20% Postconsumer) 29 ------- and the States in the Federal-State partnership is not supported by the SDWA's capacity development provisions nor the legislative history. Therefore, most of the report's recommendations would be better characterized as OIG's critique of the legislation and its perceived shortcomings, instead of how EPA and the States should measure program success. On a positive note, we do appreciate that the report acknowledges the challenges EPA would face in developing a national performance-based measurement system. We agree with four of the five challenges identified in the report on pp. 16-17: 1. EPA's limited control over implementation since States are delegated primary responsibility for implementing the drinking water program; 2. the Congressionally allowed variability in State approaches makes it difficult to identify common measures that can be aggregated at the national level; 3. demonstrating outcomes can take years; and, 4. States are increasingly concerned over data collection and reporting burdens. It is precisely because the DIG recognizes these challenges and statutory limitations, but then chooses to ignore them in the report's findings, that we are disappointed with the content and tone of the draft report. You provided us with a very brief period to review and comment on the draft report given your lengthy study period and extended report drafting; the complexity of the program; and your request that we consult with the States and EPA Regions. We have received significant comments from our regions and the six States that participated in the evaluation. The following, by consensus, are the major areas of concern with this report: 1. The study objectives of the report changed after the data collection phase and the data was collected too early to provide meaningful results; 2. States and EPA disagree with the OIG's interpretation of Congress' directive and intent; 3. DIG missed an important opportunity to provide substantive input on all components of the capacity development program (only the existing systems program was evaluated); 4. States were misled as the report mentions States by name, contrary to the agreement the OIG made with the States; and 5. there are a number of factual errors that impact the accuracy and credibility of the findings. Evaluation Objectives and Study Focus Changed After the Data Collection Phase The Capacity Development program is a cornerstone in the protection of our nation's drinking water. Capacity development is the process of water systems acquiring and maintaining adequate technical, managerial, and financial capabilities to enable them to consistently provide NoteS Note 6 Note? NoteS Note 9 Note 10 Note 11 Note 12 30 ------- safe drinking water on a sustainable basis. The SDWA's capacity development provisions provide a framework for States and water systems to work together to ensure that systems acquire the capacity needed to realize SDWA's public health protection objectives. EPA, in conjunction with our State partners, has spent a considerable amount of time and resources in developing and implementing an effective yet flexible Capacity Development program. However, because the evaluation's objectives and study questions changed significantly after your investigators collected data from EPA and the States, we respectfully, but strongly, disagree with the approach and findings. States have expressed surprise and serious concern to the Office of Water (OW) that the focus of the report has changed. For example, when first approached by the OIG, the state of Arizona was told the report would be a study that focused on three key areas: 1. State successes in capacity development implementation; 2. the challenges States face in implementing their strategies; and 3. overall program implementation issues and concerns. In addition, the investigators indicated that the report would contain examples of successful program elements from States that would foster productive information transfer among the States. Consistent with that commitment by OIG, the Engagement Letter formally signed by OW and OIG in October 2001 sets forth the following as the purpose and focus of the evaluation: "OBJECTIVE: The OIG's objective is to evaluate EPA and State formulation and initial implementation of Capacity Development programs to determine the extent to which such programs have been formulated and initially implemented consistent with the specific requirements and overall objectives of the SOW A. Specific emphasis will be given to evaluation of how States are integrating capacity development together with other SDWA initiatives and drinking water program activities to assist community water systems to consistently achieve the health objectives of the SDWA." We agreed with OIG that this broad assessment of program start-up and challenges could potentially assist States and EPA in shaping future phases of the program. This focus of the evaluation remained in place through completion of the data collection phase that involved the six States and EPA regions. It was only after site visits and data collection were completed that OIG apparently changed from a study of State successes, challenges, and overall program issues to a national performance measures focus. Because of this eleventh hour change, the information collected is not related to the changed focus of your evaluation. EPA and the States were never provided the opportunity to submit information directly related to the final study questions as presented in the draft report. Changing the focus of the evaluation report post data collection renders the data collected Note 13 Note 14 Note 15 31 ------- inadequate to answer the revised objectives of the evaluation. Therefore, we cannot concur with the report's recommendations. Both the States and EPA believe this surprise switch from the stated purpose undermines the fairness, quality and credibility of the draft report. We strongly recommend that DIG take the time necessary to refocus the draft on the original purpose which is supported by the data collected in the course of your investigation. Data Collected Too Early to Provide Meaningful Results When the OIG began the evaluation, EPA and the States were in the very early stages of program implementation. As we stated to OIG during the planning phase of this evaluation, the evaluation was conducted too early to provide meaningful results in terms of assessing program performance for capacity development strategies pursuant to SDWA section!420(c). Capacity development is universally recognized as a long-term effort. Yet, depending upon when a State's capacity development strategy was approved, most States had only approximately one year of initial implementation efforts for the OIG to evaluate. When the investigation commenced, the OIG staff recognized this issue and therefore agreed to the three study questions mentioned above. The draft report does not reference these questions, but rather focuses on national performance assessment measures and quantifiable results against those measures. This focus is premature given States had just begun to implement their strategies. Measurable results from long term strategies cannot realistically be expected at such an early stage of program implementation. The report offers little to describe this fact and leaves the reader to believe that States chose to ignore aspects of strategies rather than acknowledge that measurable progress of successful implementation may take several years. EPA Disagrees with OIG's Interpretation of Congressional Intent We disagree with the OIG's understanding of congressional intent of the capacity development strategies. Congress clearly intended that EPA provide States as much flexibility as possible, and that EPA's role in capacity development is to ensure that States are implementing capacity development strategies that are tailored to meet each State's specific and unique needs. The conference report for the 1996 SDWA Amendments ( Report 104-741, 104th Congress, (1996)) could not have made this point more clearly: "States are also to adopt and implement a capacity development strategy. This is intended to encourage States to continue to focus resources on capacity development initiatives. States are required to consider, solicit public comment on, and include as deemed appropriate by the State, a number of elements and criteria. The Conferees do not expect that every State will adopt the same capacity development strategy and do not expect States to include elements in section 1420(c) that the States determine are not appropriate. It is not expected that every State will give the same consideration to each of Note 16 Note 17 Note 18 32 ------- the elements listed in section 1420(c). Rather, the Conferees expect that, as suggested by existing State capacity development programs, State capacity development strategies developed under this section will vary according to the unique needs of the State. The Conferees encourage this diversity and indicate that EPA should give deference to a State's determination as to content and manner of implementation of a State's plan, as long as the State has solicited and considered public comment on the listed elements and has adopted a strategy that incorporates appropriate provisions." In addition, contrary to OIG's contention, Congress precluded EPA from assessing system capacity for the purposes of withholding funds. Specifically, SDWA section 1420(cX4) prohibits EPA from making any such assessment: "The decisions of the State under this section regarding any particular public water system are not subject to review by the Administrator and may not serve as the basis for withholding funds under section 1452." This provision effectively prohibits EPA from forcing States to meet any other requirements than those for content, deadlines for implementation, and reporting requirements to avoid sanctions by withholding DWSRF assistance. Strategies were intended by Congress to assist (not force or require) public water systems in acquiring and maintaining capacity. We welcome additional legal consultation with your office and the Office of General Counsel on this statutory interpretation issue. Missed Opportunity for Meaningful Program Input As previously discussed, the OIG missed an important opportunity to provide substantial input on program improvements. Part of this failure is attributed to OIG's decision to narrowly cover only capacity development strategies. In order to truly evaluate the efforts and successes of the capacity development program, your evaluation should have been designed to cover all aspects of SDWA section 1420. The new systems program, which is the proactive segment of the capacity development program, was not mentioned in your report. During your preliminary research phase, EPA staff encouraged the OIG to take a closer look at this program. At the time OIG staff began site visits to the States, the new systems program already had more than two years of implementation to its credit. In many areas of the report, the OIG criticizes EPA and States for placing too much emphasis on technical capacity and not enough on managerial and financial capacity. However, the report fails to even mention several significant efforts that are focused on managerial and financial capacity. The report does not discuss the Environmental Finance Centers (EFCs) (SDWA section 1420(g)) and the Small System Technology Assistance Centers (TACs) (SDWA section 1420(f)). Both of these university-based programs are congressionally funded each year specifically to work with EPA and States to develop tools and training that directly promote all Note 19 Note 20 33 ------- three elements of capacity. The EFCs have established expertise in financial capacity, and brought that perspective to their work assisting States develop their capacity development strategies. They continue to develop tools and training that help systems build financial capacity. The TACs have also focused their training and tool development efforts over the last few years to include more projects to help systems build managerial capacity. In addition, the States face barriers in working to assist systems. First, the SDWA requires that the strategy address existing systems. This in itself is a daunting task since many Slates have thousands of public water systems to regulate. Second, States recognize that the resources simply don't exist to assist all of these systems, and must prioritize systems most in need of assistance. This means that the States are focusing on the "worst of the worst" systems, many of which have been in and out of compliance for years and may resist State attempts to provide assistance. States Mislead on their Participation in the Evaluation As you know, all six states believe OIG violated commitments it made concerning their participation in the study. At the outset of the project, OIG agreed not to identify the basis for the selection of the states in the study. This was based upon an understanding that the data used for the selection was flawed for this purpose. It also reflects a recognition that this information was irrelevant to the core purpose of the study. Nonetheless, well into the study, OIG informed the states of a change in their position on this issue. OW's understanding of the OIG's commitments on this issue is identical to that of the participating states. We are troubled that OIG's reversal damages EPA's relationship with our state partners. It will certainly impair OIG attempts to obtain state cooperation in future projects. Numerous Factual Errors Impact the Report's Accuracy Finally, there are numerous factual errors throughout the report. In general, especially given the time frame from start to finish, we are disappointed in the quality of the report represented by these errors. One representative error is discussed below and we have included an extensive list as an attachment. We hope that you will address these issues to ensure the accuracy of the report. An example of a fundamental error occurs on page 28, where the report incorrectly describes the funding source for State capacity development programs. The report erroneously indicates that States may use 31% of their Drinking Water State Revolving Fund (DWSRF) for capacity development. The DWSRF funds a host of programs and capacity development is only one of the eligible programs. The theoretical maximum in set-aside dollars that could be used to support capacity development activities is 22% (10% for state program management; 10% for local assistance; and 2% for technical assistance to systems). However, this would be to the exclusion of other pressing local and state needs that are eligible for those same funds. Yet, the Note 21 Note 22 Note 23 Note 24 34 ------- report does not describe the competition that exists within States for those scarce infrastructure funds. Neither does it mention the matching requirement in order for those funds to be used for capacity development program implementation, a difficult undertaking in current economic times. It is well documented in several reports, both by EPA and other drinking water industry organizations, that States are faced with enormous infrastructure needs that far outpace DWSRF funding. Every State faces the very real fact that every dollar set-aside from the DWSRF to implement programs results in fewer dollars overall to fund needed infrastructure. These two issues, required matching funds and competition for money within the DWSRF, work to limit the actual monies each State could set-aside for capacity development and they should be discussed to provide a complete and balanced picture. On future technical reports such as this evaluation, you may wish to consider the benefits of conducting an independent technical peer review. It is standard practice for comparable studies in most other Federal government organizations and often improves quality. Conclusion We are including additional comment and information by attachment. As requested, Attachment 1 lists each of your recommendations and our rationale for nonconcurrence. Attachment 2 provides a more detailed list of factual errors we identified in the draft report. Attachment 3 provides comments EPA received from the States that participated in the evaluation. I hope that these concerns can be addressed in your final report. I believe they will make it stronger, clearer, more accurate and more informative. Please include this response, in its entirety, in your final report. We also welcome the opportunity to meet with you in person to discuss our concerns in more detail. If there are additional questions or if your wish further clarification of our comments, please contact William R. Diamond, Director, Drinking Water Protection Division, Office of Ground Water and Drinking Water, at (202) 564-3751. We look forward to working with you on this project and future reports concerning the quality of our nation's drinking water. Attachments cc: Kwai-Cheung Chan Cynthia C. Dougherty William R. Diamond Regional Water Division Directors Regional Drinking Water Branch Chiefs David Terry, Massachusetts Department of Environmental Protection Alton Boozer, South Carolina Department of Health and Environmental Control Marcia Willhite, Illinois Environmental Protection Agency 35 ------- Jack L. Daniel, Nebraska Department of Regulation and Licensure Karen Smith, Arizona Department of Environmental Quality Gregg Grunenfelder, Washington Department of Health Dan Engelberg Leah Nikaidoh Judy Hecht 36 ------- Attachment 1: Response to Specific Recommendations Recommendation 2-1 (a): In order for EPA to effectively manage its capacity development program, it needs to have the right information and related indicators or measures to assess capacity development progress. Therefore, we recommend that the Assistant Administrator for Water to work with its partners and stakeholders to develop and adopt a national definition of "adequate" T/M/F capacity. Nonconcur. Through capacity development guidance that we developed with stakeholders, we believe that we have adequately defined what is meant by technical, managerial and financial capacity (as noted on pages 2 and 3 of the draft report) and the key components to achieving capacity. It is not likely that a national definition of "adequate" TMF capacity can be developed that will address all State capacity development program elements. By the intent of Congress, State capacity development strategies focused on flexibility and moved away from set performance measures. Rather than have a mandated program, States were allowed to choose first to participate, and second, to be allowed the flexibility to develop their own State specific strategies. In several sections of the report, the DIG admits it would be very difficult to assess achievements of the programs given the criteria included in the SDWA. If EPA were to develop a national standard definition of TMF capacity beyond the existing guidance, it would in effect eliminate most of the intended flexibility from the program. Recommendation 2-1 (b): In order for EPA to effectively manage its capacity development program, it needs to have the right information and related indicators or measures to assess capacity development progress. Therefore, we recommend that the Assistant Administrator for Water to work with its partners and stakeholders to identify a set of common measures that can be used to develop and implement national performance goals. Nonconcur. The measures that have capture overall public health protection and therefore reflects our capacity development activities. Recommendation 2-1 (c): In order for EPA to effectively manage its capacity development program, it needs to have the right information and related indicators or measures to assess capacity development progress. Therefore, we recommend that the Assistant Administrator for Water to work with its partners and stakeholders to determine what common capacity development data and/or information resources are available that could be used to support a national capacity development measure, while minimizing data collection burdens to States and water systems. Note 25 Note 26 37 ------- 10 Nonconcur. States established baselines and methods to measure progress as suggested by SDWA 1420(cX2XD). We are trying to improve program evaluation, but we must target data collection to minimize burden to States. We don't think that nationally reportable measures are available or effective to identify the contribution of capacity development to our overall GPRA objective. For enhanced program implementation, we could analyze/evaluate State program documentation and other data (Community Water System Survey, SDWIS data, historical Significant Non-Complier list) to determine if there are commonalities. Note 27 Recommendation 2-2 (a): Using the results in Recommendation 2-1, develop national capacity development measure(s) by identifying capacity development goals to be accomplished, as part of the drinking water annual performance goal(s). Nonconcur. The feasibility of measures that could be assessed on a long-term basis is uncertain. Primary data sources we would consider have longer reporting horizons. For example, the Community Water System Survey is conducted every 4-5 years; the list of systems in historical Significant Non-Compliance is submitted from States to EPA every three years. The feasibility of identifying measures that can be assessed on an annual basis, beyond our existing GPRA measures, is even more unlikely. Recommendation 2-2 (b): Using the results in Recommendation 2-1, develop national capacity development measure(s) by developing specific capacity development measures that support the capacity development annual performance goal(s). Nonconcur. Refer to response for 2-2(a). Recommendation 2-2 (c): Using the results in Recommendation 2-1, develop national capacity development measure(s) by either modifying already existing data collection efforts or developing new data collection processes for capacity development performance measures. Nonconcur. We believe that we could pursue ways to modify existing data collection efforts to collect more meaningful data. In an effort to keep minimize State burden, we would not develop new data collection processes. Recommendation 2-2 (d): Using the results in Recommendation 2-1, develop national capacity development measure(s) by analyzing results of capacity development performance on a national basis, and reporting progress to Congress and the public, as required by GPRA. Nonconcur. Current measures developed for GPRA reporting are sufficient to assess drinking water program effectiveness and is in full compliance with GPRA. GPRA measures for segment-by-segment core program activities are not feasible nor will they provide meaningful indicators of progress. Note 28 Note 28 Note 29 Note 30 38 ------- 11 Recommendation 3-1: Revise 40 CFR 35.3515 (DWSRF withholding regulations) to provide more specific criteria that will allow EPA to conduct meaningful annual assessments of State capacity development strategies. We suggest that EPA also include as criteria the results of Recommendation 2-1, in regards to defining what constitutes adequate T/M/F capacity. Nonconcur. EPA does not believe that the DWSRF regulation is the appropriate mechanism place to dictate to the regions how to assess State reports for the purposes of withholding DWSRF funds. We also believe this is inconsistent with Congressional intent We do believe that we can provide informal guidance and assistance to the regions in reviewing the annual capacity development reports for the purpose of making withholding determinations, provided we determine that such guidance adds value to the process. Recommendation 3-2: Utilizing the outcomes of Recommendations 2-1 and 2-2, develop a national capacity development strategy that promotes T/M/F capacity in a proactive, integrated, flexible, and accountable manner. Nonconcur. Our guidance development, outreach, and advocacy work clearly articulates our vision and outlines our strategy. Recommendation 3-3: In light of Recommendations 3-1 and 3-2, modify capacity development guidance accordingly. Nonconcur. The draft report fails to recognize that Congress chose a flexible approach for States in developing and implementing the capacity development program. Rather than a top down approach, Congress asked that EPA set broad guidelines and that States take charge of developing strategies for addressing the unique characteristics of their State drinking water systems. Changing the capacity development guidance would remove the flexibility we have allowed States to have in implementing the program. Note 31 Note 32 Note 33 39 ------- 12 Attachment #2: Factual Errors and Other Comments General The cover letter incorrectly identifies Bill Diamond as the Director of the Office of Ground Water and Drinking Water. He is the Director of the Drinking Water Protection Division within the Office of Ground Water and Drinking Water. Also, the Regional Capacity Development Coordinators and Regional Water Division Directors are different people. The coordinators are staff, not supervisors. Report Title The title of the report should be changed from "Impact of EPA and State Drinking Water Capacity Development Efforts is Unknown" to "Impact of EPA and State Drinking Water Capacity Development Efforts Uncertain or Needs to be More Fully Documented. Implying that EPA and the States have no idea about the impact of the capacity development program is not helpful and leaves the reader with the idea that the capacity development program is not at all useful. Executive Summary: •Page i,lrt paragraph: The second sentence reads, "These systems vary from very small rural systems to very large systems, and they all face problems with aging infrastructure, underfunding, and future drinking water regulations." We recommend changing the phrase, "they all face problems" to "many face challenges". Page i, lrt paragraph: The last sentence does not logically follow the sentence above it. It refers to the estimated payment gap of $265 billion, which includes both capital and Operations & Maintenance costs. Most of the gap will likely be felt by larger systems, which have greater needs. Also, the sentence before refers to "expanding requirements of the SDWA", but most of the gap is related to those needed to simply operate water utilities and has nothing to do with SDWA requirements. For example, most of the need and gap is due to payments needed for pipes. Page i, 21"1 paragraph: The conference held in January 2003 was on "innovative approaches to addressing the infrastructure gap", not on the gap in and of itself. Please cite where the AA for Water called for capacity development investment as a means to address the issue. Page i, 3rd paragraph: Add "program" at end of the first sentence. Page i, last paragraph: Change the phrase "... unable to determine the extent to which those capacity development efforts have been effective" to "... unable to fully determine the extent to which those capacity development efforts have been effective". This change in language more accurately portrays the current situation. Note 34 Note 35 Note 36 Note 37 Note 38 Note 39 Note 40 40 ------- 13 Page ii, lrt paragraph: We do not agree with the conclusion that EPA cannot report to Congress on the successes of the capacity development program as envisioned by the 1996 SDWA Amendments. We can analyze existing data submitted by States. The data necessary to report successes to Congress wasn't available at the time the evaluation was conducted. Page ii, 2nd paragraph. Mentioning the budget crises that States are currently facing is somewhat irrelevant given that most of the funds that are being used for capacity development are Federal dollars from DWSRF set-asides. Note 41 Note 42 Chapter 1 Page 1, lrt paragraph: The second sentence reads, "These systems vary from very small rural systems to very large systems, and they all face problems with aging infrastructure, underfunding, and future drinking water regulations." We recommend changing the phrase, "they all face problem^' to "many face challenges". Page 2, lrt paragraph: The report mentions the use of the number of water supplies with a historical significant non-compliance (HSNC) as a means of gauging the degree of problem a State may be having in implementing the drinking water program. OIG did not take into account the enforcement actions that had been taken to assure water supplies were committing to corrective actions that, given time for implementation of construction programs, would result in attainment of compliance. They had also failed to recognize the overall high compliance rates that are being achieved. Page 2, 1rt paragraph. What was the number of systems that had a history of SNC in 2000. Without a number there is no sense of the significance of the 91 % value. Is it 91 % of 10 systems, 2000, 20,000? Page 2, lrt paragraph: What is the source for saying that "SDWA requirements had surpassed the TMF capabilities of most small systems?" Page 2, Treatment bullet: It is incorrect to assume that all public water systems treat their water. There are ground water systems that do not have treatment as part of the process of delivering safe drinking water. Page 3, 2nd paragraph. The report States that a potential amount of funds that could have been withheld between 2001-2003 was $359 million. The report seems to reflect that it would have been a good thing to withhold funds from States - which would have had the likely effect of defunding all set-asides and negatively impacting a State's ability to fund infrastructure projects needed for public health. Note that while EPA has not permanently withheld funds, it has, when allowed, held back full award of funds until requirements have been met. Note 43 Note 44 Note 45 Note 46 Note 47 Note 48 41 ------- 13 Page 4,1st paragraph: The third sentence should read, "...by requiring them to have to demonstrate..." Page 4, last paragraph: The last sentence should read, "Congress expects EPA to approve State strategies if the States considered, solicited public comment on, and included as appropriate". After that sentence, there should be two additional bullets added to mirror all of the items as listed in SDWA 1420(c)(2)(A-E), Content. The bullets that should be added are: a description of the institutional, regulatory, financial, tax, or legal factors at the Federal, State, or local level that encourage or impair capacity development; and, an identification of the persons that have an interest in and are involved in the development and implementation of the capacity development strategy (including all appropriate agencies of Federal, State and local governments, private and nonprofit public water systems, and public water system customers). Page 5, lrt paragraph: The first bullet should also include using SDWA resources to: encourage the development of partnerships between public water systems to enhance the technical, managerial, and financial capacity of the systems; and, assist public water systems in the training and certification of operators. Page 5,2nd paragraph: We believe it is incorrect to say your fieldwork ended in November 2002. It is our understanding that your fieldwork ended in February 2002 with the last State site visit. Chapter 2 Page 11, 3rd paragraph: Because the evaluation was premature, it is unfair to say that "The EPA Capacity Development Coordinator said that because of State flexibility, measures of capacity development success would have to be State-specific because all of the States' programs are unique. This means that the Agency will not be able to determine whether the capacity development initiative is working, whether some activities are more or less effective than others, and which components of the program need attention or help." Once we have an adequate amount of time for implementation, EPA and States will be able to determine that the capacity development provisions of SDWA are improving public health protection. There is also no value added by quoting statements made by the EPA Capacity Development Coordinator. Throughout your evaluation time frame, there have been several HQ staff working with the capacity development program. Page 13, 2nd paragraph: Please explain the basis for the statement "...we found that the data that EPA has collected through various reporting efforts were not consistent enough, and were not always complete or accurate." Page 15,2nd paragraph: The statement "EPA indicated it planned to periodically update this report (January 1997 report, " Initial Summary of 'Current State Capacity Development Activities," EPA 816-S-97-001) as States developed and implemented their capacity development programs (although it had not been updated as of the date of our draft report)." is not a true statement. EPA did update this report in July 2001. The report, titled "State Strategies to Assist Note 49 Note 50 Note 51 Note 52 Note 53 Note 54 42 ------- 14 Public Water Systems in Acquiring and Maintaining Technical, Managerial, and Financial Capacity: A Comprehensive Summary of State Responses to Section I420(c) of the Safe Drinking Water Act (EPA 816-R-01-019). Page 15,2nd paragraph under Analysis and Reporting, 1st sentence. Through June 30, 2002, Slates had only expended $30.83 million on capacity development activities under the DWSRF set-asides, not "hundreds of millions of dollars". Chapter 3 Page 19, I*1 paragraph: What is the basis for the following statement: " While EPA provided guidance and publications to assist States in developing their capacity development strategies, the Agency has not continued to provide oversight to ensure that State strategies are being implemented to assist systems most in need of capacity assistance." We believe that EPA is providing adequate oversight to the States. Page 26, lrt paragraph: We believe that you are putting too much emphasis on incorporating managerial and financial capacity into sanitary surveys. Sanitary surveys are only required to be completed every three years for community water systems. State capacity development programs use system "self-assessment" forms to evaluate baseline technical, managerial and financial capacity. The self-assessments can be conducted on a more frequent basis than sanitary surveys. Through training developed by EPA's Drinking Water Academy, we are promoting the linkages of all three elements of capacity into the sanitary survey process. Page 26, last paragraph: Enforcement on very small systems with managerial and financial capacity issues may or may not be a feasible method to achieve compliance, even as a last resort. In cases where community resources (manpower, financial) are severely limiting, a State may decide to take an approach which provides direct assistance as a more effective remedial tool. Usually, enforcement is reserved for a very small number of cases identified as deliberate recalcitrance or resistance to assistance efforts. Page 27, 3"1 paragraph: The third sentence should be finished as follows, "Congress intended States to utilize all three elements of capacity to ensure the system can repay the loan." Page 27, 5* paragraph: The first sentence indicates that "EPA requires that loans go to systems that are addressing all three TMF components". This is incorrect. EPA requires that all systems have TMF capacity in order to receive a loan or agree to make changes to obtain capacity. [40 CFR 35.3520(d)]. Therefore, not all loans were used to solve TMF problems. The last sentence in the paragraph is not needed because States are already required to address managerial and technical issues. It seems odd that elsewhere in the report, States are faulted for spending too much time on technical capacity and not the other elements, while in the DWSRF they are faulted for spending too much time on financial capacity. Our sense is that States are addressing all three elements in their capacity assessment, but that in closing the loan they are looking beyond base financial capacity requirements to ensure repayment of the loan. Note 55 Note 56 Note 57 Note 58 Note 59 Note 60 Note 61 43 ------- 15 Page 27, 6* paragraph: The report states that "EPA recognizes that an infrastructure gap exists. Congress, to address fins funding gap, allowed States...". There is no relationship between EPA's recognition of an infrastructure gap and Congressionally allowed DWSRF set-asides. The Gap Report came out 6 years after the Amendments authorizing the DWSRF program. Recommend deleting italicized text. Page 30, lrt paragraph: The report discusses that the accountability process is ineffective. For example, Region 7 performed a comprehensive evaluation to see how its States were doing after the first year of implementation, and to see if there was anything the region could do to assist the State. EPA holds the States accountable and will recommend a course of action if we see the insufficient implementation of this program. However, before we recommend withholding, we will do everything within our authority to assist the State to achieve in this program. Page 33, Table 3.1: You state that EPA HQ did not provide guidance to the regions on "how to" review the States' Reports to the Governor. We believe that Congress did not intend for EPA to evaluate these reports for content, but rather to ensure the reports were written and submitted to each Governor. Page 34,4th paragraph: What kind of "process" are you looking for? The process for withholding and noncompliance in the DWSRF program is outlined in the regulations (40 CFR 35.3515 and 35.3585) and described in section VILA, of preamble to the DWSRF rule. Page 35, lrt paragraph: The report refers to a discussion with the EPA HQ Capacity Development Coordinator's personal feelings and statements about EPA Regional office management that had instructed their regional coordinators not to use of DWSRF withholding as a sanction against States failing to meet the capacity development requirements (not implementing their strategies) of SDWA. This discussion, however untrue or true, gives the impression that EPA generally never had any intent to enforce non-compliance with the requirements of section 1420 of SDWA and does not belong in a factual report. Note 62 Note 63 Note 64 Note 65 Note 66 44 ------- 16 Attachment 3: Comments from State Participants Note 67 45 ------- 46 ------- Appendix E OIG Analysis of Agency Response Note 1: EPA "expressed disappointment" in the substantive content, study design and evaluation methods, factual accuracy, and tone of our draft report We strongly disagree with the overall characterization of our review by EPA for the following reasons: a. EPA expressed disappointment in the "substantive content" of our draft report We reviewed the capacity development program for existing systems based on the fact that a substantial majority of community water systems are existing systems. These systems serve 264 million people. Therefore, we believed we reviewed areas that could have a significant impact on drinking water issues. We discussed Ibis with EPA when we entered into fieldwork in October 2001, and held monthly meetings with EPA and States during the course of our evaluation, and no concerns were raised about the overall direction of our report b. EPA also expressed disappointment in our "study design and evaluation methods." Two issues lhat seem to be motivating this comment were our initial decisions to not include the names of the States visited, and to review performance measurement of capacity development When we began this study, the States selected for review expressed sensitivity about having been selected, and being characterized as 'Tailing." They were concerned that we had utilized information in SDWIS/FED, which Ihey felt was unreliable. As a result, we indicated that we would not mention the States. However, we subsequently realized this was inappropriate, because including the State names would add significant, valuable information to the report. Further, we noted the information we used on the States was already available to the public via the internet and other sources. As soon as we decided the names of the States should be included in the report, we notified EPA and the six States, and the States accepted our decision. EPA also indicates on page six of its response that the SDWIS data are flawed. However, these are the same data that EPA utilizes to report performance to Congress. Furthermore, these are also the same data EPA used in its current draft "Report on the Environment," stating that: An increasing number of people are served by community water systems that meet all health-based drinking water standards. In 2002, states reported that 94 percent of the population served by community water systems were served by systems that met all health-based standards, up from 79 percent in 1993. Underreporting and late reporting of data affect the accuracy of this information. While the data are considered to be flawed by EPA, at the same time, EPA uses the data to show progress and trend in their official documents. 47 ------- Regarding performance measurement, as part of any evaluation or audit, we generally look at performance measurement related to what we are reviewing. Since capacity development is still in the early stages, we thought it was particularly useful to address this issue, since being able to adequately measure performance is very useful when implementing a new strategy. We do not expect EPA to dictate to the States how to measure their individual programs. However, EPA needs to develop a means to assess sets of capacity development data from States and be able to evaluate capacity development progress on a national basis. c. EPA also questioned the "factual accuracy" of the draft report. There were some minor factual inaccuracies in the draft, and we appreciate the Agency's assistance in correcting those errors. One of the main purposes of issuing a draft report is to identify and correct such errors. The errors EPA identified were not significant and did not alter our positions. Further, as shown in additional comments that follow, most of the issues raised by EPA involved clarification rather than correction. We have added such clarification where appropriate. d. Lastly, EPA expressed disappointment in the "overall tone" of the draft report. We believe that we have presented a balanced and objective assessment of EPA's and selected States' design and early implementation of capacity development. Our presentation included positive aspects of what States and EPA are doing as well as areas that need improvement. However, based on EPA's response, we have included additional information reflecting efforts lhat EPA and States are making in the continued implementation of capacity development programs. Note 2: We strongly disagree with EPA's assessment that our report is "seriously flawed" and demonstrates a lack of understanding of the capacity development program. As part of our evaluation, we clearly present all three facets of capacity development (T/M/F), and how EPA and States' designed and implement capacity development efforts to address T/M/F efforts. EPA stated that we changed the focus of our report from capacity development to national performance measurement. While we looked at performance measurement because reviewing GPRA-related issues is a standard part of OIG evaluations, it was not the primary focus of our report. EPA has chosen to narrowly interpret the SDWA Amendments in regard to what it can and cannot ask or expect States to do in order to meet Congressional intent. We asked our Counsel to review the Agency's response to our draft report regarding this issue. Our Counsel stated that: The first legislative history excerpt cited by OW talks about EPA giving deference to the states. That doesn't mean complete abdication. Even the last sentence of the legislative history language begs the question when it says that EPA should give deference to a state's plan so long as the state solicited public comment "and has adopted a strategy that incorporates appropriate provisions." One could argue that this last phrase gives EPA some say in what "appropriate provisions " should look like to some degree. In addition, OW's next statutory citation to 42 USC 300g-9(c)(4) - which provides that a state's decision with respect to a particular public water system is not review able by EPA 48 ------- and cannot serve as a basis for withholding SRF funds - does not support OW's position that EPA is limited in evaluating a state's capacity development strategy. The plain meaning of this statutory provision precludes EPA from meddling with a state's decision regarding the capacity development of a "particular" system. However, this provision says nothing about EPA's responsibility in ensuring whether a state has executed its responsibility to develop an overall capacity development strategy. Nor does it preclude EPA from withholding SRF funds if a state does not develop such a strategy. EPA maintains that our position would infringe upon the flexibility Congress intended for the States, but we do not believe that is the case. EPA's position does not support Congress's expectation that EPA, through withholding provisions, ensure that public water systems are maintaining and achieving capacity. In our opinion, if Congress wanted this to be solely a State program, it would not have provided such accountability provisions for EPA to utilize. Note 3: We believe that we have presented a balanced and objective assessment of EPA's and the States' design and early implementation of capacity development We certainly identified positive aspects, especially in the area of flexibility. We duly noted that each of the States we visited had designed capacity development strategies in accordance with SDWA and met all reporting requirements to date. We also determined that EPA's approach to designing and launching the capacity development was proactive and used a high degree of stakeholder participation to develop applicable guidance. EPA stated that by examining performance measures of this program we "shifted gears." However, as already discussed, we do not think that was the case. We generally always look at performance measures during our evaluations, and since we observed a management issue concerning the adequacy of measurement for the program we were reviewing, it was our duty to report on that issue. As noted, we discussed the performance measurement issues with EPA and the States at numerous meeting during our field work, and concerns were not raised at the time. Note 4: Contrary to EPA's suggestion, we were unable, through our review of capacity development and EPA's GPRA measures, to identify measures of progress for this critical program. Our findings on this issue are presented on pages 8-9 in our final report. Note 5: We don't share EPA's characterization of our report as a "critique of the legislation and its perceived shortcomings." Rather, in one respect our report can be viewed as a critique of EPA's inability to utilize all of the tools that Congress provided it with (notably the withholding provisions contained in SDWA) to encourage states to fully carry the capacity development provisions of the legislation. Note 6: These challenges, although difficult, need to be addressed. We do not view challenges to success as excuses for not succeeding. EPA is not alone in facing challenges to measuring its performance. As GAO pointed out, throughout the government GPRA is facing several significant challenges. In our draft report, we recognized challenges facing EPA in measuring capacity development We believe that the challenges to performance measurement of this program that we identified in the draft report are solvable. To this end, we offered recommendations to address these challenges. For 49 ------- example, in Recommendation 4 [formerly Recommendation 2-1 (b) and (c) in the draft report], we recommended that EPA work with States and stakeholders to develop a set of common measures that would minimize data burdens on the States. Therefore, we do not agree with the Agency's comments that we ignored these issues in our findings. Note 7: EPA contends lhat we only allowed it a "brief amount of time to comment. On the contrary, we provided 70% more time for its review than our agreement with EPA (set out in EPA Manual 2750) normally allows. According to this manual, the action official is provided 30 days to respond to draft reports. We granted an additional 3 weeks that was requested. However, we declined a second extension requested by your office. On balance we believe that we were quite fair. Moreover, during our monthly meeting in May 2002, and in prior meetings, States expressed the desire to respond to the draft report We left up to your office the decision of whether to include comments from the states. Note 8: We do not agree with EPA's assessment. Throughout our report, we indicate that 1his evaluation was a design and early implementation review. During our discussions with EPA and States, we explained the issues we were developing and provided an opportunity for EPA to address issues early on, before programs become too ingrained. Note 9: EPA misconstrued our point. We were not suggesting in our draft report that EPA has the authorhy to get involved in States' assessments of individual systems. Rather, EPA is required to ensure, as part of its withholding determinations, lhat States, "continue to focus resources" on capacity development initiatives. (EPA expanded on this item at "Note 19" of its comments. Note 10: We appreciate that EPA may feel lhat we should have studied new as well as existing systems. Our decision to review existing systems was one that we made early in our review. As we point out in Note l(a) we decided to review the capacity development program for existing systems because they comprise a substantial majority of community water systems that provide drinking water to the public. This decision to review existing systems was made during our preliminary research phase, and discussed with EPA as part of the initiation of field work, in October 2001. We believed that this was a significant area to be reviewed. Additionally, our-review was facilitated by the fact that EPA tracks and reports to Congress on the compliance of existing systems, as part of its GPRA performance reporting. Note 11: This issue is addressed in Note l(b). Note 12: As we point out in Note l(c) like most internal draft reports, this draft contained some errors. Correcting potential errors is in fact one of the principal reasons why we share our reports with program offices prior to release to the public. However, it should be noted that most of the items raised by EPA in this area were items requiring additional clarification rather than correction. We have added such clarification, where appropriate. Note 13: See Notes l(b) and 3. Note 14: See Note 3. 50 ------- Note 15: See Note 3. Note 16: We regret that EPA believes that our discovery during fieldwork of its failure to adequately measure performance (which it characterizes as "changing the focus" of our study) prevents us from drawing findings about this new information, and that the Agency is, therefore, unable to concur with our recommendations.We disagree with EPA's assessment that the data collection performed did not allow the States the ability to submit information directly to the OIG for consideration. Hie information collected on the performance measurement issues that related to the States was collected during field work, and reviewed by the States as part of the compilations we prepared and sent to the States for review. The national performance information relating to EPA is public information available on EPA's web site and, therefore, EPA should be able to address recommendations relating to performance measurement accordingly. Note 17: See Note 3. Note 18: See Note 8. Note 19: See Note 9. Note 20: See Note 10. Note 21: During our evaluation, we specifically asked the States and Regions about the use of Environmental Financial Centers and Technology Assistance Centers. We have identified on page 3 of the report that SDWA funded these tools to assist States in implementing SDWA. Note 22: See Note l(b). Note 23: See Note 12. Note 24: We agree that EPA's presentation of the 31 percent of DWSRF funding for use for capacity development provides more detail, and offers a better picture on the availability of set-aside funds for capacity development. Note 25: We reorganized our recommendations. As a result, we dropped this recommendation from the final report It is subsumed under other recommendations (notably Recommendation 4). If EPA, as directed in Recommendation 4, can capture results on a national level for capacity development success, a specific definition of adequate capacity may not be needed. We disagree, however, with EPA's assertion that a national standard definition of adequate T/M/F capacity would effectively eliminate flexibility. We suggest that EPA consider outcome-based measures of this critical program Adequate capacity is an end result, which all systems should be expected to achieve. The tools (or outputs) that States use to assist systems do not have to be uniform - States can and should be allowed the flexibility to implement programs. 51 ------- Note 26: We completely disagree with EPA's views that the SDWIS/FED based performance measure is adequate. EPA has identified significant errors in this information source. SDWIS/FED was designed to do one thing, but is being used for another. It is a repository of information about serf- reported violations by public water systems. Only systems in non-compliance are tracked. SDWIS/FED was not designed to measure managerial or financial data Therefore, at best, SDWIS/FED can provide some measure of technical capacity, but not managerial or financial capacity results. Also, as presented in our draft report, program managers make day-to-day decisions using annual performance plans, describing annual performance goals, and measuring outputs and outcomes, but EPA does not have any such information that specifically relates to capacity development. During our fieldwork, the Massachusetts' Drinking Water Program Director stated that he was working with OGWDW on how States can capture results and measure success of their capacity development programs. EPA should continue these efforts, and work to identify and adopt national capacity development annual performance goals. Our recommendation remains in the final report and is now Recommendation 4(a). Note 27: We do not share EPA's pessimism that nationally reportable measures are not available or effective to identify the contribution of capacity development to its overall GPRA objective. For example, OGWDW has been working with stakeholders to address data issues, including OGWDW's efforts to obtain more parametric data The Chief of OGWDW's Infrastructure Branch said, during presentations that he held with various stakeholder groups, that there were perhaps 8 to 10 data elements that States probably already collect that EPA could use to better measure drinking water quality. As we stated in our report, performance measurement is not an easy process. However, we continue to believe that the Office of Water needs to address this issue, and work with its stakeholders and partners to get this process started, as part of the continued implementation of capacity development Our recommendation remains in the final report, and is now Recommendation 4(b). Note 28: We understand the difficulties EPA may face in implementing this recommendatioa However, as presented in the draft report, States (such as Massachusetts) are gathering a variety of data on capacity development activities and analyzing this information annually to assess progress, and identify trends for decision making and outcome measurement. As we recommended, EPA should partner with States, such as Massachusetts, to assist EPA in developing annual performance goals for capacity development. Our recommendation remains unchanged, and is now Recommendation 5(b). Note 29: Depending on what capacity development annual performance goals are adopted, EPA would be in the best position to determine whether new information needs to be collected, or, as EPA indicated, existing data collection efforts provide data Additionally, States, as they implement their capacity development strategies, should be gathering data to measure results against their baselines. Thus, States should already have information available for EPA to use. Our recommendation remains unchanged, and is now Recommendation 5(c). Note 30: As we provided in the draft report, EPA has established annual performance goals for other drinking water activities. EPA believes that, in tiiese instances, annual goals are useful for measuring 52 ------- progress. EPA also stated, in its response, that capacity development is a cornerstone of the drinking water program. Thus, an important program, such as capacity development, should be measured, as well. EPA's program managers need periodic information as to progress being made in capacity development, in order to make resource and policy decisions. Our recommendation remains unchanged, and is now Recommendation 5(d). Note 31: For the final report, our recommendation remains unchanged, and is now Recommendation 2. However, the Assistant Administrator stated that EPA can provide informal guidance to Regions and States to perform annual assessments (as part of Ihe withholding determination process), including appropriate criteria In responding to the final report, EPA should provide an action plan and milestone dates for completing such actions. The guidance, however, should be provided formally, to ensure consistent application by Regions and States. Note 32: Based upon our findings in the draft report, EPA needs to better articulate proactivily, integration, and accountability, to ensure that States' capacity development efforts incorporate managerial and financial capacity assistance. A national strategic plan would include both short- and long-range goals, from which to measure progress. EPA does not have such measures. For the final report, our recommendation remains unchanged, and is now Recommendation 1. Note 33: We do not agree with EPA's response. Providing guidance does not necessarily impede flexibility, especially if the guidance will help States to successfully development and implement their capacity development programs. As done with 1he current capacity development guidance, EPA can make changes to guidance using stakeholder participation, thus getting buy-in from stakeholders and ensuring that flexibility is maintained. Our recommendation remains unchanged, and is now Recommendation 1. Note 34: Corrected. Note 35: Agreed. We made changes to the final report accordingly. Note 36: Agreed. We made changes to the final report accordingly. Note 37: The information provided by EPA gives a more detailed explanation of what makes up the infrastructure gap. No changes were made to the final report. Note 38: The Assistant Administrator's speech is located on EPA's web site, at www.epa.gov/water/speeches/sustaining.html. The Assistant Administrator included in his speech specifically that SDWA capacity development can assess management of water systems to be able to provide T/M/F to reduce long-term costs and improve system performance. Note 39: Corrected. Note 40: Agreed. We made changes to the final report accordingly. 53 ------- Note 41: EPA did not agree with our conclusion, stating lhat it can analyze existing data submitted by States, but lhat data was not available at the time of our review to do this analysis. Tliis implies lhat EPA in the future, will be able to analyze such data and report results to Congress. As Recommendations 4(b) and 5(c) indicate, EPA should begin setting up a framework to review available data, as well as identify data gaps, and determine how it will report results to Congress. We have not changed our conclusion in the final report Note 42: We do not agree with EPA's assessment. State budget problems could and do impact how States will spend resources, including for such programs as capacity development In fact, two States (Arizona and Dlinois) specifically mentioned budget constraints affecting implementation of capacity development when they responded to Ihe draft report. Note 43: Agreed. We made changes to the final report accordingly. Note 44: Hie purpose of providing this information is to demonstrate Ihe needs that small systems have for capacity development assistance, and not to imply that HSNCs are a way to gauge States' drinking water problems. Also, as stated in EPA's response, States' initial capacity development efforts have focused on those systems who are the " 'worst of Ihe worst'... many of which have been in and out of compliance for years and may resist State attempts to provide assistance." These systems would be HSNCs; therefore, the information in our report is relevant. Note 45: Of the total number of HSNCs for 2002 (4,752 systems), 4,281 of those systems were small. Note 46: We paraphrased this information from the 1996 Amendments, Section 3(2), which stated: "because the requirements of the Safe Drinking Water Act (42 U.S.C. 300f et seq.) now exceed the financial and technical capacity of some public water systems, especially many small public water systems, the Federal Government needs to provide assistance to communities to help the communities meet Federal drinking water requirements...." Note 47: We obtained this information from EPA's web site, at www.epa.gov/OGWDW/kids/treat.htTnl. This information was provided to give the general reader an understanding of the drinking water process. Note 48: The purpose of providing this information was solely to give the reader an understanding of the value of DWSRF and the capacity development program, and to illustrate the important responsibility that EPA and the States have to ensure adequate strategy implementation. Note 49: Corrected. Note 50: Agreed Because of editorial considerations, this information is not presented in the final report. 54 ------- Note 51: Agreed Because of editorial considerations, this information is not presented in the final report. Note 52: EPA's information is incomplete. We made a second visit to Illinois in May 2002, and met with OGWDW officials in August 2002. We obtained additional information regarding performance measurement in October and November 2002. Therefore, field work completion is stated correctly as November 2002. Note 53: During our evaluation, we did not have any contradictory statements to the one made by the Capacity Development Coordinator. We have emphasized in the report that this is an early implementation evaluation, and our observations concerning performance measurement are based on the fact that EPA has yet to develop any national capacity development goals. Note 54: This is our conclusion, based upon our analyses of various data collection efforts. In EPA's response, EPA did not question this information specifically. Note 55: This 2001 publication does not indicate that it was performed as a followup to the 1997 publication. Note 56: EPA misunderstood our point that States could potentially lose out on 1he use of funds if they are not sufficiently implementing their capacity development strategies. While States may have expended only $30.83 million on capacity development activities, the fact remains that EPA's annual withholding determinations for fiscal 2001-2003 could have impacted a total amount of $359 million of DWSRF funding to States. Note 57: This is our conclusion, based on our evaluation of the design and early implementation of States' capacity development programs, and the supporting evidence presented in Chapter 3 of the draft report. Note 58: The reason we emphasize sanitary surveys is because the States selected for review rely heavily on this drinking water program to gather information on T/M/F capacities and to target systems for assistance. Other drinking water programs were discussed in the draft and final reports, including operator certification and enforcement. We consider self-assessments to be a tool, not a drinking water program; therefore, we did not present this information in the report. Note 59: We do not disagree with EPA's comments. As stated in Note 45, enforcement is discussed because this was a program that the sampled six States were using as part of their capacity development efforts. Note 60: Agreed. Note 61: We have corrected language in the report that loan recipients either have or will achieve T/M/F capacity through their projects. 55 ------- Note 62: Agreed. Information dropped from the final report. Note 63: We have added information about Region 7's comprehensive evaluations, and included, as Recommendation 3,that all EPA Regions adopt a similar means to assess States' capacity development programs. Note 64: We partially agree with this statement by EPA. While SDWA does not specifically require EPA to review the reports to the Governors as part of its oversight responsibilities, EPA should look at reports like these as an opportunity to assess State progress. Note 65: We believe that information/instructions that were provided to Regions in an informal manner should be included formally in the capacity development guidance document Note 66: While we believe that Ihe Capacity Development Coordinator's opinion sheds light on the difficulties faced in designing and implementing the capacity development program, his statement may give the impression that States should have had tods withheld. We have added clarifying language to the final report Note 67: Generally, the States presented the same issues raised by EPA in its response to our draft report. Therefore, we did not include the States' responses in the final report. 56 ------- Appendix F Report Distribution Headquarters Assistant Administrator, Office of Water (4101) Director, Office of Groundwater and Drinking Water (4607) Agency Followup Official (2201A) Agency Followup Coordinator (2710A) Audit Followup Coordinator, Office of Water Associate Administrator for Congressional and Intergovernmental Relations (1301A) Director, Office of Regional Operations (1108A) Associate Administrator for Communications, Education, and Media Relations (1101 A) Audit Liaison, Office of Chief Financial Officer Regions Regional Water Directors States State Environmental Directors: Arizona Illinois Massachusetts Nebraska South Carolina Washington Office of Inspector General Inspector General (2410) 57 ------- |