i
          55
                     U.S. ENVIRONMENTAL PROTECTION AGENCY

                     OFFICE OF INSPECTOR GENERAL
                     EPA Must Improve
                     Oversight of State
                     Enforcement
                     Report No. 12-P-0113
Decembers, 2011
Scan this mobile
code to learn more
about the EPA OIG.

-------
Report Contributors:                         Dan Engelberg
                                             Kathlene Butler
                                             Charles Brunton
                                             Anthony Chirigotis
                                             Julie Hamann
                                             Tiffme Johnson-Davis
                                             Genevieve Soule


OIG Report No. 12-P-0113, EPA Must Improve Oversight of State Enforcement, was reissued on
January 30, 2012. The original version of the report, issued December 9, 2011, contained an
administrative error in appendix C. On pages 38 and 39, the data underlying the
"RCRA Subtitle C (avg 2003-2009)" column heading, for both the "SNC" and "Penalty"
subheadings, contained a transposition error covering FY2007-2009. The OIG corrected this
error in the appendix. The correction also led to a change in our statement concerning North
Dakota's penalty assessments on page 15, and OIG responses 6 and 18 on pages 45 and 53. The
changes do not affect our findings and recommendations.


Abbreviations

CAA         Clean Air Act
CMS         Compliance Monitoring Strategy
CWA        Clean Water Act
ECHO       Enforcement and Compliance History Online
ECOS        Environmental Council of the States
EPA         U.S. Environmental Protection Agency
FTE         Full-time equivalent
FY          Fiscal year
GAO         U.S. Government Accountability Office
HPV         High-priority violation
NPDES       National Pollutant Discharge Elimination System
OECA       Office of Enforcement and Compliance Assurance
OIG         Office of Inspector General
OTIS         Online  Tracking Information System
RCRA       Resource Conservation and Recovery Act
SNC         Significant noncompliance
SRF         State Review Framework
   Hotline
   To report fraud, waste, or abuse, contact us through one of the following methods:
   e-mail:    OIG Hotline@epa.gov                  write:   EPA Inspector General Hotline
   phone:    1-888-546-8740                               1200 Pennsylvania Avenue NW
   fax:       202-566-2599                                 Mailcode 2431T
   online:    http://www.epa.gov/oig/hotline.htm                Washington, DC 20460

-------
                   U.S. Environmental Protection Agency
                   Office of Inspector General

                   At   a   Glance
                                                             12-P-0113
                                                       December 9, 2011
Why We Did This Review

We sought to determine
(1) whether the U.S.
Environmental Protection
Agency (EPA) set clear
national performance
benchmarks for state
enforcement programs, and
(2) to what extent EPA
headquarters holds regions
accountable and supports them
I to ensure that all state
enforcement programs protect
human health and the
environment. The scope of our
I review included selected
programs under three statutes:
Clean Water Act, Clean Air
Act, and Resource
Conservation and Recovery
Act.

Background

EPA is the steward of national
environmental protection, but
states serve as the first line of
enforcement in most cases. Past
reviews identified widespread
problems with state
enforcement.
EPA Must Improve Oversight of State Enforcement


For further information, contact
our Office of Congressional and
Public Affairs at (202) 566-2391.

The full report is at:
www.epa.gov/oiq/reports/2012/
20111209-12-P-0113.pdf
 What We Found
EPA does not administer a consistent national enforcement program. Despite
efforts by the Office of Enforcement and Compliance Assurance (OECA) and the
EPA regions to improve state enforcement performance, state enforcement
programs frequently do not meet national goals and states do not always take
necessary enforcement actions. State enforcement programs are underperforming:
EPA data indicate that noncompliance is high and the level of enforcement is
low. EPA does not consistently hold states accountable for meeting enforcement
standards, has not set clear and consistent national benchmarks, and does not act
effectively to curtail weak and inconsistent enforcement by states.

OECA has made efforts to improve state performance and oversight consistency,
but EPA does not manage or allocate enforcement resources nationally to allow it
to intervene in states where practices result in significantly unequal enforcement.
As a result, state performance remains inconsistent across the country, providing
unequal environmental benefits to the public and an unlevel playing field for
regulated industries. By establishing stronger organizational structures, EPA can
directly implement a national enforcement strategy that ensures all citizens have,
and industries adhere to, a baseline level of environmental protection. EPA could
make more effective use of its $372 million in regional enforcement full-time
equivalents by directing a single national workforce instead of 10 inconsistent
regional enforcement programs.
 What We Recommend
We recommend that EPA establish clear national lines of authority for
enforcement that include centralized authority over resources; cancel outdated
guidance and policies, and consolidate and clarify remaining enforcement
policies; establish clear benchmarks for state performance; and establish a clear
policy describing when and how EPA will intervene in states, and procedures to
move resources to intervene decisively, when appropriate, under its escalation
policy.

Based on EPA's suggestion in its response to our draft report, we recommend
that EPA develop a state performance scorecard. EPA did not agree with
recommendation 1, agreed with recommendations 2 through 4, and neither agreed
nor disagreed with recommendation 5. All recommendations are unresolved
pending EPA's corrective action plan.

-------
            I       UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
            %                     WASHINGTON, D.C. 20460


   ^t PRO^
                                                                    THE INSPECTOR GENERAL
                                   December 9, 2011

MEMORANDUM

SUBJECT:   EPA Must Improve Oversight of State Enforcement
             Report No. 12-P-0113
FROM:      Arthur A. Elkins, J
             Inspector General

TO:         Bob Perciasepe
             Deputy Administrator

             Cynthia Giles
             Assistant Administrator for Enforcement and Compliance Assurance
This is our report on the subject evaluation conducted by the Office of Inspector General (OIG)
of the U.S. Environmental Protection Agency (EPA). This report contains findings that describe
the problems the OIG has identified and corrective actions the OIG recommends. This report
represents the opinion of the OIG and does not necessarily represent the final EPA position.
Final determinations on matters in this report will be made by EPA managers in accordance with
established audit resolution procedures.

Action Required

In accordance with EPA Manual 2750, you are required to provide a written response to this
report within 90 calendar days. You should include a corrective actions plan for agreed-upon
actions, including milestone dates. Your response will be posted on the OIG's public website,
along with our memorandum commenting on your response. Your response should be provided
as an Adobe PDF file that complies with the accessibility requirements of Section 508 of the
Rehabilitation Act of 1973, as amended. The final response should not contain data that you do
not want to be released to the public; if your response contains such data, you should identify the
data for redaction or removal. We have no objections to the further release of this report to the
public. We will post this  report to our website at http://www.epa.gov/oig.

If you or your staff have any questions regarding this report, please contact Dan Engelberg,
Director for Enforcement and Water Program Evaluations, at (202) 566-0830 or
Engelberg.Dan@epa.gov.

-------
EPA Must Improve Oversight of State Enforcement
12-P-0113
                     Table of Contents
Chapters
   1    Introduction	    1

           Purpose	    1
           Background	    1
           Noteworthy Achievements	    3
           Scope and Methodology	    3

   2    EPA Must Improve Oversight of State Enforcement	    6

           OECA and Regions Have Made Efforts to Improve State Performance	    6
           State Enforcement Programs Are  Underperforming	    8
           OECA Does Not Have Sole Authority Over Regional Enforcement	   11
           EPA Headquarters Does Not Consistently Hold Regions Accountable	   11
           EPA Does Not Shift Resources to Intervene in Problem States	   17
           Conclusions	   19
           Recommendations	   21
           Agency Comments and OIG Evaluation	   21

   Status of Recommendations and Potential Monetary Benefits	   25



Appendices

   A   Details on Scope and Methodology	   26

   B   Prior GAO and EPA OIG Assessments	   32

   C   State Performance Analysis and Results	   35

   D   EPA Comments on Draft Report and  OIG Responses	   41

   E   EPA Attachments to Draft Comment  Memorandum and OIG Responses....   53

   F    Distribution	   74

-------
                                Chapter  1
                                 Introduction

Purpose
             As documented in past reviews by the U.S. Environmental Protection Agency
             (EPA) Office of Inspector General (OIG) and others, inconsistent state
             enforcement of environmental statutes has been a persistent and widespread
             problem. We sought to determine whether EPA has set clear national performance
             benchmarks for state enforcement programs. We also sought to determine to what
             extent EPA headquarters supports regional enforcement activities and holds
             regions accountable for enforcement to ensure that all states protect human health
             and the environment.
Background
             The Clean Air Act (CAA), Clean Water Act (CWA), and the Resource
             Conservation and Recovery Act (RCRA) establish programs to protect Americans
             from environmental pollution. EPA is responsible for ensuring that environmental
             goals are met and that programs are consistently applied nationwide. While EPA
             serves as the steward of many national environmental policies, it relies on states
             to do the bulk of environmental enforcement. At EPA, the Office of Enforcement
             and Compliance Assurance (OECA) is the primary entity setting national
             enforcement policy and monitoring regional oversight of state programs.

             National consistency ensures that all Americans live in states that meet minimum
             environmental standards. Environmental pollution often does not remain
             contained within state borders, so pollution in one state may cause damage in
             another. National consistency is also important because it levels the playing field
             among regulated entities, ensuring that those regulated facilities that fail to
             comply with the law do not have an unfair economic advantage  over their law-
             abiding competitors. EPA emphasizes that national consistency  exists in
             conjunction with flexibility to allow national policy and guidance to
             accommodate differences in regions and states.  For example, the CAA allows a
             state to submit implementation plans  as long as  EPA agrees that the state's
             program meets the minimum requirements of the CAA. States may also have
             programs that are more stringent than required by federal rules.

             Under EPA's organizational structure, regional  offices and headquarters program
             offices like OECA all report directly to the EPA Administrator.  As such, OECA
             does not have sole authority over regional enforcement activities.
12-P-0113

-------
             EPA Can Authorize States to Act on Its Behalf

             The CAA, CWA, and RCRA provide EPA the authority to authorize states to
             enact and enforce some programs if states request authorization and EPA
             approves their basic program. Most states have acquired this authority. The EPA
             Administrator delegated the responsibility for approving state authorization
             requests to the Regional Administrators. This arrangement establishes a chain of
             authority for enforcement that begins with Congress, extends to the EPA
             Administrator, then jointly to the OECA Assistant Administrator and EPA
             Regional Administrators, and finally to authorized state environmental agencies.

             EPA Oversees States and Retains Independent Enforcement
             Authority

             Although most states have received authorization to administer most programs
             under the CAA, CWA, and RCRA, EPA retains a vital role in ensuring that states
             implement nationally consistent programs that meet federal requirements.
             According to federal regulations, EPA should provide appropriate oversight so
             that it knows when states fail to meet their federally mandated enforcement
             commitments. As such, EPA must monitor states to keep apprised of their
             enforcement activities, a task largely left to EPA regions.

             By authorizing states to  enforce portions of these acts, EPA does not forfeit its
             authority to  continue to conduct its own inspections and take action against
             violators, particularly in cases where violations are widespread or related to a
             national priority. Not only  can EPA engage in independent enforcement activities
             in states, but it can also take action against a facility when it determines a state
             either did not act or did not take strong enough action against the facility for  a
             violation (generally referred to as "over-filing"). Ultimately, EPA retains the
             authority to  withdraw state program authority if a state is not taking appropriate
             enforcement actions against violators.

             EPA Strives for Shared Accountability With States

             EPA Administrator Lisa P. Jackson identified the need to build strong state and
             tribal partnerships as one of the Agency's seven priorities. In her October 2009
             message to the House Transportation and Infrastructure Committee, she said  that
             the enforcement performance of states has varied greatly. The Administrator
             stated in the Agency's seven priorities that strong partnerships and accountability
             are very important. She said EPA must do its part to support state and tribal
             capacity and, through strengthened oversight, ensure that EPA delivers programs
             consistently, nationwide.

             In a 1997 issue paper, Cynthia Giles, then Region 3 Enforcement Director and
             currently EPA's Assistant Administrator for  OECA, noted that it is EPA's job to
             oversee what states are doing, to ensure that  all citizens have equal opportunity to
12-P-0113

-------
             enjoy the benefits of good health and a clean environment.1 As one of EPA's
             2011 enforcement goals, OECA set out to reset EPA's relationship with states, to
             better ensure that the Agency and states both meet their environmental protection
             commitments.2 OECA intends to achieve this goal through shared accountability
             and strengthened oversight. OECA is in the process of establishing new models
             for shared accountability and strengthened oversight in the CWA program
             through its 2009 Clean Water Act Action Plan. It intends to pursue similar
             initiatives for the CAA and RCRA.
Noteworthy Achievements
             EPA has taken a number of steps to improve oversight of state enforcement. In
             2004, EPA and others established the State Review Framework (SRF), a national
             system for reviewing state enforcement performance with respect to the CAA,
             CWA, and RCRA. In 2009, OECA began developing a revised CWA reporting
             procedure, and developed the Clean Water Act Action Plan. We present more
             information on these achievements in chapter 2.
Scope and Methodology
             We conducted this evaluation in accordance with generally accepted government
             auditing standards. Those standards require that we plan and perform our work to
             obtain sufficient, appropriate evidence to provide a reasonable basis for our
             findings and conclusions based on our audit objectives. We believe that the
             evidence obtained provides a reasonable basis for our findings and conclusions
             based on our evaluation objectives. We conducted our evaluation from September
             2010 to July 2011.

             We evaluated EPA's performance under three environmental laws to determine
             (1) whether EPA has set clear national performance benchmarks for state
             enforcement programs, and (2) to what extent EPA headquarters holds regions
             accountable and supports them in their activities. We assessed EPA oversight of
             state enforcement under the CAA Title V program, the CWA National Pollutant
             Discharge Elimination System (NPDES) program, and the RCRA Subtitle C
             program. We assessed EPA oversight for all 50 U.S. states and across all 10 EPA
             regions.

             To answer our objectives, we looked to several data sources, including EPA
             enforcement data, available SRF reports, a 10-region survey, and testimonial and
             documentary evidence gathered in interviews with seven states, EPA headquarters
             personnel, and enforcement personnel in six regional offices.
1 Cynthia Giles, Region 3 Enforcement Director, "Aiming Before We Shoot," 1997.
2 "EPA Enforcement Goals," http://www.epa.gov/compliance/data/planning/initiatives/goals.html.
12-P-0113

-------
              To understand whether state enforcement activities varied over time and
              geographically, we evaluated the enforcement data reported to EPA in fiscal years
              (FYs) 2003-2009. After consultation with OECA, we identified three key
              indicators of state enforcement activity: (1) the percent of facilities states
              inspected each year, (2) the percent of inspections that resulted in a state
              identifying significant noncompliance or a high-priority violation, and (3) the
              percent of violations that resulted in the state issuing a formal enforcement action
              with a penalty.

              Each of these metrics comes with several caveats about the underlying data.
              (Many of these caveats are explained in appendix A, and were previously reported
              by OIG.3) However, the OIG conducted a data reliability assessment and
              determined that utilizing the metrics offer a valid method for comparing state
              activities that took into consideration several key factors.  Among them:

                 •  The metrics factor in the size of a state's  regulated universe by using
                     percentage of facilities inspected.

                 •  Because we drew the metrics and their values from EPA databases, they
                     provide the same information the public sees when viewing state
                     enforcement performance on EPA websites.4

                 •  The three metrics represent two priority issues EPA identified as national
                     weaknesses in its initial SRF reviews.

                 •  The metrics allow for state-to-state performance comparison based on
                     priority issues to EPA. However, without corroboration they do not offer
                     reliable details about individual state performance.

              Over the 7-year period FYs 2003-2009, EPA enforcement data for these metrics
              show that some states performed well across the board, some states performed
              well in one or two statutes, and some states performed poorly across the board.
              Over the same period, state performance remained relatively consistent, though a
              few states improved over the period and a few others declined. To better
              understand state performance issues, we gathered additional evidence in states
              that ranked in the bottom quartile for two or three statutes.

              Our in-depth analysis focused on  states for which EPA data showed (1) persistent
              problems meeting national enforcement goals related to the SRF, (2) states
3 See, for example, EPA Needs to Improve Its Recording and Reporting of Fines and Penalties, Report No. 10-P-
0077; EPA Could Improve RCRAInfo Data Quality and System Development, Report No. 1 l-P-0096; ECHO Data
Quality Audit - Phase I Results: The Integrated Compliance Information System Needs Security Controls to Protect
Significant Non-Compliance Data, Report No. 09-P-0226; and ECHO Data Quality Audit - Phase 2 Results: EPA
Could Achieve Data Quality Rate With Additional Improvements, Report No. 10-P-0230.
4 The public database (Enforcement and Compliance History Online, or ECHO) does not provide data on all three
metrics for all years we included here.
12-P-0113

-------
             showing recently improved performance based on the three metrics we chose, or
             (3) states showing comparatively high performance. From this analysis, we chose
             to interview the states of South Carolina, Illinois, Louisiana, Iowa, Colorado,
             North Dakota, and Alaska, and enforcement officials in the associated EPA
             Regions, 4, 5, 6, 7, 8, and 10.

             Appendix A provides additional details on the scope and methodology, and
             appendix B presents information on prior U.S. Government Accountability Office
             (GAO) and EPA OIG assessments related to state enforcement issues.
12-P-0113

-------
                                Chapter  2

   EPA  Must Improve Oversight of State Enforcement

             EPA does not administer a consistent national enforcement program. Despite
             OECA efforts to improve state enforcement performance, state enforcement
             programs frequently do not meet national goals, and states do not always take
             necessary enforcement actions. OECA, the EPA office responsible for
             enforcement, does not have sole authority over a national civil enforcement
             workforce or state oversight.  OECA cannot set clear and consistent benchmarks
             for state enforcement performance, or assure consistent oversight of state
             enforcement. EPA does not allocate enforcement resources according to the
             enforcement workload or high-priority state enforcement problems. As a result,
             EPA's enforcement program  cannot assure equal and sufficient protection of
             human health and the environment to all U.S. citizens or consistent enforcement
             of regulated entities.

OECA and Regions Have Made Efforts to  Improve State Performance

             In response to internal and external critiques of EPA's oversight of state
             enforcement activities, OECA has made efforts to improve state  performance and
             standardize EPA oversight by implementing the SRF, and through  several other
             headquarters efforts. Meanwhile, some regions have succeeded in improving state
             performance in some cases when they used existing tools to intervene in states.

             EPA Established the State Review Framework

             In 2004, EPA worked closely with the Environmental Council of the States
             (ECOS) to establish the SRF, a national system for reviewing state enforcement
             performance under the CAA, CWA, and RCRA. Under this system, EPA regions
             evaluate states on 12 nationally consistent elements as a means of consistently
             judging state performance, and comparing individual states to national goals.5
             OECA based many of the metrics for each element on metrics developed in EPA
             policies and guidance. The SRF allows regions to recommend improvements to
             the states in public reports. EPA conducted the first round of reviews in
             FYs 2004-2007, and EPA is  conducting the second round beginning in FY 2008
             through the end of FY 2012.6

             GAO reported that EPA has improved its oversight of state enforcement programs
             by implementing the SRF as  a consistent approach for overseeing the programs.
             Our evaluation verified some benefits from the SRF, and regional enforcement
5 There is a 13th optional element for compliance assistance. For this evaluation, the OIG used three of EPA's SRF
measures as proxies for state enforcement quality, those related to inspection coverage, identification of violations,
and penalty assessment.
6 To see all final SRF reports for Rounds 1 and 2, visit http://www.epa.gov/compliance/state/srf.
12-P-0113

-------
             officials said that the program was beneficial. First, posting reports on the EPA
             website put public pressure on states. Second, the SRF has raised the enforcement
             discussion to the level of state agency commissioners.

             An external review of the Round 1 SRF reports identified four major issues
             impairing state enforcement: data quality, identification of significant
             noncompliances (SNCs) or high-priority violations (HPVs), penalty calculation,
             and taking "timely and appropriate" enforcement actions. While OECA indicated
             that these were problems, it has yet to issue formal guidance on how to address
             the four issues, and has since focused its attention on other activities.

             EPA Undertook Additional Efforts

             In June 2010, EPA issued the Clean Water Act Action Plan Interim  Guidance in
             an attempt to target resources toward the biggest threats to water quality
             regardless of their size, and to enforce vigorously against unpermitted and illegal
             discharges. The action plan commits EPA to establish clear expectations for state
             performance, hold states consistently accountable, and hold itself to the same
             standards where EPA implements programs. Finally, the action plan commits
             EPA to provide more complete, accurate, and timely information to  the public. By
             doing so, EPA hopes to enlist the public as an ally to press the regulated
             community and government for stronger accountability. The interim guidance
             directs that (1) EPA regions and states expand NPDES annual planning to include
             consideration of enforcement and permitting in an integrated way, and (2) EPA
             regions take action in states that have demonstrated long-standing problems with
             their permit quality or enforcement programs.

             EPA is currently developing an electronic reporting rule that will require NPDES
             permittees to submit facility discharge, monitoring, and reporting information
             electronically. During the course of this evaluation, EPA was in the  process of
             modifying the noncompliance and reporting requirements for the NPDES
             delegated state programs. In response to a 2010 EPA OIG report, EPA has agreed
             to develop criteria and regularly review and update EPA-state memoranda of
             agreement.7 In addition, to increase transparency, EPA launched the CWA trends
             map and annual noncompliance report tool using FYs 2008 and 2009 CWA data.
             This tool allows the public to access  basic mapped data on assessed  penalties,
             inspection rates, and formal actions taken for all major and nonmajor facilities.

             Regions Successfully Used Existing Tools

             EPA regions have tools to intervene in states that are not enforcing environmental
             laws according to EPA's expectations.  These tools range from minor (phone calls,
             independent EPA inspections) to moderate (objecting to a state permit, taking
             independent enforcement actions, or over-filing in the state), to revoking a state's
7 EPA OIG, EPA Should Revise Outdated or Inconsistent EPA-State Clean Water Act Memoranda of Agreement,
Report No. 10-P-0224, September 14, 2010.


12-P-0113

-------
             authorization in the most severe cases. Our interviews showed that moderate
             intervention tools succeeded more often than other tools in motivating states to
             make fundamental changes to enforcement programs:

                 •   Region 4 objected to and assumed issuance authority for over 20 South
                    Carolina CWA permits in 2004, and took independent enforcement actions
                    on several other permits because the state legislature passed legislation
                    that conflicted with federal regulations. The EPA intervention enforced the
                    law when the state could not, and led to a state legislative change that
                    brought South Carolina's water protection up to federal standards in this
                    area.

                 •   Region 8 took over storm water inspections for Colorado in 2006 when
                    the state was not taking enforcement actions against violators. The region
                    took its own enforcement actions, and EPA enforcement data show that
                    the state program has increased its enforcement activity.

                 •   Region 7 took 15 enforcement actions against concentrated animal
                    feeding operations in Iowa in 2010. In coordination with the Iowa
                    Department of Natural Resources, Region 7 used aerial flyovers to
                    identify discharging concentrated animal feeding operations for sampling
                    inspections. Because of the enforcement effort, many of the operations
                    applied for NPDES permits and began constructing runoff controls.

             With decisive EPA oversight and accountability, state enforcement deficiencies
             such as these are more likely to be resolved.

State Enforcement Programs Are Underperforming

             Despite EPA's efforts, state enforcement programs frequently do not meet
             national enforcement goals.8 We compared state performance against some of
             OECA's national enforcement goals, and compared states to  each other based on
             three enforcement metrics. We found that states did not meet some national goals,
             and some  states performed far below the average.

             Using the  three metrics described in chapter 1, we assessed the average
             performance reported in EPA databases for  each state and each statute over the
             FYs 2003-2009 analysis period. These data show that performance was low
             across the board, and the range of average performance was wide.

             All of the  regions except for Region 2  included at least one state that performed in
             the bottom quartile in the different program analyses. In some regions, EPA staff
             and officials corroborated our assessment by agreeing that particular states in their
             region presented oversight challenges, while in other regions, EPA officials did
1 According to 2010 SRF data from EPA's Online Tracking Information System.
12-P-0113

-------
              not agree that any state in their region did not perform well. (See appendix C for
              state-level data.)

              Sfafe Programs Frequently Do Not Meet National Goals

              The CAA, CWA, and RCRA, and their associated regulations, establish minimum
              federal requirements for authorized state programs. The regulations require that
              states have minimum enforcement capabilities to receive authorization for these
              programs; for example, states must have the ability to assess civil penalties.
              OECA uses policies, guidance, and other programs to communicate additional
              expectations to the states on a program-by-program basis. Our review and
              analysis of EPA's enforcement data show that state enforcement performance
              frequently does not meet national goals set in these documents. (See appendix C,
              table C-2 for additional comparison data.)

                 CAA Title V Performance

                    •  EPA set a national goal that states inspect 100 percent of major CAA
                        emitters every 2 years.9 States inspected an average of 89 percent of
                       these facilities in the 2-year period, but only eight states met the
                        100 percent goal.

                    •  EPA set a national goal that states enter 100 percent of high-priority
                        CAA violations into EPA data systems within 60 days.10 The national
                        average indicates that states only enter 35 percent of HPVs in that time
                       frame, and only two states met the 100 percent goal.

                 CWA NPDES Performance

                    •  Because the CWA major inspection goal changed in 2007, we looked
                        at this metric for 2006 (prior to the change) and for 2010. In 2006, the
                       national goal was that states inspect 100 percent of majors every  year.
                        The national average was only 66 percent of facilities inspected,  and
                        only one state met the goal by inspecting 100 percent of facilities.

                    •  In 2007, EPA issued a new Compliance Monitoring Strategy (CMS)
                       that reduced the national goal so that states were to inspect 100 percent
                        of CWA majors every 2 years, starting in 2009. The national average
                       for 2010,  the year after the new goals went into effect, was only
                        61 percent. While two states met the 100 percent inspection goal,
                        13 states inspected fewer than 50 percent of major facilities, indicating
                       that they were not on track to meet the goal of 100 percent every
                       2 years.
9 EPA, "Issuance of the Clean Air Act Stationary Source Compliance Monitoring Strategy," September 2010.
10 EPA, "Issuance of Policy On Timely and Appropriate (T&A) Enforcement Response to High Priority Violations
(HPVs)," December 1998.
12-P-0113

-------
                  RCRA Subtitle C Performance

                     •  EPA set a national goal that state agencies inspect 100 percent of large
                        quantity waste generators every 5 years.11 In 2010, states had only
                        inspected an average of 62 percent of these facilities for the 5-year
                        period just ending, and only two states met the 100 percent inspection
                        goal.

              See appendix C for additional comparisons of performance to goals.

              Sfafe Enforcement Is Inconsistent

              Our analysis shows that in addition to not meeting goals, state enforcement is also
              inconsistent among states and between EPA regions. Based on  an average of three
              enforcement measures for each program (percentage of facilities inspected,
              percentage of SNCs or HPVs identified per inspection, and percentage of final
              actions with penalties for FYs 2003-2009), we found that state performance
              varied widely across the country.12 For example, CAA performance varied by
              almost 50 percentage points. This range in state enforcement activity illustrates
              that some states inspected facilities, identified violations, and/or assessed
              penalties for violations at a much  higher rate than  other states.

              These programs are complex, and differences exist at all levels of analysis. In
              addition to the significant variations from state to  state, there was often a lack of
              uniformity even within states. Many of the highest-performing  states in our
              overall analysis did not meet all of the national performance goals. For example,
              Alabama ranks the highest for CAA but does not meet the SRF stated goal that
              states identify HPVs at a  rate greater than or equal to half the national average,
              and also does not meet the goal that states enter 100 percent of HPVs into EPA
              data systems within 60 days.

              Average state performance  varied by EPA region  as well. Because regional
              approaches to state oversight varied, state performance varied depending, in part,
              on the region overseeing the state. Using the same aggregate performance
              measure, we summarized state performance by region and found that performance
              at a regional level of analysis varied as well.
11 EPA, "Compliance Monitoring Strategy for the Resource Conservation and Recovery Act (RCRA) Subtitle C
Program," January 2010.
12 Because these measures combine three different metrics, they are most useful for comparing state performance
and showing the performance distribution, and they are not useful for making inferences about an individual state's
performance. Appendix A includes a discussion of the context and limitations associated with measuring and
interpreting these metrics. As described in appendix A, this analysis did not include programs in which EPA retains
sole enforcement authority.
12-P-0113                                                                               10

-------
OECA Does Not Have Sole Authority Over Regional Enforcement

             Although EPA intends for OECA to be responsible for national enforcement
             policy, OECA does not have sole authority over EPA's national enforcement
             workforce or state oversight. Instead, EPA's  10 Regional Administrators have
             oversight authority over state enforcement; they apply regional criteria about what
             constitutes good state performance and determine when and how to intervene if
             they feel that states are not performing to expectations.

             OECA's Assistant Administrator has direct authority over national criminal
             enforcement, but only over the headquarters-based civil enforcement program.
             OECA lacks critical region-level program information; OECA officials have
             requested information from the regions but the information received or calculated
             was incomplete or otherwise known to be inaccurate. Without this basic
             knowledge,  OECA cannot effectively or efficiently manage its national program.
             OECA's lack of direct authority over national enforcement responsibilities and
             resources has continued to result in inconsistent enforcement and  state oversight.

             In the absence of direct lines of authority between OECA and EPA's regional
             enforcement workforce, OECA must persuade regions to cooperate with national
             initiatives and uniform oversight practices. The OECA Assistant Administrator
             stated that OECA must apply persistent pressure to get some regions to intervene
             in difficult states.

             OECA uses  a number of tools such as the SRF, watch lists for problem facilities,
             national program managers' guidance, the EPA performance measurement
             system, and  annual region reviews to encourage regions to participate in and
             contribute to a national enforcement program. These may be useful tools for
             managing the national program. However, OECA still does not have direct
             authority over EPA's regional enforcement workforce or state oversight, and
             cannot influence state performance directly. In our opinion, EPA headquarters and
             OECA, the office responsible for establishing a national enforcement program,
             cannot hold  regions accountable for state enforcement of CAA, CWA, and RCRA
             requirements.

EPA Headquarters Does Not Consistently Hold Regions Accountable

             EPA  has not consistently held regions accountable for ensuring that states
             adequately enforce environmental laws. EPA has not set clear and consistent
             national benchmarks for state performance, and has not held regions accountable
             for abiding by national oversight guidance.

             OECA Has Not Set Consistent Benchmarks

             State enforcement programs are complex and varied. However, as the national
             steward of environmental enforcement, EPA must set national benchmarks that
12-P-0113                                                                          11

-------
              establish the enforcement expectations that states and EPA agree to meet. EPA
              has not set consistent benchmarks for state performance. As a result, EPA cannot
              hold EPA regions accountable for ensuring that states meet a national standard,
              and headquarters does not objectively know which states require immediate
              intervention. If state performance exceeds the national benchmarks, states and
              regions can then address local priorities.13

              Some state officials we spoke with were not opposed to EPA ranking their
              performance against that of other states. One state enforcement official said that it
              would be interesting to know how his state program compared to others so that he
              could ensure that his state program is performing as well as possible.

              We interviewed enforcement officials in six EPA regions and seven state
              environment agencies about how they evaluate enforcement program quality. We
              also reviewed the CAA, CWA, and RCRA for benchmarks contained within the
              law, and reviewed hundreds of associated EPA guidance and policy documents.
              We found that none of these information sources outlined clear and consistent
              benchmarks. Most of the performance requirements established in the laws and
              regulations are not easily measurable. For example, the regulations require
              appropriate penalties, but do not define "appropriate".  To bridge the interpretive
              gap, EPA has  created policies and guidance to establish expectations for a
              national enforcement program. Generally, states are not bound by EPA policy or
              guidance that EPA has not codified in the laws and regulations. However, if
              authorized states do not to abide by EPA policies and guidance, EPA retains the
              authority to step in.

                 Headquarters Policies and Guidance Not Clear

                 EPA regulations and guidance documents do not set  clear and consistent
                 benchmarks for state enforcement performance. EPA has not consolidated and
                 clarified the long lists of documents to show which ones are active and
                 important to EPA, the documents sometimes conflict with one another, and
                 EPA regional offices interpret the documents differently.

                 Through its many policies and guidance documents,  EPA has given states
                 information on how to run their enforcement programs. However,  EPA has not
                 clearly identified the key documents for operating a state program. According
                 to an OECA official, OECA's internal websites list the most significant
                 enforcement documents for managing state enforcement programs. In these
                 lists, we found 9 guidance or policy documents for the CAA, 22 for the CWA,
                 and 13 for RCRA. However,  a public OECA Internet site that contained
                 enforcement policy and guidance documents listed over 200 documents for the
13 Memoranda of agreement between EPA and states are one tool for setting performance expectations for EPA and
states, but the OIG previously found that CWA memoranda of agreement did not set consistent, national
benchmarks. EPA OIG, EPA Should Revise Outdated or Inconsistent EPA-State Clean Water Act Memoranda of
Agreement, Report No. 10-P-0224, September 14, 2010.


12-P-0113                                                                            12

-------
                 CAA, over 240 for RCRA, and listed 69 as important for the CWA. This
                 inconsistency makes it difficult for state enforcement officials to know which
                 documents are the most critical and where to find them; one state official told
                 us that there was policy "overload."

                 In addition, some policies have been in interim status for more than a decade,
                 such as the "Interim Clean Water Act Settlement Penalty Policy," which was
                 issued as an interim policy in 1995, and is indicated as a key document in
                 EPA's SRF guidance. EPA has also updated some documents through
                 memoranda and amendments rather than full updates of the documents. This
                 approach may confuse states and reduce the efficiency of the day-to-day
                 program operations as states search for the appropriate guidance.

                 Policies and Guidance Conflict

                 In some situations, the guidance and policy documents conflict with one
                 another. For example, OECA updated the NPDES CMS in 2007 to establish a
                 new goal that states inspect all major facilities every 2 years rather than
                 annually. However, SRF reports still measured state performance against the
                 previous annual goal, and the SRF Plain Language Guide for CWA did not
                 indicate that the compliance monitoring system changed the inspection
                 frequency.

                 In another example, state officials expressed confusion about whether EPA's
                 CAA and RCRA Enforcement Response Policies and the SRF requirements
                 were coordinated. The officials were concerned that their state SRF report
                 could potentially indicate that the state did not meet program requirements
                 when they were abiding by the relevant EPA enforcement response policy.

                 Regions  Develop Independent  Interpretations

                 Without clear and consistent guidance from headquarters, regions develop
                 their own guidance and interpretations. Regional officials said that their states
                 need a lot of guidance because parts of the regulations are confusing, unclear,
                 or outdated. Enforcement officials in one region said that their states often
                 interpret EPA's policies differently, especially those that are not clear.
                 Enforcement officials in this region said that EPA issues numerous new
                 regulations without guidance for implementation, which makes it difficult for
                 the states and regions to adopt the new regulations. They said that as a result,
                 states are not implementing some new regulations. In one region, officials
                 stated that headquarters should provide more guidance and training to keep
                 regions and states up to date with continuously changing regulations. Officials
                 from this region said that headquarters relies too  much on the regions to
                 develop their own guidance and outreach materials, which they said should be
                 headquarters' responsibility.
12-P-0113                                                                             13

-------
             The absence of consistent benchmarks creates confusion about what EPA sees as
             a good state program and contributes to inconsistent performance. Staff and
             officials in EPA regions told us that it would be helpful if OECA clarified its
             national policy and guidance, and codified national enforcement standards in
             regulations. Staff and officials in states told us it would be helpful if EPA reduced
             policy "overload" and "overlap" by centralizing, finalizing, and keeping guidance
             up to date, and by codifying pertinent guidance in regulations. Without clear and
             consistent national benchmarks, EPA cannot operate a national enforcement
             program that consistently interprets and enforces environmental laws.

             OECA Has Not Ensured That EPA Regions Consistently Oversee
             States

             EPA regions vary in the stringency of their oversight. As a result, EPA is not
             operating a nationally consistent enforcement program, and EPA headquarters
             cannot prioritize  state enforcement program  improvements based on performance.
             Despite issuing guidance on conducting SRF oversight reviews, EPA regions did
             not consistently conduct or report on their reviews. In addition, enforcement
             officials in three  of the seven states we interviewed said that some regions held
             their states to a more stringent standard than others did because performance
             benchmarks vary by EPA region.

             EPA's SRF reviews demonstrate inconsistency among EPA regional approaches
             to oversight. EPA designed the SRF to be a nationally consistent review of state
             enforcement programs. OECA held training workshops across the nation and
             distributed almost 50 guidance documents to regions on how to conduct SRF
             reviews and write the associated reports. Despite OECA's efforts to standardize
             the process, regions implemented the SRF process inconsistently. OECA did not
             successfully enforce consistency in the state reports through its guidance or its
             headquarters review processes.

             Regions' SRF reports differed in both quantity and quality of information about
             the states. The discrepancies limit the reader's ability to find useful information in
             the reports or to compare reports for different states. For example, Region 6's
             Round 1 SRF report on Oklahoma was 22 pages long, and Region 7's on Kansas
             was 225 pages. However, we found that both reports were lacking in information
             on certain statutes and metrics. In contrast, Region 6's report on Arkansas was
             only 27 pages, but was relatively complete.

             After receiving the first 25 Round 1 reports, OECA issued a "Guide to Writing SRF
             Reports (Interim  Final)" in April 2007. In the accompanying transmittal memo,
             OECA said, "To  date, inconsistencies in the information available in the SRF
             reports inhibit our ability to determine if the reviews are being carried out in a
             consistent manner." OECA offered a report formatting guide and eight pages of
             guidance on completing the reports, and noted that regions were  "encouraged (but
             not required) to follow the format of the guide."  Despite the guidance, the OIG
12-P-0113                                                                            14

-------
             found similar inconsistencies in Round 2 SRF reports with the shortest report
             (Region 2, New Jersey) at 39 pages and the longest report (Region 7, Iowa)
             reaching 220 pages.

             The style and content of the reports also varies widely. Some regions were critical
             of their states in the SRF reports while others emphasized their states'
             accomplishments over their problems. In addition, the level of detail varied. For
             example, three Round 2 SRF reports did not include quantitative values for
             multiple metrics in their findings tables. Seven of the Round 2 reports did not
             provide national average and/or national goal information for many of the
             findings. Because these regions did not present the requirements along with the
             performance information, it was difficult for the reader to determine whether the
             state met requirements.

             Regions Have Not Effectively Curtailed  Weak and Inconsistent
             Enforcement by States

             EPA regions do not consistently intervene in states to correct deficiencies.  In a
             region where states performed better than average based on our analysis, a state
             enforcement director said that he felt that his region held its states to a higher
             standard  of performance than other regions do. A state enforcement manager in a
             different region agreed, saying  that the state's region held the state to standards
             that are higher than those  in other regions. Multiple state directors called for
             consistent benchmarks and oversight nationwide so that all regions hold all states
             to the same standards.

             OECA, regional, and state enforcement officials agree that states are
             underperforming. However, regional efforts to  correct deficiencies have not
             consistently led to changes in state performance because regions do not always
             assertively intervene in state programs when issues arise. The states highlighted
             below were among the lowest-ranked states across programs in FYs 2003-2009
             based on the average of the performance metrics we chose. In the following
             examples, to date, EPA regions had not acted decisively enough to improve
             performance. In each of the states, enforcement officials had a different reason for
             ranking low according to these  metrics, but in each case, the EPA region did not
             substantively change the states' performance.

                 •   North Dakota: EPA data show that North Dakota ranks in the bottom
                    quartile for two of three statutes (CAA  and RCRA). (See maps Cl through
                    C4 in appendix C.) According to the FY 2009 North Dakota SRF report,
                    the state inspected 100 percent of its major CWA facilities. However,
                    despite inspecting facilities, the state reported an SNC rate of 3.8 percent
                    for its major facilities, well below the national average of 23.2 percent.
                    EPA data show that the state assessed no penalties against known CWA
                    violators during the entire period of analysis (FYs 2003-2009). In
                    response to the state's low CWA performance, Region 8 enforcement
12-P-0113                                                                             15

-------
                    officials told us they increased their inspection coverage and file reviews.
                    They said that their increased attention in the state resulted in increased
                    permits for storm water, but that the facilities with permits were not in
                    compliance when the region returned to inspect them. The state still was
                    not enforcing the CWA storm water requirements.

                    Louisiana: EPA data show that Louisiana ranks in the bottom quartile for
                    two statutes (CAA and RCRA) and the bottom half for all three statutes. In
                    2001, citizens filed a petition  with EPA urging a withdrawal of the state's
                    CWA NPDES program authority for many reasons, including lack of
                    adequate enforcement.  This was one of many withdrawal petitions filed
                    for Louisiana. (Citizens filed petitions to withdraw CAA and RCRA
                    authority as well.) The  region responded by conducting audits in the state.
                    The region found several deficiencies and required the state to change
                    some policies and develop new measures. Although the state has
                    completed the recommended actions, the state's poor performance
                    persisted; our analysis found that Louisiana has the lowest enforcement
                    activity levels in Region 6 and ranked in the lower half for the CWA and
                    lowest quartile for CAA and RCRA for FYs 2003-2009.14 (See maps Cl
                    through C4 in appendix C.) State, EPA regional, and external interview
                    responses attributed Louisiana's poor performance to several factors,
                    including a lack of resources,  natural disasters, and a culture in which the
                    state agency is expected to protect industry.

                    Alaska: EPA data indicate that Alaska ranked in the bottom half for both
                    of its authorized statutes (CWA and CAA), although it just began  phasing
                    in its authority for the NPDES program in 2008. (See maps Cl through C4
                    in appendix C.) Therefore, Alaska's enforcement data for the CWA are
                    largely a reflection of EPA direct implementation in FYs 2003-2007.
                    However, since program authorization began, all available SRF data show
                    that the state has not taken any formal enforcement actions nor issued any
                    penalties for any facilities found to be out of compliance. Regional
                    directors told us that when the region authorized the state to run the
                    program, both the region and  OECA officials were aware that the  state
                    lacked the capacity to be successful. At the time of our review, the region
                    had moved to delay the final phase of authorization, but did not ensure that
                    the state demonstrates a minimum level of performance before it advances
                    to the next authorization phase.

                    Illinois: EPA data indicate that Illinois consistently ranked among the
                    lowest-performing states for two of the three programs (CAA and RCRA)
                    and was in the bottom half for all three in FYs 2003-2009. (See maps Cl
                    through C4 in appendix C.) Despite this record, EPA enforcement data
14 We requested documentation about the audit process and results from Region 6 staff, but regional personnel told
us that they were unable to provide documents from that period. In response to the draft report, OECA provided
documentation indicating that the state completed the required tasks.
12-P-0113                                                                             16

-------
                    show that Region 5 has inspected a lower percentage of Illinois CWA and
                    CAA facilities as compared with some other states in the region, and the
                    region's RCRA inspection coverage has been declining in recent years. In
                    2011, the region developed an intervention strategy for this state, which it
                    was in the process of implementing during this evaluation. It is premature
                    for us to determine the results of the intervention.

              Although these examples suggest different reasons for low enforcement
              performance, each state provides a scenario in which EPA's stewardship over
              national enforcement has not overcome state deficiencies. The potential causes for
              deficiencies vary, and could range from a state philosophically opposed to taking
              enforcement action (North Dakota) to a state that cannot enforce because it is
              overwhelmed by a natural disaster (Louisiana). Regardless of the cause, when a
              state is operating a federal CAA, CWA, or RCRA program, EPA must intervene
              to enforce the law when states do not perform satisfactorily.

EPA Does  Not Shift Resources to  Intervene in Problem States

              If an authorized state does not operate its program as outlined in CAA, CWA, and
              RCRA, the laws authorize EPA to revoke the state's authority to operate the
              program of concern.  However, EPA must be able to adjust its resources to take
              back state authority in programs. EPA headquarters, regional enforcement
              officials, and independent organizations told us that the threat of EPA revoking a
              state's authorization  was moot because there is a general understanding that no
              EPA region has the resources to operate a state program. This reality undercuts
              EPA's strongest tool for ensuring that authorized states adequately enforce
              environmental laws:  de-authorization.15

              OECA is  constrained from actively managing its resources to direct them to the
              most important state  enforcement problems. Federal law intends that EPA use its
              workforce efficiently and effectively to accomplish its goals.16 Under the current
              resource planning structure, EPA regions divide their resources among several
              OECA priorities, including state oversight. Figure 1 shows the average allocation
              of full-time equivalent (FTE) positions between  regions and headquarters from
              FY 2000 through FY 2010. If EPA regions report that they are having problems
              with state enforcement,  OECA cannot reallocate FTEs among regions to address
              the problems because OECA does not control  enforcement resources in the
              regions. Therefore, priority enforcement issues may not receive needed resources.
15 EPA has the power to take a number of steps to help states improve their programs. All of these steps require an
expenditure of resources. The actions taken by Regions 4, 7, and 8, outlined on page 6, took resources away from
other activities. The most severe action that EPA can take, revoking a state's authority to operate a program, takes a
large amount of resources.
16 Government Performance and Results Act (GPRA) Modernization Act of 2010.
12-P-0113                                                                             17

-------
              Figure 1: Average enforcement FTE allocations, FY 2000-FY 2010
                           REGIONS:       REGION 10:
                             244      '     174
                     REGIONS:
                        169.

                  REGION?:
                     172
                 REGIONS:
                    272                 "•            REGION 1:
                                                         174
                                                     REGION 2:
                                                       267
                           REGION 4:^        REGIONS:
                             356                 283

              Source: Office of the Chief Financial Officer data.

              Regional enforcement officials said that EPA had not allocated its enforcement
              workforce according to the regions' workloads, and that regional enforcement
              resources were insufficient to allow them to conduct their required enforcement
              work. In response to our regional survey, eight out of the nine responding regional
              enforcement directors cited a lack of resources as a barrier to using the
              enforcement tools at their disposal.17 In our interviews, some EPA regions said
              they make internal adjustments to redirect resources toward problem states.

              Our analysis of FTE data provided by OECA and the Office of the Chief
              Financial Officer showed that even though enforcement priorities have changed
              and the size of the regulated community has increased, relative allocations to EPA
              regions have effectively not changed over the past 10 years. Regional FTE
              allocations varied less than 0.3 percent from FY 2000 to FY 2010. It is reasonable
              to expect some amount of change in staffing  levels between regions during this
              time frame to reflect changing priorities and oversight needs.

              As an example, we looked at resources using a workload ratio of permits issued to
              enforcement employees. We found that the workload varied across regions from
              203 permits per employee (in Region 8) to 541 permits per employee (in
              Region 4).  The data in figure 2 show that Region 4  has the heaviest workload.
              Appendix C, which presents the results of our overall analysis, shows that
17 Region 6 enforcement officials declined to respond to all of the survey questions (their eventual response
answered 32 percent of the survey questions). As such, we consider Region 6 nonresponsive, and consider
Region 6's lack of participation a scope limitation. However, we analyzed all survey responses and believe that the
Region's lack of response does not significantly impact the findings or conclusions reached.


12-P-0113                                                                               18

-------
              Region 4 also has the highest-performing states for the enforcement metrics we
              considered.18
              Figure 2: Workload assessment (based on 2010 permit levels)

                REGIONS
               REGION 10
                REGIONS
                REGION 9
                REGION 1
                REGIONS
                REGION?
                REGION 2
                REGIONS
                REGION 4
                        0      100     200     300     400     500    600
                                       Workload (Permits/FTE)

              Source: OIG analysis of FTE data from the Office of the Chief Financial Officer and enforcement
              data from OECA.

              The evidence presented in this evaluation is consistent with previous OIG
              conclusions regarding EPA workforce management. In February 2010, the OIG
              reported that EPA does not enforce a coherent program of position management
              to assure the efficient and effective use of its workforce.19 Without an Agency-
              wide position management program, EPA leadership lacks reasonable assurance
              that it is using personnel in an effective and efficient manner to achieve mission
              results. In addition, in September 2008, the OIG reported that EPA does not
              assess resources needed and expended to accomplish its priority enforcement
              goals.20 EPA has thus far refused to accept our recommendation that it develop a
              cost-effective methodology for measuring resource inputs in the national
              priorities.
Conclusions
              EPA has not implemented a nationally consistent enforcement program. In our
              opinion, regions do not consistently take action when states do not enforce the law
              according to EPA's policies and the regulations established under federal laws. In
              states where enforcement actions are lacking, citizens may be exposed to
              inequitable health risks compared to states where EPA or state agencies take
              timely and appropriate enforcement action. Inconsistent enforcement can result in
18 There are limitations to using permits issued for estimating workload; for example, some permits require more
time to inspect than others do. However, this method allows for an estimation of workload for the purpose of this
evaluation.
19 EPA OIG, EPA Needs Better Agency-Wide Controls over Staff Resources, Report No. ll-P-0136, February 22,
2011.
20 EPA OIG, EPA Has Initiated Strategic Planning for Priority Enforcement Areas, but Key Elements Still Needed,
Report No. 08-P-0278, September 25, 2008.
12-P-0113                                                                                19

-------
              unabated pollution in one state causing environmental and public health damage
              and cleanup costs in another. Furthermore, inconsistent stringency of enforcement
              by some states may give firms in those states a substantial competitive advantage
              over firms in the same regulated industries in other states.

              A centralized national enforcement program could reduce overhead costs for EPA
              by consolidating some functions and decision-making activities. EPA could make
              more effective use of its $372 million in regional enforcement FTEs by directing
              a single national workforce instead of 10 inconsistent regional enforcement
              programs.21 EPA could use its enforcement FTEs more effectively by targeting
              decisive interventions in states where enforcement problems require the most
              attention.

              In addition, reduced pollution due to enforcement actions can lead to large
              monetary health benefits. EPA calculated that enforcement actions and the
              resulting pollution reduction would have saved an estimated $3.8 billion in 2007
              and $35 billion in 2008 in avoided health and lost work costs, including reduced
              hospital and emergency room visits, avoided premature death due to heart or lung
              disease, and reduced cases of bronchitis, heart attack, and asthma.22 When neither
              states nor EPA takes enforcement actions when needed, these health benefits are
              not realized and premature deaths and illnesses are not prevented to the extent that
              they could be. As a result, EPA cannot assure that Americans in all states are
              equally protected from the health effects of pollution or that enforcement of
              regulated entities is consistent nationwide.

              In our opinion, EPA should  act decisively when states do not enforce authorized
              programs so that EPA can implement a consistent national enforcement program
              that protects all  citizens. However, EPA lacks clear, nationwide lines of authority
              for enforcement including authority over resource allocations and use. EPA's
              current mode of operation has resulted in inconsistent implementation of
              environmental regulation and protection. EPA has not managed its enforcement
              resources efficiently to allow effective intervention in states where enforcement
              practices do not meet its expectations. Additionally, EPA has not provided
              standards or consolidated enforcement policies so that state governments and the
              regulated community understand EPA enforcement expectations. Particularly
              important is establishing a clear escalation policy that shows states when and how
              EPA will intervene if state enforcement is deficient under EPA's policies. EPA
              can improve its  national enforcement through more efficient and effective use of
              its resources and authorities to ensure that all citizens and industries enjoy
              consistent environmental protection.
21 Dollar amount represents personnel compensation, travel, general, informational technology, and contracts and
grants expenses.
22 EPA, OECA FJ 2007Accomplishments Report, May 2008; and EPA, OECA FY 2008 Accomplishments Report,
December 2008.
12-P-0113                                                                              20

-------
Recommendations

             We recommend that the Deputy Administrator:

                 1.  Give OECA authority for all nationwide enforcement resources and
                    workforce allocation.

             We recommend that the Assistant Administrator for Enforcement and Compliance
             Assurance:

                 2.  Cancel outdated guidance and policy documents, and consolidate and
                    clarify remaining guidance into EPA documents that are publicly and
                    easily accessible on the EPA Civil Enforcement website.

                 3.  Establish clear and consistent national enforcement benchmarks
                    throughout CAA, CWA, and RCRA guidance and policies so that EPA's
                    enforcement expectations are clear and consistent for state governments
                    and the regulated community.

                 4.  Establish a clear and credible escalation policy for EPA intervention in
                    states that provides steps that EPA will take when states do not act to
                    ensure that the CAA, CWA, and RCRA are enforced.

                 5.  Establish procedures to reallocate enforcement resources to intervene
                    decisively when appropriate under its escalation policy.

                 6.  Develop a state performance scorecard to publicly track state enforcement
                    activities and results from year to year.

Agency Comments and OIG Evaluation

             In its response to the draft report on this subject, the Agency agreed with the
             overall finding that state enforcement performance varies widely across the
             country. The Agency also agreed that EPA headquarters and regions can take
             steps to improve national consistency.

             The Agency expressed concern about the metrics we chose and the methods we
             used to summarize enforcement metrics. The Agency was concerned that our
             evaluation relied too heavily on the state enforcement activity metrics we
             collected from EPA to compare state performance. The Agency argued that
             enforcement is a complicated process that ideally relies on analysis of multiple
             factors related to state goals and performance. The Agency was concerned that
             our choice of metrics to compare state performance could unintentionally
             encourage states to focus only on those three metrics: inspection coverage,
             identification of SNC/HPVs, and assessing penalties.
12-P-0113                                                                            21

-------
             The Agency also expressed concern about the quality of the underlying data for
             the three enforcement activity metrics. EPA said that the data and goals on which
             we based our analysis were unreliable.

             We utilized several sources of information to reach our conclusions. Although not
             mentioned in its comments, we utilized EPA's own enforcement data, and our
             analysis is similar to those that EPA itself conducts. EPA has acknowledged that
             these data possess several significant limitations, and we took steps to overcome
             and describe these limitations. However, we used several other sources of
             information in our evaluation, such as descriptions of states contained in the SRF
             reports and interviews with officials and staff of EPA and states. Our conclusions
             rest on information contained in all of these sources.

             We chose the three measures of state activity after discussions with OECA staff
             and officials, and other stakeholders. The use of these measures is supported by
             statements contained in OECA's own documents including the SRF, and EPA
             displays the data on its publicly  available enforcement website. However, because
             we realize that no single set of measures is individually definitive, we
             corroborated our findings through numerous interviews across six regions and
             document review. EPA's agreement with the substance of our findings provides
             further support for the validity of our approach. See chapter 1 and appendix A for
             details about our methodology.

             In response to EPA's comments, we reemphasized our use of multiple  sources in
             our evaluation in chapter 1 of the final report, and included additional details on
             our methodology in report appendix A. We also determined that known
             inaccuracies did not prevent using the data. To clarify this in our report, we added
             language describing that the OIG was aware of EPA enforcement data  issues from
             prior reviews, and describing our own  accuracy assessments of the data. Based on
             data reliability assessments and  comparison of "raw" data with data verified by
             states for 2008 and 2009, we believe that the data quality is high enough to
             provide reliable  indicators of state performance. We also believe that the overall
             quality of the data submitted to the database will only improve if it is publicly
             used.

             Specifically, the Agency said that using SNC rates was not a reliable way to
             gauge performance. The response stated:

                    Available data suggests that some states significantly under-report
                    significant violators. Therefore, it is likely that low rates of
                    identified significant violators do not reflect actual low rates of
                    violation, but rather incomplete or mischaracterized data. If this is
                    correct, then higher reported rates of violation are more likely to be
                    accurate, and states with such higher rates likely to be higher
                    performing states, if identification and disclosure of violations is
                    the metric.
12-P-0113                                                                             22

-------
             In response, we clarified in our "Scope and Methodology" section that our
             method for comparing state performance, likewise, counted higher SNC rates as
             indicators of improved performance.

             EPA commented that states are not required to report CWA penalties to EPA, so
             there is no way to determine whether a state reporting zero penalties assessed zero
             or did not report. In response, we reanalyzed CWA data without the penalty
             metric. We also reanalyzed the CWA data by including the penalty metric for
             states that choose to report their penalties and did not include it for states that did
             not choose to report it, in the same way OECA analyzes CWA data on its
             Enforcement and Compliance History Online (ECHO) website. The second
             method produced more reliable results.

             The Agency also disagreed with our method of summarizing enforcement metrics
             across statutes. EPA wrote:

                    Complete data and valid, meaningful measures are key to
                    understanding state performance. Gaps in our current data make it
                    difficult to develop measures that tell a complete story across  all
                    media and regulated sectors. Limited resources at both the state
                    and federal levels make it more difficult to address these gaps.
                    Measures based on the data that EPA does have may not focus on
                    the right things. In our opinion, an  effective way to address these
                    issues is to set a new direction in the enforcement program that
                    takes advantage of the advances in information and monitoring
                    technologies. This will enable us to get more complete data
                    through the efficiencies of electronic reporting and field
                    equipment. This data will allow EPA to more completely identify
                    regulated sectors, monitor and assess compliance, target resources
                    more effectively, and measure the overall performance of federal
                    and state regulators in a more meaningful way.

             In response, we eliminated our use of a summary metric for each state. However,
             we disagree that presenting a simplified method for comparing state enforcement
             performance is invalid. Repeated assessment of indicators allows management to
             determine where they should direct additional EPA attention.  As with any other
             performance measure, the measurement only takes on meaning when it is
             compared with benchmarks that clearly outline success. The overarching message
             of this report is that EPA must establish clear benchmarks for success so that
             states, EPA regions, EPA headquarters, the public, and the regulated community
             all understand what is required to ensure a safe and healthy environment.  The
             report emphasizes that the states we ranked highest are not necessarily
             succeeding, but rather that "performance is low across the board." With clear
             benchmarks, EPA, states, and the public can clearly understand how states
12-P-0113                                                                             23

-------
             perform, and EPA can take deliberate steps to improve performance where steps
             are most needed.

             We agree that ideal performance data are repeatable, reliable, and relevant.
             However, we also believe that EPA underrelies on its vast stores of state
             enforcement data, which it has collected for decades. As part of our evaluation
             process, we conducted a data reliability assessment to  determine whether known
             data inaccuracies prevented us from relying on data in EPA enforcement
             databases. In our "Scope and Methodology" section and in appendix A, we
             defined the dates when we drew data from the database.

             The Agency did not agree with recommendation 1, generally agreed with
             recommendations 2 through 4, and neither agreed nor  disagreed with
             recommendation 5. EPA suggested an additional recommendation, which is
             included above as recommendation 6. Because the Agency did not provide a
             detailed plan for implementing recommendations 2-5, we consider all
             recommendations unresolved, pending our receipt of the Agency's corrective
             action plan.
12-P-0113                                                                            24

-------
                        Status  of Recommendations  and
                             Potential Monetary  Benefits
 Rec.
  No.
                                     RECOMMENDATIONS
                                                                                         POTENTIAL MONETARY
                                                                                          BENEFITS (in SOOOs)
Page
 No.
                             Subject
                                                    Status1
                   Planned
                  Completion
Action Official         Date
         21    Give OECA authority for all nationwide
              enforcement resources and workforce allocation.

         21    Cancel outdated guidance and policy documents,
              and consolidate and clarify remaining guidance into
              EPA documents that are publicly and easily
              accessible on the EPA Civil Enforcement website.

         21    Establish clear and consistent enforcement
              benchmarks throughout CAA, CWA, and RCRA
              guidance and policies so that EPA's enforcement
              expectations are clear and consistent for state
              governments and the regulated community.

         21    Establish a clear and credible escalation policy for
              EPA intervention in states that provides steps that
              EPA will take when states do not act to ensure that
              the CAA, CWA, and RCRA are enforced.

         21    Establish procedures to reallocate enforcement
              resources to intervene decisively when appropriate
              under its escalation policy.

         21    Establish a state performance scorecard to publicly
              track state enforcement activities and results from
              year to year.
                                                      Deputy Administrator
                                                    Assistant Administrator for
                                                        Enforcement and
                                                     Compliance Assurance
                                                    Assistant Administrator for
                                                        Enforcement and
                                                     Compliance Assurance
                                                    Assistant Administrator for
                                                        Enforcement and
                                                     Compliance Assurance
                                                    Assistant Administrator for
                                                        Enforcement and
                                                     Compliance Assurance

                                                    Assistant Administrator for
                                                        Enforcement and
                                                     Compliance Assurance
Claimed
Amount
Ag reed-To
 Amount
                                 $372,0002
  0 = recommendation is open with agreed-to corrective actions pending
  C = recommendation is closed with all agreed-to actions completed
  U = recommendation is unresolved with resolution efforts in progress

  This dollar amount represents resources that EPA will put to better use when the recommendation is implemented.
12-P-0113
                                                                                                           25

-------
                                                                         Appendix A

                Details on Scope and Methodology

We applied the objective questions to three major environmental enforcement programs: the
CAA Title V program, the CWA NPDES program, and the RCRA Subtitle C program. We
conducted our evaluation from September 2010 to July 2011. We included all 50 states and all
10 EPA regions in our evaluation.

To answer the evaluation objectives, we utilized EPA enforcement data and reviewed
environmental laws, regulations, and EPA guidance (enforcement response policies, CMSs, and
other relevant OECA enforcement policies and guidance). We also surveyed EPA's 10 regional
enforcement directors and conducted interviews with enforcement officials in selected EPA
regions and states. We also interviewed OECA officials at EPA headquarters. In addition, we
considered EPA resources for FY 2000 through FY 2010, and assessed EPA's knowledge of the
workload that the regulatory universe presents.

As we analyzed EPA enforcement data, we were specifically interested in states where
enforcement activities began low, compared with other states', and remained low for the duration
of the study period (2003-2009). In these cases, we sought supplemental  documentary and
testimonial evidence to determine what factors led to the low enforcement activity levels,
whether EPA undertook efforts to improve those states' performance, and whether EPA efforts
led to improved enforcement activities. In all cases, EPA agreed with our assessments of which
states displayed consistently low enforcement activity. This general agreement validated our use
of these metrics to compare state performance. In all cases, EPA regions engaged states in an
effort to improve their performance. However, EPA's efforts did not improve performance in
those states in all cases.

Interview Methodology

For this evaluation, we selected states and associated regions for interviews based on which
states appeared to have persistent problems meeting national enforcement goals related to the
SRF. To identify these states, the OIGused three information sources:

       EPA Enforcement Data Analysis: To assess state performance using enforcement data,
       we used EPA's Online Tracking Information System (OTIS) Management Report tool.23
       We developed a ranking system that used three enforcement metrics relevant to the SRF:
       percentage of facilities inspected, significant noncompliance identification rate, and
       percentage of formal actions with penalties. We queried the management reports for
       FYs 2003-2006, to correspond to the data years of the first round  of SRF reviews, for
       each statute and metric. An additional  query of information from FYs 2007-2009 was
       included to determine whether state performance levels were consistent over time, or
23 According to the OTIS Tool Guide, the most appropriate tool to "determine how your region or state is doing in
relation to other regions and states in regard to inspection coverage, discovery of violations, enforcement actions,
and average penalties," is the Management Report tool, which we chose to use in this analysis.
12-P-0113                                                                           26

-------
       whether they had improved since the original SRF data years. We analyzed the data using
       Microsoft Excel.

       For the programs of interest, we looked at state enforcement performance for the CAA,
       CWA, and RCRA in FYs 2003-2009 and identified states with low results for these three
       metrics in at least two of the three statutes throughout the period.

       Round 1 SRF reports: In addition to the data analysis, we reviewed Round 1  SRF reports
       for the states and provided an assessment of each state's performance based on the
       reports. We compared state results with national averages and national goals, assessed the
       overall portrayal of the state in the report, and assessed the comprehensiveness of the
       report.

       Interviews and Corroborating Evidence: We also corroborated our analysis through
       interviews with EPA  headquarters and third-party organizations, including ECOS, the
       Environmental Integrity Project, and the Environmental Working Group. We also
       considered as evidence of underperformance recent petitions filed with EPA asserting
       that states were underperforming, and collected and examined the citizen petitions for
       states, when available.

From this analysis, we chose to conduct site visits Regions 4, 5, 6, 7, 8, and 10,  and the states of
South Carolina, Illinois, Louisiana, Iowa, Colorado, North Dakota, and Alaska.  We chose five of
the region/state combinations because the states emerged as persistent underperformers over the
analysis period. We chose the two other states because their performance stood  out:  in one case,
the state performed poorly in the first analysis period, but significantly improved in the second
(Colorado). In the second case, the team decided to visit a state (South Carolina) in a region
whose states tended to perform better than states in other regions, according to our analysis, to
determine why states in this particular region were performing better.

Data  Analysis Methodology

To characterize state performance for FYs 2003-2009 for each of the three programs, we used
the data from the EPA enforcement database (OTIS Management Tool, offered  to the public
through ECHO at www.echo.epa.gov). The metrics used for assessment were (1) percentage of
facilities inspected by the state (calculated as number of state facility inspections divided by the
total number of state facilities),24 (2) percentage of SNCs/HPVs identified per inspection by the
state,25 and (3) percentage of formal actions with penalties.26 Because states are not required to
report penalty data for CWA, any state reporting 0 percent was considered a nonreporting state
(even though their actual penalty assessment might actually be 0 percent), and that metric was
24 Actual name of metric as queried in OTIS management reports was the same for all program: "Total # Facil Insp
State".
25 Actual name of metric as queried in OTIS management reports: CWA, "% Facilities in SNC"; CAA, "New State
HPVs Per State Insp Facil"; RCRA, "New State SNCs Per State Insp Facil".
26 Actual name of metric as queried in OTIS management reports was the same for all programs: "% State Actions
With Penalty".
12-P-0113                                                                             27

-------
not taken into consideration in their performance average. Appendix C presents the results of the
information downloaded from the EPA database in August 2010.

       Data Limitations and Considerations

       The use of performance indicators is essential to managing government programs. While
       all performance measures are imperfect in some ways, their repeated measurement over
       time allows government program managers to gauge changes in performance. Even if the
       measure itself does not provide a definitive, precise answer to the question of
       performance, it provides a way to gauge performance. In the case of this evaluation, we
       looked to nine  standard measures (three for each of three statutes) over the period of
       seven years to discern performance trends.

              EPA Database Considerations

              Over time, the OIG has reported about data quality and reliability concerns  with
              EPA databases.27 Because of our preexisting concerns, the OIG conducted a data
              reliability assessment prior to using EPA data for this evaluation.

       Data Reliability Assessment

       In accordance with generally accepted government auditing standards, we conducted a
       data reliability  assessment to determine whether the EPA data were sufficiently reliable
       for the purposes of this project. The team assessed the reliability of EPA's enforcement
       data from the Air Facility System, the Permit Compliance  System, the Integrated
       Compliance Information System-National Pollution Discharge Elimination System, and
       the RCRAInfo database (accessed through OTIS) using the GAO Data Reliability
       Assessment guidance and the Government Audit Standards Yellow Book. We found the
       EPA enforcement data to be sufficiently reliable for the purposes of the current project.
       While there are known issues with the data, we accounted  for these in the analysis, and
       took steps to diminish the impact of potential data issues. We supported data analysis
       results using interviews with EPA and external  sources, and document reviews.

       Our data reliability assessment indicated that the limitations of EPA data systems should
       not prevent us from using the data to compare states.

       Considerations About Performance Measures

       EPA databases contain enforcement data submitted by state and local agencies and
       facilities, and the metrics we chose provide a snapshot of state enforcement activities.
       Understanding some underlying assumptions may clarify the use of these data to assess
27 See, for example, EPA Needs to Improve Its Recording and Reporting of Fines and Penalties, Report No. 10-P-
0077; EPA Could Improve RCRAInfo Data Quality and System Development, Report No. 1 l-P-0096; ECHO Data
Quality Audit - Phase I Results: The Integrated Compliance Information System Needs Security Controls to Protect
Significant Non-Compliance Data, Report No. 09-P-0226; and ECHO Data Quality Audit - Phase 2 Results: EPA
Could Achieve Data Quality Rate With Additional Improvements, Report No. 10-P-0230.
12-P-0113                                                                             28

-------
       performance. The metrics we used included the entire facility universe for all three
       statutes.

          1.   Percent of facilities inspected. EPA establishes goals for facility inspections. The
              goals differ from statute to statute, and some states negotiate lower goals with
              EPA in exchange for conducting additional activities in other areas. In addition,
              EPA databases include the number of permits rather than the number of facilities,
              and EPA has not historically frozen facility universe counts. However, OIG
              discussions with EPA contractors who manage the databases indicated that, as a
              correlate for facility counts, EPA typically uses the permit counts, and that using
              the same universe for every year of analysis would not significantly change the
              results because the universe does not change very much from year to year. An
              important limitation to this measurement is that some permits may require much
              more time and expertise to inspect. However, we determined that dividing the
              number of inspections by the number of facilities/permits provided a general
              method for comparing a state with fewer permits with a state that has many
              permits.

              Because states are not required to report data on universe tracking for some
              nonmajors under CWA and CAA, states reporting more than just major facilities
              could possibly appear to be doing worse in this category than they actually are. A
              visual assessment of the CWA and CAA facility universe data suggested that this
              might be the case for CWA.  This is another caveat of the CWA data that users
              should consider when interpreting the performance of a given state under that
              program.

              Because EPA does not require or expect that states inspect 100 percent of their
              permitted facilities annually, the OIG did not expect that any state would achieve
              a score of 100 percent in this area. However, by  comparing state inspection
              percentages, we  gained a perspective on average inspection coverage, and where
              states exceeded these  averages. We probed further to understand what factors led
              to that state's performance.

          2.   Percent of inspections identifying SNCs and HP Vs. This metric does not correlate
              specific inspections with SNC or HPV identification. Instead, it divides the
              number of facilities with a new state-identified SNC or HPV by the number of
              facilities inspected.  Since CWA did not have a metric for SNC identification by
              states, we used the percent of facilities in the SNC metric. We conducted
              sensitivity analysis using this and other metrics,  found that it yielded comparable
              results, and therefore determined it to be an appropriate substitute.

              Identifying SNC/HPVs in an inspection may constitute a failure of enforcement
              because it means a facility failed to comply with a regulation. However, EPA
              more frequently  views state identification of an SNC/HPV as a success because it
              indicates a rigorous targeting and inspection protocol. We adopted EPA's view by
              considering a higher value of this metric as an indicator of better state
              enforcement performance. As with inspection coverage, the OIG did not expect
12-P-0113                                                                             29

-------
             that any state would identify SNC/HPVs at 100 percent of inspections, which
             would have indicated that the state had either a perfect targeting strategy or a
             regulated community that did not ever comply with regulations. Instead, this
             metric offered the OIG a perspective on average SNC/HPV identification, and
             allowed us to further probe outliers to determine why those states exceeded or
             underperformed compared with their peers.

          3.  Percent of state actions that included a penalty. This metric also does not track a
             specific inspection or violation identification to a penalty. Instead, it divides the
             number of state actions that included penalties by the total number of state formal
             actions to offer a relative measure of how frequently the state addresses a
             violation using a penalty.

             There are two considerations regarding this measure for the CWA. First, and most
             importantly, states are not required to report penalty information. Therefore, it
             was not possible to determine whether a zero in EPA's database is a true value or
             a nonresponse. There were 18 states with a zero value. We adjusted for this in our
             analysis by utilizing the penalty metric only  for (the 28) states that had a greater
             than zero value for penalties.

             The second consideration is that for two states, the state data contained in  the
             EPA database were greater than 100 percent. This does not  occur in any other
             metric or for any other states, and is caused by the way EPA's data system
             computes totals. [The system calculates this  metric by dividing the number of
             state actions with penalties (numerator) by the number of state formal actions
             (denominator). However, the database counts one category of formal actions in
             the numerator but not in the denominator ("AO Stipulated Penalty 1"). The result
             is that the percent of formal actions with  penalties can be greater than 100 percent
             if a state issued several "AO stipulated penalty 1" actions.] We adjusted for this in
             our analysis by capping the metric at 100 percent (thereby assessing the two states
             reporting > 100 percent at 100 percent).

       Comparison of State Performance to National Goals and Averages

       Using the SRF tool in OTIS, we compared actual state performance to the national goals
       presented alongside the data metrics (as set in EPA guidance and policies). The SRF data
       in OTIS were the most appropriate data for this analysis since they provide national goals
       and averages, when available, along with state data. We used  this tool to assess CAA,
       CWA, and RCRA performance for 2010 to have a current understanding of how state
       performance compared to the national goals and averages.  We eliminated all metrics that
       did not include a national goal, and then eliminated  all metrics that were not state-only.
       From the remaining metrics, we chose those that were most similar to the metrics used
       for the state ranking (inspections, SNCs/HPVs, and  penalties). We counted all states that
       did not meet the national goal for each metric, and counted all states that did not achieve
12-P-0113                                                                             30

-------
       the national average for each metric.28 We used the result to calculate a percentage of
       reporting states that did not meet the national goal and national average.

Regional Survey Methodology

We surveyed the 10 regional enforcement directors across EPA's regions to solicit their opinions
on the most effective strategies for addressing issues in chronically underperforming state
enforcement programs. We designed the survey as a preliminary research indicator of how
regions intervened in states. We designed the questions to elicit the type and importance of
information sources the regions use for state program assessment, as well as the frequency of
oversight tool usage and barriers to using those tools.  We also requested that the regional
enforcement directors describe their most effective strategies. The team requested and reviewed
comments on the survey questions from EPA's lead regional enforcement coordinator, and
OECA's Office of Compliance.

We issued the survey to EPA's 10 regional  enforcement directors. Nine of the 10 regions
responded. The team contacted the unresponsive region (Region 6) on nine occasions to request
a response. Region 6 enforcement officials disagreed with the design of the survey instrument
and declined to respond to all of the survey questions (the region eventually responded to
32 percent of the survey  questions). We consider Region 6 as nonresponsive, and as such, we
consider Region 6's lack of participation to be a scope limitation. However, we analyzed all
numerical and narrative survey results, and we believe that the region's lack of response does not
significantly affect our findings and conclusions.
 1 States with missing values for the metric were not included in the total count of states.
12-P-0113                                                                             31

-------
                                                                       Appendix B

             Prior GAO and EPA OIG Assessments

GAO and the EPA OIG identified EPA oversight of states as a management challenge in
FYs 2008, 2009, and 2010. In our 2010 description of EPA management challenges, the
Inspector General said:

      While EPA is renewing its attention on the oversight of programs delegated to
      states, much remains to be done because the issues are complex and changeable.
      Effective oversight of delegations to states is a continuous management
      challenge that requires an agile organization, accurate data, and consistent
      interpretations  of policy.

Both GAO and the EPA OIG have frequently reported on problems with the EPA-state
enforcement relationship, noting key issues such as data quality, identification of violations,
issuing enforcement penalties and other enforcement actions in a timely and appropriate manner,
and general oversight issues. In our most recent report on state oversight, we found that outdated
agreements between EPA and states reduce EPA's ability to maintain a consistent water
enforcement program.  A list of selected GAO  and EPA OIG reports follows.

GAO

Environmental Protection Agency: Major Management Challenges, GAO-11-422T, March 2,
2011.

Clean Water Act: Longstanding Issues Impact EPA 's and States' Enforcement Efforts, GAO-10-
165T, October 15, 2009.

Environmental Enforcement: EPA Needs to Improve the Accuracy and Transparency of
Measures Used to Report on Program Effectiveness, GAO-08-1111R, September 18, 2008.

Environmental Protection: Collaborative EPA-State Effort Needed to Improve Performance
Partnership System, GAO/T-RCED-00-163, May 2, 2000.

Environmental Protection: More Consistency Needed Among EPA Regions in Approach to
Enforcement, GAO/RCED-00-108, June 2000.

Environmental Protection: EPA 's and States' Efforts to Focus State Enforcement Programs on
Results, GAO/RCED-98-113, May 27, 1998.

Water Pollution: Many Violations Have Not Received Appropriate Enforcement Attention,
GAO/RCED-96-23, March 20, 1996.
12-P-0113                                                                         32

-------
EPA OIG

Congressional Testimony Statement of Arthur A. Elkins, Jr., Inspector General, Before the
Subcommittee on Interior, Environment, and Related Agencies Committee on Appropriations
U.S. House of Representatives, "Major Management Challenges at the Environmental Protection
Agency," March 2, 2011.

EPA Needs Better Agency-Wide Controls Over Staff Resources, Report No. 1 l-P-0136,
February 22, 2011.

EPA Could Improve RCRAInfo Data Quality and System Development, Report No. 1 l-P-0096,
February?, 2011.

ECHO Data Quality Audit - Phase 2 Results: EPA Could Achieve Data Quality Rate  With
Additional Improvements, Report No. 10-P-0230, September 22, 2010.

EPA Should Revise Outdated or Inconsistent EPA-State Clean Water Act Memoranda of
Agreement, Report No. 10-P-0224, September 14, 2010.

EPA Needs to Improve Its Recording and Reporting of Fines and Penalties, Report No. 10-P-
0077, March 9, 2010

EPA Oversight and Policy for High Priority Violations of Clean Air Act Need Improvement,
Report No. 10-P-0007, October 14, 2009.

ECHO Data Quality Audit - Phase I Results: The Integrated Compliance Information System
Needs Security Controls to Protect Significant Non-Compliance Data, Report No. 09-P-0226,
August 31,2009.

Better Enforcement Oversight Needed for Major Facilities with Water Discharge Permits in
Long-Term Significant Noncompliance, Report No. 2007-P-00023, May 14, 2007.

EPA Performance Measures Do Not Effectively Track Compliance Outcomes, Report No. 2006-
P-00006, December 15, 2005.

Limited Knowledge of the Universe of Regulated Entities Impedes EPA 's Ability to Demonstrate
Changes in Regulatory Compliance, Report No. 2005-P-00024, September 19, 2005.

EPA Region  6 Needs to Improve  Oversight of Louisiana's Environmental Programs,
Report No. 2003-P-00005, February 3, 2003.

State Enforcement of Clean  Water Act Dischargers Can Be More Effective,
Report No. 2001-P-00013, August 14, 2001.

Enforcement - Compliance with Enforcement Instruments,  Report No. 2001-P-00006, March 29,
2001.
12-P-0113                                                                          33

-------
North Carolina NPDES Enforcement and EPA Region 4 Oversight, Report No. 2000-P-00025,
September 28, 2000.
12-P-0113                                                                        34

-------
                                                                         Appendix C

           State Performance Analysis and Results

Using the methodology described in appendix A, we utilized EPA state enforcement data to
determine overall performance and level of consistency from state to state and region to region.
Our analysis of state performance showed that all of the regions included at least one state that
performed in the bottom quartile in the different program analyses (8 for CWA, 6 for CAA, and
7 for RCRA).

Figures C-l through C-3 show how state performance quartiles under the enforcement metrics
for inspections, SNC/HPVs identified and penalties varied both across statutes and
geographically. The figures cover the FYs 2003-2009 period, and divide  states with authorized
programs into performance quartiles. The range of average performance was wide, at
approximately 41 percentage points for CWA, 50 percentage points for CAA, and 32 percentage
points for RCRA. This indicates high variability from state to state.  The highest average
performance for CAA was over 60 percent (Alabama) and the lowest was almost 11 percent
(North Dakota). The mean performance was 16 percent (out of 100) for CWA, 39 percent for
CAA, and 18 percent for RCRA, which indicates that performance was relatively low across the
board. We calculated rough performance quartiles using EPA data for each statute (table C-l,
below), and modified the quartiles so that they followed natural breaks. The maps presented
herein offer a rough estimation of nationwide performance in conducting  inspections, identifying
serious violations, and addressing violations with penalties.

Figure C-1: State CAA performance, FYs 2003-2009, as measured by rates of inspections,
penalties assessed, and HPVs
                                                                       Legend

                                                                          1  No Program
                                                                       |   | 1 Top Quartile
                                                                       |    2 Second Quartile
                                                                       ^^| 3 Third Quartile
                                                                       ^^| 4 Bottom uuartil

                                                                         #  EPA Region
 •
Source: OIG analysis of OECA data.
12-P-0113
35

-------
Figure C-2: State CWA performance, FYs 2003-2009, as measured by rates of inspections,
penalties (where states voluntarily reported them), and SNCs
                                                                              Legend
                                                                              I   |   No Program
                                                                              I   | 1  Top Quartile
                                                                                _] 2  Second Quartile
                                                                              ^H3  Third Quartile
                                                                              ^^|4  Bottom Quartile
                                                                                *  EPA Region
Source: OIG analysis of OECA data.
Figure C-3: State RCRA performance, FYs 2003-2009, as measured by rates of inspections,
penalties assessed, and SNC
                                                                                        yam
Legend
I    I  No Program
I    1  Top Quartile
|    2  Second Quartile
|    | 3  Third Qjartile
I    | 4  Bottom Quartile
  * EPA Region
Source: OIG analysis of OECA data.
Figure C-4 collates the results of figures C-l through C-3 by assessing which states fell into the
bottom quartile in two of three enforcement programs over the 7-year period of our analysis, and
thus warrant additional EPA oversight and intervention (shown in dark red). The figure also
identifies states that fell into the top quartiles for at least two of three programs (shown in dark
12-P-0113
             36

-------
green). Even though this assessment does not indicate that these states are achieving EPA goals
or performing to other EPA expectations (as described in chapter 2), EPA may choose to reduce
its oversight and intervention in these states to refocus resources on states in the bottom quartiles.
To divide states into the categories below, we calculated the most common quartile across the
three statutes. When the three enforcement programs fell into three separate quartiles, we
categorized them as having no consistent pattern (shown in gray).

Figure C-4: Majority top-tier and majority bottom-tier states as measured by rates of inspections,
penalties assessed, and HPVs
      Hawaii, Region 9
Legend

1   | 2 or 3 programs in top quartile
|   | M osl programs In top half
|   | No consistent pattern
I   | Most programs in bottom half
^^| 2 or 3 program? in bottom quartiles
 =  EPA Region
Source: OIG analysis of OECA data.
12-P-0113
                        37

-------
   Table C-l shows the average performance calculation for each state and each statute over the
   FYs 2003-2009 analysis period based on information contained in EPA's enforcement database.

Table C-1: CWA, CAA, and RCRA state enforcement performance, FYs 2003-2009

State
AL
AK
AZ
AR
CA
CO
CT
DE
FL
GA
HI
ID
IL
IN
IA
KS
KY
LA
ME
MD
MA
Ml
MN
MS
MO
MT
NE
NV
Region
4
10
9
6
9
8
1
3
4
4
9
10
5
5
7
7
4
6
1
3
1
5
5
4
7
8
7
9
CWA NPDES (avg 2003-2009)
Insp.
14%
a1%
18%
12%
15%
6%
15%
27%
3%
10%
2%
EPA
21%
20%
20%
8%
15%
9%
19%
9%
EPA
9%
25%
7%
7%
4%
13%
7%
SNC
4%
a1%
2%
11%
1%
6%
4%
2%
0%
11%
0%
EPA
3%
16%
11%
<1%
5%
3%
17%
1%
EPA
5%
8%
3%
1%
5%
19%
0%
Penalty
100%
NA
NA
75%
NA
27%
NA
NA
53%
56%
NA
EPA
8%
NA
39%
31%
90%
22%
NA
40%
EPA
NA
36%
39%
15%
5%
NA
NA
Avg
39%
a1%
10%
33%
8%
13%
10%
15%
19%
26%
1%
EPA
11%
18%
23%
13%
37%
11%
18%
17%
EPA
7%
23%
16%
8%
4%
16%
4%
CAA Title V (avg 2003-2009)
Insp.
77%
28%
30%
61%
62%
28%
30%
59%
56%
36%
72%
13%
35%
57%
28%
52%
41%
31%
35%
45%
13%
30%
33%
34%
68%
33%
51%
50%
SNC
8%
3%
11%
3%
24%
6%
7%
5%
6%
5%
8%
10%
8%
4%
2%
2%
6%
15%
6%
6%
9%
2%
11%
5%
2%
9%
4%
10%
Penalty
97%
76%
68%
72%
85%
37%
56%
70%
90%
93%
89%
84%
12%
86%
45%
91%
50%
18%
94%
40%
83%
92%
72%
85%
92%
49%
22%
80%
Avg
60
%
36
%
37
%
45
%
57
%
24
%
31
%
45
%
51
%
45
%
56
%
35
%
19
%
49
%
25
%
48
%
32
%
21
%
45
%
30
%
35
%
41
%
39
%
41
%
54
%
30
%
25
%
47
%
RCRA Subtitle C (avg 2003-2009)
Insp.
5%
EPA
1%
2%
1%
4%
3%
4%
4%
6%
3%
4%
2%
4%
EPA
3%
11%
2%
2%
1%
2%
2%
1%
4%
3%
13%
3%
17%
SNC
2%
EPA
6%
11%
6%
2%
10%
2%
6%
5%
9%
4%
1%
2%
EPA
6%
2%
2%
9%
3%
8%
2%
4%
4%
1%
1%
4%
< 1%
Penalty
44%
EPA
22%
72%
65%
61%
78%
0%
74%
51%
32%
47%
25%
37%
EPA
84%
48%
8%
82%
11%
64%
40%
7%
94%
47%
25%
37%
32%
Avg
17%
EPA
10%
29%
24%
22%
30%
2%
28%
21%
15%
18%
9%
14%
EPA
31%
20%
4%
31%
5%
25%
15%
4%
34%
17%
13%
15%
17%
   12-P-0113
38

-------

State
NH
NJ
NM
NY
NC
ND
OH
OK
OR
PA
Rl
SC
SD
TN
TX
UT
VT
VA
WA
WV
Wl
WY
Region
1
2
6
2
4
8
5
6
10
3
1
4
8
4
6
8
1
3
10
3
5
8
CWA NPDES (avg 2003-2009)
Insp.
EPA
5%
EPA
19%
23%
36%
23%
7%
6%
11%
4%
15%
12%
12%
4%
9%
18%
33%
< 1%
5%
33%
16%
SNC
EPA
3%
EPA
7%
1%
1%
21%
10%
<1%
1%
3%
3%
11%
23%
13%
2%
4%
1%
< 1%
4%
18%
0%
Penalty
EPA
28%
EPA
47%
100%
NA
55%
2%
NA
NA
3%
63%
19%
70%
NA
16%
NA
75%
14%
14%
NA
NA
Avg
EPA
12%
EPA
24%
41%
19%
33%
6%
3%
6%
3%
27%
14%
35%
8%
9%
11%
36%
5%
8%
26%
8%
CAA Title V (avg 2003-2009)
Insp.
25%
23%
18%
15%
84%
32%
33%
25%
43%
78%
26%
66%
73%
75%
33%
53%
40%
39%
71%
51%
20%
39%
SNC
9%
15%
5%
9%
3%
0%
19%
12%
6%
5%
17%
4%
1%
5%
43%
3%
1%
2%
7%
6%
8%
8%
Penalty
53%
93%
60%
97%
81%
0%
53%
84%
83%
94%
80%
86%
0%
88%
96%
83%
23%
83%
83%
44%
27%
63%
Avg
29
%
44
%
28
%
40
%
56
%
11
%
35
%
40
%
44
%
59
%
41
%
52
%
25
%
56
%
57
%
46
%
21
%
41
%
53
%
34
%
18
%
37
%
RCRA Subtitle C (avg 2003-2009)
Insp.
0%
4%
3%
1%
9%
3%
4%
2%
3%
4%
2%
4%
2%
4%
3%
2%
2%
3%
2%
7%
2%
6%
SNC
3%
7%
2%
3%
4%
0%
3%
2%
2%
1%
7%
7%
0%
10%
1%
4%
4%
3%
2%
1%
1%
3%
Penalty
64%
58%
64%
51%
22%
5%
69%
36%
42%
70%
45%
78%
29%
13%
48%
59%
24%
54%
50%
44%
44%
18%
Avg
22%
23%
23%
18%
12%
3%
25%
13%
16%
25%
18%
30%
10%
9%
17%
22%
10%
20%
18%
17%
15%
9%
Source: The data used in this assessment are from the OTIS Management Report tool. We downloaded the data in August 2010 and
queried OTIS for each program and each metric for FYs 2003-2009.

Notes:  All percentages in this appendix are rounded.
       "NA" indicates that the state did not report penalties to EPA. CWA penalty reporting is voluntary.
       "EPA" indicates that the average for this statute was not applicable as a measure of state performance because EPA directly
       implemented the  program.

       a  EPA directly implemented the Alaska CWA program for FYs 2003-2007 (average CWA = 0.82%), and Alaska directly
          implemented the program in FYs 2008 and 2009 (average CWA = 0.00%).
   12-P-0113
39

-------
Table C-2 shows a comparison of the national performance average for several metrics to the
national goal and indicates what percentage of the reporting states fell below the goals and the
average using information contained in EPA's enforcement database.

Table C-2: Comparison of national goals and averages to actual state performance
Statute
CAA
CWA
RCRA
Metric
Percent of violations that are
considered significant
Percent of HPVs entered less
than 60 days after designation
Percent of major facilities
inspected every 2 years
HPVs identified per major
source
Percent of actions on HPVs
that include a penalty
Actions taken at major facilities
in violation
Percent of major facilities
inspected annually
Percent of toxic storage
disposal facilities inspected
every 2 years
Percent of large quantity
generators inspected every
5 years
SNCs identified per inspection
Percent of final formal actions
that include a penalty
Total
reporting
50
47
50
50
47
31
50
49
50
47
45
National
goal
<= 50%
100%
100%
>= 1/2 the
national
avg
>= 80%
>= 80%
100%
100%
100%
>= 1/2 the
national
avg
>= 1/2 the
national
avg
National
avg
46%
35%
89%
6%
89%
57%
61%
87%
62%
3%
81%
#below
goal
22
45
42
15
9
14
48
25
48
13
3
% below
goal
44%
96%
84%
30%
19%
45%
96%
51%
96%
28%
7%
#below
avg
28
24
15
32
13
14
22
15
10
27
15
% below
avg
56%
51%
30%
64%
28%
45%
44%
30%
20%
57%
33%
Source: OTIS State Review Framework tool.
and region for FY2010.
We downloaded the data in March 2011 and queried the database for each statute
12-P-0113
                                                           40

-------
                                                                       Appendix D
               Agency Comments on Draft Report
                           and OIG Responses
MEMORANDUM

SUBJECT:   Inspector General's (OIG) July 28, 2011 Draft Evaluation Report, Project No.
             OPE-FY10-0022, "EPA Must Improve Oversight of State Enforcement"
FROM:      Robert Perciasepe
             Deputy Administrator

             Cynthia Giles
             Assistant Administrator
             Office of Enforcement and Compliance Assurance

TO:         Arthur A. Elkins, Jr.
             Inspector General

      We appreciate the opportunity to review and comment on the Office of Inspector
General's (OIG) July 28, 2011 draft Evaluation Report entitled: "EPA Must Improve Oversight
of State Enforcement" (Project No. OPE-FY10-0022). We thank you for tackling this
challenging topic, which is critically important to the protection of human health and the
environment and to achieving compliance, as states are the "front lines" of implementing federal
environmental laws.

Summary comments
      EPA agrees with your overall finding that state enforcement performance varies widely
across the country. We also agree that there are steps EPA Headquarters and regional offices can
and should take to strengthen our oversight and address longstanding state performance issues.
We strongly support making state performance information publicly available in an easy-to-
understand format to help apply pressure to improve both federal and state government
performance.

      EPA's principal concern with the draft Report is the limited number of metrics and the
associated methodology relied on by the OIG to assess state performance. To fairly evaluate a
state enforcement program is a complex undertaking that requires a thorough analysis of multiple
factors considered within the context of a state's overall enforcement activities. Unfortunately,
the methodology of state evaluation adopted by the OIG for this Report - averaging performance
across programs using a very limited number of metrics - oversimplifies what, of necessity, must
be a more comprehensive review process. We are concerned that publication of the data using
these limited metrics could have the unintended  consequence of moving states toward a more
simplistic, and less protective, enforcement program. Moreover, within the metrics utilized by
the OIG, data errors and misunderstanding about information in (or missing from) the data have
12-P-0113                                                                         41

-------
occurred such that the data and goals presented and relied upon by the OIG are unreliable. EPA
is requesting that these issues be addressed before the Report is released so that the important
conclusions of the Report and the need for further action remain the focus.
 OIG Response 1: While generally agreeing with our findings, EPA took issue with some
 aspects of our methodology. EPA's extensive comments contained here reflect these concerns.
 In summary, EPA was concerned with how we used information about state activities to
 compare state program quality to screen states for further in-depth review. Although not
 mentioned in its comment, we utilized EPA's own enforcement data within our review, and our
 analysis is similar to those that EPA itself conducts. These data possess several significant
 limitations that EPA has acknowledged and which we took steps to overcome and describe.
 However, we used several other sources of information in our evaluation, such as descriptions
 of states contained in the SRF reports and interviews with officials and staff of EPA and states.
 Our conclusions rest on information contained in all of these sources.

 We affirmed our choice of the three measures of state activity based on discussions with OECA
 staff and officials, and other stakeholders. Their use is supported through statements contained
 in OECA's own documents including the SRF, and EPA displays the data on its publicly
 available enforcement website. However, because we realize that no single set of measures is
 individually definitive, we corroborated our findings through numerous interviews across six
 regions and document review. EPA's agreement with the substance of our findings provides
 further support for the validity of our approach. Details of our methodology are presented in
 chapter 1 and appendix A.	
       We appreciate your acknowledgement that significant work has been done by EPA, with
more underway, to improve our oversight of state enforcement programs within a larger
framework of federal and state shared accountability in the enforcement of our nation's
environmental laws. And yet, new work has occurred since FY2009 that is not well described in
the Report, such as the publication of our state review documents and a web-based state
performance and comparison tool, which have increased transparency and public accountability
of state enforcement programs.  Additionally, in June 2010, the Office of Water (OW) and the
Office of Enforcement and Compliance Assurance (OECA) jointly issued guidance to the
regional offices on ensuring consistent enforcement by states of Clean Water Act permits.
Lastly, and importantly, EPA is in the midst of significant advances that could dramatically
change the effectiveness and accountability of federal and state enforcement programs. By way
of example, recent actions include developing a proposed rule  for electronic reporting of
National Pollutant Discharge Elimination System (NPDES) permitting and compliance
information, and a new pollutant loadings targeting tool. This work demonstrates EPA's
commitment to the principles that underlie your Report and our willingness to pursue innovative
measures to secure better performance in the future.
 OIG Response 2: We describe many of EPA's efforts to improve state performance in the
 beginning of chapter 2 of the report.	
12-P-0113                                                                           42

-------
Methodology, metrics and data

        Rarely do states under-perform across the board in all media programs. More
commonly, states perform well in one program area and not well in others. Sometimes
performance issues are related to particular sectors. These layers of performance are difficult to
measure and convey through simple data metrics.

       As noted above, the metrics relied on  by the Inspector General in this draft Report are
overly simplistic, and in some cases inaccurate, thereby resulting in erroneous conclusions
regarding individual state enforcement performance. The use of limited data presents an
incomplete picture of state enforcement programs, and fails to provide an accurate evaluation of
the quality or other contextual aspects of complex state enforcement performance. Although we
sympathize with the desire to keep it simple, we are concerned the Report, as  currently
presented, will give the public a false impression of state performance by publishing both
inaccurately positive and inaccurately negative state evaluations.

OIG Response 3: As described in chapter 2, in response to EPA's comments, we eliminated our
use of a summary metric for each state. However, we disagree that presenting a simplified
method for comparing state enforcement performance is invalid. Repeated assessment of
indicators allows management to determine where EPA should focus additional attention. As
with any other performance measure, the measurement only takes on meaning when it is
compared with benchmarks that clearly outline success. The overarching message of this report
is that EPA must establish clear benchmarks for success so that states, EPA regions, EPA
headquarters, the public, and the regulated community all understand what is required to ensure a
safe and healthy environment. The report emphasizes that the states we ranked highest are not
necessarily succeeding, but rather that "performance is low across the board." With clear
benchmarks, according to these or other performance indicators, EPA, states,  and the public can
clearly understand performance, and EPA can take  deliberate steps to improve performance
where it is needed.	

       Our specific comments on the metrics and methodologies used by the  Inspector General
and the text of the draft Report are listed in the Attachment to this memorandum, but some
overall points are listed below.

       1. The metrics chosen by OIG overemphasize major facilities and inspections.

       Large facilities often are significant sources of pollution. Consequently, compliance at
these facilities is, and remains, a priority. However, EPA increasingly is realizing that large
cumulative impacts and locally significant pollution is associated with violations occurring
collectively at smaller facilities. In the water program, for  example, stormwater and agricultural
sources are often the most significant sources of water quality impairment.  In RCRA, there is
increasing evidence that smaller quantity generators are disproportionately responsible for
violations that could pose a risk of release.  In addition, pollution from large facilities not
technically classified as "majors" can be very important. An  example from the water program is
mining facilities, which often contribute large pollution loads but are not captured in the
definition of majors. In our work on state oversight we have been pushing for greater
12-P-0113                                                                             43

-------
accountability for these other categories of sources, and moving away from the historic and
nearly exclusive focus on majors.  States that have responded to this shift in emphasis and
refocused resources toward the most significant pollution problems would fare badly on the
metric OIG uses, which may include some minor source data, but in fact relies almost entirely on
data about majors. This will give an inaccurate and unfairly negative picture of these states'
performance, potentially harming the overall effectiveness of their enforcement programs.

       As the compliance program has matured, and our understanding of pollution problems
has grown, we have acknowledged that in addition to inspections there are other ways to
ascertain compliance.  As we have discussed, we are moving toward a new paradigm where self-
monitoring and electronic reporting will be increasingly  important tools to supplement
inspections.  Government resources will be targeted to the most serious problems and we will
have a much better idea overall about the compliance picture than we do today. We can be more
effective and efficient, both at the federal and state level, through better monitoring and reporting
and public disclosure of pollution and compliance information.  These approaches also
encourage better compliance performance through the power of public accountability. While it
is fair to evaluate states based on their performance in inspecting facilities as required in the
various  program compliance monitoring strategies, we are concerned that the Inspector General's
overly heavy emphasis on inspections of majors as a measure of state performance weights the
scale in the wrong direction.
 OIG Response 4: We note again that the metrics the OIG utilized are the same ones that OECA
 uses to manage its programs, and we selected them in coordination with OECA. We agree that
 enforcement should be targeted at the most significant sources of pollution, whether these
 sources are majors or a collection of minors. We would use information about nonmajors if
 reliable information on this were available. However, EPA does not consistently collect
 information about compliance and enforcement at nonmajor sources. The OIG has reported on
 data quality numerous times in the past and it is an acknowledged weakness of EPA reporting
 systems. As such, it is extremely difficult to assess how well states ensure compliance and
 conduct enforcement at smaller facilities.
       2.  Rates of identification of significant violations are not an informative metric.
       In the abstract, comparing rates of violation would be an excellent metric for evaluating
state performance. However, available data suggests that some states significantly under-report
significant violators.  Therefore, it is likely that low rates of identified significant violators do not
reflect actual low rates of violation, but rather incomplete or mischaracterized data.  If this is
correct, then higher reported rates of violation are more likely to be accurate, and states with
such higher rates likely to be higher performing states, if identification and disclosure of
violations is the metric. As a consequence, self-reported rates of significant violation by states is
an ambiguous metric at best, since either low rates of violation or high rates of violation could be
associated with a high performing state compliance program and vice versa.  If the Inspector
General uses this metric as one of just three, this significantly oversimplifies an otherwise
complex evaluation. In our judgment, greater reliance on self-monitoring, electronic reporting,
and public disclosure of enforcement data will result  in more complete and credible compliance
information in the years ahead, greatly increasing  public transparency and the deterrent effect of
12-P-0113                                                                             44

-------
compliance oversight. With the addition of the new data mentioned above, the Agency should
be able to use this metric with greater confidence in future years.
 OIG Response 5: As with any measurement, SNC identification and reporting requires
 additional information to provide comprehensive understanding of performance. However, SNC
 rates are a valid measure of state performance, and EPA uses them as one measure of
 performance. Our analytical method recognized underreporting as a potential issue, and as such,
 we used higher SNC identification rates as an indicator that the state targeted inspections and
 identified violations. We came to this decision based, in part, on conversations with EPA
 enforcement staff.

 Moreover, it is uncertain whether a greater reliance upon self-reporting will improve data
 quality. EPA's recent comparison of randomly generated compliance rates with results of
 targeted inspections and self-reported violations indicated that self-reported violations were
 underreported.	
       1.  Some of the data presented by OIG appears to be incorrect
       EPA's regions have been unable to replicate the OIG data pulls, and it appears that OIG
researchers have misunderstood some of the data, leading to inaccurate statements regarding the
meaning and significance of EPA's data on state enforcement. The draft OIG Report also
contains incorrect descriptions of several specific instances of state oversight by EPA, e.g., the
Agency's response to a 2001 petition to withdraw the Louisiana NPDES program, the Agency's
reaction to program performance issues in Illinois, and EPA's review and approval of the NPDES
permitting program in Alaska. A full discussion of these cases, along with the corrected
information, appears in the Attachments.
 OIG Response 6: In response to this comment, we expanded the description of our
 methodology and quality assurance steps in appendix A. EPA's enforcement database allows
 corrections to be made by states. Because we downloaded the data used in our analysis from
 EPA databases in August 2010, if states subsequently modified their data, the specific results
 may vary slightly if the analysis were repeated today.

 We reviewed the additional information about Louisiana's NPDES program provided in the
 supplied attachment and determined that it did not change our overall interpretation of events.
 We based our descriptions of specific instances of state oversight on several analytical methods,
 including interviews with the regions and the states, and data and document review. We believe
 our interpretation of the events accurately portrays the information that we gathered from all
 stakeholders. However, where appropriate we updated the narrative to elaborate and accurately
 reflect the additional information provided to us. In fact, this new information more starkly
 demonstrates the ineffectiveness of EPA intervention in the state. Despite the region and
 headquarters involvement, the state performance has not improved  substantially. This failure
 underlines the importance of recommendations 1 and 5, which would allow OECA to direct a
 national workforce to respond to enforcement crises like those experienced in Louisiana.
 For responses regarding Louisiana, Illinois, and Alaska, see subsequent OIG responses.
12-P-0113                                                                            45

-------
       We appreciate OIG's willingness to meet to help us better understand the information
utilized in the OIG analyses. It is important that data errors be identified and corrected, and
information presented consistently for all states. For example, there are several instances where
percentage scores for states exceeded 100%, clearly an error.  This and other examples of data
inaccuracies in the draft Report, as identified in this memorandum and its attachment, should be
corrected. Importantly, the metrics identified by the OIG should accurately reflect the data in
EPA's information systems, because this data is central to the conclusions that OIG reaches in
the draft Report.
 OIG Response 7: The information in the report accurately reflects the data contained in EPA's
 information systems. In terms of the example provided, we discovered instances in EPA's
 database where state data exceed 100 percent for a category of enforcement action. This is an
 artifact of how EPA's enforcement data system calculates this measure. [The database
 calculates this metric automatically by dividing the number of state actions with penalties
 (numerator) by the number of state formal actions (denominator). However, the database counts
 one category of formal actions in the numerator but not in the denominator ("AO Stipulated
 Penalty  1"). The result is that the percent of formal actions with penalties can be greater than
 100 percent if a state issued a lot of "AO  stipulated penalty 1" actions.] Because this calculation
 method was consistent across states, in our draft report we used the number as presented in the
 database.
EPA's new approaches to enforcement and state evaluation

       Complete data and valid, meaningful measures are key to understanding state
performance. Gaps in our current data make it difficult to develop measures that tell a complete
story across all media and regulated sectors. Limited resources at both the state and federal
levels make it more difficult to address these gaps. Measures based on the data that EPA does
have may not focus on the right things. In our opinion, an effective way to address these issues is
to set a new direction in the enforcement program that takes advantage of the advances in
information and monitoring technologies.  This will enable us to get more complete data through
the efficiencies of electronic reporting and field equipment. This data will allow EPA to more
completely identify regulated sectors, monitor and assess compliance, target resources more
effectively, and measure the overall performance of federal and state regulators in a more
meaningful way.

       An example of this approach is the Clean Water Act Action Plan.  In the NPDES
program, EPA has focused its resources  and attention on the biggest facilities, the majors.
Despite the focus on majors, compliance at those facilities is not what it should be. In addition,
over time, wet weather and other sources have been identified as having significant impacts on
water quality. EPA needs to expand its knowledge and data of these smaller regulated sources
and target attention on addressing the environmental damage and health impacts they cause. By
utilizing electronic reporting, better monitoring, and new compliance strategies, the Agency will
eventually build the capacity to develop  new performance measures that include these sources.

       EPA has begun to implement this new strategic vision by taking near-term actions such
as the development of new pollutant loadings targeting tool, electronic reporting of NPDES
12-P-0113                                                                             46

-------
Discharge Monitoring Reports (NetDMR), and piloting the use of third party vendors to build
electronic reporting tools for facilities to report DMRs. The Agency also has initiated longer
term actions such as developing the aforementioned NPDES e-reporting proposed rule, building
e-reporting and new compliance strategies into rules, and integrating permit and enforcement
annual planning and reviews. This vision will take some years to achieve. Until then, we must
recognize the limitations of the data and metrics we currently have and not put so much weight
on what is an admittedly incomplete,  and in some cases potentially misleading picture.
 OIG Response 8: We agree that ideal performance data are repeatable, reliable, and relevant.
 However, we also believe that EPA underrelies on its vast stores of state enforcement data,
 which it has collected for decades.  Our scope and methodology descriptions in chapter 1 and
 appendix A describe the steps we took to ensure the data we used were reliable for our purpose.
       EPA supports the use of national maps to compare state enforcement performance. This
past spring, EPA released Clean Water Act state dashboards that display data in a similar fashion
across states. EPA is, however, concerned that, by not using the right metrics and data for
comparison purposes, the OIG draft Report will lead to false conclusions regarding state
performance. For instance, states that have had deficiencies noted in State Review Framework
evaluations,  based on extensive file reviews and other contextual information beyond the data in
EPA's  systems, may look good in the limited data set pulled and displayed by the Inspector
General.  In this case, the unintended consequences are that the careful reviews done by the
regions may be undermined by the analysis presented by the Inspector General, and the public
may be falsely reassured that the state program is stronger than it in fact is. Another possible
unintended consequence is that there may be states that perform well across their regulated
universes, but our data system only contains required reporting on majors, making these states
appear to have performance problems. Another issue is the Report's failure to note that the
flexibility provided in our Compliance Monitoring Strategies allows states to negotiate
alternative compliance monitoring plans annually. These plans are  specifically negotiated to
ensure  that states address their most important sources of pollution and noncompliance. The
Agency is working on how to make these alternative plans transparent to the public so states are
held accountable to their plans. It is important for us to recognize these issues as we work to
improve consistency, transparency, and performance.
 OIG Response 9: The OIG reviewed the OECA dashboard and map system under
 development. At the time of our evaluation, this information was under development for the
 CWA and not yet available for the CAA or RCRA programs, which were included in the scope
 of this evaluation.  The OIG applauds EPA's efforts in finding new approaches to address state
 enforcement needs. We did not base our analysis on the data alone, but also on several
 additional information sources, described in our detailed methodology.

 At the same time, in response to EPA's concerns we eliminated our use of a summary metric for
 each state. However, we disagree that presenting a simplified method for comparing state
 enforcement performance is invalid. Repeated assessment of indicators allows management to
 determine where EPA should direct the most attention. As with any other performance measure,
 the measurement only takes on  meaning when it is compared with benchmarks that clearly
 outline success. The overarching message of this report is that EPA must establish clear	
12-P-0113                                                                            47

-------
 benchmarks for success so that states, EPA regions, EPA headquarters, the public, and the
 regulated community all understand what is required to ensure a safe and healthy environment.
 The report emphasizes that the states we ranked highest are not necessarily succeeding, but
 rather that "performance is low across the board." With clear benchmarks, EPA, states and the
 public can clearly understand performance, and EPA can take deliberate steps to improve
 performance where it is needed.	
   EPA's ability to measure and its oversight of state program performance is evolving as the
Agency continues to stress the need for transparency and a focus on the most important sources
of pollution affecting environmental quality. We are working to develop better tools and
approaches to improve the consistency of regional oversight of states and address known
performance issues. Examples include:
   •   The State Review Framework (SRF) has been a big  step forward in establishing a
       structured review process for state enforcement programs and promoting regional
       consistency in conducting oversight. Nevertheless, there are still  inconsistencies in how
       regions draft reports  and recommendations, and in their responses to performance issues.
       To improve the SRF, we are streamlining the data metrics,  verifying annual data sets,
       improving reporting, and integrating the SRF with NPDES permit quality reviews.
   •   Increasing use of new web products that provide easy-to-use summary information using
       national maps comparing state enforcement and permit program performance. The
       Agency posted a CWA state dashboard in the spring, and is working on similar maps for
       the CAA, RCRA, SDWA and state pesticide programs. We think this information goes a
       long way toward the kind of useful comparison and  transparency  that the  OIG is
       promoting in this Report, with significantly less opportunity for mischaracterization.
   •   The CWA Action Plan has a detailed implementation plan  covering four primary changes
       in how we do business:
          o  Mandating electronic reporting from regulated facilities for DMRs, notices of
             intent to discharge, and various program reports in a rule.
          o  Reviewing and revising CWA guidance and  policies to ensure the ability of EPA
             and states to address the most significant environmental problems, and address
             them in an appropriate manner using a tiered approach to addressing violations.
          o  Building new compliance approaches into rules such as electronic reporting;  self
             monitoring, certification and reporting; use of new technology to monitor; and
             third party certification programs.
          o  Integrating permit quality and enforcement annual planning and reviews.
 OIG Response 10: We describe many of EPA's efforts to improve state performance in the
 beginning of chapter 2 of the report.	
       The regions have a range of tools and approaches to address state performance that go
       beyond the SRF, and which are important in looking at issues not contemplated in the
       SRF. Region 5's actions in Illinois (which are publicly available at
       http://www.epa.gov/Region5/illinoisworkplan) are one example, as they address both
       permitting and enforcement issues and go well beyond the analyses and
       recommendations under SRF. These actions have already yielded significant results and
12-P-0113                                                                            48

-------
       meaningful improvements to Illinois' program and are a direct result of the Region's
       active engagement with the state.
 OIG Response 11: We commend EPA's efforts to improve Illinois' program, and described
 our knowledge of the process in chapter 2 of the report.
Responses to specific recommendations

    Despite  serious concerns with the metrics and methodology of the draft Report, and the
unintended consequences should these be included in the final Report, EPA agrees that states
should be held accountable for addressing state performance issues that are identified using the
SRF and other available tools while the Agency develops and implements a new enforcement
paradigm. Specific comments on the recommendations made in the draft Report are as follows:

Recommendation 1: Give OECA authority for all nationwide enforcement resources and
workforce allocation.

       EPA does not agree that centralizing resources and workforce allocation will address the
concerns raised by the Inspector General concerning national inconsistency in state performance
and regional oversight. OECA currently exercises significant central authority for enforcement
resources and general workforce allocation, provides national direction through the Strategic
Plan, National Program Managers Guidance and the ACS commitment process, and holds
regular meetings and calls with regions to discuss performance and oversight. The degree of
control over a dispersed workforce anticipated under this recommendation would not be
substantially changed from what currently exists and would not lead to the improvements
envisioned by the Inspector General.

       The national perspective, necessary to achieve a level playing field for businesses and
states and equal protection for the public, needs to be balanced against the value of applying
local, on-the-ground knowledge and necessary, ongoing relationships to solving specific state
and regional environmental issues. Regions need the discretion to tailor the national approaches
to their and their states' unique and individual needs. OECA cannot appropriately be positioned
to maintain the level of specific knowledge across the country necessary to ensure that each state
is addressing its most important sources and to tailor fixes to identified problems that make
sense.
 OIG Response 12: Maintaining oversight of a national enforcement program requires a
 centralized capability to manage a national enforcement workforce. EPA does not currently
 possess this capability because most enforcement resources are managed by the 10 regional offices
 rather than the OECA. Because of this, EPA does not adjust resources to meet changes in EPA's
 enforcement mission and goals. This leads to inefficiency, which is one of EPA's future challenges
 if resource constraints increase. In addition, dispersed decision-making leads to disparities in state
 expectations and consequently in their performance. These are concerns that EPA shares as
 evidenced in its comments to our draft report. These disparities limit the protection of the public
 from environmental pollution in states and regions where standards are not the same. As a result,
 we disagree with EPA's response and consider recommendation 1 unresolved.	
12-P-0113                                                                            49

-------
Recommendation 2: Cancel outdated guidance and policy documents, and consolidate and
clarify remaining guidance into EPA documents that are publicly and easily accessible on
the EPA Civil Enforcement website.

       EPA agrees that it can separate current guidance and policy from historic policy that may
still need to be accessible to regulators and the public, and clarify what is currently applicable for
state programs. We reserve judgment until further review as to how to do this and where to post
appropriate documents for public access.
 OIG Response 13: We look forward to reviewing the Agency's action plan for addressing
 recommendation 2 in the final report response.	
Recommendation 3: Establish clear and consistent national enforcement benchmarks
throughout CAA, CWA, and RCRA guidance and policies so that EPA's enforcement
expectations are clear and consistent for state governments and the regulated community.

       EPA agrees that we should have clear national enforcement goals and benchmarks so
regions and states are clear on what expectations apply. OECA's national expectations are laid
out in current compliance monitoring, enforcement and penalty guidance and policies. These
expectations are generally identified as program goals, with reviewers of state performance
looking at how individual states match up both to the goals and to comparison performance
across states. As EPA refines and streamlines the metrics for the third round of the State Review
Framework, we will  seek ways to ensure that these goals and benchmarks are clear and balance
national consistency with regional and state flexibility. As we move to the new paradigm, we
will work collaboratively with regions and states to develop appropriate expectations and
benchmarks.

       As part of our improvements in increasing public access and understanding of
government performance, EPA will explore how to make specific expectations, benchmarks and
commitments visible to the public through our state dashboards and other public web sites. Part
of the lack of clarity of guidance is that while guidance changes, the evaluation of performance is
done retroactively against the policies in place at the time of the performance. This may lead to
confusion as to which guidance the state is being held accountable for in any given year.  EPA
agrees to try and clarify which guidance applies to each performance year as reviews continue
and to make that information  available to the public.

       EPA seriously questions the value of codifying guidance applicable to state enforcement
programs in rules. Unlike rules, guidance does not have the force and effect of law, is intended
to be more flexible in its  application, and can be modified and improved more quickly to meet
changing needs and circumstances. In contrast, the rulemaking process is both resource and time
intensive. A rule may be  promulgated, and modified, only in accordance with well-defined
public comment and hearing procedures. The rulemaking process, thus, is ill-suited for dynamic,
case-specific situations like state enforcement program reviews. The use of guidance in the
context of state oversight is particularly appropriate in that it allows the EPA to quickly respond
to state requests for clarification on various enforcement-related issues, identify data and
12-P-0113                                                                            50

-------
information the Agency will rely on as part of the review process, and provide insight as to
EPA's priorities when undertaking individual state enforcement reviews.
 OIG Response 14: We believe the Agency is on the right track when it comes to addressing
 recommendation 3. We look forward to seeing the complete action plan containing the steps the
 Agency intends to take to address this recommendation in the final report response.	
Recommendation 4: Establish a clear and credible escalation policy for EPA intervention
in states that provides steps that EPA will take when states do not act to ensure that the
CAA, CWA, and RCRA are enforced.

       EPA agrees that a national strategy that describes escalating actions for regions to take to
address state performance issues would be beneficial. We want to be clear that the purpose of
escalation is to resolve performance issues and improve state programs, and to do so at the
lowest management level possible. We do not agree that when state performance is a problem
that EPA's response should always be to step in and initiate an enforcement action. Our goal is
to work to strengthen state performance so that the state can again administer a strong
enforcement program. Moreover, enforcement cannot be viewed in isolation from the
performance of a state's delegated or authorized programs as a whole. The ability to run an
effective state enforcement program is greatly compromised where state media programs,
standards, plans or permits are weak or otherwise ineffectual.
       While program withdrawal may be seen as a necessity in cases of severe  program
deficiencies, program withdrawal is not the goal of escalating oversight.  The environmental
laws of this country clearly envision the states be in the forefront of day-to-day program
implementation. While not the goal, we recognize that program withdrawal is a necessary tool in
the oversight toolbox.
 OIG Response 15: We believe that EPA was nonresponsive to recommendation 4. We agree
 that program withdrawal should be a tool in the oversight toolbox. We will expect the Agency
 to address this recommendation by providing a plan of action in the final report response.
Recommendation 5: Establish procedures to reallocate enforcement resources to intervene
decisively when appropriate under its escalation policy.

       This is a concept that warrants further Agency consideration.  The budget constraints
EPA is currently under present a challenge to implementing this recommendation. EPA believes,
however, that if program withdrawal is warranted because of significant and severe issues across
a delegated or authorized program, the Agency needs to have the capacity to carry it out.   The
Agency will explore further what options may be available regarding program withdrawal when
it is truly in order to make it a credible feature of EPA oversight.
 OIG Response 16: We believe that EPA was nonresponsive to recommendation 5. Contrary to
 EPA's point of view, we feel that EPA adopting the procedures noted here especially at the
 present time will enhance budgetary efficiency. We consider it unresolved, and expect the
 Agency to address this recommendation by providing a plan of action in the final report
 response.	
12-P-0113                                                                           51

-------
Conclusion

       EPA agrees with the OIG that state enforcement performance varies significantly and that
EPA Headquarters and regional oversight can be improved. EPA generally agrees with the
direction and recommendations in the draft Report that seek to update enforcement guidance and
policy, clarify expectations and identify escalating steps for regions to take to address
performance issues. OECA has already taken steps to improve national consistency in the
enforcement program across regions and states:
             •   Developing and implementing the State Review Framework (SRF)
             •   Making SRF metrics and documents public
             •   Developing state performance dashboards, currently for the CWA but
                 including CAA and RCRA in the future
             •   Discussing state performance at national meetings with regions and states and
                 having state oversight as a top priority goal for enforcement
             •   Working with regions to identify steps to address state performance issues

       EPA acknowledges its limitations in establishing meaningful measures for evaluating
performance. With this in mind, the Agency is pursuing a new paradigm for enforcement through
the use of 21st Century technology to improve data collection and to monitor compliance.  Once
e-reporting, the use of new technologies in compliance monitoring and new compliance
strategies in our rules are in  place, we are confident the Agency will have more and better data
with which to evaluate state enforcement performance. Until that time, we are in transition and
need to be cautious about putting too much weight on the limited metrics for which we do have
data. We need to avoid the unintended consequences that occur when making inaccurate,
misinterpreted or confusing  data available to the public without adequately explaining the
limitations  of the data and its significance.  For these reasons, EPA has serious concerns with the
metrics and methodology utilized by the Inspector General in conducting this review.

       EPA agrees more needs to be done. More complete data and better measures will help us
do a better job of holding states accountable and portraying the complexities of state
performance to the public. This is a process of continual learning and improvement, the fruits of
which may not be evident for a number of years. Regardless, it is critically important that we
take steps now if we are to achieve nationally consistency and shared accountability in the
enforcement of the nation's environmental laws by EPA Headquarters, the regions and our state
partners.

       Should you have any questions or concerns regarding this response, please contact
OECA's  Audit Liaison, Gwendolyn Spriggs, at 202-564-2439.

   Attachments
12-P-0113                                                                            52

-------
                                                                       Appendix E

   EPA Attachments to Draft Comment Memorandum
                          and OIG Responses
         Attachment: Detailed Comments on OIG Draft Report OPE-FY10-0022
             "EPA Must Improve Oversight of State Enforcement"

Substantial Issues with the State Performance Analysis Methodology
EPA feels that the key conclusions reached in the OIG analysis - that there is demonstrable
inconsistency in state performance and regional oversight, and that EPA should take action to
raise state performance where it is poor - are valid conclusions supportable with pre-existing
EPA data and analysis. However, there are a number of issues associated with the data, metrics,
and methodology used by the OIG to produce the results shown in Table B-l in Appendix C of
this Report. EPA feels that the map on page 7, the regional comparison chart on page 9, and
Table B-l all provide an inaccurate picture of state performance.
 OIG Response 17: The OIG stands by its methodology. In response to EPA's comments, we
 provided additional detail about our overall methods in chapter 1 and appendix A. We described
 in chapter 1 how the metrics allow for state-to-state performance comparison based on priority
 issues to EPA. However, without corroboration they do not offer reliable details about
 individual state performance. In addition, we made some modifications to the presentation of
 data in this final report. These changes are described in chapter 2, and below, in additional OIG
 responses.	
Issues: First, there are errors in the data collection and presentation that casts many of the raw
figures in the table into doubt.  Second, the three metrics chosen to support the overall analysis
are ambiguous when examined individually and without additional context: each has several
variables which affect their meaning and create uncertainty for the purpose of being able to draw
conclusions or inferences. Third, rolling up metric averages into one average obscures
differences and hides complexity;  and, averaging the averages across programs — which treat
those program elements differently - further compounds the errors. Lastly, assessing program
performance is complex. Data  alone  cannot adequately describe performance. EPA uses a
number of metrics, both federal and state data, file reviews, and interviews with the states to
understand state programs.
 OIG Response 18: In response to this comment, we expanded the description of our
 methodology and quality assurance steps in appendix A. EPA's enforcement database allows
 corrections to be made by states. We also eliminated our use of a summary metric for each state
On August 31, 2011, OECA and the OIG met to discuss these significant issues with the draft
Report identified by OECA and the regions. OECA felt that the discussion was very constructive
and that the OIG was engaged and receptive to the feedback. EPA requests that the OIG address
these issues by working with experts in OECA's Office of Compliance (OC) on more accurate
12-P-0113                                                                         53

-------
data pulls and utilization and display of the data the OIG is interested in using. While OECA
recognizes the independence of the OIG in doing their reviews and analyses, OC staff are
available to the OIG to help produce and share accurate data, demonstrate current tools and data
displays, and provide knowledge and experience regarding the metrics chosen in the draft
Report.
 OIG Response 19: We appreciate OECA discussing its concerns with our report. However, this
 criticism is unfounded. Subsequent to the August 31 meeting, the OIG requested the data
 mentioned in the OECA comment. The data that OECA provided to us covered 2 years for only
 one statute (CWA data for 2008-2009). Because the scope of this evaluation covers 7 years
 (2003-2009) and CAA and RCRA statutes in addition to CWA, this was not a comparable
 substitute for the years of data contained in EPA databases. Moreover, for the limited data
 provided (the 2008-2009  CWA data), we conducted statistical analyses and determined that an
 assessment of state performance using the new information was materially no different from the
 data we used in our analysis.	
Inaccurate Data
EPA regions have commented that they were unable to reproduce the data or analyses as
presented in the Report. EPA recommends that OIG and the regions work together to understand
how the data was collected and if there are errors that can be corrected.
 OIG Response 20: In response to EPA's comments, we included additional details about how
 we collected and summarized EPA enforcement data in appendix A.	
Average Percentage of Facilities Inspected ('03- '09)
It is not possible to draw valid conclusions from the inspection coverage metric as presented in
Table Bl because of the sheer number of variables that affect the data. Some examples of the
types of data errors in the Report include:
   •   The calculation method used for the percentage of facilities inspected misrepresents state
       inspection performance. For each statute that was reviewed in this Report (CWA, CAA,
       and RCRA), it is not clear how the data was pulled and if it included the entire universe
       for each media in the corresponding EPA data system. Universe numbers in both the
       CWA and RCRA Subtitle C are not complete or accurate in EPA systems.
          •   In the NPDES program, states are only required to report detailed information
              about major facilities, which represent about 6,700 facilities out of a universe of
              close to a million. Yet, some states that use the federal system for their own
              database enter in  additional sectors that may have been mistakenly included in the
              totals used for averaging. This could introduce inconsistency into the analyses of
              these numbers across states, and could serve as a disincentive for states to report
              information above what is required.
          •   The RCRA C universe is known to fluctuate greatly as facilities may periodically
              change status from a Large Quantity Generator (LQG) to a Small Quantity
              Generator (SQG). This introduces a margin of error that distorts analyses without
              correction.
12-P-0113                                                                            54

-------
          •   The OIG did not control for the inspection frequency goals that relate to facility
              size, choosing to lump together facilities - big, medium and small. Because the
              overwhelming number of facilities are small, the inspection percentages are
              skewed and do not dovetail with inspection goals.
 OIG Response 21: We note again that some of the metrics OIG utilized are the same ones that
 OECA uses to manage its programs, and we selected them after consultation with OECA. We
 agree that enforcement should be targeted at the most significant sources of pollution, whether
 these sources are majors or a collection of minors. We would use information about nonmajors
 if reliable information on this were available. However, EPA does not consistently collect
 information about compliance and enforcement at nonmajor sources. The OIG has reported on
 data quality numerous times in the past, and the issue is an acknowledged weakness of EPA
 reporting systems. As such, it is extremely difficult to assess how well states ensure compliance
 and conduct enforcement at smaller facilities. The metrics factor in the size of a state's
 regulated universe by using percentage of facilities inspected. In response to EPA's comments,
 we included additional details about how we collected and summarized EPA enforcement data
 in appendix A.
Percent of Significant Noncompliance Identified per Inspection
    •   For the CWA program, most SNC is identified via self-reports rather than inspections.
       This is not an appropriate comparison.
 OIG Response 22: We described in chapter 1 how the metrics allow for state-to-state
 performance comparison based on priority issues to EPA. However, without corroboration they
 do not offer reliable details about individual state performance. The database calculates the
 SNC metric via self-reports for states, so this was a consistent metric to use for a cross-state
 comparison.	
Percentage of Formal Actions with Penalties
    •   CWA penalties are not required to be entered by states at this time. Eighteen states were
       listed at 0%, though it cannot be determined if there are actually no penalties assessed or
       if the data was not reported.
    •   Some of the numbers appear to be erroneous (e.g., Alabama has 165% of formal
       enforcement with penalties, which is not possible and inflates their performance score).
    •   Analysis of appropriateness of penalties (amount assessed and whether there was a return
       to compliance) was not analyzed using the OIG methodology.
 OIG Response 23: In response to EPA's comments, we clarified our scope and methodology.
 There are two considerations regarding this measure for the CWA. First, and most importantly,
 states are not required to report penalty information. Therefore, it was not possible to determine
 whether a zero in EPA's database is a true value or a nonresponse. There were 18 authorized
 states with a zero value. We adjusted for this in our analysis by utilizing the penalty metric only
 for the 28 authorized states that had a greater than zero value for penalties.

 The second consideration is that the EPA database shows that several states exceed  100 percent.
 This is caused by the way EPA's data system computes totals. [The system calculates this	
12-P-0113                                                                            55

-------
 metric by dividing the number of state actions with penalties (numerator) by the number of state
 formal actions (denominator). However, the database counts one category of formal actions in
 the numerator but not in the denominator ("AO Stipulated Penalty 1"). The result is that the
 percent of formal actions with penalties can be greater than 100 percent if a state issued several
 "AO stipulated penalty 1" actions.] We adjusted for this in our analysis by capping the metric at
 100 percent (so, we assessed the two states reporting > 100% at 100% in our analysis).

 The scope of our evaluation did not include penalty appropriateness.	
Metrics Selected are Ambiguous Without Additional Information and Context
The three metrics selected all require additional information to interpret.

Average Percentage of Facilities Inspected ('03- '09)
The goals in the CWA, CAA, and RCRA Compliance Monitoring Strategies mostly reflect major
facilities and inspections, while allowing the negotiation of alternative compliance monitoring
plans that include non-major inspections for which there generally are no specific goals. This
makes it difficult to analyze performance against one goal. States who are appropriately utilizing
our policies to focus on the most important sources of pollution and noncompliance will appear
to be performing poorly under the analyses in the draft Report.
           •   Under the CAA CMS, states can negotiate lower coverage of major sources in
              exchange for greater coverage of minor sources that are contributing to priority
              environmental problems.
           •   Under the CWA CMS, we encourage states to shift inspection resources from
              major sources to non-majors that pose significant environmental harm or have
              known compliance issues.
           •   Under RCRA, a large percentage of the universe can shift between SQG and LQG
              in any given year so it is not possible for EPA to determine the precise universe.
              Beyond statutorily defined coverage frequencies for Treatment, Storage and
              Disposal facilities, states can negotiate different coverage levels according to
              options in the RCRA CMS.
 OIG Response 24: The EPA's first two examples, above, underscore the confusion states
 expressed to the OIG about EPA benchmarks and goals for states. Because there are not clear,
 consistent benchmarks for nationwide environmental protection, EPA holds different states to
 different standards. As we stated in the report, this leads to confusion on the part of states, and
 impedes the public's ability to discern whether states meet national goals. The OIG concurs that
 flexibility is valuable; however, EPA's performance measures should accurately reflect what
 EPA expects from all states. This improves transparency for state performance for the American
 public, Congress, states, and the regulated community.

 In response to EPA's comments, we added an explanation to chapter 1. The metrics allow for
 state-to-state performance comparison based on priority issues to EPA. However, without
 corroboration they do not offer reliable details about individual  state performance.	
12-P-0113                                                                             56

-------
Percent of Significant Noncompliance Identified per Inspection
By itself, there is no way to know whether a low or high SNC rate is the sign of good or poor
program performance. A low SNC rate can indicate that a program is not identifying SNC that it
should and is, therefore, performing poorly. On the other hand, low SNC numbers can mean a
high rate of facility compliance as a result of a program performing well. In this instance, a state
could have a high field presence and be active in taking enforcement actions that are creating
high deterrence (which may or may not relate to numbers of enforcement actions). To understand
the significance of the SNC rate, you need to understand the quality of the  state's targeting,
inspections, and enforcement.
 OIG Response 25: Based on EPA's comment, we provided additional explanation in
 appendix A.	
Percentage of Formal Actions with Penalties
The percentage of penalties per action does not provide a real indicator of program quality.
While the number of actions is important, it is also important that the actions are appropriate,
meaning that they bring a facility back into compliance and provide deterrence. States with a
high number of actions with penalties, but insignificant penalty amounts "score" high under the
OIG analysis, but may not actually result in facility compliance or general or specific deterrence.
 OIG Response 26: We agree that the ultimate success of an enforcement action is bringing a
 facility back into compliance and providing deterrence to future noncompliance. The metrics we
 selected for this evaluation are not comprehensive, but rather serve as important indicators of
 state enforcement activity. Assessing the effectiveness of enforcement actions at achieving
 sustained compliance was outside the scope of this evaluation.	
Roll-up of Data into "Overall Performance Averages" to Characterize State Performance
To fully appreciate the meaning of a percentage, one first needs to understand the universe and
how variations might affect the percentage. In the OIG analysis, there does not appear to be any
consideration of large versus small states. Looking within the universe of regulated facilities of a
state can be informative as to the quality of the job they are doing in terms of coverage or dealing
with violations. The relevant universe for a metric is also important. A state with a small
universe may be capable of dealing with its universe in a more thorough manner than a state with
a very large universe. For example, inspecting 100% of a total universe of eight facilities is much
easier than inspecting 100% of a total universe of 8,000 facilities.

The program performance analysis throughout the Report, and featured in the national map on
page 7, is the result of rolling up the problematic figures in each of the metrics by media program
and then rolling them up again across the programs. This approach obscures some errors or
anomalies and compounds others. The end result provides an inaccurate impression of program
performance for any particular state, which does not comport to EPA's more in depth
understanding of state performance.

The variability produced by this approach also does not help to distinguish performance. This
methodology results in an array of numbers that largely cluster between 20 and 30 per cent with
two obvious outliers at nine and 46 per cent. One of the outliers, Alabama, achieves its high
12-P-0113                                                                             57

-------
score based on an incorrect data result: 165% of formal actions with penalties - which is
compounded through the averaging process. It is not clear what these percentages describe,
whether a high or low score is better, or what the distribution of scores means.
 OIG Response 27: EPA raised two points in this comment. Its first point is that comparing all
 states masks the different circumstances faced by large and small states. However, this criticism
 does not account for our choice of metrics. In response to the comment, we provided additional
 information in chapter 1 regarding our choice of metrics. We said, "Each of these metrics comes
 with several caveats about the underlying data. (Many of these caveats are explained in
 appendix A, and were previously reported by OIG.29)" However, we conducted a data reliability
 assessment and determined that utilizing the metrics offer a valid method for comparing state
 activities that took into consideration several key factors. Among them:

    •   The metrics factor in the size of a state's regulated universe by using percentage of
       facilities inspected.
    •   Because we drew the metrics and their values from EPA databases, they provide the
       same information the public sees when viewing state enforcement performance on EPA
       websites.
    •   The three metrics represent two priority issues EPA identified as national weaknesses in
       its initial SRF reviews.
    •   The metrics allow for  state-to-state performance comparison based on priority issues to
       EPA. However, without corroboration they do not offer reliable details about individual
       state performance.

 EPA's second point was that we made an error by comparing states by a grouped measure of the
 three statutes. We conducted this comparison to determine which states performed poorly across
 the board. The maps contained in appendix C demonstrate the results of this exercise.	
Program Complexity Can Not be Captured Using Three Metrics Drawn from Existing
Data
Gaps in our current data make it difficult to develop measures that tell a complete story across
the regulated universes in each program. EPA has developed its measures for state oversight
based on the data that we have, knowing that our current data may not focus on the right things.
While EPA also seeks more straightforward means to portray program performance to the public,
we had to begin to assess programs based on the information that is available.

As described above and  stated in the transmittal memo for these comments, the 3 metrics used by
the OIG are overly simplistic, sometimes inaccurate and/or misleading, and are not adequate for
drawing conclusions  about program performance. The full suite of metrics used by OECA is a
set of indicators that  serve as a place to start to evaluate performance. EPA also looks at state
data, file reviews and interviews with state management and staff to help inform reviewers as to
29 See, for example, EPA Needs to Improve Its Recording and Reporting of Fines and Penalties, Report No. 10-P-
0077; EPA Could Improve RCRAInfo Data Quality and System Development, Report No. 1 l-P-0096; ECHO Data
Quality Audit - Phase I Results: The Integrated Compliance Information System Needs Security Controls to Protect
Significant Non-Compliance Data, Report No. 09-P-0226; and ECHO Data Quality Audit - Phase 2 Results: EPA
Could Achieve Data Quality Rate With Additional Improvements, Report No. 10-P-0230.
12-P-0113                                                                              58

-------
whether issues exist that warrant more examination. Each metric is not used in isolation to draw
a conclusion. If a metric warrants cause for concern, then the issue is explored before any
inference and recommendation for improvement is made. The OIG's use of only three metrics
without the more in depth review and context does not lead to accurate conclusions of program
performance.
 OIG Response 28: We agree that enforcement programs are complex. We employed these
 metrics the same way as EPA does: "as a place to start to evaluate performance." As described
 in our scope and methodology descriptions in chapter 1 and appendix A, we also looked at,
 "state data, file reviews, and interviews with state management and staff." For more
 information, please see OIG Response 1.	
EPA Recommends Modifying the State Performance Analysis
As stated in EPA's memorandum, the Agency agrees with the OIG findings that state
performance varies widely across the country and that regional oversight can be strengthened
and improved. This conclusion is supported by previous evaluations, many cited in Appendix B,
and by EPA's own oversight experience. EPA feels that the methodology used by the OIG in this
evaluation takes away from these primary findings and  conclusions because of the many issues
surrounding the metrics and the averaging across programs and states. Because EPA's own data
support the conclusions, it is not necessary to create confusion and unintended consequences by
publishing the problematic methodology. EPA requests the OIG modify its analysis and display
of results to address these issues.

Because the findings are not in dispute or lacking in evidence, because of the extensive issues
associated with the data utilized, the limited selection of metrics and the method of analysis, and
because of the incorrect and misleading conclusions drawn from the analysis, EPA requests that
this analysis and references to it be modified before the OIG publishes this Report. This request
applies to:
    1.   The discussion on page 3 of the Scope and Methodology section;
    2.   The "State Enforcement Programs are Underperforming" section on pages 6 and 7,
       including the map on page 7;
    3.   The "State Programs Frequently Do Not Meet National Goals" section on pages 7 and 9;
    4.   The "State Enforcement Is Inconsistent" section on pages 8 and 9, including the graphic
       on page 9;
    5.   The state examples provided on pages  14 and 15;
    6.   The "Data Analysis and Methodology" section on page 23; and
    7.   Appendix C

Alternative 1: Utilize Existing State Performance Data  and Maps
OECA believes that developing overall state performance metrics is something to strive for, but
recognizes that current metrics are not sufficiently developed. OECA has begun setting the stage
for this in the development of state comparative maps and dashboards, which are now available
to the public for the CWA. OECA has also provided comparative national graphs for any
individual SRF metric within the Enforcement and Compliance History Online (ECHO) database
so the public can look at individual performance indicators and compare across states. During the
early phases of the evaluation, OECA provided the OIG with several draft tables under
12-P-0113                                                                           59

-------
development. Any of these sources of information (state dashboards, comparative graphs in
ECHO, or the comparison tables under development) could provide a starting place for the
OIG's analysis of state performance and consistency.
 OIG Response 29: Because the data in the SRF system come from the same databases as the
 tool used in our analysis, the results of this change would be substantially the same. In addition,
 we conducted a reanalysis utilizing the data that OECA provided to us, and the difference in the
 results was not statistically significant.	
Alternative 2: OIG could direct EPA to develop a data-based scorecard that better reflects the
complexity of state performance
The OIG could use EPA's experience and their own analysis to identify the complexities of
determining and communicating state performance to the public, and direct EPA to develop a
comparison score card that relays the data caveats and context necessary while making this
information transparent.
 OIG Response 30: We agree that this would be a preferable exercise to our periodic assessment
 of public data. Based on OECA's suggestion, we have added this suggestion as
 recommendation 6.
Alternative 3: Potential re-Analysis of Three Metrics

If the OIG determines it must retain the three metrics selected, EPA recommends that the
information for each of the programs analyzed be independently displayed, rather than rolled into
a single average across programs. While this would not address some of the more significant
issues raised above, it would reduce the confusing results created by averaging across programs
and put the focus on individual programs. EPA has found that program quality can vary
significantly within a state and, if performance is poor, EPA generally addresses issues on a
program by program basis rather than a state by state basis.
 OIG Response 31: In response to EPA's comments, we presented the data for each program
 separately while also including a general assessment of where states performed poorly across
 programs.	
Page by Page Detailed Comments

Page 1       Background - "EPA is responsible...." We recommend changing the sentence to
read "EPA is responsible for ensuring that federal environmental programs are implemented by
EPA regions and states consistent with the requirements of the CAA, CWA and RCRA. EPA
establishes goals as a means to monitor the implementation of these programs, including
enforcement."

 OIG Response 32: We retained the original wording of this sentence.	

Page 1       Background - "National Consistency ensures that all Americans..." Throughout
12-P-0113                                                                            60

-------
the document the word "consistency" does not acknowledge that flexibility is built into national
policy and guidance to accommodate differences in regions and states. For example, the CAA
allows a state to submit implementation plans so long as EPA agrees that the state's program
meets the minimum requirements of the CAA. States may also have programs that are more
stringent than required by federal rules.
 OIG Response 33: We added a statement to explain that national consistency exists in
 conjunction with some measure of flexibility.	
Page 1        "EPA Can Authorize States...." We recommend changing the sentence to read:
"Most states have acquired this authority for many programs (not all states are authorized or
delegated for all programs)."

 OIG Response 34: We retained the original wording of this sentence.	

Page 2       please add text"... authority for enforcement that begins with Congress, extends
to the EPA Administrator, then to the Office of Enforcement and Compliance Assurance
Assistant Administrator, EPA Regional Administrators...
 OIG Response 35: We changed the text to reflect the joint authority of OECA and EPA
 Regional Administrators.	
Page 2        "EPA Oversees States	" We recommend changing the first sentence to read:
"Although most states have received authorization or delegation to administer most programs..."

| OIG Response 36: As requested, we added "most" to the sentence.

Page 2        The Report does not articulate a full appreciation of the relationship between a
state program and the federal program in the enforcement area. When a state is "delegated"
authority or authorized to run a federal environmental program, it enforces using its own state
authorities, and EPA does not relinquish its enforcement authorities. Most effectively, the two
authorities partner on how to assure compliance and enforce in the regulated community, and
such partnerships tend not to be uniform in all respects.

 OIG Response 37: We retained the original wording for this portion.	

Page 2        The Report also does not seem to appreciate that program withdrawal is an
appropriate response when there are broad and pervasive problems across a program, generally
extending beyond just the enforcement program. Enforcement concerns  can be an important
component of a withdrawal process, but generally, in and of themselves, are not the sole reason
behind a program  withdrawal.
12-P-0113                                                                            61

-------
 OIG Response 38: We understand that program withdrawal should be a last resort for failing
 programs. However, it should be available for EPA to use at some point. We do not believe that
 is presently the case. We understand enforcement is just one component of a successful
 program. However, enforcement is one of the most critical aspects for ensuring program
 effectiveness. Regulations clearly state that if states do not take appropriate enforcement action,
 EPA retains the authority to withdraw state program authority.	
Page 4       EPA Established the State Review Framework - Please modify the language in
the opening sentence: "In 2004, EPA worked closely with states to establish the SRF... Under
this system, EPA regions evaluate states on 12 nationally consistent..." Element 13 is optional
and not nationally consistent. On the footnote to this sentence, please add ".. .the OIG used 3 of
EPA's many SRF metrics as proxies..."
OIG Response 39: Because the evidence indicates that EPA worked predominately with ECOS,
we added, "worked with the Environmental Council of the States (ECOS)" to this sentence. We
revised the SRF sentence to read "12 nationally consistent elements." We revised the footnote to
read, "used three of EPA's SRF metrics." Many of the metrics were not data metrics, but file
review metrics, so we could not include them in our data analysis. For example, one of the
metrics looks at whether or not files were complete with penalty calculations, and that would not
lave been reasonable for us to consider in this analysis. However, we reviewed all of the SRF
reviews and utilized them  in developing our findings.	
Page 5       EPA Undertook Additional Efforts - please add "In this way, EPA hopes to enlist
the public... stronger accountability from the regulated community and from government."

 OIG Response 40: We made this requested change.	

Page 6-7     See EPA comments on Methodology above. EPA does not agree with averaging
the data in each metric, averaging data across programs, the rolling up of averages into one
number per state and the use of a color-coded map of averages. EPA does support the use of
maps for numeric metrics, with appropriate caveats and context for the data provided. EPA does
not agree with many of the state-specific conclusions reached by the OIG.
 OIG Response 41: We appreciate these comments and made the following changes to the final
 report. We removed the average across three statutes, replaced the single map with four maps,
 and augmented the discussion of our methodology in chapter 1 and appendix C.	
Pages 6-9    As indicated in EPA's comments on Methodology above, OIG's analysis of state
program performance against inspection goals for each of the media misapplies those metrics:
   •   Data may not reflect full regulated universes or all inspections conducted. Data is focused
       on majors in all three programs reviewed, and states are not required to fully report
       information on regulated non-major facilities. Non-major information is inconsistently
       reported in EPA data systems.
12-P-0113                                                                           62

-------
 OIG Response 42: In response to EPA's comment, we included additional information in our
 scope and methodology to describe what portions of the regulated universe are contained in the
 metrics we use.
   •   Each of the three programs have Compliance Monitoring Strategies (CMSs) that establish
       goals (not requirements) to reflect the diverse range of industries, facilities and state
       conditions. CMS guidelines are not equally applicable across all circumstances. For
       example, in the CAA, some states (and regions) have a greater number of mega sites. The
       guidance in the existing CMS establishes goals for state/local agencies (not requirements)
       that a Full Compliance Evaluation (FCE) should be conducted once every two federal
       fiscal years at all Title V major sources except those classified as mega-sites for which
       the minimum evaluation frequency is once every three federal fiscal years. The CMS also
       allows for alternative frequencies to be established through CMS alternative plans, with
       consideration of the following: facilities on tribal lands, major source universes, major
       source universe inaccuracies, classification changes within the CMS cycle, and number
       of majors classified as mega-sites.
 OIG Response 43: We considered the CMS guidelines for all three programs in its analysis of
 enforcement activities, goals, and accomplishments. Though individual state goals may vary,
 we believe it is reasonable to compare state accomplishments against national goals.	
   •   For the vast majority of SRF reviews, the regions have concluded that the delegated
       agencies are successful regarding evaluation coverage and are meeting their CMS
       commitments. OECA is encouraging states to expand coverage to non-majors that cause
       environmental harm or have known compliance issues.

   •   The Report does not address the planning processes in which regions discuss inspection
       coverage and commitments with states. EPA can and does step in, where possible, to
       improve inspection coverage where states are unable to meet regulatory  requirements or
       address their most significant sources.  The objectives of the planning discussions are to
       fully utilize limited resources while tailoring inspection priorities to the range of source
       universes and environmental issues unique to individual state circumstances.
 OIG Response 44: We acknowledge that enforcement activities are part of a larger program,
 which includes an annual planning process. However, the planning process is outside the scope
 of this assignment.
   •   For example, the October 2007 CWA CMS has a goal of 100% coverage every two years
       to inspect NPDES major facilities, but allows that percentage to be adjusted based on
       state coverage of non-major facilities of interest, such as CAFOs or stormwater. The
       analysis used by the OIG measured performance with a rigid 100% minimum standard. A
       specific example is the data on the inspection of majors related to Alabama and a number
       of other Region 4 states. Since many of the Region 4 states negotiated a NPDES CMS
       that was less than the 100% (national goal) inspection of majors (for Alabama it was
       50%), it is inappropriate to evaluate them against the national goal of 100%. The
       reduction in the inspection of NPDES majors was offset by the increase in inspections of
12-P-0113                                                                             63

-------
      NPDES minors and this negotiated agreement was part of the 106 Work plan. Alabama
      exceeded the negotiated inspection amount by performing inspections at 54% of their
      NPDES major universe. The Report states that 13 states inspected fewer than 50% of
      major CWA facilities in 2010, but does not explain that states may have conducted
      inspections at non-major facilities or other resource trade-offs available to states through
      the flexibility provisions of the CWA CMS.
 OIG Response 45: EPA's comment demonstrates the type of scenario states describe when
 expressing confusion over EPA expectations. In chapter 2 of the report, we describe the
 confusion, saying, "The absence of consistent benchmarks creates confusion about what EPA
 sees as a good state program and contributes to inconsistent performance. Staff and officials in
 EPA regions told us that it would be helpful if OECA clarified its national policy and guidance,
 and codified national enforcement standards in regulations."	
       The statistic about RCRA Subtitle C performance is confusing and not accurate. The
       ACS commitment requires inspections at 20% of the large quantity generators in each
       state per year (100% every five years). However, this is a combined state and EPA goal.
       As stated in the NPM guidance, "at least 20% of the LQG universe should be covered by
       combined federal and state inspections unless an alternative plan is approved under the
       RCRA CMS." EPA questions whether the 62% cited by OIG includes EPA inspections.
       Also, as the sentence quoted above indicates, the CMS allows for deviations from the
       20% per year goal to allow states to do more small quantity generator inspections because
       of suspected compliance issues within that universe, the 62% cited by OIG includes EPA
       inspections. Also, as the sentence quoted above indicates, the CMS allows for deviations
       from the 20% per year goal to allow states to do more small quantity generator
       inspections because of suspected compliance issues within that universe.
 OIG Response 46: To clarify, the CMS limits EPA inspections to fewer than 10 percent of the
 required 20 percent, so if a state is required to complete 40 inspections, only four of those could
 be EPA inspections. Our assessment looked at state-only inspection rates. The EPA database
 reports state inspections for RCRA. We used  this metric. We are not arguing for inflexibility,
 but rather for ensuring that all states meet a minimum standard. Above the minimum standard,
 flexibility in achieving overall goals may be appropriate.	
       This can be found in the RCRA SRF Plain Language guide for Metric 5c (Five-year
       inspection coverage - Large Quantity Generators) which states "...NPM guidance states
       100% of LQGs should be inspected every five years. The BR [Biennial Reporting]
       universe, while not perfect, in many cases offers the most accurate LQG count. However,
       because this is a difficult universe to gauge, the region and state may agree upon and
       substitute an alternative universe  count. In addition, states with approved plans may
       substitute other facility inspections for LQGs per the Guidance for FY08 RCRA Core
       LQG Pilot Projects; commitments that vary from the national goal may be reviewed
       under Element 4. Further review is needed when states do not meet the goal, although
       due to universe changes, coming close to but not reaching the 100% goal is not cause for
       concern." Given that the LQG universe changes on a monthly basis and that only 60 to
       70% of the LQGs appear in each BR report over a 3 BR reporting cycle, it is unlikely a
12-P-0113                                                                             64

-------
       state will reach 100% of the universe unless it is a small state with a stable generator
       universe. The closer a state is to 100% the less concern, while the further from the 100%
       the more the issue should be investigated further.
 OIG Response 47: OECA may explore using this detailed information when a state scorecard
 (per report recommendation 6) indicates that there are issues with its RCRA program. However,
 the detail was outside the scope of this OIG evaluation.	
       None of the OIG analyses take into account state resource limitations, while regional
       resources are considered. This doesn't seem reasonable.
 OIG Response 48: We understand that states fund their environmental protection programs to
 varying degrees and that in some instances funding may not be adequate to meet agreed-upon
 responsibilities. However, when this happens it is EPA's responsibility to step in and ensure
 that the state protects its citizens from the resulting pollution by enforcing the law. EPA must be
 able to adjust resource allocations to respond to these deficiencies in order to fulfill its statutory
 responsibility as the steward for environmental protection.	
Page 8        "Based on an overall average of the three enforcement measures... we found that
the state performance varied across the country almost 40 percentage points from lowest to
highest performing state." Per the discussion in EPA's comments on Methodology above, EPA
recommends that this comparison be removed from the Report. However, if not, the results of
this array of averages on page 9 also show that, other than the two regions with the highest and
lowest averages, the remaining eight regions all fall within a narrow range of about 8 percentage
points, which would suggest a high degree of consistency among regions.
 OIG Response 49: We modified this section. Specifically, we removed the figure comparing
 regional performance. We also modified the report to discuss state performance by statute,
 rather than overall. However, our analysis using EPA data and interviews across six EPA
 regions indicates significant variation in regional approaches to state oversight.	
Page 9        "Because regional approaches to state oversight varied, state performance varied
depending, in part on the region overseeing the state" is not accurate. State performance varies
because of a range of issues including state resources, whether the state agency has independent
administrative authority, influence of regulated community within the state, etc. These factors
have greater influence on state performance than regional oversight. The Report does not provide
an analysis of the relative importance of regional oversight against these other factors. We
recommend changing the sentence to read, "Our evaluation indicated that state performance
varied depending, in part, on the region overseeing the state."
12-P-0113                                                                             65

-------
 OIG Response 50: Our analysis of EPA data and interview results across six EPA regions and
 seven states does not support EPA's comment that other factors have a greater influence than
 regional oversight. We found that some states with resource constraints or states lacking
 administrative penalty authority performed well, while others did not. The laws clearly state
 that EPA has a responsibility to assist states or step in when states are not adequately
 performing their enforcement programs, be it from resource constraints, penalty limitations, or
 other factors. Therefore, regional oversight is critical. We did not change the text as we believe
 the existing sentence in the report is sufficient.	
Page 9       EPA does not agree that rolling up state averages is a legitimate way of evaluating
regional oversight performance.

 OIG Response 51; We removed this from the final report.	

Page 9       "Instead EPA's 10 Regional Administrators have authority over state
enforcement..." is not correct: RAs do not have authority over state enforcement, rather they
have oversight authority.
| OIG Response 52: We made this change in the final report.	

Page 10             EPA does not believe that a (one) national baseline or standard is
appropriate for judging state or regional performance. Performance is complex and must be
viewed with appropriate context. National expectations are laid out in the goals that are set
through our national guidance, which allow flexibility to balance national consistency with
regional, state and local conditions.
 OIG Response 53: We describe the importance of national benchmarks in chapter 2 of the
 report, saying, "State enforcement programs are complex and varied. However, as the national
 steward of environmental enforcement, EPA must set national benchmarks that establish the
 enforcement expectations that states and EPA agree to meet. EPA has not set consistent
 benchmarks for state performance. As a result, EPA cannot hold EPA regions accountable for
 ensuring that states meet a national standard, and headquarters does not objectively know which
 states require immediate intervention. If state performance exceeds the national benchmarks,
 states and regions can then address local priorities."	
Page 11             Headquarters Policies and guidance Not Clear — EPA has consolidated all
current guidance that serves as the basis for an SRF review on the SRF OTIS web site. This is
available to both EPA and states. We acknowledge that the public web site is not as clear.

Page 11             "We ... reviewed hundreds of associated EPA guidance and policy
documents. We found that none of these information sources outlined clear and consistent
baselines." It is not clear what standard the OIG is applying for "clear and consistent baselines."
While EPA agrees that guidance is numerous and can be improved, we have substantial
documentation that does establish clear baselines and we continue to make improvements when
12-P-0113                                                                            66

-------
issues arise. For example, under the CAA program, the CAA CMS has been updated to: 1)
clarify minimum frequencies based on federal fiscal years; 2) incorporate the Federally
Reportable Violations Clarifications memo; 3) address new guidance focusing on the array of
new air toxics area source rules; 4) update Stack Testing Guidance and Section 112(r) Guidance.
In addition, the HPV Policy and the CAA Penalty Policy are both currently under review to
update them consistent with changes in the CAA program.
 OIG Response 54: Our interviews with seven states and six EPA regions indicated that despite
 all of these documents, enforcement personnel did not clearly understand how benchmarks and
 flexibility worked to ensure that all states met the same basic requirements. The OIG is not
 arguing for inflexibility, but rather for ensuring that all states meet a minimum standard. Above
 the minimum standard, flexibility in achieving overall goals may be appropriate. The OIG
 report emphasizes that these information sources have not explicitly outlined performance
 benchmarks.
Page 12             In the first paragraph, the OIG appears to have used an old inspection
frequency benchmark in determining the appropriate percentage for coverage NPDES major
coverage for 2010. EPA recognizes that the plain language guide used a previous version of the
CMS. This was because the new CMS went into effect in October 2008 (FY2009), after the start
of SRF Round 2. Reviews done using FY2008 data would use the old CMS while reviews using
the FY2009 data would use the new CMS. EPA appreciates that this could cause some
confusion, but this practice reflects prior agreement with states that they are held accountable to
guidance and policy in effect during the year of performance. The plain language guide for
Round 3 is being updated to reflect current policy.
 OIG Response 55: We described this scenario in a footnote in both the draft and final reports.
 We used the goal that EPA presented alongside the data in the SRF data, but included the
 footnote because of the discrepancy between the goal presented with the data and the CMS.
 This provides another example of how benchmarks are not clear. However, in response to this
 comment, we compared state performance to this goal based on 2006 data (prior to the new
 CMS), as well as 2010 data. The results are presented in chapter 2.	
Page 12             "In another example, a state official pointed out a discrepancy between
EPA's RCRA Enforcement Response Plan and the SRF review process. Specifically, the RCRA
Enforcement Response Plan allows states to miss a timeliness requirement 20% of the time. In
contrast, the SRF review process does not allow this. Therefore, the SRF report could potentially
indicate that the state did not meet program requirements if the state did not meet the 100%
timeliness goal." This is inaccurate. The state official has either misunderstood the requirement
or the SRF review process. The RCRA Enforcement Response Policy flexibility allowing
exceedance of timeliness requirements for 20% of cases per year is for cases involving unique
factors and is not a blanket exemption. Nevertheless, the associated SRF data metric has a goal
of 80% timeliness to most closely reflect the policy. This is noted on the data metrics report and
in the Plain Language Guide for the RCRA SRF reviews.
12-P-0113                                                                            67

-------
 OIG Response 56: We rephrased the statement to clarify that the state officials expressed
 confusion about whether the goals under the SRF and Enforcement Response Policy were
 coordinated.
Page 12             Independent Interpretations — States may interpret the regulations
differently for RCRA since they adopt their own regulations (even if they use EPA language)
and as long as they are not less stringent than EPA.
 OIG Response 57: This EPA comment supports the OIG argument about ensuring all states
 meet national benchmarks, above which there may be flexibility in establishing additional
 goals. We are not arguing for inflexibility, but rather for ensuring that all states meet a
 minimum standard. Above the minimum standard, flexibility in achieving overall goals may be
 appropriate. Our report emphasizes that these information sources have not explicitly outlined
 performance benchmarks.	
Page 12             The IG suggests that, by codifying policy in regulations, EPA would
operate a national enforcement program that consistently interprets and enforces environmental
laws. While this would support consistency, it may not allow sufficient flexibility to focus on
state and regional priorities for addressing the most serious water quality issues. Also, the
rulemaking process is resource-intensive and time-consuming. By the time a rule would be final,
it is likely that the policy or guidance would need to be modified or changed.
 OIG Response 58: The report does not recommend codifying policy in regulations; however,
 we heard from state and external sources that codification would make it easier for the states to
 implement the enforcement programs.	
Page 13             The IG incorrectly uses report length as an indicator of the quantity and
quality of information. Report length varies for a number of simple reasons — such as the number
of attachments, the size and complexity of the state program, the number of issues identified —
and does not correlate with the quality of the report. In addition, the specific examples provided -
Region 7's lengthy reports versus other regional reports - are not comparative as the Region 7
reports are comprehensive media program reviews (including permitting), which incorporate the
SRF elements for reviewing the enforcement components of the overall program.
 OIG Response 59: Our description of SRF report inconsistencies does not focus on the length
 of the reports. We say, "Regions' SRF reports differed in both quantity and quality of
 information about the states." In addition, we point out EPA's own conclusions about
 inconsistency, saying, "After receiving the first 25 Round 1 reports, OECA issued a 'Guide to
 Writing SRF Reports (Interim Final)' in April 2007. In the accompanying transmittal memo,
 OECA said, 'To date, inconsistencies in the information available in the SRF reports inhibit our
 ability to determine if the reviews are being carried out in a consistent manner.'"	
Page 14             See EPA comments on Methodology above. Because of the
methodological errors, the analyses of state performance in this section are incorrect and
misleading and should be removed from the Report.
12-P-0113                                                                            68

-------
 OIG Response 60: Our methodology was sound and based on EPA data systems. Please see
 OIG Response 1 for additional details.	
      For example, North Dakota's NPDES and RCRA inspection rates far exceed both the
      national average and the national goals. North Dakota ranks second in the nation for
      percentage coverage of inspections at facilities classified at majors, and first for
      percentage inspection coverage at those facilities with reportable minor permits. North
      Dakota has inspected 100% of its TSDs over the last 7 years and 68% of its LQG's, yet
      according to the Report,  their inspection coverage percentage is 3%.
 OIG Response 61: Our analysis included all facilities. This means that it included all of these
 inspections. As described in chapter 2, North Dakota's inspection rate does not reflect the
 state's enforcement activity when it finds a violation.	
      For Louisiana: the OIG states "In 2001, citizens filed a petition with EPA urging a
      withdrawal of the state's CWA NPDES program authority for many reasons, including
      lack of adequate enforcement. The region responded by conducting audits in the state.
      Even though the region found multiple parts of Louisiana's NPDES program to be
      deficient, it decided not to withdraw the program. [Footnote omitted.] The state's poor
      performance has persisted."

      •   This paragraph implies that the Agency did not take significant action in response to
          the petition, and that program withdrawal is the action EPA should have taken; we
          disagree with both points.

      •   However, as a threshold matter, it is important to recognize that at the time of the
          OIG inquiry, Region 6 was unable to provide OIG with copies of relevant documents
          from the 2001 period, and thus OIG did not have access to the full set of facts. The
          Agency regrets and apologizes for that omission. We have now located the relevant
          documents (attached), and want to share them so that the IGs recommendations can
          be based upon the fullest and most accurate information, and so that the public is well
          informed.

      •   The record shows that although the Agency decided not to withdraw Louisiana's
          NPDES  program (the "LPDES" program); EPA undertook an extensive process to
          evaluate, oversee, and improve the State's program.

      •   After the citizens' petition was filed in 2001, the Regional Administrator and the
          Governor Foster of Louisiana decided to jointly review the administration of the
          LPDES program. The Governor convened a special task force, and EPA began an
          informal investigation pursuant to 40 CFR §123.64(b). Region 6's informal
          investigation included on-site reviews of LPDES files, interviews with LDEQ
          management and staff, and an evaluation of information and data concerning program
          implementation. The Governor's task force findings were sent to the EPA
          administrator, who responded with a letter on February 10, 2003. (See Attachment 2,
          Enclosure 1.) Both the special task force and EPA's informal investigation found
12-P-0113                                                                            69

-------
          deficiencies in the LPDES program. The EPA, through its then Assistant
          Administrator for Water, and the Assistant Administrator for Enforcement and
          Compliance Assurance,  subsequently responded by detailing a list of seven
          performance measures that needed to be completed by LDEQ within a specified
          timeframe in order to address the program deficiencies. (See Attachment 2, Enclosure
          2.) Region 6 also sent a letter commenting on the findings of the Governor's special
          task force. (See Attachment 2, Enclosure 3.) In reply, the Governor committed to
          complete the seven performance measures, and Louisiana submitted a revision to its
          NPDES program that was published in the Federal Register on August 13, 2004. In
          May 2004, Region 6 found that Louisiana had successfully completed all seven
          performance measures (see Enclosure 4), and in  June 2004,  the Region performed a
          follow-up review of LDEQ's processes and procedures outlined in the revised
          LPDES program authorization documents. The Region found that LDEQ was
          implementing the changes agreed to, and the Louisiana NPDES program showed
          marked improvement. The EPA then approved the revisions to the LPDES program
          which was published in the Federal Register on January 5, 2005.

          Therefore, contrary to its description in the draft OIG Report as a failure of EPA
          oversight, Agency action in this case actually demonstrates the effectiveness of EPA
          oversight in resolving state program deficiencies."
 OIG Response 62: We appreciate the additional information provided by EPA, since Region 6
 was unable to provide the information during the course of our evaluation. We reviewed and
 analyzed the additional information provided, and modified the text in chapter 2 related to
 Louisiana accordingly. We said, "The region found several deficiencies and required the state to
 change some policies and develop new measures. Although the state completed the
 recommended actions, the state's poor performance persisted; our analysis found that Louisiana
 has the lowest enforcement levels in Region 6 and is ranked in the lower half for CWA and the
 lowest quartile for CAA and RCRA."	
       For Alaska, the OIG uses data from 2003-2009. EPA ran the full NPDES program in the
       Alaska until October 2008. In 2009, the State DEC administered the program for only
       part of the NPDES universe, so the SRF metrics run for that year are confused by this
       shared lead. An SRF review of the State's newly authorized NPDES program has not
       been done.
 OIG Response 63: In response to this request, we added a sentence clarifying Alaska's 2003-
 2007 performance, when EPA was directly implementing the program. The continuously poor
 performance record spanning EPA and Alaska's implementation across the 2003-2009 period
 underscores the challenges Alaska faces in operating an effective enforcement program.	
Page 15             While the Report's methodological errors also apply to the analysis of
Louisiana, Alaska and Illinois, in each of these states EPA has identified issues in its SRF reports
and in other public documents and, most importantly, is taking extensive action to address them.
The Report should acknowledge these actions.
12-P-0113                                                                           70

-------
 OIG Response 64: We present the actions regions took in the report.	

   •  In the aftermath of Hurricane Katrina, Louisiana worked closely with Region 6 to re-
      evaluate environmental priorities and redirect staff to deal with a multitude of unforeseen
      environmental problems. The IG appears to have disregarded the workforce and
      workload implications of these facts.
 OIG Response 65: We agree that many external factors create and increase resource strain, in
 particular the disasters affecting Louisiana and Region 6. However, we reiterate that when states
 are not adequately operating their enforcement programs, EPA should take corrective actions.
 We intended report recommendations 1 and 5 to allow OECA to move resources at a national
 level when crises like these occur so that EPA can respond to a compounded enforcement crisis
 in a state like Louisiana using the weight of its national workforce.	
      Region 10 had continuing concerns about Alaska's capacity and performance in
      enforcement at the time of authorization in 2008 and shared these concerns with the
      State. It was determined that Alaska's program did meet the minimum regulatory and
      statutory requirements for program authorization despite those concerns. The phased
      approach was itself a way to help the State build capacity in permit and enforcement
      work and for EPA to assess  progress as the State started a new NPDES permit and
      enforcement program from the ground up. As expected, the State has encountered a
      number of challenges. Even absent an SRF review, the Region has both formally and
      informally informed the State of improvements that need to be made in both their permits
      and enforcement work. Region 10 also proposed in the Federal Register, a delay of one
      year for the final phase of the NPDES universe.
 OIG Response 66: We understand that Region 10 has sought to assist Alaska build capacity.
 However, the actions taken by the region have thus far not brought about improved performance
 in the state program. We intended report recommendations 1 and 5 to allow OECA to move
 resources at a national level when crises like these occur so that EPA can respond to a
 compounded enforcement crisis in a state like Alaska using the weight of its national workforce.
      Region 5's actions in Illinois (which are publicly available at
      http://www.epa.gov/Region5/illinoisworkplan) address both permitting and enforcement
      issues and go well beyond the analyses and recommendations under SRF; and these
      actions have already yielded significant results and meaningful improvements to lEPA's
      program which go beyond the scope of SRF. Specific results to date include:
      •   Air Enforcement - IEPA has completed the following measures pursuant to its Work
          Plan commitments: a new Compliance Monitoring Report has been drafted and is
          currently being field tested by IEPA inspectors for completion of inspection reports,
          three committees  are now consulting with field inspectors prior to and following
          inspections planned for the upcoming quarter, IEPA is issuing Violation Notices
          containing a recommended technical remedy to resolve violations, and an EPA
          review of HPV cases was conducted with IEPA on March 11, 2011. Additionally,
          IEPA has referred sixteen cases to the Illinois Attorney General this year. This
          compares with fourteen for FY 2010, and appears to be on track to meeting the
12-P-0113                                                                            71

-------
          required twenty-five percent increase. Currently, IEPA is meeting or is on track to
          meet all of its enforcement related Work Plan requirements.

       •  Title V Permit Issuance - To date, IEPA has met every Clean Air Act permitting-
          related commitment in the Work plan, including efforts to issue timely permits, hire
          additional staff to work on Title V, and the evaluation of its application completeness
          process. On June 30, 2011, IEPA submitted its strategy for reducing the Federally
          Enforceable State Operating Permits backlog for sources that would otherwise be
          subject to Title V. IEPA has met both November 30, 2011 commitments to issue six
          final permits and public notice eleven permits in an effort to reduce the Title V
          backlog. IEPA has hired nine additional permit analysts to work on Title V permitting
          in last six months.

       •  Water Program: CAFO Inspections - IEPA is on course to meet the enforcement and
          inspection commitments specified in the Concentrated Animal Feeding Operation
          (CAFO) work plan. IEPA conducted 38 CAFO inspections, surpassing its
          commitment to complete 25. IEPA met the hiring commitment by hiring two
          additional inspectors. EPA and IEPA collaborated to provide inspection training for
          IEPA field inspectors and enforcement staff. IEPA drafted and submitted a long-term
          CAFO National Pollutant Discharge Elimination System (NPDES) inspector training
          curriculum. IEPA also  developed and submitted  a plan to create and maintain a
          comprehensive inventory of large CAFOs and  is on schedule to submit an inventory
          of large CAFOs by the end of calendar year 2011. IEPA developed a citizen
          complaint standard operating procedure (SOP) and a database for animal feeding
          operations and CAFOs. IEPA prepared and submitted a CAFO NPDES inspection
          SOP and a revised Enforcement Response Guide. Since February 2011, IEPA issued
          approximately 10 violation notices, received and investigated 4 complaints of spills
          and releases, and sent 3 referrals to its Attorney General's Office (IL AGO). IEPA,
          the IL AGO, and EPA  participate in quarterly docket review conference calls to help
          ensure all referred CAFO matters are moving forward.

          Water Program: Priority Permits Issuance - IEPA has done a good job on priority
          permits issuance. Performance is 114%  of the goal. The State has exceeded its
          commitment of 42 issuances for FY 2011.
 OIG Response 67: We drew our analysis from available EPA data. We applaud Region 5 and
 Illinois for increasing their enforcement oversight efforts in Illinois. At some future point, we
 may revisit the progress made in Illinois and evaluate how well the Region 5 intervention
 program worked.	
Page 15             The OIG Report concludes that "regardless of the cause," when a state
cannot fully operate its program, EPA should step in and fill the gap. It is important to
understand that when a state is overwhelmed by the need to address a national disaster — as
Louisiana was during Hurricane Katrina — the Regional Office was similarly required to devote
significant resources to the response activities (with the help of all of the other Regions and HQ).
It should also be noted that in the wake of a natural disaster like Hurricane Katrina, many
12-P-0113                                                                           72

-------
industrial operations were curtailed for months, during which time inspections and enforcement
actions would logically decrease.

| OIG Response 68: Please see OIG Responses 65 and 66.	

Page 16             "OECA lacks significant control over EPA enforcement resources
nationwide." OECA centrally manages the national enforcement program through the Strategic
Plan, GPRA national program measures, EPA's Enforcement Goals, the NPM Guidance and
ACS Commitment system, semi-annual regional progress and planning meetings during which
state oversight is a major topic, and regular meetings, conference calls and video conferences.
OECA generally directs enforcement resources to the regions for broad purposes, with regions
having some discretion as to how those resources are deployed. Additional transparency on how
regions utilize enforcement resources would be helpful.
 OIG Response 69: We do not dispute the many actions that OECA takes within the constraints
 of the present organizational structure. In fact, we point many of them out in the report. Our
 general finding was that OECA was taking actions within the current structure, and only
 through a structural change along the lines of our recommendations may EPA improve its
 oversight of state enforcement.	
Page 17             The workload ratio of permits issued to enforcement employees is not a
valid measure of enforcement workload. EPA would need to know which permits were included
in the analysis to better understand the OIG's approach. Does it include both state and federally-
issued permits? Permits are issued by media program staff in both the regions and in states.
Permits vary in complexity, and not all permits are required to be reviewed or lend themselves to
federal enforcement. Regions generally take actions based on national enforcement initiatives or
regional priorities.
 OIG Response 70: EPA's data dictionary indicates that the count received from the database
 includes all permits. We agree that permits vary in complexity and the scope of the report does
 not include detailed analysis of permits. Creating a ratio of all permits to FTEs brings awareness
 of general workload levels across regions where there were gaps in that knowledge previously.
 As an aside, OECA suggested this metric. We applied it across all states, so the permit count
 includes the same information nationwide.
Attachments
12-P-0113                                                                            73

-------
                                                                       Appendix F

                                Distribution

Office of the Administrator
Deputy Administrator
Assistant Administrator for Enforcement and Compliance Assurance
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for External Affairs and Environmental Education
Associate Administrator for Policy
Audit Follow-Up Coordinator, Office of Enforcement and Compliance Assurance
12-P-0113                                                                         74

-------