$
<
73
\
Ml
r
ppo^
O
2
Lll
O
T
A?
OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Evaluation Report
EPA Performance Measures Do
Not Effectively Track Compliance
Outcomes
Report No. 2006-P-00006
December 15, 2005

-------
Report Contributors:	Katie Butler
Gabrielle Fekete
Jeff Hart
Ben Webster
Abbreviations
CAA
Clean Air Act
CCDS
Case Conclusion Data Sheets
CWA
Clean Water Act
EMP
Environmental Management Practices
EMS
Environmental Management System
EPA
Environmental Protection Agency
GAO
Government Accountability Office
GPRA
Government Performance and Results Act
ICIS
Integrated Compliance Information System
NEPA
National Environmental Policy Act
NET I
National Enforcement Training Institute
NPMS
National Performance Measures Strategy
OECA
Office of Enforcement and Compliance Assurance
OIG
Office of Inspector General
OMB
Office of Management and Budget
PART
Program Assessment Rating Tool
RCRA
Resource Conservation and Recovery Act
SEP
Supplemental Environmental Project
SNC
Significant Noncompliance

-------
<
33
\
^t0SrX
&
V PRO^4-0
o
2
Lll
o
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
2006-P-00006
December 15, 2005
Catalyst for Improving the Environment
Why We Did This Review
We did this review to
determine (1) how the Office
of Enforcement and
Compliance Assurance
(OECA) measures and reports
enforcement and compliance
effectiveness and progress,
and (2) how well OECA's
performance measures
characterize changes in
compliance or other
outcomes, and provide
transparency.
Background
Performance measures allow
the U.S. Environmental
Protection Agency (EPA) to
chart its progress against its
goals. Ensuring compliance
with environmental laws and
regulations is critical to
accomplishing EPA's mission.
EPA must publicly report its
progress in the most
transparent way possible so
stakeholders can determine
whether OECA's strategies,
policies, and programs are
effective.
For further information,
contact our Office of
Congressional and Public
Liaison at (202) 566-2391.
To view the full report,
click on the following link:
www.epa.qov/oiq/reports/2006/
20051215-2006-P-00006.pdf
EPA Performance Measures Do Not
Effectively Track Compliance Outcomes
What We Found
In response to our first objective, we found that OECA primarily measures
progress in ensuring compliance using output measures. OECA uses several types
of internal performance reports to monitor enforcement and compliance progress
throughout the year, and reports progress to Congress and the public in several
ways. Through these reports, OECA has stated it generally met its annual
performance goals.
In response to our second objective, we found that OECA's 2005 publicly-
reported GPRA performance measures do not effectively characterize changes in
compliance or other outcomes because OECA lacks compliance rates and other
reliable outcome data. In the absence of compliance rates, OECA reports proxies
for compliance to the public and does not know if compliance is actually going up
or down. As a result, OECA does not have all of the data it needs to make
management and program decisions. What is missing most, the biggest gap, is
information about compliance rates. OECA cannot demonstrate the reliability of
other measures because it has not verified that estimated, predicted, or facility
self-reported outcomes actually took place. Some measures do not clearly link to
OECA's strategic goals. Finally, OECA frequently changed its performance
measures from year to year, which reduced transparency.
What We Recommend
We recommend that the Assistant Administrator for Enforcement and Compliance
Assurance:
•	Design and implement a pilot project to verify estimated, predicted, and
facility self-reported outcomes, and report on the pilot's results to
demonstrate the reliability of such performance measures;
•	Improve the linkage/relationship of OECA's goals and measures in EPA
strategic and budgetary documents to improve external understanding and
internal usefulness; and
•	Continue to improve enforcement and compliance performance measures,
while continuing to publicly report key measures annually to provide the
public, Congress, and other specific stakeholders a minimal amount of
comparable trend data.
EPA agreed with all of our report recommendations. We also made other
revisions based on EPA's comments as we determined appropriate.

-------
^£0SX
i ^JL^-7 -	UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
| X\\/y ?	WASHINGTON, D.C. 20460
V
^ PRQl^
OFFICE OF
INSPECTOR GENERAL
December 15, 2005
MEMORANDUM
SUBJECT:
EPA Performance Measures Do Not Effectively Track Compliance
Outcomes
Report No. 2006-P-00006
FROM:
Jeffrey K. Harris /s/
Director for Program Evaluation, Cross Media
TO:
Granta Y. Nakayama
Assistant Administrator
Office of Enforcement and Compliance Assurance
This is our final report on the subject evaluation conducted by the Office of Inspector General
(OIG) of the U.S. Environmental Protection Agency (EPA). This evaluation report contains our
findings that describe the problems we have identified and corrective actions we recommend.
This evaluation report represents the opinion of the OIG and the findings contained in this report
do not necessarily represent the final EPA position. EPA managers will make final
determinations on matters in this report in accordance with established procedures.
We met with Office of Enforcement and Compliance Assurance managers on November 22,
2005, to discuss our preliminary findings, and provided our official draft report on September 9,
2005. EPA agreed with all of our recommendations. We have included EPA's official written
comments in their entirety as Appendix F. EPA's attachments to its response are available on
our Web site along with the report. Appendix G includes our detailed evaluation of EPA's
response.
Action Required
EPA Manual 2750 requires you as the action official to provide this office with a written
response to this report within 90 calendar days of the final report date. Your response should
address all recommendations and must include your concurrence or nonconcurrence with all
recommendations. For corrective actions planned but not completed by the response date, please
describe the actions that are ongoing and provide a timetable for completion. If you do not
concur with a recommendation, please provide alternative actions addressing the findings
reported. For your convenience, this report will be available at http://www.epa.gov/oig/.

-------
Attachment
cc: Lyons Gray, Chief Financial Officer, OCFO
Kimberly Terese Nelson, Assistant Administrator and Chief Information Officer, OEI
Phyllis Harris, Principal Deputy Assistant Administrator, OECA
Michael M. Stahl, Director, Office of Compliance, OECA
Walker B. Smith, Director, Office of Civil Enforcement, OECA
Greg Marion, Audit Followup Coordinator, OECA

-------
	Table of Contents	
At a Glance
Chapters
1	Introduction 		1
Purpose 		1
Background		2
Scope and Methodology		5
2	OECA Reported Outputs and Reported Meeting Performance Goals		7
OECA Performance Measurement Activities		7
OECA Primarily Measures Outputs		8
OECA Reported Progress in Various Ways 		8
OECA Reported That It Generally Met Its Performance Goals 		9
3	OECA's Public Measures Do Not Effectively Characterize Changes
in Compliance or Other Outcomes		11
OECA Lacks Compliance Rates Among Its Public Measures		11
Unverified Estimated, Predicted, and Facility Self-Reported Public
Measures May Be Unreliable		14
Some Public Measures Are Not Linked to Goals		17
Frequent Changes in Public Measure Reporting Reduce Transparency		20
Recommendations 		21
Further Evaluation Needed		22
Agency Response and OIG Evaluation		23
Appendices
A Detailed Scope and Methodology		24
B Internal Performance Management and Reporting		27
C Criteria for Effective Performance Measurement		28
D OECA Fiscal 2005 Performance Goals, Measures, and Targets		29
E OECA Performance Measures, Fiscals 1999-2004		31
F Agency Comments		34
G OIG Evaluation of Agency Comments		48
H Distribution		56

-------
Chapter 1
Introduction
Purpose
Compliance is at the heart of any regulatory agency's mission, and the U.S.
Environmental Protection Agency (EPA) cannot be effective without a strong
enforcement and compliance program. Ensuring compliance with environmental
laws and regulations is critical to accomplishing EPA's mission.
The overarching goal of EPA's Office of Enforcement and Compliance Assurance
(OECA) is to maximize compliance with environmental regulations to protect
human health and the environment. Environmental laws and regulations can
achieve their purpose only when those in the regulated community comply with
requirements. Performance measures allow OECA to chart its progress against its
compliance, environmental, and other goals. OECA must publicly report its
progress in the most transparent way possible so that EPA staff, the public, and
the Congress can determine whether OECA's strategies, policies, and programs
are effective. Transparency requires performance changes be easily detected and
readily understood.
To evaluate the effectiveness of EPA's enforcement and compliance efforts, we
plan to evaluate several interrelated issues. This report builds upon our prior
evaluation of OECA's regulated universe1 by examining how OECA measures
and reports its performance.
The intent of this report is to inform EPA's leadership and interested stakeholders
regarding the extent to which OECA can measure the impact and effectiveness of
its enforcement and compliance assurance activities. Specifically, this report
answers the following questions:
•	How does OECA measure and report enforcement and compliance effectiveness
and progress? (Chapter 2)
•	How well do OECA's performance measures characterize changes in
compliance or other outcomes, and provide transparency? (Chapter 3)
1 See EPA OIG report. Limited Knowledge of the Universe ofRegulated Facilities Impedes EPA's Ability to
Demonstrate Changes in Regulatory Compliance. Report 2005-P-00024, September, 19 2005.
1

-------
Background
Compliance and Environmental Results is the Goal
A key element of Administrator Johnson's 500-day plan is to "make compliance
our enforcement objective." At the Administrator's May 23, 2005, swearing-in
ceremony, President Bush emphasized he wanted results - real environmental
improvements and vigorous enforcement - when he said, "...we will continue our
enforcement strategy which focuses on achieving real environmental
improvements that benefit everyone... .We'll continue to vigorously enforce our
environmental laws.. .and we will focus on results."
EPA's fiscal 2005 enacted budget included approximately $453 million and about
2,672 staff-years to improve compliance. The fiscal 2006 President's budget
request included approximately $487 million and about 2,715 staff-years to
improve compliance.
Reliable Compliance Information Is Essential
Reliable compliance information is essential for a regulatory agency to establish
baselines, set goals, monitor progress, serve as evidence to support enforcement
actions, and ultimately demonstrate results.2 The Agency states that it uses
compliance data to:
•	Identify problems in need of EPA or State attention;
•	Monitor program performance; and
•	Improve program effectiveness.
OECA can use compliance information to inform Agency staff and external
stakeholders on compliance levels, and to demonstrate OECA's progress in
achieving its goals. Compliance rates are among the Agency's most important
performance measures.
Performance Measurement Defined
Performance measurement is the monitoring and reporting of program
accomplishments, particularly progress toward pre-established goals.
Performance measures may address the type of program activities conducted, the
direct products and services delivered by a program (outputs), or the results of
those products and services (outcomes). Table 1.1 further defines these
performance measurement terms.
2 See Appendix C for a comprehensive definition of reliability and other performance measurement criteria.
2

-------
Table 1.1: Performance Measurement Terminology
Term
Definition
Input
Personnel, funds, and other resources that contribute to an activity
Output
Quantitative or qualitative measures of activities, work products, or
actions (example: enforcement cases completed)
Intermediate
Outcomes
Changes in knowledge, behavior, or conditions that result from
program activities and are needed to achieve the end outcome
(example: compliance)
End Outcomes
The ultimate outcomes of program activities (example: improved
human health and environmental conditions)
Using measures to actually manage and improve a program necessitates a mix of
output and outcome measures to determine what outputs produce the most
important outcomes. Agencies must balance their ideal performance
measurement systems against real-world considerations such as the cost and effort
involved in gathering and analyzing data.
Effective Performance Measurement and Reporting
The purpose of performance measurement is to support resource allocation and
other policy decisions to improve service delivery and program effectiveness. It
can also serve as an early warning system of program management or
performance problems, and as a vehicle for improving accountability to the
public. Performance measures are also an essential element of an effective
internal or management control structure and an important aspect of managing an
organization. Effective internal controls are essential for reliable performance
reporting.3
Effective performance measurement enables an agency to establish baselines;
identify and prioritize compliance problems; and evaluate, promote, manage,
control, adapt, and improve programs in response to incoming performance
information. Performance measurement enables decision-makers to maximize
environmental and health benefits by focusing efforts on the most successful
enforcement and compliance activities and programs.
A good performance measurement and reporting system is transparent and holds
an organization accountable. It also improves outcomes by increasing awareness,
sharpening focus, motivating improved performance, and encouraging innovation.
Externally reporting on the results of performance measurement enables the
public to make educated decisions about their surrounding environment and on
EPA's effectiveness in protecting human health and the environment.
3 "Internal control" (also referred to as "management control") comprises the plans, methods, and procedures used
to meet missions, goals, and objectives and, in doing so, supports performance-based management. This includes
the processes and procedures for planning, organizing, directing, and controlling program operations, and the system
established for measuring, reporting, and monitoring program performance.
3

-------
The President's Management Agenda stresses the need for clear performance
measurement and reporting. It states that:
The American people should be able to see how government
programs are performing and compare performance and cost
across programs. The lack of a consistent information and
reporting framework for performance.. .obscures this necessary
transparency.
Strategic Planning, Measurement, and Reporting Required by Law
The 1993 Government Performance and Results Act (GPRA) prompted renewed
focus on internal control to support results-oriented management. GPRA required
Federal agencies to:
•	clarify their missions;
•	set strategic and annual performance goals; and
•	measure and report annually on actual performance compared to goals.
Specifically, GPRA required agencies to:
•	develop plans for what they intend to accomplish;
•	measure how well they are doing;
•	make appropriate decisions based on the information they gathered; and
•	communicate information about their performance to Congress and to the
public.
GPRA required agencies to develop a 5-year strategic plan including:
•	a mission statement and long-term goals and objectives;
•	annual performance plans with annual performance commitments toward
achieving the goals and objectives presented in the strategic plan; and
•	annual performance reports that evaluate an agency's progress toward
achieving performance commitments.
In general, EPA's strategic plan outlines the Agency's five long-term goals and
guides in establishing the annual goals that must be met along the way. To fulfill
its five strategic goals, the plan includes a series of more specific goals in the
form of objectives and sub-objectives. Each of these objectives has associated
performance measures designed to demonstrate progress in achieving the
objective and, eventually, the strategic goal. The annual performance plan defines
the Agency's budget and associated goals and objectives in greater detail and ties
the annual budget to the 5-year strategic plan. Finally, EPA issues an annual
performance and accountability report as required by GPRA. This report
4

-------
highlights the Agency's environmental, programmatic, and financial performance
for the fiscal year.
The long-range strategic plan, annual performance plans, and annual performance
reports forge links between several activities, including:
•	measuring performance to assess progress and link resources
actually used to results achieved; and
•	reporting performance to present progress achieved and impacts on
future efforts.
Scope and Methodology
Our review primarily focused on the public enforcement and compliance
measures as described in EPA's Fiscal 2005 Annual Plan related to EPA goal 5,
Compliance and Environmental Stewardship. EPA changed its public
enforcement and compliance measures for fiscal 2005, and will not report on
these measures until sometime after the end of fiscal 2005. Therefore, we were
unable to assess how EPA reported on those new performance measures.
However, we did assess some elements of EPA's Fiscal 2004 Annual Report.
OECA's planned fiscal 2005 performance measures and goals are detailed in
Appendix D.
To determine how OECA measured and reported enforcement and compliance
effectiveness and progress, we reviewed various internal EPA documents, plans,
and reports, and Office of Management and Budget (OMB) communications. We
also reviewed relevant reports by the U.S. Government Accountability Office
(GAO), National Academy of Public Administration, and International Network
for Environmental Compliance and Enforcement.
To determine how well OECA's performance measures characterize changes in
compliance or other outcomes and provide transparency, we determined and
applied essential criteria for evaluating a performance measurement and reporting
system. We determined and used our professional judgment in applying these
criteria in evaluating OECA's performance measures. These criteria include
relevance, reliability, validity, comparability, and feasibility, and are described in
greater detail in Appendix C. We also met with representatives from OECA,
OMB, and other external stakeholders.
Our evaluation was a review of performance measures, an essential element of
effective internal or management control. Effective internal controls are essential
for reliable performance reporting, and we have identified several issues
regarding OECA's performance measurement and reporting system.
5

-------
We did not identify any previous audit or evaluation reports specifically
addressing EPA's enforcement and compliance performance measurement and
reporting system. However, we identified some EPA Office of Inspector General
(OIG) and GAO reports related to performance measurement, performance data,
and OECA performance in general. Please see Appendix A for more details on
our scope and methodology including prior audit and evaluation coverage.
We conducted our evaluation fieldwork on EPA's enforcement and compliance
performance measurement and reporting between January and June 2005. We
performed our evaluation in accordance with Government Auditing Standards,
issued by the Comptroller General of the United States.
6

-------
Chapter 2
OECA Reported Outputs and Reported Meeting
Performance Goals
OECA primarily measures progress in ensuring compliance using output
measures. OECA uses several types of internal performance reports to monitor
enforcement and compliance progress throughout the year, and reports progress to
Congress and the public in several ways. Through these reports, OECA has stated
it generally met its annual performance goals.
OECA Performance Measurement Activities
OECA's formal performance measurement activities date back to the mid-1990's,
soon after OECA was established. OECA's 1997 National Performance
Measures Strategy (NPMS) was the first important step in improving its
performance measurement system. The project produced principles to help guide
OECA in developing a set of improved measures, and many suggestions about
specific measures that OECA should consider.
Several experts consider OECA an international leader in developing and
improving performance measurement for enforcement and compliance programs.
For example, one expert explained that OECA was clearly ahead of the States in
that few U.S. States used environmental outcome measures at all. In fact, no State
was using measures throughout its environmental program. Another expert
explained that OECA was also a leader among its Federal regulatory colleagues in
developing measures. The expert said that no other Federal program measured
outcomes, so OECA could not look to other Federal agencies as models in
outcome measurement. Senior OECA officials have also spoken on performance
measurement at international conferences. As States and other countries look to
OECA for guidance, OECA must be able to demonstrate successful, results-
oriented approaches for others to emulate.
Notwithstanding both OECA's efforts to improve its performance measurement
and reporting, and also its reputation as a leader in the field, OMB found in its
2002 Program Assessment Rating Tool (PART) assessment that EPA had four
major weaknesses in its civil enforcement program:
1.	lack of meaningful outcome measures;
2.	weak management that did not target resources based on workload
analysis;
3.	data quality issues; and
4.	lack of adequate noncompliance rates.
7

-------
EPA's lack of meaningful outcome measures led to a "Results Not Demonstrated"
characterization in both 2002 and 2003. However, OMB's 2004 PART
assessment found that the program had followed through on original PART
findings by undertaking development of a measures implementation plan and
rated the program as "adequate."
OECA Primarily Measures Outputs
OECA has focused primarily on measuring outputs, such as "number of
enforcement actions," also called activity counts. OECA and other regulatory
agencies have traditionally relied on activity counts because of the difficulty in
demonstrating a direct cause and effect relationship between specific enforcement
and compliance activities, and compliance or end outcomes.
We characterized OECA's performance measures for the most recent complete
year for which OECA has reported results. We based our characterization on both
OECA's fiscal 2004 annual performance report and OECA's fiscal 2004
accomplishments press release. We characterized OECA's measures as inputs,
outputs, intermediate outcomes, or end outcomes, and found that most
performance measures focused on outputs, as shown in Table 2.1.
Table 2.1: Characterization of Fiscal 2004-Reported Measures
Performance
Report
Total
Measures
Reported
Outputs
Reported
Intermediate
Outcomes
Reported
End
Outcomes
Reported
Fiscal 2004 Annual
Performance
Report
13
11
2
0
Fiscal 2004 Press
Release
33
25
8
0
OECA Reported Progress in Various Ways
In the most recent reporting cycle, fiscal 2004, OECA reported its performance to
the public in two documents: (1) an Annual Report, which communicated overall
EPA performance to Congress and the public under GPRA; and (2) an annual
Accomplishments Press Release, which OECA issued to communicate the same
GPRA enforcement and compliance results in EPA's Annual Report, as well as
additional accomplishments not included in that report. OECA posted both
documents on its website, and issued press releases to the media to encourage
news organizations to report OECA accomplishments.
8

-------
OECA issued a third annual report, an Annual Accomplishments Report, each year
from fiscals 1988-2002 (except for 2000) describing results in greater detail (e.g.,
using case studies). OECA did not publish such a report for fiscals 2000, 2003, or
2004. An OECA official explained that OECA discontinued publishing its report
with fiscal 2000 because it duplicated EPA's overall annual report. However,
subsequent Assistant Administrators started and again discontinued publishing
OECA's annual report in later years.
Besides the reporting methods mentioned above, OECA officials stated they
managed programs throughout the year using the internal reporting mechanisms
described in detail in Appendix B.
OECA Reported That It Generally Met Its Performance Goals
In EPA's annual performance reports for fiscals 1999-2004, OECA said that it
generally met its performance goals. Over those 6 years, OECA reported results
for 105 total measures. Of these, OECA reported intended goals for 95 measures.
For the remaining 10 measures reported, OECA did not provide established goals.
As shown in Table 2.2, for the 95 publicly-reported GPRA performance measures
with related goals, 89 percent (or 85 measures) met their goals.
Table 2.2: Number of Publicly-Reported GPRA Performance
Measures Meeting Goals, Fiscals 1999-2004
Fiscal
Year
Measures
Reported
Measures with
Reported
Goals
Measures
Meeting
Goals
Measures Not
Meeting
Goals
Percentage of
Measures
Meeting Goals
1999
25
15
15
0
100%
2000
14
14
10
4
71%
2001
22
22
18
4
82%
2002
19
19
18
1
95%
2003
13
13
12*
1
92%
2004
12
12
12*
0
100%
TOTAL
105
95
85
10
89%
* Included measures with data lag, listed as "to be reported"
The 89 percent of annual performance goals met by OECA were often not only
met, but exceeded. For the seven measures OECA consistently reported for
fiscals 2000-2004, OECA frequently exceeded annual goals. For example, Figure
2.3 illustrates how OECA consistently exceeded its goals for planning and
accomplishing civil investigations.
9

-------
Figure 2.3: Number of Civil Investigations Planned
and Accomplished for Fiscals 2000-2004
700
— Goal
¦U— Accomplishment
c 600
j? 500
e 400
O 300
jij 200
E
Z 100
0
2000
2001
2002
2003
2004
Fiscal Year
10

-------
Chapter 3
OECA's Public Measures Do Not Effectively
Characterize Changes in Compliance or Other
Outcomes
OECA's fiscal 2005 publicly-reported GPRA performance measures do not
effectively characterize changes in compliance or other outcomes because OECA
lacks compliance rates and other reliable outcome data. Four issues reduce the
effectiveness of OECA's performance measures. First, OECA has not developed
effective compliance rates; instead, OECA reports proxies for compliance and
does not know if compliance is actually going up or down.4 As a result, OECA
does not have all of the data it needs to make management and program decisions.
What is missing most, the biggest gap, is information about compliance rates.
Second, OECA cannot demonstrate the reliability of many measures because it
has not verified that measured actions actually took place. Third, some public
measures5 do not clearly link to OECA's strategic goals. Fourth, OECA
frequently changed its performance measures from year to year which reduced
transparency.
OECA Lacks Compliance Rates Among Its Public Measures
OECA has not publicly reported compliance rates for two primary reasons. First,
OECA chose not to invest the resources necessary to produce statistically valid
rates on a broad scale because that might impact its ability to inspect known or
suspected significant violators. Second, other existing compliance rates are either
unreliable or biased; in place of compliance rates, OECA reports proxies for
compliance.
4	While OECA did not use the word "proxy," a top OECA executive did tell us that OECA used these measures
because they would lead to compliance. "Proxy" is our characterization and we believe it is accurate, i.e., the
compliance-related measures currently reported are as close to real compliance rates that OECA can get at the
present time.
5	We use the term "public measures" interchangeably with "GPRA measures" referring to those measures reported
in EPA's annual performance plan required under GPRA. According to OECA, while not part of its public GPRA
measures, OECA has published a compliance rate for Combined Sewer Overflows on its website in 2002 and 2004.
OECA also stated that it plans to publish RCRA compliance rates for foundries in the next 60 days.
11

-------
OECA Does Not Report Statistically Valid Compliance Rates or Other
Compliance Rates
OECA lacks compliance rates among its publicly-reported performance measures.
OECA conducted pilot studies to develop statistically valid compliance rates on a
small scale, but has not invested the resources necessary to produce statistically
valid rates on a broad scale. According to OECA, these pilots resulted in the
development of statistically valid compliance rates for seven small segments of
the regulated community based on inspections, and for five small segments of the
regulated community based on facility self-reported information. OECA also
plans to develop statistically valid rates for its national enforcement priority areas.
However, according to OECA, it is not practical for OECA to determine
statistically valid compliance rates for the entire regulated universe.6 A senior
OECA executive said that OECA does not have the resources to either inspect
every facility to determine the true state of compliance across programs, or
randomly sample facilities to determine compliance rates, without sacrificing
compliance monitoring of known significant violators.
OECA generates other compliance rates (e.g., significant noncompliance
information) and internal reports (e.g., watch lists for noncompliance7) based on
targeted inspections. OECA chose not to publicly report such information
because:
•	these compliance rates are based on universes known to be incomplete and
the rates are therefore unreliable; and
•	internal reports are based on targeted inspections at facilities suspected to
be likely violators, and are therefore biased in that the results may indicate
a higher level of noncompliance than might be present in the regulated
community as a whole.
However, because OECA does not report compliance rates, the public, Congress,
and other specific stakeholder groups cannot determine whether OECA is
successfully achieving its primary goal of maximizing compliance.
As reported in our September 2005 report, Limited Knowledge of the Universe of
Regulated Entities Impedes EPA 's Ability to Demonstrate Changes in Regulatory
Compliance, OECA lacks an accurate characterization of the universe of
regulated entities. Better understanding of the composition of the regulated
universe will allow OECA to reliably estimate compliance for segments of the
regulated universe.
6	For additional discussion on statistically valid compliance rate computation and methods, see the section "Further
Evaluation Needed" on page 22.
7	Please see Appendix B for detailed descriptions of these and other examples of internal OECA performance
management and reporting.
12

-------
To reliably estimate compliance for a segment of the regulated community,
OECA needs an accurate characterization of the number of regulated facilities in
that segment. OECA does have a methodology to develop statistically valid non-
compliance rates. However, we have not reviewed this methodology, and
therefore cannot comment on it at this time.
In the OIG report mentioned above, we recommended that OECA:
•	Biennially update publicly released universe figures by tracking and
recording the number of entities over which it has oversight and
primary regulatory responsibility; and
•	Develop an objective of having the most up-to-date and reliable data on
all entities that fall under its regulatory responsibility.
With reliable information about the regulated universe, OECA can divide the
regulated universe into manageable categories and develop a sampling procedure
for inspections. OECA can categorize the regulated universe based on many
parameters, including:
•	environmental risk to the public;
•	industry sector;
•	compliance history;
•	geography;
•	regulated substance;
•	potential for exposure; or
•	number of people affected or potentially affected.
This will allow OECA to focus compliance and enforcement resources and
efforts on particular categories and plan inspections based on the selected
parameters. OECA can choose inspection sites using a number of approaches,
including targeted, random, stratified, or weighted sampling.
Experts agreed that compliance rates developed for segments of the regulated
universe would provide useful performance information. One expert said that
calculating statistically valid compliance rates on a sector or geographic basis
would suffice for identifying sector-based or geographically-focused compliance
problems.
Each of these categories and sampling approaches has advantages and
limitations. As we explain in the final section of Chapter 3, we will explore the
benefits and disadvantages of likely approaches in future OIG evaluations.
OECA has developed statistically valid compliance rates in ten small populations
that OECA officials say do not tie in well with OECA priorities, and OECA can
do little with the results. OECA officials want to overcome the resource, policy,
and methodological hurdles to developing additional statistically valid
13

-------
compliance rates. For example, OECA officials would like to use statistically
valid compliance rates for national priority areas.
OECA Reported Proxies Instead of Compliance Rates
Instead of measuring and publicly reporting compliance rates, OECA relies on
other compliance-related measures of activities that result from compliance and
enforcement actions. These proxy compliance measures include:
•	corrected violations;
•	compliance assistance results; and
•	facility self-audit data.
OECA publicly reported some measures of recidivism in fiscals 200land 2002,8
but did not report these measures in subsequent years because of concerns about
whether the measures were meaningful, and whether they might overstate
recidivism.
OECA included three compliance-related measures among the 69 performance
measures contained in annual performance plans and reports from fiscals 1999-
2005. OECA reported on these three measures a total of five times, from fiscals
1999-2004 as follows:
1.	Percentage of automotive service and repair industry reaching targeted
compliance level;9
2.	Percentage increase over fiscal 2000 proportion of facilities in significant
noncompliance (SNC) returning to compliance within two years;10 and,
3.	Percentage reduction in SNCs for the Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act from fiscal 2000.11
Unverified Estimated, Predicted, and
Facility Self-Reported Public Measures May Be Unreliable
OECA's dependence on unverified estimated, predicted, and facility self-reported
measurement data decreases the reliability of its performance measurement and
reporting system. OECA measures pollutant reductions using estimated data, and
reports anticipated future pollutant reductions using predicted data. OECA bases
these performance measures on data that OECA did not verify. Therefore, OECA
cannot know if these measures provide reliable information about outcomes.
8	OECA publicly reported "Percent increase over 2000 proportion of SNCs [facilities in significant noncompliance]
returning to compliance within two years" and "percent reduction in significant noncompliance for CAA, CWA and
RCRA from 2000" in fiscals 2001 and 2002.
9	Reported in fiscal 1999, see Appendix E, measure 12.
10	Reported in fiscals 2001 and 2002, see Appendix E, measure 21.
11	Reported in fiscals 2001 and 2002, see Appendix E, measure 22.
14

-------
As an Agency, EPA has specifically avoided using estimated performance
measurement data in the past. For example, EPA specifically chose to use
recorded observations and values rather than estimated data in its 2004 Draft
Report on the Environment to prevent ambiguity and potential problems with data
reliability. The use of and dependence on unverified estimated, predicted, or
facility self-reported data reduces the reliability of OECA's performance
measures as accurate indicators of compliance. Collecting monitoring data and
tracking actual values would:
•	provide internal and external stakeholders with a more accurate portrayal
of OECA's results;
•	increase the reliability of OECA's performance measures; and
•	allow OECA to more effectively characterize actual changes in
environmental conditions and human health.
OECA also relied on self-reported data from regulated entities. Because regulated
entities are required to comply with laws and regulations, OECA cannot rely on
regulated entities as objective or reliable sources of compliance data.
OECA Bases Nearly All 2005 Measures on Estimated, Predicted, or
Facility Self-Reported Performance Data
OECA measures and reports pollutant reductions using estimated, predicted, and
facility self-reported data that may not reliably demonstrate progress. Table 3.1
shows OECA's fiscal 2005 performance goals, measures, and the basis for the
related measures. We characterized each measure as "Unverified, Facility Self-
Reported Data," "Unverified Estimates or Predictions," or "Actual Count of
Activities," and found that 15 of 16 planned measures were based on facility self-
reporting or estimates and predictions. Only one measure, number 16, was based
on an actual count.12
Although OMB has recommended that OECA verify emissions reductions
actually took place, OECA did not plan to verify self-reported or estimated data.
In EPA's fiscal 2005 annual plan, OECA described pollutant reductions or
eliminations as estimates of what may be achieved if the facility or defendant
carried out the requirements of a voluntary settlement agreement, and said the use
of estimates limits its measurement data.13 OECA officials said they expect that
companies will fulfill the requirements of their consent decrees14 even without
12	Please see Appendix D for a detailed list of all of OECA's fiscal 2005 annual performance goals, associated
performance measures, and targets.
13
EPA's fiscal 2005 annual plan described pollutant reductions or eliminations reported on the Case Conclusion
Data Sheets (CCDS) as estimates of what may be achieved if the facility or defendant carried out the requirements
of a voluntary settlement agreement, and that this limits this Integrated Compliance Information System (ICIS) data.
14
Consent decrees are judicial decrees that sanction voluntary agreements between parties in dispute.
15

-------
any verification. OECA believes that if these controls are effective, they would
increase the likelihood that terms of the settlement agreements will be carried out
and the pollution reductions actually achieved.
Table 3.1: OECA Goals and Basis of Related Performance Measures (fiscal 2005)
Performance Goal
Performance Measure Basis (shaded)
Unverified, Unverified
Facility Self- Estimates Actual
Reported or Count of
Related Performance Measure(s) Data Predictions Activities
Goal Area: Compliance Assistance
Improve Understanding of
Regulations
1. Percentage of entities seeking assistance I
from EPA-sponsored compliance assistance
centers and clearinghouse reporting improved
understanding
2. Percentage of entities receiving direct
compliance assistance from EPA reporting
improved understanding
Improve Environmental
Management Practices
(EMP)
3. Percentage of entities seeking assistance
from EPA-sponsored compliance assistance
centers and clearinghouse reporting improved
Environmental Management Practices (EMP)
4. Percentage of entities receiving direct
compliance assistance from EPA reporting
improved EMP
Reduce Pollutants
5. Percentage of entities seeking assistance
from EPA-sponsored compliance assistance
centers and clearinghouse reporting pollution
reductions
6. Percentage of entities receiving direct
assistance from EPA reporting pollution
reductions |
Goal Area: Compliance Incentives
Increase percentage of
facilities using incentive
policies to conduct
environmental audits or
other actions that reduce,
treat, or eliminate pollution
or improve EMP
7. Percentage of audits resulting in pollution
reduction and ecosystem protection
8. Percentage of audits resulting in improved
EMP
9. Pounds pollution reduced as a result of
audits
10. Dollars invested in EMP as a result of
Goal Area: Monitoring and Enforcement
Increase Complying
Actions
11. Percentage of entities taking complying
actions as a result of on-site inspections/
investigations
Increase Pollutant
Reduction/ Treatment
12. Estimated pounds of pollution to be
reduced/treated as result of concluded
enforcement actions
13. Percentage of concluded enforcement
cases requiring pollutant reduction and
ecosystem protection
Improve Environmental
Management Practices
14. Percentage of concluded enforcement
cases requiring improved EMP
15. Dollars invested in improved EMP or
environmental performance as a result of
enforcement actions
(No Goal)
16. Number of inspections and investigations I
conducted I
16

-------
Because OECA tracks nine of the fiscal 2005 performance measures using
estimated or predicted results, OECA reports pollutant reductions, improvements
in environmental conditions, or other results that may not actually occur.
As shown in Table 3.1, regulated facilities provide self-reported data for all six
OECA compliance assistance-related measures. These measures all depend upon
unverified, facility self-reported data.
While OECA's fiscal 2005 annual plan states that OECA expects estimates will
be prudently underestimated, the annual plan provides no basis for this
expectation.
OECA Does Not Verify Estimated, Predicted, or Facility Self-Reported
Data to Ensure Reliability
Although performance measurement experts stress verifying estimated, predicted,
or facility self-reported data,15 OECA does not verify such data for key outcomes
such as:
•	pollution reduced;
•	protection of populations or ecosystems; or
•	environmental management practices improved or employed.
A senior OECA manager agreed that OECA could potentially validate estimated
numbers such as predicted pollution reductions through a pilot verification study.
Conducting such a study would allow OECA to ascertain if estimated, predicted,
and facility self-reported outcomes actually occurred.
External stakeholders and performance measurement experts cited the lack of
actual monitoring data used in OECA's performance measurement system as a
concern. OMB officials suggested that if actual outcomes cannot be reported or
estimates verified, OECA should clearly identify and label such outcomes as
"planned" emissions reductions. OECA could also describe pollution reductions
as estimated, predicted, facility self-reported, or actual/verified reductions.
Some Public Measures Are Not Linked to Goals
OECA's fiscal 2005 performance measures for some of its most important
outcomes do not clearly link to OECA's goals and objectives. As a result, OECA
is unable to clearly or effectively communicate and report on the extent to which
it is accomplishing these important goals. In our opinion, this lack of linkage
15 In reporting on performance measure values, performance measurement experts and internal stakeholders
encouraged using actual numbers instead of estimates or facility self-reported data or, alternatively, verifying any
such data used in performance measurement.
17

-------
obscures OECA's goals and makes it difficult for the public, Congress, or even
EPA staff to discern OECA's progress in accomplishing its goals.
Table 3.2 shows OECA's fiscal 2005 goals, associated measures, and goal-
measure relationship discrepancies. We assessed the goals and measures to
determine if they agreed, and found discrepancies for 8 of 16 measures. Some
OECA performance goals did not include any relevant measures linked to the goal
(see measures 7 through 11 in Table 3.2). The "Compliance Incentives"
performance goal is to "Increase the percentage offacilities.. .that reduce, treat, or
eliminate pollution or improve EMP [environmental management practices]," but
none of the four measures under this goal is designed to measure the "percentage
offacilities." (We added bold italics in both quotes for emphasis.) While this
goal is titled "Compliance Incentives" in OECA's fiscal 2005 annual plan, none
of the four measures under this objective measures true "compliance," or
conformity with environmental laws and regulations.
OECA's measure for another performance goal (see measure 11 in Table 3.2) was
not designed to measure exactly what the goal described. This measure could
easily mask real decreases in complying actions and mislead the public.
Specifically, OECA aims to "Increase Complying Actions," while measuring the
"Percentage of entities taking complying actions as a result of on-site compliance
inspections/evaluations." To clearly articulate progress toward increasing
complying actions, OECA should measure the change in the actual number of
complying actions from one year to the next. Measuring only the percentage
could lead to reporting an increase from one year to the next, even if the number
of complying actions substantially decreased.
18

-------
Table 3.2: OECA Performance Goals, Related Performance Measures,
and Goal/Measure Relationship Discrepancies (fiscal 2005)
Performance Goal
Related Performance Measure(s)
Relationship Discrepancy
Goal Area: Compliance Assistance
Improve Understanding
of Regulations
1. Percentage of entities seeking
assistance from EPA-sponsored
compliance assistance centers and
clearinghouse reporting improved
understanding

2. Percentage of entities receiving direct
compliance assistance from EPA reporting
improved understanding

Improve Environmental
Management Practices
3. Percentage of entities seeking
assistance from EPA-sponsored
compliance assistance centers and
clearinghouse reporting improved EMP

4. Percentage of entities receiving direct
compliance assistance from EPA reporting
improved EMP

Reduce Pollutants
5. Percentage of entities seeking
assistance from EPA-sponsored
compliance assistance centers and
clearinghouse reporting pollution
reductions

6. Percentage of entities receiving direct
assistance from EPA reporting pollution
reductions

Goal Area: Compliance Incentives
Increase percentage of
facilities using incentive
policies to conduct
environmental audits or
other actions that reduce,
treat, or eliminate
pollution or improve EMP
7. Percentage of audits resulting in
pollution reduction and ecosystem
protection
Measure reports percent audits,
not percent facilities
8. Percentage of audits resulting in
improved EMP
Measure reports percent audits,
not percent facilities
9. Pounds pollution reduced as a result of
audits
Measure does not report percent
facilities
10. Dollars invested in EMP as a result of
audits
Measure does not report percent
facilities
Goal Area: Monitoring and Enforcement
Increase Complying
Actions
11. Percentage of entities taking
complying actions as a result of on-site
inspections/ investigations
Measure does not demonstrate
increase or decrease in complying
actions from year to year, and
may mask changes
Increase Pollutant
Reduction/ T reatment
12. Estimated pounds of pollution to be
reduced/treated as result of concluded
enforcement actions

13. Percentage of concluded enforcement
cases requiring pollutant reduction and
ecosystem protection
Goal does not include ecosystem
protection
Improve Environmental
Management Practices
14. Percentage of concluded enforcement
cases requiring improved EMP

15. Dollars invested in improved EMP or
environmental performance as a result of
enforcement actions
Measure does not equate dollars
with improvements
(No Goal)
16. Number inspections, investigations
conducted
Related to the Goal Area of
Monitoring and Enforcement,
though not to a specific goal.
19

-------
Frequent Changes in Public Measure Reporting Reduce Transparency
OECA frequently changed its public performance measures from year to year
which reduced transparency. Since 1999, OECA officials have changed many
publicly reported performance measures to improve them, according to OECA.
OECA also changed performance measures to comply with changes in EPA's
strategic plan, and to respond to OMB recommendations. OECA reported some
measures for up to five consecutive years, and publicly reported trend data for
some measures as recently as in its fiscal 2004 annual report. However, OECA
changed the wording for all publicly-reported measures in fiscal 2005. Therefore,
unless OECA continues tracking and publicly reporting at least some of its pre-
2005 measures, the public cannot compare enforcement and compliance
performance over time.
OECA frequently changed its publicly-reported performance measures and
reported on the majority of the 69 different measures used between fiscals 1999-
2005 for only a single year. OECA used 46 measures once, and 23 for two years
or more. OECA reported or plans to report on 69 different performance
measures16 in EPA's annual performance reports for fiscals 1999-2005,17
providing information on an average of 18 measures per year.
OECA consistently reported information on two measures over 6 consecutive
years (fiscalsi999-2004):
1.	Number of inspections; and
2.	Pounds of pollutants required to be reduced through enforcement actions.
OECA consistently reported information on five additional measures over 5 years
between fiscal 1999 and 2004:
1.	Number of criminal investigations;
2.	Number of civil investigations;
3.	Number of EPA-assisted inspections conducted;
4.	Number of regulated groups ("populations") served by valid compliance
rates or other indicators of compliance; and
5.	Number of entities voluntarily disclosing and correcting violations.
Although OECA changed its publicly-reported GPRA performance measures over
time, officials stated they continue to track raw data for most modified or
16	Please see Appendix E for a complete listing of these measures.
17	EPA changed its public enforcement and compliance measures in fiscal 2005, and will not report on these
measures until sometime after the end of fiscal 2005. Therefore, we were unable to assess how EPA reported on
these new performance measures. However, we have included the planned 2005 performance measures as described
in EPA's latest strategic and annual plans.
20

-------
discontinued measures in an electronic database and could choose to compile and
report historical performance data in OECA's annual press releases.
Fiscal 2005 marked the beginning of EPA's new strategic plan, with OECA's
activities reorganized as part of the fifth of five EPA goals, "Compliance and
Environmental Stewardship." EPA established annual performance goals to:
•	increase compliance with environmental regulations;
•	reduce and treat pollutants; and
•	improve environmental management practices at regulated facilities.
To more closely demonstrate progress toward achieving these goals, OECA
officials said they revised their publicly-reported GPRA performance measures
for the 2005-2008 EPA strategic plan (see Appendix D).
OECA included 16 measures in its fiscal 2005 annual plan, and all measures
differed from past years' publicly-reported measures. In some cases, the 2005
measures represented a combination of two or three past measures. For example,
measure number 23 in Appendix E, "# inspections, civil investigations and
criminal investigations conducted," combined measures 1, 13, and 14. In other
cases, OECA reworded measures used in the past. For example, measure 42
(Appendix E), "% regulated entities receiving direct CA [compliance assistance]
from EPA.. .reporting that they increased their understanding of environmental
requirements as a result of EPA assistance," is a rewording of measure 35
(Appendix E), "% Participants Improved Understanding of Regulations."
OECA made changes to the fiscal 2005 publicly-reported GPRA measures,
choosing to use percentages in tracking some measures in fiscal 2005. OECA
presented some trend information in past annual performance reports, and OECA
officials said they intend to continue this practice for fiscal year 2005. However,
OECA officials also acknowledged that other EPA offices have sometimes
modified OECA's annual planning and reporting submissions in the past.
Recommendations
We recommend that the Assistant Administrator for Enforcement and Compliance
Assurance:
3.1 Design and implement a pilot project to verify estimated, predicted, and
facility self-reported outcomes, and report on the pilot's results to demonstrate
the reliability of such performance measures. Until OECA verifies these data,
OECA should clearly and prominently describe all measures as estimated,
predicted, or facility self-reported.
21

-------
3.2	Improve the linkage/relationship of OECA's goals and measures in EPA
strategic and budgetary documents to improve external understanding and
internal usefulness. In addition to clarifying the language of its annual
performance goals, this action should include developing measures that more
clearly and directly link to those goals.
3.3	Continue to improve enforcement and compliance performance measures,
while continuing to publicly report key measures annually to provide the
public, Congress, and other specific stakeholders a minimal amount of
comparable trend data.
Further Evaluation Needed
As mentioned on page 13 of this report, methods for producing statistically valid
compliance rates come with advantages and limitations. Further evaluation of EPA's
previous and potential use of statistically valid compliance rate measures will
determine the feasibility and effectiveness of developing statistically valid rates and
the most beneficial method for doing so; OECA has requested additional assistance
from us in this area.
As stated in our September 2005 report,18 OECA does not have accurate information
about the universe of regulated entities for five of six programs we sampled in that
evaluation. OECA's ability to randomly select facilities, and produce statistically
valid compliance rates will also be hampered by documented data quality problems.
It may be possible for OECA to produce statistically valid compliance rates on a
larger scale, e.g., the Safe Drinking Water Act program. Indeed, OECA would like to
expand its use of these measures and make them "a more integral part of our planning
and program assessment activities." Depending on the sampling scheme that OECA
chooses to use to develop statistically valid compliance rates, resources could be
drawn away from known significant environmental violators.
Further evaluation is necessary of the potential generation and use of large scale
statistically valid compliance rates. Among the topics that could be evaluated are:
•	The statistically valid compliance rate pilot projects undertaken by OECA
between fiscals 1999-2004;
•	The tradeoffs of different sampling strategies, given resource considerations.
This study could also include an analysis of the complexities of coordinating with
States and EPA regions using the various approaches;
•	The true environmental costs and benefits of a neutral-based inspection
approach (random sampling) to generate statistically valid compliance rates;
18 See EPA OIG report, Limited Knowledge of the Universe of Regulated Facilities Impedes EPA 's Ability to
Demonstrate Changes in Regulatory Compliance. Report 2005-P-00024, September 19, 2005.
22

-------
•	An analysis of inspection-based and facility self-reported statistically valid
compliance rates; and,
•	OECA's intended use of statistically valid compliance rates and how their
management approaches and practices may change as a result.
Agency Response and OIG Evaluation
OECA agreed with all of our draft report recommendations as described in its
comments attached as Appendix F. However, OECA stated, "... advocating a
strict adherence to the use of recorded observations and values will set an
impossibly high standard for data collection and will have a chilling effect on
initiatives to improve outcome measures...." OECA also characterized our
observations and recommendations as having ".. .marginal value and relevance,
and left unaddressed the requests for assistance on the crucial issue of developing
meaningful, statistically valid compliance rates... " We disagree, and have
addressed OECA's criticisms in detail in Appendix G. As described in
Appendix G, we also modified recommendation 3.2 to ensure OECA clearly
understood our intent.
OECA's comments also included additional information that it believed would
correct certain facts or provide additional context to the report. We have
addressed each of these specific comments in detail in Appendix G. We made
revisions in our final report based on their comments as we determined
appropriate.
23

-------
Appendix A
Detailed Scope and Methodology
To determine how the US Environmental Protection Agency's (EPA's) Office of Enforcement
and Compliance Assurance (OECA) measured and reported enforcement and compliance
effectiveness and progress toward its goals, we reviewed internal OECA documents, EPA Office
of the Chief Financial Officer (OCFO) reports and plans, and Office of Management and Budget
communications. We reviewed EPA's 2003 - 2008 Strategic Plan: Direction for the Future,
fiscal 2004 and 2005 annual performance plans, and the Agency's fiscal 2004 annual report. We
also reviewed relevant reports by the U.S. Government Accountability Office (GAO), National
Academy of Public Administration, and International Network for Environmental Compliance
and Enforcement.
To determine how well OECA's performance measures characterize changes in compliance or
other outcomes, and provide transparency, we determined the essential criteria for evaluating a
performance measurement system. We used our professional judgment in applying these criteria
in evaluating OECA's performance measures. These criteria include relevance, reliability,
validity, comparability, and feasibility, and are described in greater detail in Appendix C.
Specifically, we reviewed approximately 100 pieces of academic and public policy literature and
interviewed a variety of performance measurement experts. We selected documents that
included, and interviewed experts about, criteria for developing and assessing performance
measures. We summarized the criteria identified in each document, and then grouped similar
criteria to develop the comprehensive list of criteria summarized in Appendix C. We also met
with representatives from OECA, the Office of Management and Budget (OMB), and other
external stakeholders. We evaluated OECA's measures to determine how OECA demonstrated
its progress in achieving compliance and environmental and human health goals, focusing
primarily on publicly-reported measures. EPA changed its public enforcement and compliance
measures in fiscal 2005, and will not report on these measures until sometime after the end of
fiscal 2005. Therefore, we were unable to assess how EPA reported on these new performance
measures. However, we did consider the planned 2005 performance measures as described in
EPA's fiscal 2005 annual plan. See Chapter 3 for further details on these changes.
Prior Audit and Evaluation Work
We did not identify any previous audit or evaluation reports specifically evaluating EPA's
OECA performance measurement and reporting system. However, we identified EPA/OIG and
GAO reports listed below with findings on performance measurement, performance data, and
OECA performance.
24

-------
GAO: Environmental Indicators: Better Coordination Is Needed to Develop
Environmental Indicator Sets That Inform Decisions, GAO-05-52, November 17,
2004.
GAO found a number of challenges with developing environmental indicator sets to
inform decisions. Key among those was obtaining sufficient environmental data to report
conditions and trends related to the indicators selected. GAO also found problems in
linking specific environmental management actions and program activities to changes in
environmental conditions and trends. Developers assembling environmental indicator
sets to improve the performance of environmental management programs reported
difficulty (1) accounting for relationships between management actions and other factors
beyond the agency's control that can potentially affect environmental changes, and (2)
addressing the time lag between management actions and achieved results. GAO stressed
that EPA place priority on developing indicators to guide the agency's priority setting,
strategic planning, and resource allocation. GAO found that EPA has not initiated or
planned an institutional framework with clear lines of responsibility and accountability
for developing and using environmental indicators, and no processes, procedures, or work
plans exist to link the results with EPA's strategic planning and performance reporting
cycle. GAO recommended that building on EPA's initial efforts on indicators and
evaluating the purposes that indicators might serve, the EPA Administrator establish clear
lines of responsibility and accountability among EPA's various organizational
components and identify specific requirements for developing and using environmental
indicators.
EPA OIG: EPA Needs to Improve Tracking of National Petroleum Refinery
Compliance Program Progress and Impacts, Evaluation Report No. 2004-P-00021,
June 22, 2004.
We found that OECA's performance measurement and reporting approach for the
national petroleum refinery program had not provided useful and reliable information
necessary to effectively implement, manage, evaluate, and continuously improve program
results. OECA had not established and communicated clear goals, systematically
monitored refinery program progress, reported actual outcomes, or tracked progress
toward achieving consent decree goals. During consent decree implementation, EPA
delays may have delayed emissions reductions and compromised compliance. We found
that OECA must resolve planning issues and delays, and begin to measure outcomes, to
ensure timely emissions reductions and to optimally protect human health and the
environment, especially for people living in the vicinity of refineries.
GAO: Performance Reporting: Few Agencies Reported on the Completeness and
Reliability of Performance Data, GAO-02-372, April 26, 2002.
GAO found that only 5 of the 24 Chief Financial Officer Act agencies' fiscal year 2000
performance reports included assessments of the completeness and reliability of their
performance data in their transmittal letters. EPA was not among those five agencies.
None of the agencies identified any material inadequacies with their performance data in
their performance reports. However, concerns about the quality of performance data
were identified by the agencies' inspectors general as either a major management
25

-------
challenge or included in the discussion of other challenges for 11 of the 24 agencies.
Discussing in performance reports the standard or method used to assess the
completeness and reliability of its performance data is not required. However, such
information can provide helpful contextual information to decision makers on the
credibility of the reported performance data. GAO noted that EPA's performance report
also provides a useful discussion of data quality. The agency discusses the source and
quality of the data associated with each performance goal.
EPA OIG: Compliance with Enforcement Instruments, Audit Report No.
2001-P-00006, March 29, 2001.
We found that OECA's performance measures were not sufficient to determine the
program's actual accomplishments. Consequently, we determined Congress had less
useful performance data upon which to base its decision-making. We also found that
EPA regions did not always adequately monitor compliance with enforcement
instruments (e.g., consent decrees) or always consider further enforcement actions. We
attributed ineffective monitoring primarily to the lack of (1) guidance detailing how or
when to monitor enforcement instruments, and (2) emphasis OECA placed on
monitoring. Consequently, OECA risked continued violations that would contribute to
human and environmental health impacts, thus decreasing EPA's deterrent effect. In
response, OECA concurred that it and the regions can and should improve tracking and
enforcing compliance with requirements in enforcement instruments. At that time, we
concluded that OECA had begun to take the steps necessary for us to close out the report.
GAO: Managing for Results: Assessing the Quality of Program Performance Data,
GAO Letter Report B-285312, May 25, 2000.
GAO determined the following key dimensions to consider when producing and
analyzing program performance data:
•	Accuracy—the extent to which the data are free from significant error;
•	Validity—the extent to which the data adequately represent actual performance;
•	Completeness—the extent to which enough of the required data elements are
collected from a sufficient portion of the target population or sample;
•	Consistency—the extent to which data are collected using the same procedures and
definitions across collectors and times;
•	Timeliness—whether data about recent performance are available when needed to
improve program management and reporting to Congress;
•	Ease of Use—how readily intended users can access data, aided by clear data
definitions, user-friendly software, and easily used access procedures.
26

-------
Appendix B
Internal Performance Management and Reporting
Office of Enforcement and Compliance Assurance (OECA) officials told us they managed
programs throughout the year using six internal management and reporting techniques:
1.	Monthly management reports to regions about key outputs and outcomes
provided snapshots of how regions performed. These reports allow OECA to
compare current regional progress with past years', and regions could adjust as
needed. For example, in fiscal 2003, OECA noticed that Department of Justice
referrals were down from the same period in the prior year. OECA officials said
they discussed the need to look into referrals, and as a result they were able to
improve results by year's end.
2.	Region performance profiles ("Trip Reports" developed by OECA staff in
preparation for Assistant Administrators' planned visits to a region) provide
OECA's Assistant Administrator with information about a region's progress with
the National Priority Areas.19 Because OECA only began establishing specific
goals for priority areas in 2005, OECA has little experience using the reports for
this purpose. These profiles are the major performance reports OECA uses
internally.
3.	Periodic in-depth performance analyses for specific measures provide OECA
with information on progress for certain activities or priorities. For example,
OECA did a more in-depth analysis on the National Pollutant Discharge
Elimination System permit system a few years back. A senior OECA executive
said that such performance data allow them to look at a particular slice of the
program and to have a standard format for addressing weaknesses. For example,
in reviewing the National Pollutant Discharge Elimination System, OECA
officials said they found follow-up problems with Significant Non-Compliers
(SNCs), and were able to accelerate the use of the Watch List (see item number 6
below) to address this problem.
4.	National Priority Area data to determine if and how the strategies for these
priorities need to be adjusted. An OECA official said that because the strategies
have only been in place for a few months, the organization could not yet say
anything definitive about their use in managing programs. However, the official
said OECA believed the organization would use these priority area updates more
in the future.
5.	Mid- and end-of-year GPRA data to determine if programs are on track for
meeting their annual performance goals.
6.	Quarterly Watch Lists identify noncompliance priorities, i.e., those facilities
that remain out of compliance after a notice of violation was issued.
19
OECA identified enforcement priorities as National Priority Areas. Regions and headquarters focused efforts on
these areas, which had specific goals.
27

-------
Appendix C
Criteria for Effective Performance Measurement
Organizations should periodically evaluate performance measures to determine whether they are
providing the information for which the measures were developed. Evaluating performance
measures also illustrates whether other measures exist that could better measure progress toward
goals. By using well-defined criteria to choose, revise, and use performance measures, program
operators can manage programs based on results to ensure they use the best techniques and
achieve the best possible outcomes.
We identified five criteria for evaluating performance measures: relevance, reliability, validity,
comparability, and feasibility. We also determined that assessing reporting mechanisms
provided important information on performance measure clarity and public accountability.
Criterion
Definition
Relevance
A performance measure should be pertinent for its intended use. It should
also include aspects of program performance applicable for the intended
use. A performance measure should be relevant to EPA's goals,
objectives, and priorities, and to the needs of external stakeholders.
Reliability
A performance measure should be consistent and have high quality data.
Samples should be large enough to yield reliable data, repeated
measurements should yield the same results, and data from different
offices or organizations should be based on similar definitions and data
collection procedures.
Validity
A performance measure should accurately represent the condition or
phenomenon that it is purporting to represent.
Comparability
A performance measure should be able to be compared to existing and
past measures of conditions to develop trends and define variation. It
should also provide a clear frame of reference for assessing performance
overtime to demonstrate performance trends.
Feasibility
A performance measure should be "collectable." Information for the
measure should be available or able to be obtained with reasonable cost
and effort and provide maximum information per unit of effort. The cost of
collecting data should not outweigh their value.
Performance reports should clearly portray performance measures with appropriate comparisons
to show trends and the adequacy of the measure itself. Program managers should provide
enough information for users to correctly understand results, including information about how
present performance compares with past performance, and explanations of results.
28

-------
Appendix D
Office of Enforcement and Compliance Assurance
Fiscal 2005 Annual Performance
Goals, Measures, and Targets
ANNUAL
PERFORMANCE
GOAL
ASSOCIATED PERFORMANCE MEASURES
TARGET
Through monitoring and enforcement actions,
Environmental Protection Agency (EPA) will increase
complying actions, increase pollutant reduction or
treatment, and improve environmental management
practices
1. Pounds of pollution estimated to be reduced, treated,
and eliminated as a result of concluded enforcement
actions.
300 million
2. Percentage of concluded enforcement cases (including
Supplemental Environmental Projects, SEPs) requiring
that pollutants be reduced, treated, or eliminated and
protection of populations or ecosystems.
30
3. Percentage of concluded enforcement cases (including
SEPs) requiring implementation of improved
environmental management practices.
60
4. Number of inspections, civil investigations, and
criminal investigations conducted.
18,500
5. Dollars invested in improved environmental
performance or environmental management practices as a
result of concluded enforcement actions (i.e., injunctive
relief and SEPs).
4 billion
6. Percentage of regulated entities taking compliance
actions as a result of compliance monitoring.
10
Through self-disclosure
policies, EPA will increase the
percentage of facilities reducing
pollutants or improving
environmental management
practices
7. Percentage of audits of other actions that result in the
reduction, treatment, or elimination of pollutants, and the
protection of populations or ecosystems.
5
8. Percentage of audits or other actions that result in
improvements in environmental management practices.
10
9. Pounds of pollutants reduced, treated, or eliminated as a
result of audit agreements or other actions.
25 million
10. Dollars invested in improving environmental
management practices as a result of audit agreements or
other actions.
2 million
29

-------
Through compliance assistance, EPA will increase the understanding of
regulated entities, improve environmental management practices, and reduce
pollutants
11. Percentage of regulated entities seeking assistance
from EPA-sponsored compliance assistance centers and
clearinghouse reporting that they improved environmental
management practices as a result of their use of the centers
or clearinghouse.
60
12. Percentage of regulated entities receiving direct
compliance assistance from EPA (e.g., training, on-site
visits) reporting that they improved environmental
management practices as a result of EPA assistance.
50
13. Percentage of regulated entities seeking assistance
from EPA-sponsored compliance assistance centers and
clearinghouse reporting that they reduced, treated, or
eliminated pollution as a result of that resource.
25
14. Percentage of regulated entities seeking assistance
from EPA-sponsored compliance assistance centers and
clearinghouse reporting that they increased their
understanding of environmental requirements as a result of
their use of the resources.
75
15. Percentage of regulated entities receiving direct
compliance assistance from EPA (e.g., training, on-site
visits) reporting that they increased their understanding of
environmental requirements as a result of EPA assistance.
65
16. Percentage of regulated entities receiving direct
compliance assistance from EPA (e.g., training, on-site
visits) reporting that they reduced, treated, or eliminated
pollution as a result of EPA assistance.
25
30

-------
Appendix E
Office of Enforcement and Compliance Assurance
(OECA) Performance Measures,
Fiscals 1999-2005


Fiscal Year


OECA PERFORMANCE MEASURES
FY 1999 Annual Report
FY 2000 Annual Report
FY 2001 Annual Report
FY 2002 Annual Report
FY 2003 Annual Report
FY 2004 Annual Report
Planned for FY 2005
Number
of Years
Reported

ENFORCEMENT MEASURES








1
# Inspections
X
X
X
X
X
X

6
2
lbs of pollutants required to be reduced through enforcement
actions settled in the FY
X
X
X
X
X
X

6
3
% enforcement actions resulting in improvements in use or
handling of pollutants
X
X





2
4
# enforcement actions
X






1
5
# entities regulated
X






1
6
# planning and community right to know enforcement actions
X






1
7
# planning and community right to know inspections
X






1
8
$ value of concluded enforcement actions FY98-FY03
X






1
9
% concluded enforcement actions resulted in improvements in
facility management practices and information collection
X






1
10
% formal enforcement actions by States
X






1
11
% inspections conducted by States
X






1
12
% of automotive service and repair industry achieving targeted
compliance level
X






1
13
# civil investigations

X
X
X
X
X

5
14
# criminal investigations

X
X
X
X
X

5
15
# EPA-assisted inspections conducted

X
X
X
X
X

5
16
# reports produced on civil and criminal enforcement actions
initiated and concluded

X
X
X



3
17
# baselines established to measure % recurring significant
violations within 2 years

X





1
18
# baselines established to measure average length of time for
significant violators to return to compliance or enter
plans/agreements

X





1
19
% inspections and investigations at priority areas

X





1
20
% concluded enforcement actions requiring physical action that
will result in pollutant reductions and/or changes in
management or information practices


X
X
X


3
21
% increase over 2000 proportion of SNCs returning to
compliance w/in 2 years


X
X



2
31

-------
22
% reduction in SNC for CAA, CWA, and RCRA from 2000


X
X



2
23
# inspections, civil investigations and criminal investigations
conducted






X
1
24
$ invested in improved environmental performance or improved
EMP as a result of concluded enforcement actions (i.e.,
injunctive relief and SEPs)






X
1
25
% concluded enforcement actions resulting in physical action
and/or improvements in practices





X

1
26
% concluded enforcement cases (including SEPs) requiring
implementation of improved environmental management
practices






X
1
27
% concluded enforcement cases (including SEPs) requiring that
pollutants be reduced, treated, or eliminated and protection of
populations or ecosystems






X
1
28
lbs of TRI pollutants released, disposed of, treated, or
combusted for energy recovery in previous year (DATA LAG)




X
X

2
29
lbs pollution estimated to be reduced, treated, and eliminated as
a result of concluded enforcement actions






X
1











COMPLIANCE ASSISTANCE MEASURES








30
# entities (facilities) voluntarily disclosing and correcting
violations
X
X

X
X
X

5
31
# compliance assistance centers in operation
X






1
32
# self disclosures
X






1
33
# user sessions
X






1
34
# visits to compliance assistance centers' internet sites
X






1
35
% participants' improved understanding of regulations
X






1
36
% participants' taking actions
X






1
37
# settlements with facilities to voluntarily disclose and correct
violations


X




1
38
# students trained


X
X



2
39
# training modules provided to tribal governments by NETI


X
X



2
40
# EMS tools developed


X
X



2
41
% regulated entities receiving direct assistance from EPA (e.g.,
training, on-site visits) reporting that they reduced, treated, or
eliminated pollution, as a result of EPA assistance






X
1
42
% regulated entities receiving direct CA from EPA (e.g., training,
on-site visits) reporting that they increased their understanding
of environmental requirements as a result of EPA assistance






X
1
43
% regulated entities receiving direct compliance assistance
from EPA (e.g., training, on-site visits) reporting that they
improved EMP as a result of EPA assistance






X
1
44
% regulated entities seeking assistance from EPA-sponsored
CA centers and clearing house reporting that they reduced,
treated, or eliminated pollution as a result of that resource






X
1
45
% regulated entities seeking assistance from EPA-sponsored
CA centers and clearinghouse reporting that they improved EMP
as a result of their use of the centers or the clearinghouse






X
1
46
% regulated entities seeking assistance from EPA-sponsored
CA centers and clearinghouse reporting that they increased
their understanding of environmental requirements as a result
of their use of the resources






X
1
47
% regulated entities taking complying actions as a result of
compliance monitoring






X
1
32

-------
48
lbs pollutants reduced, treated, or eliminated as a result of audit
agreements or other actions






X
1
49
$ invested in improving environmental management practices
as a result of audit agreements or other actions






X
1
50
% audits or other actions that result in improvements in
environmental management practices






X
1
51
% audits or other actions that result in the reduction, treatment,
or elimination of pollutants; and the protection of populations or
ecosystems






X
1
52
# Entities reached through compliance assistance




X
X

2











INTERNAL MEASURES








53
# courses provided to State and Tribal officials
X
X
X
X



4
54
# data system improvement designs
X

X




2
55
# priority areas identified
X






1
56
populations served by valid compliance rates or other indicators
of compliance

X
X
X
X
X

5
57
# import and export notices filed and reviewed

X





1
58
% operational efficiency for existing 14 info systems


X
X
X


3
59
phases of modernization of Permit Compliance System


X
X
X
X

4
60
# quality mgmt plans completed for additional data systems


X




1
61
# tribal personnel trained by EPA



X



1
62
% transboundary notices reviewed and responded to


X
X



2
63
# tribal personnel trained by NETI


X




1
64
% homeland security support to federal, state, and local entities



X



1











MISC MEASURES








65
# facilities with performance information
X






1
66
% NEPA concerns voluntarily addressed
X






1
67
% significant federal actions (NEPA) reviewed
X






1
68
# data analyses of environmental problems in tribal lands-
Tribal Baseline Assessment Project


X

X


2
69
% reduced from 1991 levels of priority list chemicals




X
X

2











TOTAL Measures Reported Per Fiscal Year
25
14
22
20
14
12
16












AVERAGE Measures Reported Per Fiscal Year
18








AVERAGE Years Measure in Use
2








ALL Measures Reported for Fiscal Years 1999-2004
107








ALL Measures Reported and Planned for Fiscal Years 1999-2005
123







33

-------
Appendix F
Agency Comments
OFFICE OF ENFORCEMENT AND COMPLIANCE ASSURANCE (OECA)
Response to
OFFICE OF INSPECTOR GENERAL (OIG) DRAFT EVALUATION REPORT
ON OECA PERFORMANCE MEASUREMENT AND REPORTING
October 12, 2005
General Comments
The Office of Enforcement and Compliance Assurance (OECA) appreciates the efforts of
the Office of the Inspector General (OIG) to evaluate the performance measurement and
reporting practices of the national enforcement and compliance assurance program. OECA
understands the value of program evaluation as a tool for improving program effectiveness, and
has conducted its own series of program evaluations over the past four years to examine a
number of program performance issues.
When OECA suggested performance measurement as a possible evaluation topic in
response to a solicitation from the OIG, it was in the hope that the OIG could provide an
objective analysis of OECA's measurement and reporting practices and recommend solutions to
some of the challenges confronted by OECA as it has developed and used performance measures
over the last eight years.
To that end, OECA took the extraordinary step of submitting a memo from Phyllis
Harris, OECA's Principal Deputy Assistant Administrator, to Jeff Harris, Director, Cross-Media
Issues, Office of Program Evaluation in the OIG on September 29, 2004 entitled, "Request for
the IG's Assistance to Improve and Expand OECA's Use of Outcome-Based Performance
Measures." (See Attachment 1) In that memo OECA described two primary performance
measurement challenges for which it requested assistance from the OIG. The first of these
challenges was to enhance the current measure for pounds of pollution reduced from
enforcement actions by adding some sense of the impact of these reductions on hazard and
exposure. The memo also posed a series of specific questions for the OIG about this challenge.
The second challenge was to help OECA find a way to expand the use of statistically valid
noncompliance rates. OECA had developed a methodology for combining inspections targeted
at suspected violators with randomly selected inspections to produce representative samples on
which to base statistically validates. Unfortunately, the additional increment of random
inspections reduced the number of targeted inspections that could be performed and so the
methodology was applied through pilot projects to relatively small segments of the regulated
universe. The memo then provided a list of barriers that needed to be overcome, a set of options
34

-------
for moving forward with statistically validates, and a set of specific requests for OIG assistance
that would move OECA forward on the continuum of outcome measures.
Now more than one year later, the OIG evaluation does not respond to this request and
provides no assistance in addressing either of these challenges. Instead, the OIG has chosen to
offer a critique of OECA's use of outcome measures, in some instances utilizing various issues
OECA itself provided in its September 29, 2004 memo.
This would seem to be at odds with the OIG's own Strategic Plan for FY 2004-2008.
The "vision statement" from that Plan (provided as Attachment 2) reads as follows:
"We are catalysts for improving the quality of the Environment and
Government through problem prevention and identification, and
cooperative solutions. " (italics added)
Given the opportunity to conduct a program evaluation to find a
"cooperative solution" to a significant problem in OECA's performance
measurement practice, the OIG was perhaps incapable, unwilling, or
hostage to a prevailing audit mentality.
See OIG Response
in Appendix G,
Note 1
In this evaluation report, the OIG has focused on various other limitations in OECA's use of
outcome measures, offered observations and recommendations of marginal value and relevance,
and left unaddressed the requests for assistance on the crucial issue of developing meaningful,
statistically valid compliance rates. Their three recommendations are of little benefit — one
urges OECA to continue improving its measures, a second amounts to
editorial changes in a document, and the third urges OECA to address a
data problem not as dire as portrayed by the OIG. The evaluation report
is a significant disappointment to the OECA managers and staff
responsible for performance measurement.
See OIG Response
in Appendix G,
Notes 2 and 3
In this response we will suggest corrections to errors and misrepresentations, provide
additional context that we believe will benefit the report, and respond to each of the three
recommendations.
Chapter 1 - Introduction
Background
On page 2 of the report, the OIG quotes selectively from President Bush's remarks at the
May 23, 2005 swearing-in ceremony of Administrator Stephen Johnson.
Here is the section of the report that includes the excerpted remarks of the President:
"At the Administrator's May 23, 2005 swearing-in ceremony, President Bush
35

-------
emphasized that he wanted results - real environmental improvements and vigorous
enforcement - when he said, ' we will continue our enforcement
strategy which focuses on achieving real environmental improvements that benefit
everyone.... We'll continue to vigorously enforce our environmental
laws ... and we will focus on results.'"
And here is the entire quote of that portion of the President's remarks that dealt with
enforcement:
"And finally, we will continue our enforcement strategy which focuses on achieving real
environmental improvements that benefit everyone. Since 2001, the EPA has increased
compliance inspections by 19 percent, and civil investigations by 24 percent. And last
year the agency provided compliance assistance to over 730,000 individuals and
businesses.
Our strategy is working. Last year we obtained commitments to reduce future pollution
by an estimated 1 billion pounds, an increase of 50 percent over the 2001 level. And I
want to thank all the EPA employees who work in the field
on this collaborative effort.
As Steve leads the EPA, he will maintain our common-sense approach of collaborating
with leaders and volunteers at the local level to find the very best solutions to meet our
national goals. We'll continue to vigorously enforce our environmental laws. We'll
encourage good stewardship of natural resources and we will focus on results."
Note that the excerpt from the IG report deletes: three references to program activities
expressed as output measures (the increase in inspections and investigations, and the number of
individuals provided compliance assistance); the reference to estimated pollution reductions; and
the statement that our strategy is working.
It appears that these references are omitted from the report because they are incompatible with
OIG observations presented later in the report. The report focuses on the need for outcomes,
never acknowledging that a mix of outputs and outcomes is necessary to
manage the program, and implies that OECA is using too many outputs.
The report criticizes the use of pollution reduction measures that are based
on estimates. And the report creates the impression that OECA's strategy
for managing the national program is deficient in significant ways.
This practice - selectively using only information that supports their
ignoring incompatible information or important context — is one OECA
has seen in other recent OIG evaluation reports. (See, for example,
OECA's comments on the 9/19/05 OIG report on data about regulatory
universes.) We will point out other instances of troubling practices in this
OIG report elsewhere in this response.
See OIG Response
in Appendix G,
Note 4
observations, and
See OIG Response
in Appendix G,
Note 5
36

-------
Also on page 2, the OIG describes the value of compliance rates and characterizes them
as the among the Agency's most important performance measures. OECA agrees that such
measures are important, and that is why we developed a methodology (with an external statistical
consultant), conducted several pilot projects of our methodology, and continue to use the
methodology to develop rates on key populations. And that is also why we requested assistance
from the OIG to help determine how the use of statistically validates
could be expanded. We do not agree, however that without statistically
valid noncompliance rates we cannot effectively manage the program.
Such rates are one of many tools that, taken together, comprise a
comprehensive system of performance measures.
See OIG Response
in Appendix G,
Note 6
On page 3, the OIG discusses "effective performance measurement and reporting,"
including its purposes and benefits, especially its contribution to transparency. This section is
very similar to an article by Robert Behn of Harvard University {Public Administration Review,
September/October 2003, Vol. 63, No. 5) in which he identifies eight distinct purposes for
measuring performance - evaluate, control, budget, motivate, promote, celebrate, and learn. In
Behn's article, he observes that serving these purposes requires a variety of types of performance
measures, including both outputs and outcomes. This is an observation that is not made in the
OIG report. Using measures to actually manage and improve a program will necessitate a mix of
output and outcome measures to determine what outputs produce the most important outcomes.
This should be acknowledged by the OIG in this report. Similarly, the OIG report should
recognize the existence of practical constraints on performance
measurement. The Government Accountability Office (GAO) has noted
that agencies "must balance their ideal performance measurement systems
against real-world considerations such as the cost and effort involved in
gathering and analyzing data."(U.S. GAO, 1996a, p.24)
See OIG Response
in Appendix G,
Note 7
Scope and Methodology
On page 5, the report states that the OIG "determined and applied essential criteria,
does not identify those criteria or point out that the criteria are listed in
Appendix C. More importantly, the report does not provide any detail
about how the criteria were applied to the individual measures used by
OECA, nor does it describe what judgments were made based on
reviewing the measures against the criteria.
but
See OIG Response
in Appendix G,
Note 8
Also, there are two additional criteria that OECA and other compliance and enforcement
organizations have used, and these should be considered by the OIG for its list. The first is what
some of the public management literature calls functionality, i.e., does the measure encourage or
provide an incentive for the right kind of behavior among the regulated
universe and the internal staff of the agency? The second criterion that
should be added to the list is comprehensiveness. This criterion applies to
See OIG Response
in Appendix G,
Note 9
37

-------
the set or system of measures being used by a program or organization. In evaluating a set of
measures, program managers and other users of performance information will need to determine
if the measures cover all or most of the important activities and results of the program.
Also, on this page, the OIG states that the "field work" for the report occurred between
January and June 2005. If the definition of "field work" includes making requests of OECA staff
and meeting or otherwise interacting with OECA staff and managers about
the report, field work continued well beyond June and continued until days
before the draft report was received by OECA. In fact, one of the reasons
the report is so disappointing to OECA staff and managers is that it seems
like a very small return on a significant investment of OECA staff time spent working with the
OIG evaluation team.
See OIG Response
in Appendix G,
Note 10
Chapter 2 - Outputs and Goals
This chapter begins on page 7 with an acknowledgement that various experts cited
OECA's leadership in developing and using performance measurement for enforcement and
compliance programs. We would suggest that, in addition to the opinions of experts, the OIG
also include information about the various contributions OECA staff and
managers have made to the practice of performance measurement.	ApperfdbTci80
Attachment 3 contains a list of various publications and tools that OECA	Ncte 11
has developed and distributed to advance performance measurement in state		
environmental agencies and environmental ministries of other nations.
The report then focuses on a short discussion of the Office of Management and Budget's
(OMB) reviews of the civil enforcement program under the Program Rating and Assessment
Tool (PART), and in doing so makes factual errors and provides an incomplete account of those
reviews. On page 8, the report states that OECA received a "Results Not Demonstrated" rating
in "both 2002 and 2004." The correct account is that such a rating was received by OECA for
PARTs done in 2002 (in preparation for the FY 2004 budget) and 2003 (in preparation for the
FY 2005 budget). The report then confuses the account even more by saying the PART
assessment for "the fiscal 2006 budget found that the program had followed through on original
PART findings by undertaking development of a measures implementation plan." The PART for
the FY 2006 budget was actually conducted in 2004, and that review led to an improvement to
"adequate" in the PART rating, a fact that the IG neglects to mention. The report should be
revised to reflect the following: the PART conducted in 2002 (for the FY 2004 budget) and the
PART conducted in 2003(for the FY 2005 budget) gave OECA a "Results Not Demonstrated"
rating; the PART conducted in 2004 (for the FY 06 budget) gave OECA an "Adequate" rating.
Furthermore, the improved rating resulted not just from OECA's undertaking a measurement
improvement plan, but because OMB in 2004 categorized pounds of		
,,	, ,	, ,	See OIG Response
pollution reduced as an outcome measure rather than an output measure,	jn Appendix G
finally aligning its view of outcome measurement with that of	Note 12
performance measurement experts.		
38

-------
OECA notes that the omission of the improved PART rating is another example of the
OIG practice of selectively using only information that supports their observations, and ignoring
incompatible information or important context.
Also on page 8, Table 2.1 characterizes FY 2004 OECA measures by sorting them as
outputs, intermediate outcomes, or end outcomes. But the table does not identify which
individual measures have been put in each respective category, leaving the OECA reviewers
unable to check the accuracy of the table. Also, there seems to be an implication that the number
of output measures is out of balance, though the report never explicitly
makes that claim. If that is the conclusion of the OIG, it should be
stated, along with the rationale or standard that the OIG is using to
determine what the appropriate mix of output and outcome measures
should be.
See OIG Response
in Appendix G,
Note 13
Chapter 3 - Characterizing Changes in Compliance or Other Outcomes
Both the title of this chapter and the title of this report itself create the false impression
that OECA lacks sufficient data about outcomes to manage its programs. This practice of stating
findings or conclusions that are much broader than the supporting
evidence on which they are based is one the OIG has employed in other
recent evaluations of the enforcement program. (See the OECA
comments on the 9/19/05 OIG evaluation report on data about regulated
universes at www.epa.gov/oig/reports/xmedia.htmr)
See OIG Response
in Appendix G,
Note 14
OECA fully recognizes the need to improve specific aspects of its outcome measurement and
reporting. That is why OECA has been working to develop statistically valid noncompliance
rates, and that is why OECA submitted its memo entitled, "Request for the IG's Assistance to
Improve and Expand OECA's Use of Outcome-Based Performance Measures," over a year ago.
We suggest that the OIG revise both the title of the report and of this
chapter to reflect the need for specific improvements in outcome
measurement and reporting rather than the broader (and false) statement
that our measures cannot track or characterize outcomes.
See OIG Response
in Appendix G,
Note 15
OECA developed its publicly reported GPRA measures to track progress toward
achieving the compliance objective and sub-objectives under Goal 5 in the Agency Strategic
Plan. The Goal 5 architecture, including the performance measures associated with the objective
and sub-objective, are provided as Attachment 4. These measures have been designed to track
whether the various tools used by the program (compliance assistance, compliance incentives,
inspections and investigations, and civil and criminal enforcement) are producing specific
outcomes (e.g., improvements in understanding regulatory requirements, implementation of
improved environmental management practices at facilities, and reduction of pollution emissions
and discharges). Moreover, to increase the value of these measures as a management and
accountability tool, OECA established numeric targets for increasing
these outcomes over a set period of time.
See OIG Response
in Appendix G,
Note 16
39

-------
Also on page 11, the report includes a discussion about OECA's history on compliance
rates that contains further inaccuracies and simplifications that need to be corrected or revised.
The statement that "OECA has not publicly reported compliance rates ..." is inaccurate. While
OECA has not reported its statistically valid rates as part of its GPRA measures, rates have been
reported on the OECA website for CSO compliance with nine minimum
control requirements in 2002 and 2004, and a recently-completed rate
study on foundry compliance with RCRA regulations will be posted in
the next 60 days.
See OIG Response
in Appendix G,
Note 17
The statement that "OECA chose not to invest the resources necessary to produce statistically
valid rates on a broad scale," is misleading because it implies that sufficient resources were
available but OECA invested them somewhere else. The situation is more complex than the OIG
chose to portray. As we have explained on may occasions and in many ways to the OIG, the
dilemma OECA is facing is that inspection resources are finite and the additional increment of
random inspections necessary to produce meaningful, representative, statistically valid rates for
large segments of the regulated population would mean conducting fewer overall inspections
targeted at known or suspected significant violators. We request that this section be revised to
correct these inaccuracies and suggest that the language on page 12
regarding "sacrificing compliance monitoring of known significant	See ®IG Response
*i .. 1 . f T ^. i •	in Appendix G)
violators, be given more prominence in the OIG s explanation or	Note 18
OECA's record on developing and using compliance rates.
On pages 12 and 13 the OIG report discusses information about the regulated universe
and how that information can contribute to developing compliance rates. Citing a previous OIG
report, this section states that "OECA lacks an accurate characterization of the universe of
regulated entities" and that a "better understanding of the composition of the regulated universe
will allow OECA to reliably estimate compliance for segments of the regulated universe."
These statements reflect a lack of understanding about the true impediments to
developing statistically valid rates, and are perhaps an attempt to inflate the value of the
recommendations from the previous OIG report on universe data. In the work OECA has done
in the last several years on compliance rates (see Table 1), we have been able to develop an
accurate characterization for that portion of the regulated universe associated with a particular
rate. In developing rates, OECA has been able to categorize and parse the regulated population
in the various ways suggested by the OIG. These suggestions from the OIG are not new insights
that will enable OECA to have a breakthrough in developing more compliance rates. The major
impediment to expanding OECA's use of rates - as we explained to the OIG in our memo from
one year ago, and in many subsequent meetings since - is finding the resources necessary to
conduct the necessary additional random inspections without reducing significantly the
inspections targeted at known or suspected violators.
See OIG Response
in Appendix G,
Note 19
Also in this section (on page 12) the report states that to reliably
estimate compliance for a segment of the regulated community, OECA
needs "a sampling method that can produce statistically valid generalizable compliance
40

-------
information for that segment of the regulated community." OECA
already has such a sampling method, has used it throughout the
compliance rate pilot projects over the last several years, and has
provided it to the OIG as part of their research for this report.
See OIG Response
in Appendix G,
Note 20
The next section (on pages 13 and 14) about reporting proxies instead of rates implies
that OECA has described its current measures as proxies for compliance rates. We have never
done so. Our current measures are designed to track the extent to which our various tools
(assistance, incentives, inspections, and enforcement) produce important behavior changes such
as improved environmental practices at facilities and changes to the environment such as
pollutant reductions. Even if OECA were able to develop and use dozens of statistically valid
compliance rates each year, we would continue to use and report our current measures because
they provide valuable information about the results we are producing through our activities.
Calling these measures proxies for compliance rates is an invention of this OIG report, not the
approach OECA has adopted.	„ „
^	^	See OIG Response
in Appendix G,
Note 21
41

-------
Table 1. Statistically Valid Noncompliance Rates for Selected Populations
Year(s)
Undertaken
Sector and Noncompliance Rate
Method
FY 2000-2002
Petroleum refining: Ammonia, zinc and lead
violations with more than 20% over NPDES limit
Self-reported Discharge
Monitoring Report (DMR)
data
FY 2000-2002
Iron and Steel: Ammonia, zinc and lead violations
with more than 20% over NPDES limit
Self-reported DMR data
FY 2000-2002
Municipalities: biological oxygen demand (BOD)
and total suspended solids (TSS) violations with
more than 40% over NPDES limit
Self-reported DMR data
FY 2001
Organic Chemical Manufacturing: RCRA Small
Quantity Generator Compliance
Statistically valid inspections
FY 2001
Iron and Steel and Metal Services: DMR Accuracy
Audit
Statistically valid inspections
FY 2002
Ethylene Oxide Manufacturers: Maximum
Achievable Control Technology (MACT)
Compliance
Statistically valid inspections
FY 2002
Combined Sewer Municipalities: Combined Sewer
Overflow (CSO) Nine Minimum Control Policy
Compliance (baseline)
Statistically valid inspections
FY 2004
Combined Sewer Municipalities: CSO Nine
Minimum Control Policy Compliance
(Reevaluation)
Statistically valid inspections
FY 2004/2005
RCRA Foundries: Compliance with RCRA
Regulations
Statistically valid inspections
FY 2005/2006
Compliance with TSCA 1018 Lead-Paint Disclosure
rule in St. Louis Missouri
Statistically valid site visits
The OIG report then turns to a discussion (beginning on page 14) about issues associated
with the use of estimated, predicted, or facility self-reported data, stating that the reliance on
such data "reduces the reliability of OECA's performance measures as accurate indicators of
compliance." The report states that, "As an agency EPA strives to avoid using estimated
performance measurement data," and cites as the lone example EPA's 2004 Draft Report on the
Environment in which EPA "chose to use recorded observations and values rather than estimated
data."
These assertions are misleading in at least three ways. First, EPA reports its performance
measurement data in its Annual Performance Report, not in the Draft Report on the Environment
where it reports data about environmental effects and conditions. Citing the Draft Report as an
example of how the Agency uses performance measurement data is incorrect. Second, by using
42

-------
the Draft Report as its example and stating that the Draft Report only uses recorded observations
and values rather than estimated data, the OIG report creates the false impression that the use of
recorded observations and values is the rule rather than the exception at EPA. On the contrary,
many EPA programs are highly dependent on estimated, predicted, and self-reported data for
analyzing and justifying proposed regulations, assessing the risks associated with substances and
products, identifying emerging or existing environmental problems, and determining whether
industries and facilities are complying with myriad requirements and carrying out voluntary
agreements. Moreover, there are statutes that require regulated entities to report all manner of
data - is it the view of the OIG that all such data categorically are too inaccurate to be used for
the purposes prescribed by the statute? Third, the claim that using estimated, predicted, and self-
reported data reduces the reliability of OECA's performance measures is
based on the general principle that such data is always inferior to
recorded observations and values for every purpose, and not on an
actual analysis of the data OECA is using for performance measurement
purposes.
See OIG Response
in Appendix G,
Note 22
This practice - drawing conclusions based on a general principle
with no accompanying analysis - is one we have seen all too frequently
in recent OIG evaluation reports. The program evaluation function of
the OIG and the programs being evaluated by the OIG are both
important enough to warrant a higher standard of analysis than is
evident at many points in this report.
See OIG Response
in Appendix G,
Note 23
There are other troubling issues associated with the OIG's views about OECA and the
Agency moving from estimated or self-reported data to recorded observations and values. At a
time when the OIG and others are pressuring programs to develop and
use more outcome measures, advocating a strict adherence to the use of
recorded observations and values will set an impossibly high standard
for data collection and will have a chilling effect on initiatives to
improve outcome measures.
See OIG Response
in Appendix G,
Note 24
Gathering recorded observation and values will often necessitate establishing monitoring systems
that can be very expensive, have long implementation periods, and require collection of
significant amounts of new information from external parties. There are formidable resource
barriers that will not likely be overcome in an era of steadily declining budgets. Urging that
outcome measures be based on recorded observations and values will mean that most prospective
performance measures will almost always fail one of the OIG's own criteria listed in Appendix
C: feasibility. As the OIG report states, feasibility means:
"A performance measure should be "collectable." Information for the measure should be
available or able to be obtained with reasonable cost and effort and provide maximum
information per unit of effort. The cost of collecting data should not outweigh their
value." (italics added)
43

-------
This practice - failing to weigh the benefits of their ideas and
recommendations against the resources they entail and the competing
needs and demands that will be displaced - is all too common in recent
OIG evaluation reports about enforcement issues.
On page 15, the report introduces Table 3.1 which purports to show that of the 16 FY
2005 GPRA performance measures, 15 of them rely on unverified estimates, predictions, or
facility self-reported data. This analysis is incorrect. Actually, 8 of the
16 measures rely on actual counts of activities, not on estimates or
predictions. (Items 7, 8, 10, 11, 13, 14, and 15 were categorized
incorrectly by the OIG.)
Also on page 15, the OIG raises a concern about pollutant reductions from enforcement
cases not being verified and "achieved if the facility or defendant carried out the
requirements of voluntary settlement agreement." This characterization understates the
likelihood that the pollutant reductions will be carried out.
Pollutant reductions can be reasonably estimated based upon the type of injunctive relief
required by the terms of a settlement. Federal enforcement of environmental laws focuses on
major sources of pollution found to be in significant violation of the environmental laws;
resolution of these violations usually requires installation of appropriate control technology
and/or significant upgrades of extant controls. Because this technology is very rarely so
innovative as to be untested, EPA engineers and technical consultants are able to estimate, to a
strong degree of certainty, the amount of pollution that will be released by the facility from the
production line or unit in question when the control device is installed and operating correctly.
This number, compared to the amount of pollution emitted or discharged prior to the
enforcement action, is the basis for EPA's pollution reduction figures.
EPA review ensures that the upgrades have been made, control technology installed and
permit levels achieved. Federal consent decrees resolving environmental violations include
deadlines by which compliance must be achieved. Typically, the more extensive injunctive
relief provisions (e.g., those that require installation of complex pollution control systems) also
include milestones that must be met prior to achieving compliance. Defendants certify to EPA
that they have met such milestones and deadlines. The certifications are usually made by
licensed Professional Engineers who are either employees of the defendant or hired as
consultants, and the certifications are accompanied by detailed engineering reports.
As the control devices come on line, the results of performance tests are typically
reported to EPA; some decrees include immediate notification of malfunctions or permit
violations. EPA monitors compliance with both the terms of the consent decrees and
with pollution permit levels by reviewing these reports. The Agency has found that the
engineering reports have a high degree of reliability. Because the court can enforce the
milestones and deadlines, defendants take them very seriously. When EPA detects or has
reason to suspect an irregularity in the reports, it may inspect the facility. At the
See OIG Response
in Appendix G,
Note 25
See OIG Response
in Appendix G,
Note 26
44

-------
Agency's discretion, it may also inspect the on-site progress of the defendant in meeting
the terms of its agreement.
OECA suggests that the description of pollution reductions currently
in the report be revised to reflect that there are controls that increase
the likelihood that terms of the settlement agreements will be carried
out and the pollution reductions actually achieved.
On page 17 the OIG states:
OECA's fiscal 2005 performance measures for some of its most important
outcomes do not clearly link to OECA's goals and objectives. As a result, OECA
is unable to clearly or effectively communicate and report on the extent to which
it is accomplishing these important goals.
The conclusions drawn here about the lack of linkage and its impact are incorrect.
Among other things, GPRA requires the Agency to: 1) develop a strategic plan, which has
objectives and sub-objectives that describe performance targets covering the life of the plan; and
2) produce annual performance goals and corresponding annual performance measures. When
the most recent Agency Strategic Plan was developed, OECA made a conscious effort to ensure
that annual performance measures aligned with, and enabled us to report progress on, OECA's
Objective and Sub-objectives in the Agency Strategic Plan. Attachment 4 arrays the FY 2005
Annual Performance Measures under OECA's Sub-objectives, and makes clear the
correspondence between the two. The language of the Annual Performance Goals were written
as a summary of the corresponding Sub-objective language, and the OIG is correct in noting that
there are discrepancies between the wording of the FY 2005 Annual Goals and the corresponding
measures. OECA will reword the Annual Goals to make the existing
linkage between the performance measures and OECA's Objective
and Sub-objectives in the Agency Strategic Plan clearer.
The OIG's comments on page 18 about the compliance incentives measure - that it does not
measure true compliance or conformity with environmental laws and regulations - reflects a lack
of understanding about the EPA self-audit policy that is the basis of the incentives measure. To
qualify for using the audit policy, a facility or company must certify that the audit agreement will
bring them into compliance. Thus, one of the consequences of an approved audit agreement is
facility/company compliance. For compliance incentives, OECA measures other important
outcomes, since compliance is a prerequisite for an approved audit
agreement.
Another point raised by the OIG on this page concerns
whether OECA should measure the number of complying actions resulting from certain activities
or the percentage of entities taking compliance actions. Counting and reporting the number of
complying actions would be misleading, for example, because within a two-year period the total
number of complying actions could increase from 1000 to 2000 but the number of entities taking
the actions might increase from 100 to 200 or from 20 to 21. OECA believes that tracking the
See OIG Response
in Appendix G,
Note 27
See OIG Response
in Appendix G,
Note 28
See OIG Response
in Appendix G,
Note 29
45

-------
percentage of entities is the more useful measure for a national
program because it demonstrates whether an increasing or decreasing
percentage of the entities are achieving outcomes resulting from the
EPA activity.
See OIG Response
in Appendix G,
Note 30
On page 20 of the report, the OIG begins a discussion regarding how changes made by
OECA to its measures over the six-year period of 1999-2005 have reduced transparency. OECA
believes that measures can be changed and improved without diminishing transparency, and
provides explanatory information when such changes are made. The report points out that
OECA changed the wording for all its measures in FY 2005, but does not mention until the next
page that this change was in conjunction with the development of a new EPA Strategic Plan that
was very different from the previous plan. The OIG also neglects to mention that the measures
put in place in FY 2005 for the new strategic plan were changed largely to improve OECA's
ability to measure the outcomes it was producing through its various
activities.
As part of this report, the OIG painstakingly tracks the changes
in individual OECA performance measures for each year from 1999-2005, displays these
changes in a three-page Appendix D, and attempts to discern some trends in these changes. But
this section of the report is focused on the wrong question: Isn't the important question not
whether the measures are different over time, but whether they have improved over time? This
would be valuable information for a program manager to have, and it is the type of question
OECA hoped would be addressed when it originally suggested that the OIG evaluate OECA
performance measures.
On page 22 of the report, the OIG discusses the need for
further evaluation of "EPA's previous and potential use of statistically
valid compliance rate measures." For the first time in the report, the OIG makes reference to the
fact that "OECA has requested assistance from us in this area." However, the report makes no
reference to the contents of the September 2004 memo sent by OECA to the OIG to request
assistance, nor that the memo was received by the OIG over a year ago.
See OIG Response
in Appendix G,
Note 31
See OIG Response
in Appendix G,
Note 32
Recommendations
The OIG has recommended that the Administrator for
Enforcement and Compliance Assurance;
See OIG Response
in Appendix G,
Note 33
3.1. Design and implement a pilot proj ect to verify estimated, predicted, and facility self-
reported outcomes, and report on the pilot's results to demonstrate the reliability of such
performance measures. Until OECA verifies these data, OECA should clearly and
prominently describe all measures as estimated, predicted or facility self-reported.
OECA Response: Concur. Although OECA believes that the OIG has exaggerated the
seriousness of estimated, predicted and self-reported data, OECA will begin exploring
how to conduct a pilot project to verify estimated, predicted, or self-reported data for
46

-------
those measures in which such data is used. While the pilot
project is ongoing, OECA will increase the prominence of
caveats associated with these measures.
3.2	Improve the linkage/relationship of OECA's goal and measures in EPA strategic and
budgetary documents to improve external understanding and external usefulness.
OECA Response: Concur. OECA will revise the language of
its FY 2005 Annual Performance Goals so that they conform
with the language of the relevant sub-objective in the Agency
Strategic Plan.
3.3	Continue to improve enforcement and compliance measures, while continuing to publicly
report key measures annually to provide the public, Congress, and other specific
stakeholders a minimal amount of trend data.
OECA Response: Concur. OECA is continually making efforts to improve its
performance measures and improve the practice of using performance information to
manage, assess and improve program performance. In light of the OIG's reluctance to
assist OECA during the past year wit [sic] expanding the use of statistically valid rates,
OECA will move forward on this key issue by turning to an
external institution (e.g., the National Academy of Public
Administration) and a statistical consultant for assistance.
OECA will review its current practices for annual reporting of
data and consider modifying or expanding that reporting.
Appendix B
In Appendix B, the OIG report provides a description of various management reports
used by OECA to examine various aspects of program performance. As mentioned previously to
the OIG, item #7, "Monthly Deputy Regional Administrator Conference Calls" is incorrect and
should be deleted. Although the Agency holds a regularly scheduled call with the DRAs and
OECA will occasionally be on the agenda for those calls it is not
accurate to consider them as an OECA tool for internal performance
management and reporting.
See OIG Response
in Appendix G,
Note 34
See OIG Response
in Appendix G,
Note 35
See OIG Response
in Appendix G,
Note 36
See OIG Response
in Appendix G,
Note 37
47

-------
Appendix G
OIG Evaluation of Agency Comments
1. We have a strong track record of consulting with, and considering input from, the Office
of Enforcement and Compliance Assurance (OECA) as we plan and implement our
evaluations. We solicited comments from OECA in our multi-year planning activities for
both the 2003-05 and 2006-08 cycles. And, contrary to OECA's assertions, we
repeatedly addressed requests by OECA to assist them with compliance rate generation.
To clarify what seems to be confusion on the part of OECA, we describe these efforts
below.
As we develop our program evaluation agenda, we routinely seek input and feedback
from Agency offices on potential evaluation topics. Stakeholder input is crucial to
identifying and preventing problems in order to improve the environment and human
health, and we consistently sought comments from OECA on evaluation topics. For
example, we developed our original approach for evaluating the effectiveness of EPA's
enforcement activities in consultation with the OECA. In early 2003 when developing
the OIG's Multi-Year Plan Fiscal 2003-2005, we met with senior OECA officials to
solicit comment on our four-phased evaluation strategy.20 In response to OECA concerns
about the feasibility of the plan, we agreed to pilot our approach in a regulated sector
selected in consultation with OECA. The resulting evaluation, EPA Needs to Improve
Tracking of National Petroleum Refinery Compliance Program Progress and Impacts,
Report No. 2004-P-00021, June 22, 2004, validated our multi-year evaluation strategy.
Again, in 2005, as we were compiling our 2-year plan for fiscal 2006-2008, we solicited
input and ideas from OECA.
OECA's claim that we "left unaddressed" OECA requests for assistance is inconsistent
with the facts. Our January 10, 2005, memorandum responding to OECA's September
29, 2004, memorandum requesting OIG assistance stated that: (1) thoroughly
researching or evaluating either of the two major issues would require more time and
staff resources than we currently have available; (2) the proposed evaluation of hazard
and exposure risk characterization would require a cross-media approach and scope that
would not necessarily focus solely on enforcement and compliance assurance; and (3)
each question would be considered for inclusion in the OIG's next multi-year plan. In
November 2005, we initiated preliminary research on a question directly related to
OECA's request; What are the costs and benefits of targeted, random, and stratified
random sampling approaches to generate compliance rates? OECA management will
20
To evaluate whether EPA's enforcement approaches were optimized to ensure compliance with environmental
rules and regulations, the OIG developed a plan of four reviews that would characterize the regulated universe;
assess the Agency's coordination and prioritization processes, critique implementation strategies, and evaluate
compliance assurance measurement and data quality.
48

-------
also recall that OIG staff and OECA managers have discussed the request, and the status
of the request, during meetings held on 9 December, 2004, 12 January, 2005, and July 5,
2005.
2.	OECA is correct in that we temporarily set aside the other important issues discussed in
note 1 above, and chose to focus on OECA's current use of outcome measures. It was
and remains our opinion that we could best contribute, in the short term, by identifying
improvements OECA could make in its current performance measurement and reporting
approach. We also believe that our approach appropriately establishes a baseline and
effectively sets the stage for our planned future work in this area.
3.	We do not agree that the recommendations in this report are of little value. Rather, we
find the recommendations crucial to the integrity and effectiveness of the Agency's
performance measurement system for three reasons. First, in the absence of empirical
evidence, EPA cannot ensure the validity and reliability of any environmental
performance measure - including enforcement. We consider our recommendation to
pilot selected data validations as a measured first step for OECA to establish a process to
enhance the credibility of OECA's reported results. Second, a fundamental component of
public accountability includes establishing and reporting on measures that can determine
programmatic goal accomplishment. Public accountability suffers when program
managers cannot use their measures to determine success - namely whether the program
has achieved its goals. We consider the ability to clearly and logically connect measures
with goals to outcomes as essential. Third, we recommend that EPA report the
compliance measures that it has. While the Agency's enforcement and compliance
assurance measures need improvement, that does not mean they are without value.
Tracking results over time can add transparency in performance trends even when the
precise outcomes remain in doubt. We consider the transparency of Agency performance
- with the best data available - to be of considerable benefit. We believe these three
reasons clearly demonstrate the importance and value of our recommendations to the
Agency's enforcement and compliance measurement approach.
4.	Our quote clearly indicated that it was selective by accurately using ellipses, and OECA's
quote is also selective (but without the proper use of ellipses). We selected to quote the
parts of the President's speech we did because we believe those parts fairly summed up
his message, i.e., "President Bush emphasized he wanted results - real environmental
improvements and vigorous enforcement - when he said, '... we will continue our
enforcement strategy which focuses on achieving real environmental improvements that
benefit everyone... .We'll continue to vigorously enforce our environmental laws...and
we will focus on results.'" We believe "real environmental improvements" are best
characterized by outcomes, e.g., compliance leading to improvements in human health
and the environment. We do not see "inspections and investigations, and the number of
individuals provided compliance assistance" as "real environmental improvements," and
we doubt OECA does either. It appears that OECA's problem may also be with our
scope. Our scope was specifically focused on outcomes as clearly described by our
objectives which focused on bottom-line enforcement and compliance effectiveness, and
49

-------
on how well OECA's performance measures characterized changes in compliance or
other outcomes. We included information that was relevant to our scope and objectives.
However, we, and many experts we spoke with, certainly agree that a mix of outputs and
outcomes is necessary and desirable, and we believe we have made that point clear in our
final report.
5.	We sought and included all contextual and other information that we determined would
help answer our objectives and assist us in communicating a fair and accurate answer to
those objectives.
6.	We did not state that "without statistically valid noncompliance rates [OECA] cannot
effectively manage the program." We are encouraged that OECA agrees, "Compliance
rates are among the Agency's most important performance measures." While we agree
that OECA can and has managed the enforcement and compliance program without
statistically valid rates, it is also our conclusion and well-considered opinion that,
"Ensuring compliance with environmental laws and regulations is critical to
accomplishing EPA's mission." We cannot be certain that including such rates will
markedly improve Agency management, because such improvement entirely depends on
what decisions management makes with such information. Nonetheless, we believe that
valid compliance rates are essential for an Agency that has a goal to "improve
compliance."
7.	We agree that a mix of measures is desirable, and as stated earlier, we made this point
explicit in our final report. We also included the statement that agencies "must balance
their ideal performance measurement systems against real-world considerations such as
the cost and effort involved in gathering and analyzing data."
8.	We modified the text to clearly state that we used our professional judgment in applying
the five criteria mentioned in Appendix A and detailed in Appendix C. We also included
in our final report that, "We determined and used our professional judgment in applying
criteria to assist us in evaluating OECA's performance measures. These criteria include
relevance, reliability, validity, comparability, and feasibility, and are described in greater
detail in Appendix C."
9.	We shared our proposed criteria with top OECA managers in numerous meetings and as
part of a written document, and those managers never mentioned these additional criteria
OECA now offers in its comments. Nonetheless, these other criteria OECA offers are
covered within the criteria we used. We consider "functionality" as part of "feasibility,"
and "comprehensiveness" as part of both "reliability" and "validity."
10.	The definition of OIG "fieldwork" traditionally ends when most substantive fieldwork
has been completed, and draft report writing begins, i.e., June 2005. There is always a
certain amount of follow-up with Agency staff that occurs during the draft report writing
process.
50

-------
11.	We have not included such information because it is not essential to answering our
objectives, and does not seem to provide essential additional context.
12.	We clearly acknowledged in our official draft report that, "the fiscal 2006 budget found
that the program had followed through on original PART findings by undertaking
development of a measures implementation plan." We have also clearly indicated in our
final report that OECA received a rating of "Adequate" as a result of the 2004 PART.
The relevant OMB examiner told us that the primary reason the PART rating was
changed from "Results Not Demonstrated" to "Adequate" was OECA's undertaking
development of a measures implementation plan.
13.	This portion of the report was only intended to be descriptive, i.e., reporting "what is".
We do not believe that we are in a position at this point to determine if OECA's mix is, in
fact, out of balance. We provided OECA with a list clearly indicating how we
categorized each measure, and OECA did not provide any comments on our list.
14.	We believe the report findings and conclusions regarding OECA performance measures
are supported by the evidence we collected and the analyses we performed.
15.	We did not say that OECA's measures cannot track or characterize outcomes "at all."
We said OECA can't track compliance outcomes effectively, and then explained in some
detail what we meant by "effectively." We think the report title, EPA Performance
Measures Do Not Effectively Track Compliance Outcomes, and the Chapter 3 title,
"OECA's Public Measures Do Not Effectively Characterize Changes in Compliance or
Other Outcomes," are accurate.
16.	We compared OECA's Attachment 4 to OECA's most recent expression of its goals and
measures we had received before issuing our official draft report. We found that OECA
made some changes in goals and targets. We also noted that OECA eliminated what it
had called "Annual Performance Goals," and renamed what it used to call "Performance
Measures" as "Annual Performance Goals." While OECA renamed its measures as
goals, we believe that these newly labeled goals are really measures. We found no other
substantive changes between the two documents expressing OECA's goals and measures.
Therefore, rather than casting new light on its strategic architecture, we found that this
latest expression of OECA's goals actually served to further confuse the issues.
Therefore, we stand by our analysis.
17.	In our scope and methodology section on page 5 of our draft report we clearly stated that,
"Our review primarily focused on the public enforcement and compliance measures as
described in EPA's Fiscal 2005 Annual Plan related to EPA goal 5, Compliance and
Environmental Stewardship." In every case where we state, "publicly reported
compliance rates" or "publicly reported measures," we intended for it to be understood as
those rates or measures reported under GPRA, and usually said so explicitly, e.g.,
"publicly-reported GPRA performance measures," as used on page 9 and elsewhere in
the draft report. We did not feel a need to repeat "GPRA" in each and every case.
51

-------
However, to provide additional clarity, we have included a footnote in the first place we
use the term "public measures" in the final report defining that these measures refer
specifically to those measures reported in EPA's annual performance plan required under
GPRA. We have also added a footnote in our final report that states, "According to
OECA, while not part of its public GPRA measures, OECA has published a compliance
rate for Combined Sewer Overflows on its website in 2002 and 2004. OECA also stated
that it plans to publish RCRA compliance rates for foundries in the next 60 days."
18.	It is, in fact, EPA's choice about how and where to expend its enforcement resources. It
is EPA management's role, and not OIG's role, to identify whether EPA has adequate
resources for any particular given activity. We understand that resources are finite, and
OECA itself recognizes in its comments that we acknowledged that, ".. .the additional
increment of random inspections necessary to produce meaningful, representative,
statistically valid rates for large segments of the regulated population would mean
conducting fewer overall inspections targeted at known or suspected significant
violators...." We also said at the bottom of page 11 of our draft report that "A senior
OECA executive told us that OECA does not have the resources to either inspect every
facility to determine the true state of compliance across programs, or randomly sample
facilities to determine compliance rates, without sacrificing compliance monitoring of
known significant violators." OECA asks only that we give "more prominence" to this
fact. We have given this issue more prominence by modifying the subject sentence in the
introductory summary paragraph on page 11 of our report to say, "OECA chose not to
invest the resources necessary to produce statistically valid rates on a broad scale because
that might impact its ability to inspect known or suspected significant violators."
19.	We believe that data quality is very important, and that verification is necessary. We are
not saying that OECA has not been able to develop an accurate characterization for that
portion of the regulated universe associated with the particular statistically valid
compliance rates (SVCRs) OECA has already developed for small segments of the
regulated universe. We're saying that OECA needs a better understanding of the
composition of the universe for each and every portion of the universe for which a SVCR
is desirable, i.e., OECA needs a reliable denominator to get a reliable compliance rate.
Furthermore, the first bullet on page six of OECA's September 29, 2004, memo to us
states that, "Regulated universes are unknown."
20.	We believe that OECA may have missed our point here as to what kind of "method" we
are talking about. We have modified our statement so as to reflect that OECA does have
a method to develop SVCRs, though we did not assess their method and therefore cannot
comment on it at this time.
21.	We never said that, "OECA has described its current measures as proxies for compliance
rates...." To clarify, we added the following footnote to the final report: "While OECA
did not use the word "proxy," a top OECA executive did tell us that OECA used these
measures because they would lead to compliance. "Proxy" is our characterization, and
we believe it is accurate, i.e., the compliance-related measures currently reported are as
52

-------
close to real compliance rates that OECA can get at the present time. We also are not
intending to imply that OECA should discontinue any particular measures even if OECA
were able to develop and report on statistically valid compliance rates on a broad scale
each year.
22.	We consider "data about environmental effects and condition" as one type of
performance measure, and disagree that "Citing the Draft Report [on the Environment] as
an example of how the Agency uses performance measurement data is incorrect...."
Second, we did not intend to create the impression that the use of recorded observations
and values is the rule rather than the exception at EPA. The extent to which EPA
programs are dependent on estimated, predicted, and self-reported data outside of the
context of performance measurement was outside the scope of this evaluation. Third, we
did not set forth any "general principle" or state that using estimated, predicted, and self-
reported data is always inferior to recorded observations and values for every purpose.
We suggested that OECA conduct an actual analysis of the data OECA is specifically
using for performance measurement purposes by designing and implementing a pilot
project to verify estimated, predicted, and facility self-reported outcomes, and report on
the pilot's results to demonstrate the reliability of such performance measures.
Additionally, the pilot program that we suggested was vetted by and agreed upon by
OECA management.
23.	We disagree that we were "drawing conclusions based on a general principle with no
accompanying analysis..." As stated immediately above in note 22, in this specific case,
we suggested that OECA conduct the actual analysis in the form of a pilot.
24.	We believe OECA is referring to prospective outcome measurement initiatives, i.e., in
response to outside parties (e.g., OMB and OIG) telling them to develop better outcome
measures, rather than referring to any current OECA initiative. We included OECA's
comment, (i.e., "... advocating a strict adherence to the use of recorded observations and
values will set an impossibly high standard for data collection and will have a chilling
effect on initiatives to improve outcome measures.... ") in our summary of OECA's
comments.
25.	As stated above in notes 22 and 23, in this specific case, we suggested that OECA
implement a pilot project to verify estimated, predicted, and facility self-reported
outcomes, and report on the pilot's results to demonstrate the reliability of such
performance measures. If such outcomes can be demonstrated to be reliable, then OECA
would not necessarily need to be restricted to recorded observations and values. Perhaps
most importantly, we believe that measures must be reliable, whether based on sampling
or entirely on recorded observations and values.
26.	We stand by our analysis. To the best of our knowledge, the source of data for these
measures is the Integrated Compliance Information System (ICIS), which OECA states
includes estimated data. Based on our review of the relevant measures, they all appear to
53

-------
be based on EPA agreements made with industry (e.g., consent decrees). OECA has told
us that those agreements are not verified.
27.	We have added the following statement in our report: "According to OECA, if these
controls are effective, they would increase the likelihood that terms of the settlement
agreements will be carried out and the pollution reductions actually achieved."
28.	OECA seems to have agreed with at least the first part of our finding here, i.e.,
"performance measures for some of its most important outcomes do not clearly link to
OECA's goals and objectives." If the first part is true, we believe the second part also
must be true, i.e., "OECA is unable to clearly or effectively communicate and report on
the extent to which it is accomplishing these important goals."
29.	OECA is taking exception here to one sentence/example, i.e., "While this goal is titled
'Compliance Incentives' in OECA's fiscal 2005 annual plan, none of the four measures
under this objective measures true 'compliance,' or conformity with environmental laws
and regulations." At a minimum, this is an example of unclear communication by
OECA. Nowhere in its public performance reporting (e.g., in a footnote) does OECA
explain that an approved self-audit policy equals compliance with all applicable
environmental laws and regulations. This is also another example of OECA's reliance on
self-reporting/certification. We believe these issues are addressed by our
recommendations 3.1 and 3.2.
30.	To eliminate any possibility of miscommunication, we believe that the only appropriate
approach is to report on both the number of complying actions resulting from certain
activities, and the percentage of entities taking compliance actions.
31.	We believe we adequately described OECA's explanations for the changes in the
wording of OECA's measures, and attributed the explanations to OECA. The important
point to us is that the measures changed frequently.
32.	We believe it is important for OECA to both improve its measures, and handle changes in
a way that maximizes comparability and transparency. That is why we specifically
prefaced our related recommendation to, "Continue to improve enforcement and
compliance measures, while continuing to publicly report key measures annually to
provide the public, Congress, and other specific stakeholders a minimal amount of trend
data."
33.	See our response to note 1.
34.	See our response to note 2. We disagree that we have "exaggerated the seriousness of
estimated, predicted and self-reported data." We have elaborated in note 2 what our
recommendations are trying to accomplish.
54

-------
35.	See our response to note 2. We have also added language to our final report
Recommendation 3.2 to clarify that this action should include developing measures that
more clearly link to OECA's stated goals. It is more than simply making language
consistent between two documents.
36.	See our response to note 2.
37.	We shared the draft of this list with a top OECA executive at the end of our fieldwork
and modified it at that time as requested to make it fully accurate. The official did not
request that this item be removed at that time. However, we have removed this item from
our list based on OECA's latest request.
55

-------
Appendix H
Distribution
Office of the Administrator
Assistant Administrator for Enforcement and Compliance Assurance
Chief Financial Officer (Agency Followup Official)
Assistant Administrator for Environmental Information
Principal Deputy Assistant Administrator for Enforcement and Compliance Assurance
Director, Office of Compliance, Office of Enforcement and Compliance Assurance
Director, Office of Civil Enforcement, Office of Enforcement and Compliance Assurance
Audit Followup Coordinator, Office of Enforcement and Compliance Assurance
Agency Followup Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Inspector General
56

-------