I Office of Inspector General
Audit Report
Consolidated Report on OECA's
Oversight of Regional and
State Air Enforcement Programs
E1GAE7-03-0045-8100244
September 25, 1998
-------
Inspector General Division
Conducting the
Consolidated Audit:
Mid-Atlantic Audit Division
Philadelphia, PA
Regional Reports:
Eastern Audit Division
Boston, MA
Mid-Atlantic Audit Division
Philadelphia, PA
Central Audit Division
Dallas, TX
Western Audit Division
San Francisco, CA
Regions Covered:
Regions 1, 3, 6, and 10
Program Offices Involved:
Office of Enforcement and
Compliance Assurance
Region 1 Office of
Environmental Stewardship
Region 3 Air, Radiation and
Toxics Division
Region 6 Compliance
Assurance and Enforcement
Division
Region 10 Office of Air Quality
-------
X, UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
2 WASHINGTON, D.C. 20460
OFFICE OF
THE INSPECTOR GENERAL
September 25, 1998
MEMORANDUM
SUBJECT: Consolidated Report on OECA's Oversight
of Regional and State Air Enforcement Programs
Audit Report Number E1GAE7-03-0045-8100244
FROM: Michael Simmons
Deputy Assistant Inspector General
for Internal Audits (2421)
TO: Steven A. Herman
Assistant Administrator
for Enforcement and Compliance Assurance (2201A)
Attached is our final consolidated audit report on EPA's Oversight of Regional and
State Air Enforcement Programs. This report consolidates national issues
identified during our audits of Regions 1, 3, 6, and 10. The overall objective of this
audit was to determine the functions that the Office of Enforcement and
Compliance Assurance (OECA) needs to perform in order to resolve the deficiencies
that we identified during our regional audits. This report contains findings and
recommendations that are important to EPA.
This audit report contains findings that describe problems the Office of Inspector
General (OIG) has identified and corrective actions the OIG recommends. This
audit report represents the opinion of the OIG. Final determinations on matters in
this audit report will be made by EPA managers in accordance with established
EPA audit resolution procedures. Accordingly, the findings contained in this audit
report do not necessarily represent the final EPA position, and are not binding upon
EPA in any enforcement proceeding brought by EPA or the Department of Justice.
ACTION REQUIRED
In accordance with EPA Order 2750, you as the action official, are required to
provide us a written response to the audit report within 90 days of the date of this
report. Your response should address all recommendations, and include milestone
-------
dates for corrective actions planned, but not completed. This will assist us in
deciding whether to close this report.
We have no objections to the further release of this report to the public. Should
your staff have any questions about this report, please have them contact Patrick
Milligan at 215-814-2326 or Ernie Ragland at 202-260-8984.
Attachment
-------
EXECUTIVE SUMMARY
Purpose
Results-In-Brief
The Office of Inspector General (OIG) performed six audits
on EPA's oversight of the states' air enforcement data. The
six states reviewed were located in four of EPA's ten regions.
We identified national issues during these audits and have
used that work as a basis for this report. The objective of
this audit was to determine the functions that the Office of
Enforcement and Compliance Assurance (OECA) needs to
perform in order to resolve the deficiencies identified by the
previous audits of state enforcement data.
Air enforcement audits disclosed fundamental weaknesses
with state identification and reporting of significant violators
of the Clean Air Act. This occurred because states either did
not want to report violators or the inspections were
inadequate to detect them. Without information about
significant violators, EPA could neither assess the adequacy
of the states' enforcement programs, nor take action when a
state did not enforce the Clean Air Act. Numerous
significant air pollution violators went undetected, and many
of those identified were not reported to EPA.
State and even EPA regions disregarded Agency
requirements, or were uncertain whether enforcement
documents were guidance or policy. As a result, the
effectiveness of air enforcement programs suffered.
Moreover, these violators were not made known to the
general public. This occurred in large part because EPA and
the states did not adhere to EPA's Timely and Appropriate
Enforcement policy (TAE) and its Compliance Monitoring
Strategy (CMS).
For EPA's oversight system to work properly, OECA should
oversee EPA regions, which are responsible for working with
state agencies to promote an effective enforcement program.
In response to our six individual audits, states and regions
Report No. E1GAE7-03-0045-8100244
-------
agreed to corrective actions to improve enforcement, and
OECA should ensure that they fulfill their commitments.
In addition, OECA needs to undertake actions to address the
concerns discussed in this report. OECA had not assigned
internal responsibility for the oversight and implementation
of CMS. EPA regions did not always know who to contact in
OECA for clarification of enforcement issues. OECA did not
routinely analyze enforcement data to detect trends and
problem areas, and its regional reviews did not always assess
the adequacy of regional oversight to identify violators. Air
grants did not include specific amounts for enforcement,
which resulted in EPA's loss of leverage to ensure state
compliance.
Recommendations We recommend that the Assistant Administrator for the
Office of Enforcement and Compliance Assurance:
1. Continually reinforce EPA regional compliance
with the TAE and CMS.
2. Assign oversight responsibility for the CMS.
3. Work with the Office of Air and Radiation
(OAR) to earmark Section 105 grant funds to
enforcement.
4. Perform analysis and quality assurance of
enforcement data.
5. Conduct evaluations of regional air enforcement
programs that assess regional compliance with
the TAE and CMS.
6. Improve communications with the EPA regions.
7. Establish focal points within OECA so that
states and EPA regions can obtain clarification
of Agency enforcement directives.
11
Report No. E1GAE7-03-0045-8100244
-------
EPA RESPONSE In its August 12, 1998 response to the OIG draft report,
OECA agreed with the report's recommendations. OECA
stated that many of the findings validated issues that they
were aware of, and contributed to their strategies to address
them.
OIG EVALUATION We concur with the Agency's response and the corrective
actions that were already taken or proposed. OECA offered
some clarifying suggestions and we revised the report
accordingly.
111
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
IV
Report No. E1GAE7-03-0045-8100244
-------
TABLE OF CONTENTS
EXECUTIVE SUMMAEY i
ABBREVIATIONS vii
CHAPTER 1 1
INTRODUCTION 1
PURPOSE 1
BACKGROUND 1
SCOPE and METHODOLOGY 4
CHAPTER 2: FINDINGS and RECOMMENDATIONS 7
Part 1 7
STATES NOT REPORTING
SIGNIFICANT VIOLATORS TO EPA 7
Six OIG Audits Found States Did Not Report SVs 7
OECA Also Found States Did Not Report SVs 8
Reasons States Did Not Report Significant Violators 10
States Need To Comply With TAE 12
Part 2 14
INADEQUATE INSPECTIONS CAUSED
UNDERREPORTING OF SIGNIFICANT VIOLATORS .... 14
Level 2 Inspections Not Always Performed 14
Four States Did Not Adequately Document Inspections .... 17
States Need To Comply With CMS 17
Part 3 19
EPA REGIONAL OFFICES ALLOWED STATES
TO UNDERREPORT SIGNIFICANT VIOLATORS 19
Regions Working To Have States Report SVs 21
One Region's New Approach To EPA Inspections 22
Part 4 24
LAX OVERSIGHT BY OECA CONTRIBUTED TO
UNDERREPORTING OF SIGNIFICANT VIOLATORS .... 24
Improvements Needed By OECA 24
Changes Needed By OECA 27
EPA And State Partnership 29
Recommendations 30
Report No. E1GAE7-03-0045-8100244
-------
APPENDIX 1
PRIOR AUDIT COVERAGE 35
APPENDIX 2
EPA'S RESPONSE TO DRAFT REPORT 39
APPENDIX 3
DISTRIBUTION 51
VI
Report No. E1GAE7-03-0045-8100244
-------
ABBREVIATIONS
CAA
CMS
EPA
FMFIA
NOV
OAR
OECA
OIG
RCRA
SSCD
SVs
TAE
VOC
Clean Air Act, as amended in 1990
Compliance Monitoring Strategy
United States Environmental Protection Agency
Federal Manager's Financial Integrity Act
Notice of Violation
Office of Air and Radiation
Office of Enforcement and Compliance Assurance
Office of Inspector General
Resource Conservation and Recovery Act
Stationary Source Compliance Division
Significant Violators
Timely and Appropriate Enforcement Response to Significant
Air Pollution Violators
Volatile Organic Compounds
Vll
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
Vlll
Report No. E1GAE7-03-0045-8100244
-------
CHAPTER 1
INTRODUCTION
PURPOSE
BACKGROUND
Inspection Types
The Office of Inspector General (OIG) performed six audits
on EPA's oversight of the states' air enforcement data. The
six states reviewed were located in four of EPA's ten regions.
We identified national issues during these audits and have
used that work as a basis for this report. The objective of
this audit was to determine the functions that the Office of
Enforcement and Compliance Assurance (OECA) needs to
perform in order to resolve the deficiencies identified by the
previous audits of state enforcement data.
Section 105 of the Clean Air Act (CAA) provided the initial
authority for federal grants to help state and local agencies
prevent and control air pollution. The EPA regions award
Section 105 grant money so that states can operate their air
programs in accordance with the grant agreements executed
with EPA. Before EPA awards each grant, it negotiates a
work program with the state which contains specific work
commitments the state agrees to perform. The work program
should encompass activities such as inspections, monitoring,
permitting, and enforcement, which includes identifying and
reporting significant violators (SVs).
EPA guidance lists five different levels of inspections that
can be performed at air pollution facilities. Level 0,
commonly called a "drive by," is the most basic inspection.
EPA does not consider this level of inspection to be an
acceptable compliance assurance method. A Level 4
inspection is the most thorough and time consuming. This
type is generally done only when developing a legal case
against the facility. To adequately evaluate a facility's
compliance with the CAA, the Section 105 grants awarded by
EPA often required each state to comply with the
Compliance Monitoring Strategy (CMS).
1
Report No. E1GAE7-03-0045-8100244
-------
Agency
Enforcement
Procedures
To adequately evaluate a facility's compliance with the Clean
Air Act, the CMS explains that states need to perform at
least a Level 2 inspection at major stationary sources. The
CMS defines in detail the necessary tests and evaluations
that must be performed to constitute a Level 2 inspection.
The inspection must include an assessment of the compliance
status of all sources within the facility. The CMS also
requires that states annually submit Comprehensive
Inspection Plans to EPA. These plans should include a list of
those facilities that would receive a Level 2 inspection. Some
states inspect major facilities on an annual basis, while
others inspect major facilities on a multiple year cycle.
According to Section 110 of the Clean Air Act, when a
violation is identified, the inspector should issue the facility
a Notice of Violation (NOV), which specifies the type of
violation and the regulations the facility violated. If the
violation meets EPA's definition of a significant violator, the
state should report the facility to EPA for placement on the
Agency's Significant Violator List, as required by EPA's
February 7, 1992 Timely and Appropriate Enforcement
Response to Significant Air Pollution Violators (TAB).
According to the TAE, a significant violator is any major
stationary source of air pollution, which is violating a
federally-enforceable regulation. The TAE requires states to
report significant violators to EPA within one month of the
violation, and to maintain the facility on EPA's list until it
achieves compliance. After the violation is reported, the
state and EPA should monitor the source until it achieves
compliance. This includes determining an appropriate time
schedule for achieving compliance and assessing a penalty, if
necessary. To resolve violations expeditiously, EPA stresses
to each state the importance of identifying and reporting
significant violators promptly.
Each month EPA and the states are responsible for updating
the air enforcement data in the Agency's database known as
the Aerometric Information and Retrieval System. The TAE
also requires states to participate in teleconferences with the
EPA regions to discuss new and existing significant
Report No. E1GAE7-03-0045-8100244
-------
OECA's Roles And
Responsibilities
State Audits
OECA's
Assessments
violators. EPA intends for this communication to promote a
greater degree of teamwork between themselves and the
states. However, if EPA is dissatisfied with a state's
enforcement action, the Agency has the authority to override
the state and assume the lead in resolving the violation.
We performed our audit work for this capping report at two
of OECA's divisional offices; the Enforcement Planning,
Targeting, and Data Division and the Air Enforcement
Division. The roles and responsibilities of these divisions as
they relate to the issues we identified in this report include:
1) establishing inspection priorities; 2) developing and
assessing accurate compliance measures; 3) providing
technical assistance to regions, states and the regulated
community; 4) developing national enforcement policies and
guidance; 5) performing evaluations of national, regional and
state enforcement programs; and, 6) performing oversight of
EPA regional and state enforcement actions.
To determine if underreporting of SVs was occurring, we
initially conducted an audit of Pennsylvania's air
enforcement program during fiscal year 1996. This review
found that Pennsylvania was not reporting significant
violators to EPA.
Because of concerns that other states were also not reporting
SVs, during November 1996, the Assistant Administrator for
OECA requested that each regional administrator conduct an
assessment of SV reporting by the states within their region.
At that time, he also requested the OIG to determine if there
were comparable conditions in other states. Subsequently,
all 10 EPA regions responded to OECA that states were
underreporting significant violators. Moreover, two regions
visited states to review files in order to determine the extent
of underreporting.
Because of the significant interest in this issue, we initiated
audits in five additional states. The primary objectives of
these audits were to determine the extent of the
underreporting by the states, and also to determine whether
states were performing sufficient inspections to identify
Report No. E1GAE7-03-0045-8100244
-------
violators. During the time these audits were ongoing, OECA
established a workgroup consisting of state, EPA regional,
and Headquarters personnel to determine if the TAE needed
revision. One revision contemplated was to change the
definition of a significant violator.
SCOPE and We performed this audit according to the Government
METHODOLOGY Auditing Standards (1994 Revision) issued by the
Comptroller General of the United States as they apply to
program audits. The audit included tests of the program
records and other auditing procedures we considered
necessary.
This report consolidates the audit work completed in Regions
1, 3, 6, and 10. The six states where we performed audits
were: Arkansas, Maryland, Massachusetts, New Mexico,
Pennsylvania, and Washington. We also conducted a limited
examination in Region 4 to evaluate their reviews of the
states. Region 4 staff examined enforcement files in its eight
states to determine the extent of SV underreporting:
Alabama, Florida, Georgia, Kentucky, Mississippi, North
Carolina, South Carolina, and Tennessee. Personnel in
Region 2 also performed a similar review at the State of New
York. We considered the results of these reviews during our
audit of OECA.
To accomplish our objective, we performed our review at
OECA's: Enforcement Planning, Targeting, and Data
Division; Manufacturing, Energy, and Transportation
Division; and its Air Enforcement Division. We held
numerous meetings with officials from these divisions to
discuss OECA's responsibilities for the specific issues we
identified.
We reviewed management controls and procedures
specifically related to our objective, but we did not fully
review the internal controls associated with the input and
processing of information into EPA's database or any other
automated records system. We also reviewed Headquarters
reports prepared to comply with the Federal Manager's
Financial Integrity Act (FMFIA) and found that none of the
Report No. E1GAE7-03-0045-8100244
-------
weaknesses cited during this audit were disclosed in these
reports. Moreover, these reports contained no issues that
impacted our audit.
We reviewed documentation to determine OECA's
responsibilities for overseeing regional and state air
enforcement programs. This documentation included among
others, OECA's; 1) Enforcement Accomplishments Report,
2) Data Quality Survey, 3) draft Strategic and Tactical
Automation Management Plan, 4) regional memorandum of
agreement, 5) significant violator and inspection trend
analyses, and 6) National Performance Measures Strategy.
We also analyzed the evaluations OECA performed at
regional offices, and the draft Strategy For Addressing The
Status Of State Under-Identifying And Under-Reporting Of
CAA "Significant Violators."
We conducted our fieldwork for this report between October
1, 1997, and February 17, 1998. We met with senior OECA
officials on January 20, 1998 to discuss the results of our
audit work.
We issued the draft report on May 13, 1998. EPA submitted
its response to us on August 12, 1998. Based on this
response, we made minor modifications to our report. EPA's
response to our findings and our evaluation of the response
are summarized at the end of Chapter 2, Part 4. EPA's
complete response is included in Appendix 2.
See Appendix 1 for prior audit coverage.
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
6
Report No. E1GAE7-03-0045-8100244
-------
CHAPTER 2: FINDINGS and RECOMMENDATIONS
Parti
STATES NOT REPORTING
SIGNIFICANT VIOLATORS TO EPA
Although EPA required information about violators, the
states did not report significant violators to EPA. Without
information about significant violators, EPA could neither
assess the adequacy of the states' enforcement programs, nor
take action when a state did not enforce the Clean Air Act.
In effect, the states hindered EPA's ability to oversee the
states' air enforcement programs by not providing
information concerning significant violators.
In fiscal year 1996, EPA provided the states $160 million in
grants to carry out the Agency's priorities for enforcing the
Clean Air Act. One such priority was to report significant
violators, another was to perform inspections. Despite these
priorities, the audits we performed in six states disclosed
states underreported SVs. This occurred because states
either did not want to report violators or the inspections
performed were inadequate and did not detect violators. The
assessments that each EPA region performed at the direction
of OECA confirmed that the states' lack of reporting was a
nationwide condition.
Six OIG Audits
Found States Did
Not Report SVs
Despite performing more than 3,300 inspections during the
fiscal year reviewed, the six states we audited reported a
total of only 18 significant violators to EPA. In contrast,
while reviewing only a small portion of these 3,300
inspections, we identified an additional 103 SVs the states
did not report. More specifically, we reviewed state
enforcement files for 430, or 13 percent of the major facilities
Report No. E1GAE7-03-0045-8100244
-------
in these states and identified an additional 103 SVs that the
states did not report:
State
OIG
Identified
Inspections SVs
MD
PA
AE
WA
MA
NM
722
2,000
418
106
39
45
3330
3
6
2
7
0
0
18
Facilities Identified
Reviewed SVs
60
270*
23
42
7
28
430
4
64
10
17
3
5
103
*For PA \
ed 45 facilities and 225 Notices of Violation.
OECA Also Found
States Did Not
Report SVs
While our audits disclosed underreporting by six states in
four regions, the regional offices' responses to OECA's
request for an assessment of SV reporting further
corroborated our findings. In response to OECA's
assessment request, all ten regions agreed that some of their
states were underreporting SVs. The responses received
from the six regions where we did not audit a state
illustrated the seriousness and extent of the underreporting
by states:
/ Region 5 responded there was at least one state that
appeared to be underreporting. Ohio, which had more
than 1700 major sources, reported only four SVs
during a two-year period. Ohio officials agreed that it
was underreporting and that it would be reasonable to
expect a large industrial State such as itself to identify
more than four SVs in two years.
/ Region 8 replied that several states may not be
reporting all SVs. The Region was concerned that the
8
Report No. E1GAE7-03-0045-8100244
-------
underreporting may be caused by inadequate state
inspections that did not identify SVs, rather than the
state not wanting to report SVs.
/ Regions 7 and 9, along with some of their states,
criticized the EPA definition of an SV contending that
it was too all-inclusive. In some instances, Region 7
agreed with its states that a particular source did not
need to be designated as an SV, even though it met the
definition contained in EPA's TAE document.
/ Regions 2 and 4 performed reviews at a total of nine
state offices and confirmed that there was substantial
underreporting of SVs.
During one of these
More Than 150 SVs I reviews, Region 2 personnel
Not Reported By New York | examined 73 of New York's
^^^^^^^^^^^^^^^^^J inspections performed at
major sources. This review
identified nine SVs that were not reported to EPA.
According to EPA's database, New York had more than 2,300
major sources and reported no SVs during fiscal year 1996,
and only five during the previous fiscal year. As a result of
Region 2's review, the State revised its procedures, trained
its staff, and emphasized SV reporting. Within a few months
of the review by EPA, New York reported 152 SVs in EPA's
database.
The results achieved in New York also confirmed the
Secretary of Pennsylvania's contention that his state was not
the only one disregarding the TAE. In his response to that
audit, he explained that most states disagreed with EPA's
definition, and disregarded it because EPA did not require
compliance with the TAE.
(Region 4 staff examined almost
1,200 inspection files in its eight
states to determine if states were
reporting SVs. These states had
more than 5,000 major sources
9
Report No. E1GAE7-03-0045-8100244
-------
Reasons States
Did Not Report
Significant
Violators
and EPA's database showed only 99 SVs. The Region's
review identified more than 300 SVs not reported in EPA's
database.
Forty-six SVs were violators the states had not identified,
indicating possible differences between what the states and
EPA considered to be SVs. Through conference calls or by
submitting copies of NOVs, these states had previously
notified the Region of another 259 SVs. However, the SV
status for these sources was not entered into EPA's database
because there was confusion about who was supposed to
enter the SV data. Region 4 believed the states entered the
SVs, and the states thought EPA did it. As a result, EPA's
database was incomplete because it listed fewer significant
violators of the CAA than actually existed.
In response to our audits and OECA's request, state air
enforcement officials cited many reasons why significant
violators were not reported to EPA. However, while state
officials did not offer specific examples to support their
contentions, they indicated that:
/ They did not understand EPA's SV definition.
Some state personnel believed they were identifying
all SVs when actually they were not. When we asked
Massachusetts personnel why violations we identified
as significant were not reported, they told us the TAE
was unclear. However, the State did not request
clarification from EPA until we raised this issue
during our audit.
Contrary to EPA's TAE, Washington did not report
SVs at the time the violations were first detected.
Instead, the State reported SVs only after penalties
were assessed. When SVs were not assessed a
penalty, they were not reported as an SV.
/ EPA involvement caused delays. Even though
required by the TAE, Massachusetts and Kentucky did
not conduct teleconferences with EPA to discuss SVs.
Officials from another state indicated they often did
10
Report No. E1GAE7-03-0045-8100244
-------
not want EPA involved in the resolution of a violation.
They contended EPA's involvement delayed the
process; however, EPA's assistance was requested
whenever it was necessary. They also said facilities
remained on EPA's list for an excessive amount of
time.
/ Timely resolution negated reporting of SVs.
When state officials believed violations were resolved
timely, they often thought it was not necessary to
report the violations to EPA. However, states and
EPA did not always agree on what was considered
timely resolution. State officials claimed they
preferred not to discuss these types of violations
during monthly SV meetings with the region.
/ States not responsible for designating SVs. State
and EPA personnel in Regions 4 and 6 told us they
were confused about who was responsible for
designating SVs in EPA's database. In one case,
Region 4 had not authorized the states to enter this
data, and therefore needed to clarify the reporting
responsibilities of the states.
/ Compliance with the TAE was not necessary.
Arkansas personnel argued that the TAE was only
guidance and allowed states flexibility to achieve their
goals. As a result, they believed compliance with the
TAE was not required. They also expressed
dissatisfaction with the TAE for not meeting its
intended purpose and suggested that EPA needed to
revise it. It appears that this belief was also held by
the personnel in at least one EPA region. This region
did not require the states to comply with the TAE as a
condition of the Section 105 grants it awarded to the
states.
/ They disagreed with EPA's SV definition. Some
state personnel believed EPA's definition required
them to report minor violations as SVs. As a result,
they did not always use the EPA definition of a
11
Report No. E1GAE7-03-0045-8100244
-------
States Need To
Comply With TAE
Requirements To
Report SVs
significant violator. For example, in Pennsylvania, a
facility installed and operated a large boiler without a
permit. The boiler emitted nitrogen dioxide, a
pollutant regulated under the CAA. However, State
officials did not report this facility as an SV. They
believed boilers were not environmentally hazardous
and did not report situations such as these to EPA.
Disagreement with EPA's SV definition was
illustrated in another Pennsylvania inspection report
for a facility that manufactured automotive carpet and
interior trim. This facility had a history of opacity
violations for almost four years. Opacity violations
occur when the plume of smoke from a stack exceeds
an allowable density, indicating that the facility is
emitting excess pollution. Here, the source was a
boiler. The State assessed this manufacturer a civil
penalty of $4,000, but decided not to place the facility
on EPA's list of significant violators. This example
shows there were occasions when Pennsylvania
recognized that a facility was a violator and took
enforcement action against the facility. However,
despite assessing a penalty, the State did not consider
these violations severe enough to report the facility to
EPA, even though the violation met the SV definition.
It is essential that states report SVs to EPA. In response to
one audit, the Regional Administrator wrote, without
consistent, timely, and reliable information on violators,
determinations cannot be made on the best course of action
to bring violators into compliance. EPA must rely on states
for most of its information concerning the compliance status
of the regulated community. This is the reason for specifying
in the Section 105 grant agreement that states identify and
report SVs in accordance with the TAE that was negotiated
between the states and EPA. If the grant requirements were
objectionable, states should have refused the grant. Grant
recipients cannot be allowed to disregard requirements in an
executed agreement because they believe portions of the
grant to be "rigid," "unrealistic," or "defies any common sense
understanding."
12
Report No. E1GAE7-03-0045-8100244
-------
13
Report No. E1GAE7-03-0045-8100244
-------
CHAPTER 2: Part 2
INADEQUATE INSPECTIONS CAUSED
UNDERREPORTING OF SIGNIFICANT VIOLATORS
The quality of state inspections affected the number of SVs
identified and ultimately reported to EPA. In four of the
states we audited, inspectors did not always complete the
tests required for a Level 2 inspection. In other cases, it was
not possible to determine whether inspectors in these same
states did enough to identify significant violators because the
inspectors did not document inspections properly. As a
result, EPA was not assured that facilities were in
compliance with the Clean Air Act, and that states identified
all significant violators. Details concerning inspections that
either did not fulfill Level 2 requirements or were not
documented properly are shown below:
PA
MD
WA
MA
Files
jviewe
81
60
20
6
167
Inspections Without
juired Tests Do
6
12
11
I
30
cumentat
8
14
2
4
28
14
26
13
5
58
Level 2
Inspections Not
Always Performed
Because inspections were either inadequately done or
documented poorly, SVs were undetected for long periods.
This condition continued because the EPA regions did not
ensure state inspectors completed the tests and evaluations
that CMS required for Level 2 inspections.
14
Report No. E1GAE7-03-0045-8100244
-------
For example, at a facility in
Maryland which painted
diesel truck engines, there
were two Volatile Organic
Compounds (VOC) sources
installed in 1991, without a construction permit. The State
did not identify these violations for five years despite
performing other inspections during this time. During fiscal
year 1996 alone, the State conducted five inspections at this
facility, but did not identify either source that was operating
without a permit. It was not until fiscal year 1997, that the
State identified the violations and requested the facility to
apply for a permit.
This facility also had two paint spray booths under one
registration number since 1979. Maryland was also not
aware of this condition. Rather, the State believed that the
new spray booth had replaced the existing booth. However,
the facility operated both spray booths for 18 years before the
State discovered the additional booth. During an October
1996 inspection, the inspector indicated that the two spray
booths needed individual registration numbers.
An adequate Level 2 inspection should have identified any
new or unreported sources since the time of the last
inspection. Had the inspectors compared the facility's permit
to the equipment in the plant, as required by a Level 2
inspection, the two VOC sources and the spray booths would
have been identified more timely and the SV reported to
EPA.
I We reviewed inspections
performed at 20 major
facilities in the State of
Washington. These
inspections were reported
to EPA as Level 2 inspections. However, the State's
inspectors did not inspect all of the sources at 11 of these
facilities. As a result, the inspectors did not complete the
tests EPA required for Level 2 inspections. For example, a
Level 2 inspection conducted at a lime manufacturing
15
Report No. E1GAE7-03-0045-8100244
-------
company contained results for a heat exchanger and
hydrator scrubber. However, State inspectors did not inspect
the facility's baghouse and its coal firing system, including a
review of fuel records to determine the amount of sulfur and
ash content in the fuel. Without inspecting these, the
inspector did not perform a Level 2 inspection and did not
assure the facility complied with the CAA.
Our audits disclosed numerous other examples and reasons
why inspectors did not perform Level 2 inspections. For
example, on some inspections the state did not:
/ Follow up on conditions identified during inspections.
A Pennsylvania inspector identified broken gauges on
one facility's equipment. These gauges were to be
used to ensure that pollution was being captured. The
facility's permit required that the gauges operate
properly. It also required that the facility take
periodic readings from the gauges and record these
readings.
Because the gauges were broken, the facility's records
were incomplete, and the State could not tell if the
facility's emissions complied with the CAA. More than
eight months after the inspection, Pennsylvania had
not determined if the gauges had been repaired. State
officials informed us that the facility was conscientious
and did not believe a second visit was necessary to
verify that the gauges were repaired. It appears that
the State placed too much reliance on the facility and
should have conducted a follow-up inspection to be
sure the facility repaired the gauges.
/ Require facilities to correct violations timely.
An inspection at one facility showed that the
equipment being used did not agree with the permit.
It was not until 16 months later that State personnel
met with facility representatives to correct the
permitting issues.
16
Report No. E1GAE7-03-0045-8100244
-------
Four States Did
Not Adequately
Document
Inspections
States Need To
Comply With CMS
/ Have qualified inspectors conduct a Level 2
inspection. Tennessee's inspectors did not make
compliance determinations because they were not
adequately trained to perform the mass balance
calculations. As a result, there was a backlog of over
two years for determining facilities' compliance.
In Washington, the inspection reports for two facilities did
not contain enough information to determine if the inspector
completed the necessary tests. In addition to these two
inspections, inspectors did not satisfy Level 2 requirements
for another 11 inspections. As a result, some State officials
planned to work with Region 10 to ensure their inspectors
received training for conducting Level 2 inspections.
Some Inspection Reports Totaled
Only Five Handwritten Lines
Inspectors did not
thoroughly document
inspections as
illustrated during an
inspection at a
Maryland facility that used four melting furnaces to produce
glass. At the time of the inspection, two of these furnaces
were operating. The inspector looked for visible emissions,
and noted temperature readings and production rates for the
day of the inspection. Although required for a Level 2
inspection, we saw no documentation that the State inspector
reviewed the facility's annual operating parameters. The
parameters the inspector should have reviewed, included
items such as; hours of operation, operating temperatures,
and fuel usage. Moreover, the information that was recorded
on the inspection report and reviewed by the inspector was
only for one day and not for the intervening period since the
last inspection as required by the CMS. The operating
conditions for the day of the inspection comprised the entire
inspection report, which totaled only five handwritten lines.
The CMS includes critical aspects of an enforcement
program. These include which facilities should be targeted
for inspection, and the tests and evaluations that need to be
done during a Level 2 inspection to determine compliance
with the CAA. When states do not complete the tests and
17
Report No. E1GAE7-03-0045-8100244
-------
evaluations required by CMS, a Level 2 inspection is not
performed, violations are not detected, and enforcement
action cannot be taken against polluters. Personnel in one
region told us, anything less than a Level 2 inspection is
largely inadequate to determine compliance. Inspections are
the front line of an enforcement program upon which all
other aspects of the program are built. If inspections are
deficient, the entire enforcement program suffers. EPA can
help remedy this situation by enforcing state compliance
with the CMS.
18
Report No. E1GAE7-03-0045-8100244
-------
CHAPTER 2: Part 3
EPA REGIONAL OFFICES ALLOWED STATES
TO UNDERREPORT SIGNIFICANT VIOLATORS
In the previous sections, we discussed many of the reasons
that states did not report significant violators. In this
section, we address reasons that EPA regional offices allowed
the underreporting by states to continue, and mention
various steps that EPA regions are now taking to improve
oversight of state enforcement activities and reporting.
Although our audits showed
SV Data Not Analyzed | that state input into EPAs
^^^^^^^^^^^^^^™ reporting systems often was
incomplete, EPA regions did not
adequately review the data that was reported. Had regional
offices performed more quality assurance and trend analyses
of the data, they would have identified the underreporting of
SVs and the over reporting of inspections. Closer attention
to data trends would have alerted EPA that the number of
SVs being reported by the states was declining, and that
some of the largest industrial states in the nation were not
reporting any SVs.
It was not clear to many
EPA's Message Unclear- I personnel in the states,
Policy Or Guidance? I regions, and even EPA
_^^^^^^^^^^^^^^^^J Headquarters whether the
TAE and CMS were policy
or guidance. There is considerable difference in these terms,
policy must be adhered to, while complying with guidance is
more flexible. More specifically, guidance allows states and
regions to use alternatives for accomplishing specified
requirements.
Although the word guidance is mentioned throughout the
TAE, including the title, EPA personnel in one region, as
19
Report No. E1GAE7-03-0045-8100244
-------
well as in Headquarters regarded it as policy. Conversely,
personnel from the states and other EPA regions viewed the
TAE as guidance. The CMS document was confusing
because it is referred to as both policy and guidance within
the document itself. Because some regional personnel
viewed the TAE and CMS as guidance which did not
absolutely have to be adhered to, they did not enforce state
compliance with the reporting and inspection requirements.
I Specifying a requirement in
guidance does not make
compliance optional or
relieve the necessity for
fulfilling the requirements.
We do agree that deviations
from guidance with proper justification are allowable.
However, the regions did not provide us justification when
states underreported SVs, did not conduct teleconferences, or
performed inadequate Level 2 inspections.
It is also important to note that while the CMS may be
viewed by some as only a guidance document for developing
an inspection targeting strategy, it also includes the Level 2
inspection requirements for adequately determining a
facility's compliance. This substantially increases the
importance of the CMS and the need for EPA to mandate
state compliance with the CMS.
„ ._. -.„,. ^ ^ AT ^ I The Section 105 grant is one
Section 105 Grants Not • -,, , ™ ,. , n
TT , . T I 01 the most effective tools a
Used As Leverage • . ,
^^^^^^^^_^^^^^^^J region can use to ensure
states comply with the TAE
and CMS. Grant
agreements provide the regions with leverage to withhold
funds when states do not adhere to requirements specified in
the grants. However, OECA personnel told us that prior to
the Pennsylvania audit, none of the regions took action to
withhold funds for underreporting SVs of the Clean Air Act.
Moreover, Region 4 did not require compliance with the TAE
and CMS as a condition of receiving EPA's grant funds.
20
Report No. E1GAE7-03-0045-8100244
-------
Ineffective Communication
Between EPA And States
Regions Working
To Have States
Report SVs
Regions 1 and 4 were not
conducting SV conference
calls with some of its
states, contrary to what
was required in the TAE.
Massachusetts officials claimed they would contact EPA on
an as-needed basis, such as when they needed to add or
delete information. We considered this breakdown in
communication between Region 1 and Massachusetts as the
one overriding cause of the State's failure to identify SVs.
Communication breakdowns also caused Regions 4 and 6 to
believe states were entering SV information into EPA's
database; however, the states believed the Regions were
entering the SVs. More effective communication would have
prevented this underreporting.
The regional responses to our audits indicated that the
regions plan to improve their oversight of SV reporting and
inspections. EPA regions have either completed or planned
to:
/ Withhold grant funds until it was satisfied the state
was reporting all required enforcement information.
/ Improve procedures for reporting SVs. One state
agreed to generate a report separating the most
important significant violations from those of lesser
interest to EPA.
/ Require states, through the 105 grants, to provide the
regions with copies of all notices of violation and
noncompliant determinations.
/ Improve communications with states regarding the
responsibility of entering SV data, and verify that
significant violators reported in the states' database
are also reported into EPA's database.
/ Evaluate the adequacy of training for state inspectors.
21
Report No. E1GAE7-03-0045-8100244
-------
/ Develop, with state assistance, a Level 2 inspection
checklist to be used for training state inspectors and
for conducting facility inspections.
/ Develop a plan to review a sample of state case files on
a yearly basis to determine if inspection reports show
that Level 2 inspections were accomplished and
adequately documented. This will be a grant
commitment.
/ Improve coordination procedures with states and
provide SV training for state personnel so that they
can adequately identify SVs.
While these corrective actions are a composite of the
responses received for the six audits we performed, the
regions and states often did not agree with our SV
determinations. To confirm our determinations, we
requested OECA's interpretation as to whether many of the
significant violators we identified met the definition of an
SV. OECA personnel generally concurred with our SV
designations. Despite differences on whether some violations
were SVs, most states ultimately agreed to improve SV
reporting or inspections. In this regard, one region has
devised a new approach to performing inspections.
One Region's New During past inspections, the EPA inspector would ensure the
Approach To EPA facility complied with all permit requirements. However,
Inspections this type of inspection did not identify violations of
regulations not included in the permit. To remedy this
omission, Region 3 revised its approach to inspections and
does not inspect only to the facility's permit. It also focuses
on: 1) changes in facility capacity, 2) construction projects
and expansions, 3) technical information, and 4) industry
trends.
Thus far, Region 3 has performed seven of the revised
inspections and has identified six significant violators.
According to EPA officials, state personnel are receptive to
the new approach because they do not have the resources to
consistently perform these in-depth inspections. Moreover,
22
Report No. E1GAE7-03-0045-8100244
-------
they believe this new approach complements state inspection
programs which may not detect the violators EPA is finding.
23
Report No. E1GAE7-03-0045-8100244
-------
CHAPTER 2: Part 4
LAX OVERSIGHT BY OECA CONTRIBUTED TO
UNDERREPORTING OF SIGNIFICANT VIOLATORS
Improvements
Needed By OECA
EPA's oversight system, when applied appropriately, should
ensure the Agency's enforcement priorities are accomplished.
For the system to work properly, OECA should oversee EPA
regions, which are responsible for working with state
agencies to promote an effective enforcement program.
We found inconsistent implementation of Agency directives,
and in other cases, the states or EPA regions disregarded the
Agency's requirements. As a result, the effectiveness of air
enforcement programs suffered. Numerous significant air
pollution violators went undetected and many of those
identified were not reported to EPA. Moreover, these
violators were not made known to the general public. This
occurred in large part because EPA and the states did not
adhere to requirements of the TAE and CMS. The states and
regions have both agreed to improve enforcement. OECA
needs to ensure that the regions and states fulfill their
commitments. Moreover, OECA needs to improve several of
the functions it performs. These are discussed below.
Perform CMS Oversight
1
There was no one within
OECA responsible for the
oversight and implementation
of CMS since OECA was
established in 1994. OECA was created to consolidate
enforcement under one Assistant Administrator from
program offices such as, Air, Water, as well as the Resource
Conservation and Recovery Act (RCRA). Prior to this time,
each EPA Headquarters program office had responsibility for
enforcement, including the oversight of EPA directives such
as TAE and CMS. OECA officials told us that assigning
responsibility for CMS was overlooked during the
reorganization. While OECA personnel said that it was not
24
Report No. E1GAE7-03-0045-8100244
-------
their intention to abandon CMS, this was the perception we
found among regions and states.
The four regions we
Communicate With Regions | audited expressed concern
^^^^^^^^^^^^^^^^^^" because they did not
always know the
appropriate OECA person to contact to clarify requirements.
OECA personnel voiced the same concerns about the regions.
Two of the four regions made correcting communications a
priority during OECA's most recent evaluations of the
regions. According to regional personnel, the ineffective
communication was partly due to OECA's organization as a
sector-based office. Part of OECA is arranged by industry
sectors such as manufacturing, chemical, and transportation.
Within these sectors are a mix of expertise from the EPA
programs. For example, in each sector there are people with
expertise in programs such as Air, Water, and RCRA. These
same regional personnel told us that previously the
Stationary Source Compliance Division (SSCD) handled air
enforcement. If regional air enforcement personnel needed
some Headquarters air expertise, they would contact SSCD
to obtain assistance. Under the sector approach, the regions
are unsure where to obtain the air expertise within OECA.
Adding to the confusion, other parts of OECA are organized
by program office.
A few regions have reorganized their enforcement offices to
be more closely aligned with OECA's sector approach. This
was done by establishing one enforcement office within each
region for all EPA programs. However, six of the ten regions
are still organized by program office. OECA required these
regions to establish an Enforcement Coordination Office that
coordinates enforcement activity and information. OECA
personnel said that because regions are organized
differently, they also have experienced some communication
barriers and are sometimes unsure of the appropriate
regional person to contact.
25
Report No. E1GAE7-03-0045-8100244
-------
We are not advocating either the sector-based approach or
the program office organizations. However, since OECA and
regional personnel have voiced concerns about
communications, both organizations could clarify their lines
of communication.
IOEUA evaluates the
enforcement programs of about
+liv£}£} vomrinc: oonli TTQOV T^lioco
three regions each year. These
evaluations did not emphasize
state and regional compliance with the TAE and the CMS.
OECA's oversight role in this area is crucial. If OECA placed
more priority on the TAE and the CMS, regions and states
would have known consistent implementation of these
enforcement documents was important. This emphasis could
have taken place during OECA's regional reviews. For
example, OECA should have determined if the regions
ensured that states were performing adequate inspections to
identify violators.
IOEUA personnel told us they do not
routinely analyze enforcement data such
o c +Ti£i mimViov r\-f .Q\7c i rJon+i-fi orl anrl
as the number of SVs identified and
inspections performed to detect trends
and identify problem areas. Instead, analysis is done mainly
for management reporting and targeting resources. At the
time of our audit, one person was responsible for gathering,
tracking, assembling, and reporting the data for both the Air
and Water Programs. Previously this function was staffed
with four or five people for each EPA program.
Our audits demonstrated the need for OECA to analyze
enforcement data, which would have alerted OECA that the
number of SVs were declining. It would also have shown
some larger states were not reporting SVs.
Analyzing the number of inspections performed by states
also would have been worthwhile. For example, Maryland
and New Mexico had about 200 major sources each, but
reported a vastly different number of Level 2 inspections.
New Mexico reported 45 Level 2 inspections for one year.
26
Report No. E1GAE7-03-0045-8100244
-------
Changes Needed
By OECA
Further analysis disclosed the State performed Level 2
inspections at only half its major sources during a six-year
period.
Maryland reported 722 Level 2 inspections for the same year,
or about four inspections at each facility. Our review
disclosed that the State over reported the number of Level 2
inspections performed. We estimated that only 20 percent of
the inspections performed were Level 2 inspections. The
results of our analysis highlighted two issues that OECA and
Regions 3 and 6 should have been concerned with.
Analyzing the data effectively would have alerted the Agency
that New Mexico was not performing enough Level 2
inspections, while Maryland was over reporting Level 2
inspections.
Revise Section 105 Grant
J
When EPA established
OECA, it resulted in
enforcement not being as
involved in the grant funding
process at the Headquarters level. This adversely affected
the regions' ability to leverage the states to comply with
enforcement priorities. Before the reorganization, an EPA
program office such as the Office of Air and Radiation (OAR)
had both the ability and the incentive to emphasize the
enforcement aspect of the program through the Section 105
grant. For example, if EPA highlighted certain enforcement
or compliance initiatives as priority, the regions and states
would clearly understand the message through how the
grant was structured and how much money was designated
for these initiatives.
Under the existing arrangement, OECA has no active role in
the allocation of the grant funds. Because enforcement no
longer exists in each EPA program office, it is less of a
priority for OAR and does not command much attention in
the allotment of grant funds. Moreover, there is much less
incentive for the program offices to include enforcement on
its agenda since enforcement is presently administered
under one office—OECA. Without grant funds specified for
enforcement, the grants awarded to the states by the regions
27
Report No. E1GAE7-03-0045-8100244
-------
also did not have a specified amount for enforcement. As a
result, when states do not perform adequate enforcement,
the regions do not have specified funds to withhold.
,T~ C^TTTIT i I Clarifying the definition of a
Complete SV Workgroup • ••£?••-,,• n-^n\>
^i^^^-^^^^*^! significant violator is OECA s
responsibihty. OECA
personnel told us they were
unaware the states and regions had questions about the SV
definition. Moreover, they told us the states and regions did
not request clarification.
As a result of the Pennsylvania audit and OECA's
assessment, the EPA regions, states, and OECA began
discussions on whether: 1) the TAE needed clarification and
2) the definition of an SV needed revision. Subsequently,
OECA established a workgroup to evaluate these
contemplated changes.
The workgroup consists of air enforcement personnel from
state and local agencies, EPA regions and OECA. This
workgroup began studying these issues in July 1997. OECA
personnel participating in the workgroup told us that the
group's work would be completed by the end of 1997. Since
October 1997, when we began our work at OECA, little
progress was made by the workgroup. More than seven
months after the workgroup's targeted completion date, their
efforts were still not complete.
Some of the reasons for not
Establish Focal Points I •-, ,-f • -, ,• aw
^^^^^^"^^^^^^^i identifying and reporting SVs
indicate the need for better
communication between the
states, the regions, and OECA. For example, to address
issues such as the cited uncertainty about the definition of
an SV, OECA should consider establishing national focal
points for interpretation of enforcement policies. These focal
points could provide the states and regions a central place to
obtain a clear understanding of requirements, and foster a
more consistent implementation of EPA enforcement policy
and guidance.
28
Report No. E1GAE7-03-0045-8100244
-------
EPA And State EPA is committed to working in partnership, both internally
Partnership between headquarters and regional components, and
externally with state and local districts, to achieve
environmental goals. For these partnerships to work well,
there must be mutually agreed-upon enforcement objectives
and expectations, clear understanding of each partner's
responsibilities, and complete and accurate reporting of
enforcement data. In the state air enforcement audits we
conducted over the last several years, we found:
/ A lack of agreement on basic definitions and
enforcement approaches;
/ Incomplete and inaccurate reporting of
enforcement data; and
/ Uncertainty regarding what is optional
guidance and what is mandatory policy.
These deficiencies work against effective partnership in
making environmental progress.
We found that there is a cascading effect in air enforcement
when one partner does not fully carry out its role. If OECA,
for instance, is unclear about whether a state reporting
requirement is optional or mandatory, EPA regional offices
may or may not enforce it. Likewise, states may view the
reporting requirement as subject to their own interpretation
of what the requirement really entails. We are not
advocating more "command and control" measures, but
rather that OECA work to reach clearer agreement with its
partners on the necessary elements of a sound air
enforcement program.
We believe that complete and accurate reporting of
significant violators by the states, using a commonly
accepted definition of what constitutes a significant violation,
is necessary for all partners to be able to carry out their
respective roles. With complete reporting from the states,
EPA regions can work with their state partners to bring
about a return to compliance for recalcitrant facilities.
29
Report No. E1GAE7-03-0045-8100244
-------
Likewise, OECA can bring the weight of its authority to bear
in those cases which require it.
Recommendations We recommend that the Assistant Administrator for the
Office of Enforcement and Compliance Assurance:
1. Ensure that the initiatives undertaken by the
SV workgroup are completed timely. This is
needed so that states, EPA regions, and
Headquarters will have a clear understanding
of what constitutes an SV, and what needs to be
reported to EPA.
2. Continually reinforce to the EPA regions that
they must comply with the TAE and CMS. This
should include emphasizing that grant funds
could be withheld when states do not comply
with these directives.
3. Assign oversight responsibility for the CMS
within OECA to the appropriate functional
authority.
4. Work with OAR to earmark Section 105 grant
funds to enforcement. This will enable the EPA
regions to reinforce to the states that
enforcement is an Agency priority.
5. Perform quality assurance of enforcement data
through increased analyses of regional and
state performance measures. This is needed to
detect trends and initiate corrections in a timely
manner.
6. When conducting evaluations of regional air
enforcement programs, improve oversight of
regions by assessing:
a. Regional compliance with the TAE and
CMS.
30
Report No. E1GAE7-03-0045-8100244
-------
b. Whether regional reviews of states
determine if states comply with TAE and
CMS. The reviews should include an
evaluation of the adequacy of state
inspections and whether states are
identifying and reporting SVs.
7. Improve communications with the EPA regions.
Steps should be taken to ensure that regional
personnel know where in OECA technical
expertise can be obtained. Conversely, OECA
should determine who in each region interacts
with state enforcement personnel. Interaction
with these regional personnel would help
ensure OECA is aware of state performance as
well as problem areas.
8. Establish focal points so that states and EPA
regions know where within OECA to obtain
clarification of Agency enforcement directives
such as TAE and CMS.
EPA RESPONSE
In its August 12, 1998 response to the draft report, OECA generally agreed
with the report's recommendations. OECA stated that although it had implemented
a successful program to evaluate regional oversight of state programs, the OIG's
close look at certain CAA enforcement activity provided a welcome supplement to
OECA's ongoing efforts. Further, many of the findings validated issues that OECA
was aware of, and contributed to OECA's strategies to address them.
Regarding Recommendation Number 1, OECA agreed that there must be a
clear understanding of what constitutes an SV. OECA stated it has met several
times with state and local air enforcement representatives to discuss its separate
proposals and have reached a tentative agreement with them on a new definition of
significant violator. Further workgroup discussion on the timely and appropriate
aspects of the TAE guidance will follow.
31
Report No. E1GAE7-03-0045-8100244
-------
Concerning the recommendation to continually reinforce regional compliance
with the TAE and CMS, OECA said it will continue to require implementation of the
TAE guidance and make an effort to include the CMS guidance as well. OECA
stated it has placed the focal point for TAE oversight in the Air Enforcement
Division and the Manufacturing, Energy, and Transportation Division has taken the
lead for CMS oversight.
Pertaining to Recommendation Number 4, OECA agreed that enforcement
and compliance priorities should be reflected in the Section 105 state grant
guidance. OECA stated it will work with OAR to modify the grant guidance to
incorporate the enforcement priorities.
Regarding the recommendation to perform quality assurance of enforcement
data, OECA said it recently completed an extensive trend analysis of enforcement
data for reporting of SVs and inspections during FY 1993 through 1997. The
analysis identified a number of possible reporting problems. OECA stated it will
consider institutionalizing this trend analysis as a regular Headquarters effort.
In response to Recommendation Number 6, OECA said it is considering
potentially significant modifications to the Regional Evaluation Program to better
reflect its priorities and to focus its attention on areas where there are known or
suspected problems. Adherence to the TAE and CMS guidances, and regional
assessment of state inspection programs are areas of known concern. OECA said
they will receive continued attention.
Concerning the recommendation to improve communications with EPA
regions, OECA stated it has taken a number of steps to ensure that the regions have
specific Headquarters contacts on various program areas. OECA said it will
continue to provide the regional offices updated material on its outreach efforts and
inform the regional offices about its capabilities and responsibilities as they change.
Regarding Recommendation Number 8, OECA partially agreed and said it
has identified focal points for the TAE and the CMS guidance. OECA stated to the
extent, however, that the OIG is recommending that it identify a single office as a
point of contact for all enforcement policy, it disagreed. Further, responsibilities for
developing and interpreting enforcement and compliance policies are spread among
several offices. OECA said it will continue to communicate the appropriate staff
contacts and their responsibilities to the Regions.
OIG EVALUATION
32
Report No. E1GAE7-03-0045-8100244
-------
We concur with the Agency's response and the corrective actions that were
already taken or proposed. It was not our intention to recommend that there be only
one point of contact for all enforcement policies. Therefore, we agree with OECA's
decision to assign separate focal points for the TAE and CMS. We modified the
report and the recommendation to more accurately reflect our position that multiple
focal points are acceptable.
OECA officials also offered some comments to clarify the report and we
revised the report accordingly.
33
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
34
Report No. E1GAE7-03-0045-8100244
-------
APPENDIX 1
PRIOR AUDIT COVERAGE
35
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
36
Report No. E1GAE7-03-0045-8100244
-------
The OIG performed six audits on EPA's oversight of states' air enforcement data.
These audits addressed topics similar to those discussed in this report and are
shown below.
Validation of Air Enforcement Data Reported to EPA by Pennsylvania: The audit
covered Region 3's Air, Radiation and Toxics Division and the Pennsylvania
Department of Environmental Protection. The fieldwork was performed from
November 2, 1995 to September 15, 1996. The final report (Report No. 7100115)
was issued on February 14, 1997.
EPA Region 3's Oversight of Maryland's Air Enforcement Data: The audit covered
the Region's Air, Radiation and Toxics Division and the Maryland Department of the
Environment. The fieldwork was performed from November 27, 1996 to June 30,
1997. The final report (Report No. 7100302) was issued on September 29, 1997.
Validation of Air Enforcement Data Reported to EPA by Massachusetts: The audit
covered Region 1's Office of Environmental Stewardship and the Massachusetts
Department of Environmental Protection. The fieldwork was performed from
January 24, 1997 to July 31, 1997. The final report (Report No. 7100305) was
issued on September 29, 1997.
Region 6's Oversight of Arkansas Air Enforcement Data: The audit covered the
Region's Compliance Assurance and Enforcement Division, the Multimedia
Planning and Permitting Division, and the Arkansas Department of Pollution
Control and Ecology. The fieldwork was performed from January to June 1997. The
final report (Report No. 7100295) was issued on September 26, 1997.
Region 6's Oversight of New Mexico Air Enforcement Data: The audit covered the
Region's Compliance Assurance and Enforcement Division, the Multimedia
Planning and Permitting Division, and the State of New Mexico's Environment
Department. The fieldwork was performed from June to October 1997. The final
report (Report No. 8100078) was issued on March 13, 1998.
Region 10's Oversight of Washington's Air Enforcement: The audit covered the
Region's Office of Air Quality and four local Air Quality Authorities in the State of
Washington. The fieldwork was performed from April 14, 1997 to November 21,
1997. The final report (Report No. 8100094) was issued on March 30, 1998.
37
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
38
Report No. E1GAE7-03-0045-8100244
-------
APPENDIX 2
EPA'S RESPONSE TO DRAFT REPORT
39
Report No. E1GAE7-03-0045-8100244
-------
[This page was intentionally left blank]
40
Report No. E1GAE7-03-0045-8100244
-------
t>. r1^
*L PR01«-°
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
ENFORCEMENT AND
COMPLIANCE ASSURANCE
AUGUST 12, 1998
MEMORANDUM
SUBJECT: Comments on Draft Internal Audit Report: EIGAE7-03-0045,
'Consolidated Report of Audit on EPA's Oversight of State Air Enforcement
Data'
FROM: Sylvia K. Lowrance 7s
Principal Deputy Assistant Administrator
Office of Enforcement and Compliance Assurance
TO: Michael D. Simmons
Deputy Assistant Inspector General for Internal Audits
Office of the Inspector General
Attached please find the Office of Enforcement and Compliance Assurance's response to
the subject draft report. These comments incorporate the views of our Office of Compliance and
Office of Regulatory Enforcement.
We thank the Inspector General for the time, effort, and thoughtfulness put into evaluating
SV identification and the rigor of air inspections. Although the EPA Office of Enforcement and
Compliance Assurance has implemented a successful program to evaluate Regional oversight of
State programs, your close look at certain CAA enforcement activity provides a welcome
supplement to our ongoing efforts. Many of the findings validate issues that we are aware of, and
contribute to our strategies to address them.
Should you have any questions about our response, please contact Carolyn Hardy,
OECA's OIG Audit Liaison at 564-2479.
Attachment
cc: Elaine Stanley
Eric Shaeffer
Carolyn Hardy
Frederick F. Stiehl
John Rasnic
Bruce Buckheit
Ernie Ragland
Patrick Milligan
s As signed by Sylvia K. Lowrance on August 12, 1998.
-------
OECA Comments on Draft IG Report No.# EIGAE7-03-0045
'Consolidated Report of Audit on EPA's Oversight of State Air Enforcement Data'
With one possible exception we agree with the recommendations in the Office of Inspector
General (OIG) draft report. We do have several clarifying suggestions.
Report Title
The draft report title implies a focus on data. The report goes well beyond data issues, and
addresses issues fundamental to effective enforcement programs such as surveillance programs,
violation response programs, and State-EPA communication on these programs. The title should
better reflect the report's substance.
Scope and Methodology
The draft report mentions that interviews included staff from the Enforcement Planning
Targeting and Data Division (EPTDD) and the Air Enforcement Division (AED). It should also
reflect OIG interviews of staff in the Manufacturing, Energy and Transportation Division
(METD).
Chapter 2, Part 1, States not Reporting Significant Violators to EPA
This section discusses why States did not identify and report significant violators. We are
concerned that some wording in this chapter might be interpreted to give credence to frivolous
excuses for program gaps. We were particularly concerned about quoted State complaints that
'EPA meddling caused delays' and that the ' SV&T policy defies any common sense and is
unrealistic.' No evidence is cited to support these complaints. They should be portrayed as
views only. EPA involvement and oversight in State enforcement is entirely appropriate given
EPA responsibility to ensure national compliance with environmental law and EPA responsibility
to assure State performance in accordance with grant agreements.
Other comments are grouped by recommendation:
1. Ensure that the initiatives undertaken by the SV workgroup are completed timely. This is
needed so that states, EPA regions, and Headquarters will have a clear understanding of what
constitutes an SV, and what needs to be reported to EPA.
Response: We agree that this is important and are working to that end. We formed a workgroup
consisting of 9 state and local air directors who are members of STAPPA/ALAPCO (S/A), first
and second line supervisors from each Region, and from OECA (AED and METD), in July 1997
with an objective to "streamline" and clarify the Significant Violator Timely and Appropriate
(SVT&A) guidance. We have met several times with S/A representatives to discuss our separate
proposals and have reached a tentative agreement with them on the definition of Significant
Violator. Our State commissioners' enforcement group was briefed on this agreement in its June
-------
meeting and the response was generally positive. Further discussion on the timely and
appropriate aspects of the SVT&A guidance will follow.
2. Continually reinforce to the EPA regions that they must comply with the TAE and CMS.
Response: We concur with this recommendation. In our Regional Reviews and the MOA
process, OECA will continue to require implementation of the TAE guidance (or as we call it the
Significant Violator Timely and Appropriate, SVT&A, guidance). We will make an effort to
include the Compliance Monitoring Strategy (CMS) guidance as well. See item 3 below.
This should include emphasizing that grant funds could be withheld when states do not comply
with these directives.
Response: Agree. See item 4 below.
3. Assign oversight responsibility for the TAE and CMS within OECA to the appropriate
functional authority.
Response: Agree. The SVT&A focal point is in ORE/AED. The Manufacturing Energy and
Transportation Division (METD) has, for the time, taken the lead for the CMS guidance.
4. Work with OAR to earmark Section 105 grant funds to enforcement. This will enable the
EPA regions to reinforce to the states that enforcement is an Agency priority.
Response: Agree. We believe that enforcement and compliance priorities should be reflected in
the Section 105 State grant guidance. OECA will work with OAR to modify the grant guidance
to incorporate the enforcement priorities. Primary responsibility for the actual administration of
the grant program, though, remains vested in the Regional offices.
5. Perform quality assurance of enforcement data through increased analyses of regional
and state performance measures. This is needed to detect trends and initiate corrections in a
timely manner.
Response: Agree. The Targeting and Evaluation Branch within the Office of Compliance (OC)
recently completed an extensive trend analysis of enforcement data, by Region and by State, for
reporting of CAA SVs and inspections during FY 1993 through 1997. This analysis was
transmitted to each Region in a May 7, 1998 memorandum from Frederick F. Stiehl. The
analysis identified a number of possible reporting problems in particular States and requests that
each Region provide answers to various questions about apparent trends and anomalies in the SV
-------
and inspection coverage data. The memorandum also requests that each Region provide a
detailed update of the status of Regional efforts to address reporting problems in their States.
OECA will consider institutionalizing this trend analysis as a regular Headquarters effort.
In addition, EPA is implementing a set of performance measures for its enforcement and
compliance assurance program which will enhance its ability to evaluate the effectiveness of its
program. These measures will help identify the relationship between program activities and
environmental results. Part of the implementation includes reviewing key elements of
enforcement data for accuracy and utility. EPA is working with states to adopt similar measures
for their enforcement and compliance assurance programs.
6. When conducting evaluations of regional air enforcement programs, improve oversight of
regions by assessing:
a. Regional compliance with the TAE and CMS.
b. Whether regional reviews of states determine if states comply with TAE and CMS. The
reviews should include an evaluation of the adequacy of state inspections and whether states are
identifying and reporting SVs.
Response: Agree. As part of the Regional Evaluation Program, OECA has normally included an
evaluation of a Region's air enforcement program, which has included a review of compliance
with the SVT&A and CMS policies. OECA is presently considering potentially significant
modifications to the Regional Evaluation Program to better reflect OECA priorities and to focus
our attention on areas where there are known or suspected problems. Adherence to the SVT&A
and CMS guidances, and Regional assessment of State inspection programs, are areas of known
concern and will receive our continued attention.
7. Improve communications with the EPA regions. Steps should be taken to ensure that
regional personnel know where in OECA technical expertise can be obtained. Conversely,
OECA should determine who in each region interacts with state enforcement personnel.
Interaction with these regional personnel would help ensure OECA is aware of state
performance as well as problem areas.
Response: Agree. OECA has taken a number of steps to improve communications with the
Regional Offices including ensuring that they have specific Headquarters contacts on various
program areas. For general enforcement and compliance assurance activities both OC and ORE
have provided the Regions with detailed directories of staff s areas of responsibility, and have
communicated office roles and responsibilities to the Regions on several occasions. OECA will
continue to provide updated material on our outreach efforts and inform the Regional Offices
about additional Headquarters staff capabilities and OECA organizational responsibilities as they
change or arise. One specific cross-cutting program that has caused some difficulty and concern
among the Regional Offices is the role of enforcement and compliance assurance programs in
-------
negotiating Performance Partnership Agreements (PPAs) and Performance Partnership Grants
(PPGs) under the National Environmental Performance Partnership System (NEPPS). Specifically,
the problem has been the lack of a substantive discussion of enforcement and compliance in many
of the agreements, which is generally attributed to the late inclusion of Regional enforcement and
compliance assurance program managers and staff at the initial stages of the negotiating process.
OECA expects that the enforcement and compliance assurance programs will be at the table from
the beginning of the process and has worked with the Regional Offices on this issue since the
inception of NEPPS. Most recently, to emphasize this point, and
to facilitate consistent communications on NEPPS, Steve Herman issued on May 13, 1998 a
memorandum titled Enforcement and Compliance Assurance Program and the Performance
Partnership Agreement/Grant Process (copy attached). The memo requested that enforcement
and compliance assurance programs participate in the initial NEPPS negotiations and encouraged
the use of the OECA Accountability Measures in the PPAs and PPGs. This memorandum also
identifies staff within the Office of Planning and Policy Analysis (OPPA), who will serve to
facilitate communications on these issues. The Office of Planning and Policy Analysis (OPPA),
is the OECA contact point for general State enforcement and compliance oversight issues.
8. Establish a focal point so that states and EPA regions know where within OECA to obtain
clarification of Agency enforcement directives such as TAE and CMS.
Response: Partially agree. OECA has identified Linda Lay in the Air Enforcement Division
(AED) as a focal point within ORE for the SVT&A. OECA has designated Mamie Miller of the
Manufacturing Energy and Transportation Division (METD) as the point of contact for the CMS
guidance. To the extent, however, that the OIG is recommending that OECA identify a single
office as a point of contact for all enforcement policy, we disagree. Under the 1994
reorganization of OECA, responsibilities for developing and interpreting enforcement and
compliance policies are spread among several offices. As we develop and issue enforcement
guidance, specific staff contacts are identified to assist our regional offices with questions of
interpretation and application. We will continue to communicate the appropriate staff contacts
and their responsibilities to our Regions.
-------
[This page was intentionally left blank]
-------
t>. r1^
*L PR01«-°
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
ENFORCEMENT AND
COMPLIANCE ASSURANCE
MAY 26, 1998
MEMORANDUM
SUBJECT: Enforcement and Compliance Assurance Program and the Performance
Partnership Agreement/Grant Negotiations Process
FROM: Steven A. Herman, Assistant Administrator 7s
Office of Enforcement and Compliance Assurance
TO: Deputy Regional Administrators
Regional Counsels
Regional Enforcement Coordinators
During the past three years, EPA Regional Offices have made strides with many of their
state partners in implementing the National Environmental Performance Partnership System
(NEPPS). I want to take this opportunity to acknowledge the efforts of the Regions'
enforcement and compliance programs in negotiating enforcement and compliance priorities with
states. I also want to emphasize the importance of the State/EPA relationship and the need to
continue to work with the states to make it as effective a process as possible.
The purpose of this memorandum is to emphasize the importance of integrating enforcement
and compliance priorities into Performance Partnership Agreements (PPAs) and Performance
Partnership Grants (PPGs) as they are negotiated for FY 1999, and to stress that enforcement and
compliance personnel must be involved throughout the negotiation process to achieve this goal.
Also provided is a list of contacts in the Office of Planning and Policy Analysis (OPPA) assigned to
assist the Regions throughout the PPA/PPG negotiations process. I
encourage you to consult with OPPA early in the negotiations process when significant issues arise.
Incorporating Enforcement and Compliance Assurance into PPAs/PPGs
The Regions should continue to incorporate enforcement and compliance goals, priorities,
and measures into PPAs/PPGs. To accomplish this result, enforcement and compliance personnel
need to be involved throughout the PPA/PPG process. Specifically, it is essential for enforcement
and compliance personnel to initiate and lead joint planning and priority setting discussions on
enforcement and compliance. Also, they should ensure that state enforcement and compliance
programs are incorporated in PPAs/PPGs.
s As signed by Steven A. Herman on May 26, 1998.
-------
Measures of Success
The OECA Accountability Measures are still valid and should be used when negotiating
PPAs/PPGs with states for FY 1999. These measures, developed in 1997 in cooperation with the
Environmental Council of the States, are a set of outcome and output measures designed to
measure the performance of state enforcement and compliance programs.
Coordination with OPPA
OPPA is the OECA lead on enforcement issues and state oversight arising under NEPPS.
To assist in resolving enforcement and compliance issues that may arise in PPA/PPG negotiations,
the Regional Offices should bring these issues to the attention of OPPA at the earliest possible
moment. Please share draft agreements with OPPA contacts at an early stage so that they may
have an opportunity to review and comment on the enforcement and compliance components of
these agreements. OPPA will provide guidance and support to regional staff in negotiating
national enforcement and compliance priorities with states, while providing flexibility to state
programs.
Attached for your reference is a list of OPPA contacts assigned to assist you with
PPA/PPG activities in your Region. The listed individuals also will be initiating communications
with you in the near future. If you have any general questions, contact Mimi Guernica leader of
the State Programs and Compliance Incentives Team in OPPA at (202) 564-7048.
-------
ATTACHMENT
Office of Planning and Policy Analysis Contact List
1
2
Region
Region
Region 3
Region 4
Region 5
Region 6
Region 7
Region 8
Region 9
Region 10
Mimi Guernica (202) 564-7048
Wendy J. Miller (202) 564-7102
Amanda Gibson (202) 564-4239
Wendy J. Miller (202) 564-7102
Art Horowitz (202) 564-2612
Amanda Gibson (202) 564-4088
Margaret Du Pont (202) 564-0056
Art Horowitz (202) 564-2612
Margaret Du Pont (202) 564-0056
Margaret Du Pont (202) 564-0056
-------
[This page was intentionally left blank]
50
-------
APPENDIX 3
DISTRIBUTION
51
-------
[This page was intentionally left blank]
52
-------
DISTRIBUTION
Headquarters
Office of Inspector General - Headquarters (2421)
Agency Audit Followup Coordinator (3304)
Agency Audit Followup Official (3101)
Audit Followup Coordinator, Office of Enforcement
and Compliance Assurance (2201A)
Assistant Administrator for Air and Radiation (6101)
Associate Administrator for Regional Operations and State/Local Relations (1501)
Associate Administrator for Congressional and Legislative Affairs (1301)
Associate Administrator for Communications, Education, and Public Affairs (1701)
Headquarters Library (3404)
Regional Offices
Regional Administrators
Regional Air Enforcement Directors
Regional Audit Followup Coordinators
Regional Directors of Public Affairs
Regional Libraries
General Accounting Office
53
------- |