®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Validation of Air Enforcement Data
Reported to EPA by Massachusetts
E1KAD7-01 -0017-7100305
September 29,1997
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Inspector General Division(s)
Conducting the Audit
Region(s) covered
Program Office(s) Involved
Eastern Audit Division
Boston, Massachusetts
Region 1
Office of Environmental Stewardship
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
MEMORANDUM
SUBJECT: Report of Validation of Air Enforcement Data Reported to
EPA by Massachusetts
Audit Report No. E1KAD7-01-0017-7100305
FROM: Paul D. McKechnie
Divisional Inspector General for Audit
Eastern Audit Division
TO: John P. DeVillars
Regional Administrator
Region 1
Attached is our audit report on Validation of Air Enforcement Data Reported to EPA by
Massachusetts. The overall objectives of this audit were to determine whether the Massachusetts
Department of Environmental Protection (MADEP): (1) identified significant violators in
accordance with EPA's Timely and Appropriate Enforcement Policy; (2) reported significant
violators to EPA; and (3) performed inspections that were sufficient to determine if a facility
violated the Clean Air Act (CAA). This report contains findings and recommendations that are
important to both EPA and MADEP.
This audit report contains findings that describe problems the Office of Inspector General
(OIG) has identified and corrective actions the OIG recommends. This audit report represents
the opinion of the OIG. Final determinations on matters in this audit report will be made by EPA
managers in accordance with established EPA audit resolution procedures. Accordingly, the
findings contained in this audit report do not necessarily represent the final EPA position, and are
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
not binding upon EPA in any enforcement proceeding brought by EPA or the Department of
Justice.
ACTION REQUIRED
In accordance with EPA Order 2750, you as the action official are required to provide this office
a written response to the audit report within 90 days. Your response should address all
recommendations, and include milestone dates for corrective actions planned, but not completed.
We have no objection to the release of this report to the public.
Should you or your staff have any questions about this report, please contact Linda Fuller, Team
Leader, at (617) 565-3160.
Attachment
cc: G. Mollineaux, R-l, Audit Coordinator
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
EXECUTIVE SUMMARY
PURPOSE
While states and local governments have primary responsibility for compliance and enforcement
actions within delegated or approved states, EPA retains responsibility for ensuring fair and
effective enforcement of federal requirements, and a credible national deterrence to non-
compliance.1 In order for EPA to ensure that states are effectively carrying out federal
enforcement requirements, the Agency needs compliance and enforcement data, especially
related to significant violators (SVs). In its February 14, 1997 audit report, "Validation of Air
Enforcement Data Reported to EPA by Pennsylvania," OIG's Mid-Atlantic Division (MAD)
reported that the State of Pennsylvania did not report all significant violators to EPA. Because of
the reporting weaknesses identified in the MAD audit, OIG initiated this review to determine if
other states across the nation were remiss in reporting data to EPA. Our audit objectives were to
determine whether the Massachusetts Department of Environmental Protection (MADEP):
• identified significant violators in accordance with EPA's Timely and Appropriate
Enforcement Policy;
• reported significant violators to EPA; and
• performed inspections that were sufficient to determine if a facility violated the Clean Air
Act (CAA).
1 June 26, 1984, Memorandum from EPA Deputy Administrator entitled, "Implementing
the State/Federal Partnership in Enforcement: State/Federal Enforcement Agreements."
i Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
RESULTS IN BRIEF
Region 1, also known as EPA New England, awarded a pilot demonstration grant to MADEP
covering Fiscal Years (FYs) 1995 and 1996. The pilot program allowed MADEP to test a
multimedia approach to environmental protection and greater flexibility in using resources. As a
result, Region 1 allowed deviations from standard air program procedures. For example, routine
regional/state meetings to discuss SVs were no longer required in the grant agreement; MADEP
was allowed to test an inspection targeting plan which reduced the number of air major sources
to be inspected; and MADEP was allowed to test a new inspection protocol rather than use EPA
Level 2 inspection requirements. Our review showed that Region 1 and MADEP have much to
learn from the pilot and need to make adjustments to future grant agreements to assure that
EPA's program expectations are met.
When planning future grant activity, Region 1 should refer to the Assistant Administrator's
February 21, 1996 memorandum, "Core EPA Enforcement and Compliance Assurance
Functions." This memorandum incorporated lessons learned from some of the Performance
Partnership discussions which had taken place. It was prepared with the intention to guide EPA
regional offices in their discussions with states regarding EPA's essential responsibilities for
ensuring compliance with environmental standards through the use of enforcement and
compliance assistance tools. By carrying out the core functions outlined, regions secure the
protections of public health and the environment and the assurance that those regulated entities
who violate environmental requirements do not gain a competitive advantage over those who
comply with environmental laws. In addition to reexamining the success of innovative
approaches, Region 1 needs to assure that MADEP carries out its grant agreement commitments.
The following findings present areas which Region 1 and MADEP need to continue to work
together to assure that EPA receives all the information needed for carrying out national
objectives.
MADEP Needs to Identify Significant Violators
MADEP did not identify and report any SVs in FY 1996 even though several SVs existed.
Region 1's elimination of routine SV discussions with the state and MADEP's misinterpretation
of EPA's Timely and Appropriate policy contributed to MADEP not identifying SVs. As part of
11 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
a demonstration grant, Region 1 also allowed MADEP to reduce its inspections of air major
sources which reduced the universe of potential SV sources. As a result, Region 1 was not
advised of serious violations and unable to assure that appropriate enforcement action was taken.
Additionally, MADEP did not document the basis for why penalties were not assessed against
the facilities we believed were significant violators.
Region 1 and MADEP Need to Resolve Database Discrepancies
MADEP under reported the number of enforcement actions in EPA's Air Facility Subsystem
(AFS) database for FY 1996 and did not enter any compliance inspection and enforcement data.
In addition, we found database discrepancies between the MADEP databases and Region 1's
AFS. MADEP did not comply with its special grant condition to update AFS on a quarterly
basis. The untimeliness of MADEP data input and the incompatibility of MADEP and EPA
systems contributed to the problem. As a result, EPA did not have a clear picture of state
accomplishments and was forced to use additional resources to correct the problem.
Region 1 Needs to Evaluate FIRST Inspection Protocol
Region 1 needs to conduct its own evaluation of MADEP's use of the Facility-wide Inspection to
Reduce the Source of Toxics (FIRST) protocol to assure that such inspections were adequately
performed to determine a facility's compliance with state and federal regulations. Region 1 and
MADEP developed the FIRST protocol as procedural guidance for use during multimedia
inspections. While MADEP provided an evaluation of its use of the FIRST protocol, this
evaluation's conclusions were not definitive. In its evaluation, MADEP characterized its data as
85 percent accurate and stated that information obtained from staff interviews must be qualified
because some staff were ambivalent towards using the multimedia approach. Also, MADEP
claimed in its report that a single inspector was performing the inspections when in fact staffing
varied. FIRST protocol inspections did not include all minimum requirements of a Level 2
inspection even though Region 1's State Compliance and Enforcement Coordinator said the
inspections were to be equivalent to a Level 2 inspection. Additionally, Region 1 needs to
encourage MADEP to develop a structured training program for this new inspection approach.
MADEP did not have training criteria, individual inspector training records, or a data tracking
system. We believe a more structured training program would assist the state to effectively and
111 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
consistently perform multimedia inspections.
RECOMMENDATIONS
We have made a number of recommendations to the Regional Administrator to improve the
operation of the program, some of which follow. (For more details, refer to the
Recommendations at the end of Chapters 2, 3 and 4.)
To improve MADEP's identification of SVs, we recommend that the Regional Administrator
include in MADEP grants a special condition requiring regional/state monthly meetings to
discuss SVs and other enforcement actions. We also recommend that Region 1 staff provide
training and guidance to MADEP staff on identifying and reporting SVs. Region 1 should also
negotiate with MADEP an increase in the number of air major sources to be inspected.
To improve database reconciliation between Region 1 and MADEP, we recommend that the
Regional Administrator require MADEP to comply with the grant's reporting requirements such
as entering on a quarterly basis into AFS compliance and enforcement as well as penalty data.
We also recommend that the Regional Administrator consider adjusting MADEP's grant award
for noncompliance with grant conditions related to data reporting.
To assure that MADEP effectively used the FIRST protocol, we recommend that the Regional
Administrator instruct Region 1 staff to conduct its own evaluation and encourage MADEP to
adopt a structured training program to ensure all inspectors are adequately trained to perform
multimedia inspections.
REGION 1's COMMENTS
Overall, Region 1 agreed with most of our recommendations but disagreed on the presentation of
certain conclusions. Their response along with the state's response has been summarized at the
end of each finding. The complete Regional and state responses have been included as
Appendices 2 and 3 respectively. An exit conference was held with representatives from Region
1 and MADEP on September 24, 1997.
IV Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
TABLE OF CONTENTS
Page
EXECUTIVE SUMMARY i
CHAPTER 1 - INTRODUCTION 1
PURPOSE 1
BACKGROUND 1
SCOPE AND METHODOLOGY 5
PRIOR AUDIT COVERAGE 8
CHAPTER 2 - MADEP NEEDS TO IDENTIFY SIGNIFICANT VIOLATORS 9
SIGNIFICANT VIOLATORS NOT IDENTIFIED 10
SIGNIFICANT VIOLATOR COORDINATION MISSING 17
PENALTIES NOT ASSESSED 18
INSPECTIONS OF MAJOR AIR SOURCES REDUCED 19
CONCLUSION 20
REGIONAL RESPONSE 21
OIG EVALUATION 24
MADEP RESPONSE 25
OIG EVALUATION 25
RECOMMENDATIONS 26
CHAPTER 3 - REGION 1 AND MADEP NEED TO RESOLVE
DATABASE DISCREPANCIES 27
C&E DATA UNDER REPORTED AND UNTIMELY 27
V Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
UNDER REPORTING A CHRONIC PROBLEM 29
DATABASE DISCREPANCIES 29
CONCLUSION 31
REGIONAL RESPONSE 31
OIG EVALUATION 32
MADEP RESPONSE 32
OIG EVALUATION 33
RECOMMENDATIONS 34
CHAPTER 4 - REGION 1 NEEDS TO EVALUATE FIRST PROTOCOL 35
MADEP EVALUATION NOT CONCLUSIVE 36
INSPECTIONS NOT LEVEL 2 EQUIVALENT 37
FIRST PROTOCOL PILOT IDENTIFIED PROBLEMS 38
INSPECTION STAFFING 39
MADEP NEEDS TO ESTABLISH A STRUCTURED
TRAINING PROGRAM 41
CONCLUSION 44
REGIONAL RESPONSE 44
OIG EVALUATION 45
MADEP RESPONSE 45
OIG EVALUATION 46
RECOMMENDATIONS 47
APPENDIX 1: GLOSSARY OF ACRONYMS 48
APPENDIX 2: REGIONAL RESPONSE 49
APPENDIX 3: MADEP RESPONSE 55
APPENDIX 4: DISTRIBUTION 64
VI Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
CHAPTER 1
Introduction
PURPOSE While states and local governments have primary
responsibility for compliance and enforcement actions within
delegated or approved states, EPA retains responsibility for
ensuring fair and effective enforcement of federal
requirements, and a credible national deterrence to non-
compliance.2 In order for EPA to ensure that states are
effectively carrying out federal enforcement requirements,
the Agency needs compliance and enforcement data,
especially related to SVs. In its February 14, 1997 audit
report, "Validation of Air Enforcement Data Reported to EPA
by Pennsylvania," OIG's Mid-Atlantic Division (MAD)
reported that the State of Pennsylvania did not report all
significant violators to EPA. Because of the reporting
weaknesses identified in the MAD audit, OIG initiated this
review to determine if other states across the nation were
remiss in reporting data to EPA. Our audit objectives were
to determine whether the Massachusetts Department of
Environmental Protection (MADEP):
identified significant violators in accordance with
EPA's Timely and Appropriate Enforcement Policy;
reported significant violators to EPA; and
performed inspections that were sufficient to
2 June 26, 1984, Memorandum from EPA Deputy Administrator entitled, "Implementing
the State/Federal Partnership in Enforcement: State/Federal Enforcement Agreements."
1 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
determine if a facility violated the Clean Air Act (CAA).
BACKGROUND The CAA of 1990 lists 188 toxic air pollutants that must be
reduced. EPA estimated that more than 2.7 billion pounds
of toxic air pollutants are emitted annually in the United
States. The list of air toxics touched every major industry,
from the mining of base metals to the manufacture of high-
tech electronics. EPA studies showed that exposure to
these air toxics may result in up to 3,000 cancer deaths
each year. Other adverse health effects of air toxics
included: respiratory illness; lung damage; premature aging
of lung tissue; as well as retardation and brain damage,
especially in children.
The CAA separately regulates six of the more serious air
pollutants - ground level ozone, particulate matter, carbon
monoxide, sulfur dioxide, lead, and nitrogen dioxide. These
six criteria pollutants are emitted in large quantities by a
variety of sources. EPA sets national ambient air quality
standards for each of these criteria pollutants and the states
must take action to assure attainment with these national
standards.
Section 105 of the CAA provided the initial authority for
federal grants to help states and local agencies administer
their air programs. Before EPA awarded each grant, it
negotiated a work program with the state. The work
program contained specific work commitments the state
agreed to perform. The work program encompassed
activities such as inspections, monitoring, permitting and
enforcement, which included identifying and reporting
significant violators.
The MADEP conducted inspections of major facilities to
ensure they met federal and state regulations. To assess
2 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
compliance during an inspection, the inspector would need
to refer to the facility's permit. The permit translated
requirements of laws such as the CAA into individualized
enforceable requirements.
According to EPA policy, states can perform five different
levels of inspections at air pollution facilities. To adequately
evaluate a facility's compliance with the CAA, EPA
considered a Level 2 inspection the most appropriate. The
Level 2 inspection included reviewing facility records to
determine compliance with applicable regulations, taking
and analyzing samples when appropriate, recording process
rates and control equipment parameters, and performing
visual observations of emissions.
EPA provided general guidance for conducting Level 2
inspections. However, MADEP and Region 1 developed an
inspection protocol different from the EPA Level 2
inspection. The inspection protocol, known as the FIRST
protocol, was intended as a reference or guidance outlining
the minimum elements of an annual compliance evaluation
inspection for approximately 1,000 industrial major and
minor facilities.
Traditionally, the guidance that accompanied federal grant
funds required MADEP to target most of its industrial
inspections at the largest sources of pollution or "major"
sources. In recent years, through grant negotiations,
MADEP has increasingly used what it considered innovative
schemes for targeting inspections. This practice gave less
consideration to the targeting of strictly major facilities.
MADEP policy required the issuance of a Notice of
Noncompliance (NON), when an inspector identified a
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
violation. An NON specified the type of violation and the
regulation the facility violated. It may also require the facility
to show the actions to be taken to achieve compliance. If
the violation met EPA's definition of a significant violator,
EPA policy required the state to report the facility for
placement on EPA's significant violator list. Before FY 1995,
Section 105 grants required MADEP to identify and report
significant violators to EPA. However, starting in FY 1995,
air program compliance and enforcement activities were
transferred to the Massachusetts Compliance Assurance
Demonstration Grant. While criteria for reporting significant
violators was not defined in the grant, EPA required the state
to follow EPA's February 7, 1992 "Issuance of Guidance on
the Timely and Appropriate Enforcement Response to
Significant Air Pollution Violators."
According to EPA's Timely and Appropriate Enforcement
Policy, a significant violator is any major stationary source of
air pollution, which violated a federally-enforceable
regulation. This policy required states to report significant
violators to EPA within one month of the violation, and to
maintain the facility on EPA's list until it achieved
compliance. After the violation was reported, the state and
EPA should monitor the source until it achieves compliance.
This included determining an appropriate time schedule for
achieving compliance and assessing a penalty, if necessary.
During Federal Fiscal Years (FFYs) 1995 and 1996,
MADEP Air program compliance and enforcement activities
were funded under the Compliance Assurance
Demonstration Grant. This grant served as the forerunner of
the performance partnership grant and agreement currently
in place. The Demonstration Grant was for a two year
period, October 1, 1994 through September 30, 1996. The
total amount of grant funds provided as of September 30,
1996 was $2,823,743; federal share $2,112,300 and state
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
share $711,443.
The Compliance Assurance Demonstration Grant tested
several aspects of the state-EPA relationship, and innovative
ways of providing environmental protection. Some of the
primary activities tested were multimedia inspections,
flexible targeting of industrial sources and building an
innovative electronic data system to improve multimedia
facility compliance. While the grant did not contain
language regarding the reporting of significant violators,
separate correspondence from MADEP stated
communication between EPA and MADEP regarding
violators would continue under the 1996 Demonstration
Grant.
Each month EPA and the states are responsible for updating
the air enforcement data on the Agency's database known
as the Aerometric Information and Retrieval System (AIRS).
The AIRS Facility Subsystem (AFS) is part of the AIRS
database, containing compliance and enforcement data on
sources. New violations are to be reported to EPA via
telephone and AIRS. EPA is to use this communication to
promote a greater degree of teamwork between themselves
and the states. However, if EPA is dissatisfied with a state's
enforcement action, EPA has the authority to override the
state and assume the lead in resolving the violation.
MADEP is an agency located in the Massachusetts Office of
Environmental Affairs, and is divided into several functional
offices and bureaus. MADEP is divided organizationally into
five major offices, a Boston headquarters and four regional
offices. Boston is generally responsible for writing
regulations and guidance; the regional offices are
responsible for operations. Staff in the regional offices were
reorganized from a structure following the traditional
program areas to one following multimedia functional areas.
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
During 1995 and 1996, a significant number of
Massachusetts air major sources converted to minor status.
However, EPA and state databases did not always reflect
these changes. MADEP staff gave us information to present
a more accurate picture of the universe of air major sources
and activities performed by MADEP staff. After deleting
facilities which converted to minor status, we determined
that for FFY 1996, there were now 250 air major sources.
MADEP conducted 39 inspections at these major sources
and issued 32 NONs as a result.
SCOPE AND METHODOLOGY We performed this audit according to the Government
Auditing Standards (1994 Revision) issued by the
Comptroller General of the United States as they apply to
performance audits. Our review included tests of the
program records and other auditing procedures we
considered necessary.
To accomplish our objectives we performed reviews and
conducted interviews at MADEP regional offices and its
central office in Boston. We visited three of MADEP's four
regional offices. While at the MADEP regional offices, we
interviewed the Bureau of Waste Prevention Deputy
Directors, the Compliance and Enforcement Chiefs, Permit
Chiefs, and Senior Regional Counsel. We interviewed
Central Office staff as well.
At EPA Region 1, we interviewed staff from the Office of
Environmental Stewardship specifically in the
Air/Pesticides/Toxics section.
We reviewed the CAA, EPA's February 7, 1992 "Issuance of
Guidance on the Timely and Appropriate Enforcement
Response to Significant Air Pollution Violators," the June
6 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
14,1994 "Clarification Package for Guidance...." EPA's
"Compliance/Enforcement Guidance Manual (Revised
1987)," Chapters, MADEP's April 1997, "Evaluation of
Compliance Assurance Demonstration Grant" and the
September 15,1986 "Comprehensive Enforcement Policies
and Guidance." We also reviewed the FFY 1996 Section
105 grants and the Compliance Assurance Demonstration
grant awarded to the state. During this audit we used
various printouts from EPA's AIRS to obtain information on
the number and names of major sources for FFY 1996 and
what inspections were performed. We also used MADEP
generated reports to determine the inspections performed
and enforcement actions undertaken.
To evaluate MADEP's enforcement of the CAA
requirements, we reviewed the air quality files maintained at
MADEP offices. These files contained items such as
inspection reports, NONs, consent decrees, permits, test
results, emissions data and correspondence. Due to the
complexity of air enforcement files, we received technical
assistance from the OIG's Mid-Atlantic technical staff and
Region 1's Air/Pesticides/Toxics' staff.
We performed two analyses to accomplish our objectives.
First we examined AIRs and MADEP data for major facility
inspections performed in FFY1996. We were aware that
FFY 1996 was a transition year for the permitting of facilities.
Specifically, a significant number of major source facilities
submitted applications to MADEP which would affect their
emissions status and result in the conversion of the facility
from a major to a minor source. We obtained information
showing which major sources submitted applications to
convert their emission status from a major to a minor source.
We judgmentally selected all major facilities with 1996
inspections but excluded those with a 1996 approval for a
permit revision to a restricted or synthetic minor operating
7 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
status. Our universe for review included those facilities that
had inspections in 1996 and a NON issued by MADEP. In
some instances we also selected a facility that did not have
an NON issued.
To evaluate the adequacy of inspection reports we obtained
technical assistance reviews from Region 1 and 3 staff
associated with the Air program. To evaluate the adequacy
of inspector training, we interviewed MADEP and Region 1
staff.
Our audit disclosed several areas needing improvement that
are discussed in Chapters Two to Four. Our
recommendations address the need to report significant
violators and improve the quality of inspections performed.
Action must be taken to ensure that MADEP reconciles its
database with Region 1 and provides complete reporting
data as required in its grant agreement. Additionally, Region
1 needs to conduct its own evaluation of MADEP's use of
the FIRST protocol and encourage MADEP to establish a
structured training program for FIRST inspections.
We reviewed management controls and procedures
specifically related to our objectives. However, we did not
review the internal controls associated with the input and
processing of information into AIRS or other automated
records system.
As part of this audit we also reviewed the Region 1 FFY
1996 Assurance Letter prepared to comply with the Federal
Manager's Financial Integrity Act (FMFIA). We found that
none of the weaknesses cited during our audit were
disclosed in Region 1's annual report.
Our survey began on January 24, 1997. As a result of the
survey, we performed additional audit work from June 9,
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
1997 to July 31, 1997.
We issued a draft report on August 14, 1997. Region 1
submitted its response to us on September 22, 1997
(Appendix 2). We also included MADEP's
response(Appendix 3) to the Region. We have made
revisions to the report, where appropriate, to reflect the
information provided us in the Region's response. We
provide a synopsis of the Regional and state comments and
our evaluation at the end of each finding chapter.
PRIOR AUDIT COVERAGE This was EAD's first review of validation of air enforcement
data. Massachusetts was the only state in New England
selected for review. OIG's MAD reviewed the adequacy of
Pennsylvania's identification and reporting of SVs and
issued its February 14, 1997 audit report, "Validation of Air
Enforcement Data Reported to EPA by Pennsylvania"
(E1KAF6-03-0082-7100115).
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
CHAPTER 2
MADEP Needs to Identify Significant Violators
MADEP did not identify and report any "Significant Violators"
(SVs) in FY 1996 even though several SVs existed. The
three sources listed in EPA's database as Massachusetts
SVs for that year were identified by the EPA Region 1 staff,
not MADEP staff. We identified an additional three SVs
from a sample of seven sources. Region 1's elimination of
routine SV discussions with the state and MADEP's
misinterpretation of EPA's Timely and Appropriate policy
contributed to MADEP not identifying SVs. As part of a
demonstration grant, Region 1 allowed MADEP to reduce its
inspections of air major sources which in turn reduced the
universe of potential SV sources. As a result, Region 1 was
not advised of serious violations and unable to assure that
appropriate enforcement action was taken. For the three
SVs we identified, penalties were not assessed for two
cases; a penalty was assessed for the third case but only
after the facility repeatedly violated its opacity limits.
(Opacity is the degree to which emissions reduce the
transmission of light and obscure the view of an object in the
background.)
The number of SVs identified by MADEP staff has steadily
declined since 1994. We believe the decline was due in part
to Region 1's elimination of routine SV discussions and
MADEP's decision to stop entering data into EPA's AFS
(See Chapter 3). According to the Region 1 Air Coordinator,
MADEP identified SVs as shown:
10 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Fiscal Year
1994
1995
1996
SVs Identified
by MADEP
14
2
0
Identification of SVs to EPA is important to assure that
regional personnel are aware of violations and concur with
state enforcement action taken since the region can take
action independently of the state. EPA's February 7, 1992,
"Issuance of Guidance on the Timely and Appropriate
Enforcement Response to Significant Air Pollution Violators"
(hereafter referred to as the Timely and Appropriate policy)
provided that:
The Clean Air Act vests responsibility for enforcement
of the law in EPA. Therefore, EPA may move
independently with respect to designation of a violator
as a "Significant Violator" and EPA shall assume the
lead in cases when it becomes apparent that the
State is unable or unwilling to act in accordance with
this guidance to resolve a violation in a timely and
appropriate manner.
Additionally, the SV data must be accurately maintained to
11 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
ensure that this data, which is shared by other enforcement
offices within EPA and the states, correctly reflects the SV
status for all sources subject to EPA's policy. This data field
is becoming increasingly more important as the Agency
shifts further toward multimedia, geographic and industry
specific enforcement.
Significant Violators Not MADEP did not identify any SVs during FY 1996. The three
Identified SVs in EPA's 1996 database were entered by EPA Region
1's State Compliance and Enforcement (C&E) Coordinator
after he reviewed a MADEP enforcement report. Our review
of seven NONs found that three included violations which
should have been reported as SVs to Region 1. Our seven
cases represented half the NONs issued in 1996 which had
potential to include significant violations. Only 14 of the 32
NONs issued to air major sources potentially included
significant violations. The other 18 were for not submitting
operating permit applications (which was not considered a
significant violation by EPA for that period).
Controls were established to assist MADEP in identifying
SVs. MADEP created a checklist to aid the inspector in
identifying an SV. Also, the Region 1 State C&E
Coordinator was available to discuss whether violations
warranted an SV designation. However, we found no
evidence of the checklist in the files and MADEP staff did
not request Region 1's technical assistance. Based on the
above, we concluded that MADEP did not use the controls in
place to identify and report SVs to Region 1.
Even though the MADEP Commissioner stated that MADEP
staff have "definitional" differences with EPA's SV policy,
MADEP staff used EPA's Timely and Appropriate policy
when determining a violator's SV status.
12 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
We reviewed 13 case files administered by three of the four
MADEP regional offices:
Regional
Offices
Northeast
Southeast
Central
Number of
Files Reviewed
5
6
2
The 13 sources were classified in either the Air Facility
Subsystem (AFS) or state databases as A1-Majors.
However, five of the sources had applied and were approved
as synthetic minors, and one source ceased operations and
closed the facility. Of the seven remaining major sources,
three had violations which warranted elevation to SV status.
Based on our file reviews and discussions with MADEP
regional staff, there was no evidence available to
demonstrate whether the sources were ever considered as
candidates for the SV list. The MADEP Southeast Regional
Office (SERO) provided a copy of a MADEP designed
checklist for identifying SVs, however, we found no evidence
in any of the files that the checklist was used. MADEP
Central Regional Office (CERO) staff stated that there were
several discussions concerning one of the sources we
identified as an SV, but they had never discussed the case
with Region 1 technical staff. This breakdown in
communication and lack of documentation not only affected
MADEP's ability to identify and record sources as SVs, but
also demonstrated MADEP's noncompliance with EPA's
Timely and Appropriate policy.
A synopsis of the three sources we determined were SVs
follows:
13
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Major Source #1
Major Source #1 chronically violated its opacity limits which
were part of the Massachusetts State Implementation Plan
(SIP). According to the EPA's Timely and Appropriate policy
a major source which violates a SIP emission is an SV.
During a March 12, 1996 inspection, a MADEP inspector
observed opacity of greater than 20 percent. The facility's
August 3, 1993 Plan Approval required zero percent opacity.
The inspector noted that MADEP had given interim approval
on February 23, 1996 to install two thermal oxidizer units
(opacity controls) by mid-May. As a result, no NON was
issued.
MADEP's Chief of Compliance and Enforcement (C&E)
concurred with the inspector's decision. On May 23, 1996,
MADEP conducted a follow-up inspection and found that the
units were installed according to the interim approval and
were working correctly (zero percent opacity). Also, the
plant's neighbors were happy that the odors were gone.
However, on July 21, 1996, neighbors complained to their
Police Department that the plant was emitting smoke and
obnoxious fumes. In a July 25, 1996 letter to the City's
Mayor, the City's Conservation Department summarized that
there had been many complaints received regarding the
plant. In September and October more complaints were
received from the neighborhood. On October 11, 1996,
MADEP staff observed blue smoke during a visible emission
observation. As a result, on October 22, 1996, MADEP
issued an NON for smoke opacity equal to 30 percent.
Complaints continued to be received by MADEP as of March
1997. This violation was not considered an SV according to
the C&E Chief because the October 1996 opacity violation
was considered a one time occurrence since the installation
14 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
of the oxidizer units. He also stated the company showed a
good faith effort by working with MADEP to correct the
October violation.
In our opinion, this violation should have been identified as
an SV because of the plant's chronic opacity problem.
While we can accept MADEP's decision not to issue an
NON in March 1996, a violation did occur; therefore, the
October violation was not a first time occurrence.
Additionally, the numerous citizens' complaints over a
number of months led us to conclude that the violation cited
by MADEP in October 1996 was not a first time occurrence.
Also, a plant's willingness to work at correcting a violation
does not preclude it from being reported as an SV.
EPA's Office of Enforcement and Compliance Assurance
(OECA) provided further clarification on whether one opacity
violation constituted a significant violation as follows:
Opacity is a surrogate for particulate matter emissions
and is considered an emissions violation.
Accordingly, even a single opacity violation must be
identified. How it is then ranked and tracked is to be
determined by agreement of EPA and the State/local
authority.
Additionally:
The policy establishes a procedure and a mechanism
for differentiating between these two examples in
terms of the priority for action, but requires an
identification of the violation in each instance so the
agencies have an opportunity to review the relevant
facts and agree as to the priority for resolving the
violation.
15 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
By not notifying Region 1 of this violation, MADEP prevented
an opportunity for it and Region 1 to review the case and
come to an agreement on resolving the violation.
Major Source #2
We believe Major Source #2 was an SV because it
exceeded its particulate matter standards which was a
violation of the New Source Performance Standards
(NSPS). MADEP staff reviewed results of stack tests
performed on October 4 and 5, 1995. The test results
showed particulate emissions during the three compliance
runs were 0.054, 0.040 and 0.033 grains per dry standard
cubic feet of flue gas. In addition to exceeding the NSPS
standard, the readings exceeded MADEP's permitted worst
case emission standard for particulate matter of 0.030 grains
per dry standard cubic foot as contained in the Plan
Approval of April 11, 1994. Based on this non-compliance,
MADEP staff requested the source to be re-tested for
particulate matter emissions. The November 30, 1995
results were 0.153, 0.166 and 0.176 grains per dry standard
cubic foot of flue gas, which still exceeded the maximum
allowable emission rate.
On December 14, 1995, MADEP issued an NON based on
the second test readings. In the NON, MADEP noted the
following:
The compliance stack testing results revealed that
particulate emissions were greater than 0.040 grains
per dry standard cubic foot of flue gas, which is in
violation of the New Source Performance Standards
(NSPS) regulations for Hot Mix Asphalt Facilities
(Federal Register 40 CFR, Part 60, Subpart I).
16 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
The MADEP staff found the source to be in compliance with
applicable emission limitations on February 9, 1996, after
the third compliance emission test report.
Until the source reached compliance in February, it should
have been listed as an SV because the violation issued by
MADEP staff met the following definition of a Significant
Violator:
1. A "Major" source and it violates any one or more of
the following:
b. NSPS emission, monitoring or substantial
procedural requirements.
The SERO Deputy Regional Director concurred that the
violation should have been reported as an SV but was not
because the stack tester did not relay the results to the
appropriate staff who identify SVs. The SERO Deputy
Regional Director assured that corrective action had been
taken to prevent this from recurring.
Major Source #3
Major Source #3 chronically violated its opacity limits, a
violation not only of the SIP, but also of a MADEP
negotiated Consent Order. MADEP issued an NON on June
26, 1996 which reported a history of visible emission
violations that dated back to August 12, 1991. MADEP staff
made observations over a four month period and recorded
opacity readings equal to and greater than 35 percent on
four of the six days readings were conducted. On the other
two days the opacity readings were greater than 20 and 30
percent. At the time this NON was issued, the source was
under a June 16, 1995 Administrative Consent Order for
17 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
visual emissions in excess of 20 percent opacity and odor
violations.
In the June 1996 NON, MADEP wrote that the source was in
violation of its consent order. The MADEP files for this
source contained evidence of numerous complaints. In a
December 17, 1996 letter, MADEP referred this case to the
state's Attorney General's Office for higher level
enforcement. However, MADEP did not report this case as
an SV to Region 1. MADEP's Central Regional Office
(CERO) Deputy Director stated that the visual emission
readings were not conducted using EPA's Method 9. She
said in order to get the appropriate visual position, readings
would have to be conducted from a highway which would not
be safe for the inspector. Because Method 9 was not used,
MADEP believed the readings would be challenged and
would not be legally defensible at the federal level.
In regards to this question, EPA's OECA provided the
following:
Our response is that a violation should be listed
where the agency has a reasonable basis to believe
that a violation has occurred. Listing the violation is
not an adjudication and does not require proof
"beyond a reasonable doubt" or "by a preponderance
of the evidence." Indeed, one of the appropriate
responses to a listing as a SV is to conduct further
investigation.
OECA suggested that a state could make arrangements with
the state police to provide an adequate measure of safety on
the highway during inspections.
We concluded that the source should have been placed on
the SV list since the violations met the Timely and
18 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Appropriate policy SV definition:
Agencies shall deem a source to be an SV if it is a
major source and it violates the following:
SIP emission, monitoring or substantial
procedural requirements, regardless of
pollutant designation status.
SIP, NSPS or NESHAP emission,
procedural or monitoring requirements
violated repeatedly or chronically.
Any substantive provision of a State
Judicial Order or a State Administrative
Order which was issued for an
underlying SIP violation.
Significant Violator In our opinion, the one overriding cause of MADEP's failure
Coordination Missing to identify SVs was a breakdown in communication between
MADEP's regional compliance and enforcement staff, and
Region 1 air technical staff. Region 1's State C&E
Coordinator stated that in the past he directly communicated
on a regular basis with MADEP air compliance and
enforcement staff. This communication ceased under the
Mass Demonstration Grant because there was no longer a
requirement for MADEP and Region 1 to discuss SVs on a
monthly basis.
Prior CAA Section 105 grants provided that the state and
EPA would conduct monthly Compliance/Enforcement
19 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
discussions about current/potential SVs. The FY 1996 Mass
Demonstration Grant deleted this reference. According to a
draft "MA-DEP FFY96 Compliance Assurance
Demonstration Grant Response to 11/2/95 EPA Comments",
EPA commented in item #14, "Significant Violator
coordination is missing." MADEP responded, "As in FY95,
DEP will verbally communicate significant violator
information monthly to EPA." However, we found that
MADEP was not communicating to Region 1 on a monthly
basis. Region 1's State C&E Coordinator advised that
MADEP did not call on a monthly basis but rather on an as
needed basis. MADEP's Associate Commissioner stated
that MADEP staff surveyed its actions monthly to identify
SVs but only called EPA when needed to add/delete
information. If no action occurred, MADEP did not call EPA.
This lack of communication between EPA and MADEP
contributed to MADEP's failure to identify SVs. MADEP's
regional staff commented that additional guidance or
clarification on identifying SVs was needed. EPA's State
C&E Coordinator stated that he could provide this guidance.
Additionally, since inspectors conducting multimedia
inspections have diverse technical backgrounds, EPA
technical staff can provide the air expertise which some
state inspectors may not possess. Direct and open
communication will resolve questions on guidance and
provide multimedia inspectors with a source for technical
expertise.
Penalties Not Assessed MADEP did not assess penalties against Major Sources #1
and #2. MADEP did not believe these violations
represented a significant potential harm to public health,
safety or the environment. Major Source #3 had violated
opacity limits since 1993 before MADEP assessed a penalty.
Consent Orders and NONs had been issued from 1993 to
20 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
1996. MADEP considered its actions to be a long term
enforcement strategy and believed the strategy achieved a
sustainable remedy for environmental problems at the plant.
For Major Sources #1 and #2, files showed no evidence that
MADEP considered whether a penalty should have been
sought or if there was any economic benefit to the facilities
for their noncompliance. While a penalty was finally
assessed against Major Source #3, the time involved was
excessive in our opinion.
We bring this information to the Region's attention because
in its January 27, 1994 Multi-Media Overview Report of
MADEP, the Region was concerned that MADEP was
relying excessively on the use of NONs. The report further
stated:
Yet, NONs are not appropriate as the only
enforcement response to substantial violations.
Indeed, some types of violations deserve judicial or
administrative action, with penalties, as the first
enforcement intervention. Such stronger actions
send a more powerful deterrence message to the
violator and the regulated community at large.
Moreover, collecting penalties that recoup the
economic benefit to the violator from its violations
ensures that all members of the regulated community
are treated equally under the law. In other words,
entities that violate the law are prevented from gaining an
economic advantage over their law-abiding competitors.
In future discussions with MADEP, the Region should assure
that MADEP has considered stronger actions such as
penalties and properly documented basis why it did/did not
assess a penalty.
21 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Inspections of Major Air Region 1 agreed to let MADEP try an innovative inspection
Sources Reduced targeting plan under the FY 1996 Demonstration Grant.
However, this inspection plan reduced the number of
inspections of air major sources to a level which may affect
the state's ability to maintain the enforcement presence it
planned according to MADEP's January 29, 1996 letter to
Region 1. This reduction also limited the number of
potential SV cases which could be identified and reported.
In its January 29, 1996 grant application, the MADEP
Assistant Commissioner for the Bureau of Waste Prevention
wrote to Region 1, "DEP will increase targeting of smaller
sources who were less-frequently inspected under traditional
targeting approaches, while maintaining a deterrent
inspection and enforcement presence at larger, high
potential risk facilities." Region 1 agreed to this innovative
inspection targeting approach under the FY 1996
Demonstration Grant by not requiring MADEP to conduct a
specific number of air major sources.
According to EPA's March 29, 1991 "Revised Compliance
Monitoring Strategy" (CMS), "The goal of CMS is to develop
the most environmentally effective inspection program for
each State. To accomplish this goal, more open and
frequent planning and discussion between the State and
EPA is required, which will build a stronger State-Federal
partnership." Both Region 1 and MADEP believed the
targeting approach included in the 1996 Demonstration
Grant was the most environmentally effective for
Massachusetts.
However, the CMS also states that the final Inspection Plan
"must explicitly include": 1) a list of sources to be inspected;
2) how the list of sources was determined; and 3) an
estimated resource allocation. The Region 1
22 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Air/Pesticides/Toxics (APT) Chief said that Region 1
departed from the CMS process under the Performance
Partnership Grant (PPG) philosophy. Region 1 did not
require MADEP to provide as grant deliverables the three
elements described as part of an Inspection Plan. The 1996
grant included a description of the basis for selecting
sources for inspections. However, a list of sources to be
inspected, as well as a resource allocation plan, was not
submitted.
The APT Chief said that he would like to see all major
sources "touched" at least once during a four year period.
Depending on the database used, MADEP conducted from
39 to 54 inspections at air major sources. In either case,
MADEP will not meet the APT Chiefs suggested inspection
schedule. At the current rate, it is also doubtful that MADEP
could inspect all its air majors at least once every five years
as stipulated in EPA's March 1980 "Inspection Frequency
Guidance". In our opinion, this shortage in coverage will not
maintain the deterrent presence MADEP claimed it would
continue with its new inspection targeting plan.
Additionally, MADEP and Region 1 need to decide if less
frequent major source inspections are justified. MADEP's
Deputy Assistant Commissioner stated that the inspections
of smaller sources had not detected the serious violations
anticipated. The Deputy Assistant Commissioner also said
that the state was discussing the results and whether or not
to increase its inspections of major sources. EPA's CMS
provides, "An analysis of each State's Inspection Plan
results will be conducted at the end of each year by the
Regional Office." Region 1's APT Environmental Protection
Specialist stated that although targeting of major inspections
had not increased for FY 1997, it would be discussed and
23 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
negotiated for FY 1998. Together, Region 1 and MADEP
should evaluate and discuss results of past targeting plans
and make adjustments accordingly to the FY 1998
inspection targeting plan.
CONCLUSION
Region 1 and MADEP need to enhance their partnership
with greater communication. SV identification is a joint
decision making process. In our opinion, by deleting the
requirement for routine SV discussions from the grant
agreement, the Region did not clarify its expectation that
such discussions were to continue. As a result, MADEP was
unable to identify any SVs on its own. Discussions on SV
identification should also assist Region 1 and MADEP to
determine the adequacy of the enforcement action planned.
MADEP's limited use of penalties may have allowed
economic benefits to companies which polluted or provided
an inadequate deterrent to prevent prolonged pollution.
Region 1 and MADEP also need to further discuss the
results of their inspection targeting plans to assure that
adequate opportunities exist to identify significant violations.
Open and continuous dialogue will benefit both partners and
ensure the ultimate success of the partnership, a safe
environment.
REGIONAL RESPONSE
Region 1 generally agreed to implement our
recommendations related to SV identification. The Region
described a number of actions it was taking, such as
developing SV coordination procedures with MADEP and
providing SV identification training to MADEP staff.
The Region also noted that the demonstration grant
attempted to balance EPA's interests in media specific
expectations while providing MADEP enough flexibility to
24
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
pilot its approach. However, different expectations evolved:
The IG audit has pointed to the identification and
communication of CAA Significant Violators as an
area where EPA and MADEP diverged on media
specific expectations. This was further exacerbated
by the absence of a clear understanding of how core
program expectations should be addressed in an
experimental demonstration grant.
The Region further responded:
The audit report implies that the region allowed
improper CAA procedural departures. For example,
we no longer required SV meetings as a grant
requirement. First, because we no longer required
SV meetings in the grant does not mean we
abandoned the notion that these were an expected
practice. In the spirit of performance partnership, the
region has attempted to move beyond "dollars for
widgits" as the basis of its relationship with the state
and do not include all of our expectations as grant
requirements. Secondly, during this time period, EPA
and Massachusetts air personnel were meeting
regularly with representatives from the Attorney
General's Office along with EPA legal staff to discuss
significant CAA violators. In fact the level of
participation at these meetings far exceeded what
occurs in most other state enforcement discussions.
Where this process fell short was not capturing the
universe of CAA violations for discussions. In
addition, it was difficult to capture that universe of
CAA violations, because MADEP had invested in
FIRST related coordination/communication in lieu of
25 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
media specific coordination/communication.
Communication efforts were further hampered by the
existence of four distinct DEP regions with no clear
mechanism for collecting media specific information.
Concerning MADEP's penalty assessment practices, the
Region stated:
The IG audit raised concerns over MADEP's
enforcement approach with regard to economic
benefits. EPA New England has a fundamental
expectation that state enforcement programs will
neutralize economic benefit from noncompliance. It is
not clear from the case files cited that MADEP has an
endemic problem with economic benefits although we
agree with the audit finding that EPA's response
would likely have been different (at least with regard
to gravity based penalty) for cases reviewed. We
believe that closer communications between MADEP
and EPA on all media specific enforcement matters
will address the enforcement issues discussed in the
audit.
During our exit conference, the Region stated that it was not
clear that the facilities cited in our report received an
economic benefit from their violations. However, the Region
agreed that MADEP should document the basis for not only
instances when a penalty was assessed but also for when it
is not assessed.
Finally on the issue of reduced inspections of air major
sources, the Region wrote:
The partial disinvestment in targeting CAA majors
cited in the audit was not an improper departure from
CAA procedures but rather a strategic decision fully
26 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
allowed for under the CAA Compliance Monitoring
Strategy (CMS). At the time, there was general
consensus at EPA New England and its state air
programs that we were overinvesting resources in
major source compliance monitoring. In the spirit of
the Demonstration Grant, we agreed to depart from
the traditional inspection expectation (not
requirement) for major sources. As implementation
progressed, we advised MADEP that there was too
large a disinvestment in major sources (even though
we supported the strategy of investing inspection
resources more broadly than the major expectation).
In fact there is currently programmatic
acknowledgment within EPA that all media CMSs
may now be outdated and require revisitation. This
merely underscores the validity of entertaining
forward looking alternatives to historical compliance
monitoring expectations. There is no current,
universally held expectation on what the right "mix" of
sources to be inspected is.
OIG EVALUATION The Region has taken positive actions to carry out our SV
identification and reporting recommendations. However, we
do continue to recommend that SV discussions be held
within a specific period of time, such as monthly.
We disagree that our report "implies" the Region allowed
"improper" CAA procedural departures. We stated that the
elimination of SV meetings was one of the causes why
MADEP did not identify SVs. We never cited this activity as
27 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
a CAA requirement. We did point out, however, that prior air
media specific grants provided for monthly SV discussions.
We also pointed out that the decline in MADEP's
identification of SVs coincided with the elimination
of this special condition of prior grants. We believed it was
an effective procedure and thus recommended its
reinstatement.
Additionally, we also reported that the lack of communication
on SVs was missing between MADEP's regional compliance
and enforcement staff, and Region 1's air technical staff.
While Region 1 staff may have had several meetings with
the Attorney General's Office and other state staff, obviously
these meetings were not about the identification of SVs;
otherwise, the state would have reported SVs.
Based upon further discussion of the penalty assessment
issue during our exit conference, we revised this section of
the report. We agreed that the state needs to document its
penalty decisions, even when penalties are not assessed.
Finally, we concur with Region 1's plan to increase the
number of major sources to be inspected. We recognize
that in the Demonstration Grant, Region 1 agreed to let
MADEP try an innovative inspection targeting plan because
both parties believed the approach was the most
environmentally effective for Massachusetts in accordance
with the CMS. We did not report Region 1's approval of
MADEP's targeting plan as an "improper departure from
CAA procedures," but rather reported on this change
because it affected the opportunities MADEP had to identify
air major sources which were SVs.
MADEP RESPONSE MADEP agreed that it can improve its reporting of SVs to
EPA and that this effort was underway. Additionally,
28 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
MADEP commented that many states and OECA have
debated EPA's definition of the air program's significant
violator. MADEP believed the OIG misrepresented the SV
identification problem. It stated that 32 files were examined
and 29 did not contain SV reporting problems. MADEP also
believed that the OIG misunderstood its Administrative
Penalty Statute in concluding that violations were not a first
time occurrence and penalties should have been issued
against Major Source #1.
MADEP stated, "Now that the demonstration period is
concluded, DEP plans to inspect at least 1/5 of the universe
of air 'majors' during FY98."
OIG EVALUATION We are pleased that MADEP agreed to improve its reporting
of SVs to EPA. While debate may be ongoing regarding the
appropriate definition of significant violators, MADEP and
other states have a responsibility to carry out the program
activities as currently defined. MADEP was aware of this
fact as evidenced by an earlier statement that even though it
did not agree with EPA's SV definition, it did use it as
criteria.
MADEP was not correct in stating that the OIG examined 32
files and 29 did not contain SV reporting problems. As
previously explained (See page 11), OIG reviewed seven
files from a universe of 14 which had potential to contain an
SV.
As already described in our finding, we continue to conclude
that more than one violation occurred at Major Source #1.
Regarding the appropriateness of a penalty assessment, we
reported that MADEP's files did not document whether a
penalty should have been sought or if there was any
economic benefit to the facility for its noncompliance.
29 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
We concur with MADEP's response to inspect at least 1/5 of
the air major facilities. This will expand MADEP's
opportunities to discover significant violators.
RECOMMENDATIONS We recommend that the Regional Administrator require his
staff to:
1. Amend the current grant and include in future grants,
a special condition for monthly regional/state
meetings to discuss potential SVs and other
enforcement actions.
2. Instruct MADEP to provide your staff with copies of
issued NONs to assist in discussions to determine
when a violating source should be placed on the
"Significant Violator" list.
3. Provide MADEP staff with guidance and training on
how to designate and report violating sources as
"Significant Violator" under the air program.
4. Instruct MADEP to document the files of major
sources showing that consideration was given to
elevating the violation to significant status and a
reason provided for the decision.
5. Negotiate with MADEP an increase in the number of
air major sources to be inspected.
6. Instruct MADEP to document basis for why it did/did
not assess penalties against SVs.
30 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
CHAPTER 3
Region 1 and MADEP Need to Resolve
Database Discrepancies
MADEP under reported the number of enforcement actions
and did not enter any compliance inspection and
enforcement (C&E) data in the AFS in FY 1996. In addition,
we found database discrepancies between the MADEP
databases and Region 1's AFS. The untimeliness of
MADEP data input and the incompatibility of MADEP and
EPA systems contributed to this problem. As a result, EPA
did not have a clear picture of state accomplishments and
was forced to use additional resources to correct the
problem.
The FY 1996 Demonstration Grant stated:
DEP will report compliance and enforcement data
semiannually from FMF incorporating MOS, and will
perform data entry quarterly into SSEIS for upload to
the federal AFS . . . (emphasis added)
C&E Data Under Reported Approximately one-half (40) of the NONs issued to major
and Untimely sources were not entered into AFS until six months after the
fiscal year end. On October 12, 1996, MADEP provided a
disk to Region 1 containing a half year's worth of C&E data
from its Stationary Source Emission Inventory System
(SSEIS) database. MADEP did not update the C&E data in
AFS until March 29, 1997. MADEP maintained two air
program databases. The SSEIS database was the original
31 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
database used by MADEP for AIRS (C&E) data. The
MADEP also maintained multimedia C&E data in its Facility
Master File (FMF) database. MADEP provided semi-annual
paper reports containing C&E data to EPA from the FMF
database. However, these paper reports were not a
substitute for entering data into AIRS. The SSEIS database
was compatible with the AFS; however, the FMF was not
compatible with either SSEIS or AFS. Data could be
transferred from SSEIS to update AFS.
In late spring 1995, a MADEP Assistant Commissioner
instructed the MADEP inspectors to stop entering data into
SSEIS and instead to enter it only into FMF. MADEP staff
wanted to avoid redundant data entering into two state
databases. In order to update AFS, MADEP staff in the
Boston Office manually entered data from FMF into SSEIS.
MADEP provided Region 1 a SSEIS disk on October 12,
1996 which was supposed to contain FFY 1996 C&E data.
Region 1 staff found that the diskette contained only half of
the FY 1996 C&E data and notified MADEP of this under
reporting. In late December 1996 or early January 1997,
Region 1 staff contacted MADEP advising that the National
AFS Database needed to be updated since OECA would be
extracting 1996 C&E data by the middle of January. In order
to update AFS, Region 1 staff requested and MADEP
provided a diskette in ASCII format containing C&E data
from its FMF database. Since the FMF and AFS systems
were not compatible, Region 1 staff with assistance from
one of its contractors converted FMF inspection and
enforcement data but not NONs, and entered it into the AFS
database. Region 1 staff decided not to convert and enter
the NON data into AFS because OECA would not draw
down this data and Region 1 believed that MADEP would
soon be updating AFS.
32 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Additionally, MADEP did not enter any monetary penalty
data into AFS even though state reports showed penalties
were assessed to major sources. (Penalty data was
maintained in a third separate database.) Neither FMF nor
SSEIS contained monetary penalty data fields. Since AFS
was updated electronically from SSEIS, no penalty data was
available to update AFS. MADEP did not receive any
special waivers from Region 1 exempting them from entering
C&E and penalty data into AFS on a quarterly basis.
Regardless of MADEP's concern on maintaining of separate
media databases, MADEP was still responsible for updating
AFS in a timely manner per the grant agreement's special
conditions. The decision to stop entering data in SSEIS was
made with no contingency plan in place to comply with the
EPA grant requiring timely entering of data into AFS. Also,
Region 1 was forced to use additional resources to perform
a task that Region 1 already awarded grant funds to the
state to perform.
Under Reporting a Chronic MADEP's under reporting of enforcement data into
Problem AFS was identified by Region 1 as a problem based on its
1993/1994 monitoring visit. The 1994 monitoring report
stated that under reporting was not a new problem. It
indicated that work groups were established to deal with this
serious problem but efforts to resolve it needed to be
redoubled. The report also contained a section with areas
that warranted special attention. One of these areas
recommended that MADEP and Region 1 make a concerted
effort to solve the enforcement data reporting problems as
rapidly as possible. The section concluded that under
reporting will necessarily result in Massachusetts being
portrayed as conducting less enforcement than it actually
did.
The under reporting problem has not improved. The
33 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
workgroup did not complete its objectives. MADEP's
inadequate data reporting represents a serious and chronic
problem which requires that Region 1's management play a
more aggressive role to ensure that the under reporting
problem is resolved.
Database Discrepancies In addition to under reporting, database discrepancies
between the MADEP database and Region 1's AFS were
noted during our review. In our effort to select a universe of
major sources to review, we found the data contained in
MADEP's database varied from that in Region 1's AFS.
Some examples noted were as follows:
Enforcement Actions
MADEP's database contained information on 80
enforcement actions (NONs) having been issued in
FY 1996, while the AFS database showed 49
enforcement actions (NONs) issued.
Inspections Conducted
MADEP's database showed that 108 inspections of
major sources were conducted in FY 1996, while the
AFS database contained information on 123 major
sources inspected.
Synthetic Minors
In FYs 1995 and 1996, 411 major sources applied for
Restricted Emission Status (RES) or "Synthetic Minor"
classification. There were 279 sources approved as
RES in FY 1995. However, the FMF and AFS
continued to list these sources as majors.
34 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
True Majors
After identifying and subtracting the number of major
sources that were approved as RES or synthetic
minors, we found the MADEP database contained 39
"true majors" having been inspected in FY 1996. The
AFS database contained 54 "true majors" as
inspected in FY 1996.
The following discrepancies were found between the AFS
and FMF databases, specifically, for the MADEP's Northeast
(NERO) and Southeast (SERO) Offices.
NERO Enforcement
There were 14 NONs listed on the AFS but none on
the MADEP listing. Additionally, there were 14 other
NONs on the MADEP listing that were not in AFS.
SERO Enforcement
There were three sources with NONs listed on the
AFS database that were not on the MADEP
database. There were 22 NONs issued to major
sources on the MADEP's database that were not on
the AFS database.
NERO Inspections
The AFS database contained 24 major sources
having been inspected in FFY 1996 which were not in
the MADEP database. There were nine major
inspections in the MADEP's database that were not in
AFS.
35 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
CONCLUSION
REGIONAL RESPONSE
SERO Inspections
The AFS showed 12 inspections of major sources
which were not on the MADEP database. The
MADEP's database contained seven major source
inspections which were not on AFS database.
In December 1996, Region 1 and MADEP started another
work group in conjunction with the Performance Partnership
Agreement with objectives to address reliability questions
between the two databases. One of the work group's initial
tasks was to identify that differences in databases existed.
Also, in March 1997, the EPA Region 1 Administrator sent
each New England state a matrix of its C&E data contained
in AFS to determine its accuracy. The Region 1 and
MADEP workgroup were aware of the database differences
but had not begun to reconcile the databases at the
conclusion of our field visit. The workgroup's agenda
contained a milestone to reconcile the databases on a
quarterly basis, starting in January through October 1997.
Considering the length of time that data reconciliation has
remained a problem, Region 1 and MADEP need to more
aggressively resolve this issue. Region 1 also needs to hold
MADEP accountable for carrying out the grant conditions it
agreed to when it accepted program funding. Inaccurate
data will adversely affect EPA's decision making.
The Region agreed that improvement needed to be made:
Another shortcoming of the region's program
relationship with Massachusetts was our inability to
make SEIS (sic), FMF and AFS communicate
effectively. We concur with your finding that EPA
36
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
New England and MADEP need to resolve database
discrepancies. We appreciate that states such as
Massachusetts may have data needs unaddressable
through EPA data systems and that these EPA
systems may not be optimally designed for the
challenges of the next millennium. However, this
does not eliminate the obligation of a state to
adequately support EPA's data needs. We are
working with MADEP to reconcile the scope and
mechanics of data reporting for our respective
systems.
Because the Region and MADEP were making efforts to
improve the data system, the Region did not intend to
withhold FY 1997 grant funds for nonperformance of data
maintenance.
OIG EVALUATION
We encourage the Region to continue its effort on this issue.
However, we continue to recommend that the Region
withhold grant funds for nonperformance. MADEP agreed to
support EPA's data needs as part of its grant agreement.
Because MADEP did not carry out that responsibility, the
Region had to expend other resources to meet its data
needs.
MADEP RESPONSE
While MADEP agreed that data management could be
improved, it disagreed with the OIG's presentation of the
problems:
The OIG misrepresents DEP's compliance and
enforcement reporting to EPA-New England. All
references by the OIG to DEP's failure to report any
compliance and enforcement data for FFY96 are
37
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
incorrect statements of the facts, ignoring semi-
annual paper reports, and the electronic data
provided, however late. DEP explained to the OIG
that, although internal problems with reporting of
significant violators and with electronic data reporting
exist, semi-annual paper reports of all compliance
and enforcement activities were provided to EPA in a
specially-prepared format to accommodate EPA's
segregated regulatory programs. DEP offered to
reconcile the specific data differences between AFS
and FMF, but EPA-New England failed to provide the
requested AFS data necessary for reconciliation.
That data was recently provided to DEP. Also,
references to the electronic data provided to EPA in
FMF-format which required additional EPA resources
to process are misleading. This data was provided to
EPA on diskette in ASCII format, the universal data
format, at EPA's request. Additional work remains to
be done, largely by DEP, to improve electronic data
reporting.
OIG EVALUATION Our report was amended to show that MADEP submitted
semi-annual C&E paper reports to EPA. However, these
reports were not substitutes for the grant requirement to
update AIRS quarterly. MADEP acknowledged this
requirement since they had staff enter data from FMF into
SSEIS in order to update AIRS. MADEP entered no data
into AIRS in FFY 1996. Incomplete data was entered in
October 1996 and complete data for FFY 1996 was not
entered until March 1997. The lateness in updating AIRS
and the incompleteness of data resulted in enforcement
data being under reported in FFY 1996. In order for data to
be useful it must be current and complete. At the time of our
field work we were advised that no data for FFY 1997 had
been entered into AIRS. This demonstrates a chronic
38 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
problem which needs to be addressed by Region 1.
Regarding reconciliations, MADEP acknowledged that the
necessary AFS report was received and Region 1 stated
that reconciliations are taking place. We trust that the
Region/MADEP partnership will prevent future problems
MADEP claimed it had in obtaining data from Region 1.
MADEP's submission of data in ASCII format rather than
FMF format is irrelevant. The grant agreement required
MADEP to update AIRS. Region 1 in fact updated AIRS
because it had to convert MADEP's ASCII formatted data in
order to get it into AIRS. This was not Region 1's
responsibility.
We concur that MADEP needs to do additional work to
improve data reporting.
RECOMMENDATIONS We recommend that the Regional Administrator:
1. Require MADEP to comply with the grant's reporting
requirements such as entering C&E along with
penalty data into AFS on a quarterly basis.
2. Require MADEP to provide, for Region 1's review,
quarterly reconciliations between AFS and FMF
databases.
3. Consider adjusting the amount of the current MADEP
grant award for not complying with the 1996 grant
requirements. Also, adjust future grant awards if
MADEP's reporting does not improve or begins to
regress.
39 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
40 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
CHAPTER 4
Region 1 Needs to Evaluate FIRST Protocol
While MADEP provided an evaluation of its use of the
FIRST protocol, this evaluation's conclusions were not
definitive. Also, MADEP claimed in its report that a single
inspector was performing the inspections when in fact
staffing varied. FIRST protocol inspections did not include
all minimum requirements for a Level 2 inspection even
though Region 1's State C&E Coordinator said the
inspections were to be equivalent to a Level 2 inspection.
Additionally, MADEP had not established a structured
training program for FIRST inspections and did not maintain
a control system to track training needs. Region 1 allowed
MADEP to conduct its own evaluation and did not plan to
review MADEP's use of the FIRST protocol until 1999. In
our opinion, Region 1 placed too much reliance upon the
state to evaluate itself. Without conducting its own
evaluation, Region 1 cannot be assured that FIRST protocol
inspections met federal inspection standards or adequately
identified significant environmental problems and supported
enforcement actions.
Region 1 and MADEP developed the FIRST protocol, issued
October 31, 1994, for use during multimedia inspections.
Region 1 allowed MADEP to use the FIRST protocol with the
understanding that an evaluation would be conducted. In a
March 29, 1996 Letter from MADEP's Assistant
Commissioner, Bureau of Waste Prevention, to Region 1,
MADEP agreed to perform an evaluation of the FIRST
protocol to determine if its use resulted in :
41 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
. the discovery of significant environmental
problems, and
. sufficient support for enforcement cases.
MADEP included its evaluation of the FIRST protocol in the
April 1997, "An Evaluation of the Massachusetts
Compliance Assurance Demonstration Grant." MADEP's
Deputy Director for Business Compliance Division stated
that Chapter 2, which showed increased enforcement
effectiveness, supported the conclusion that multimedia
inspections were effective. In Chapter 2, "Inspection and
Enforcement Trends and Analysis," MADEP reported that
since 1993 its rate of overall enforcement actions increased
and, except for 1996, its higher level enforcement actions
increased. Chapter 4 also included an evaluation of the
FIRST protocol based upon interviews with MADEP
inspectors.
MADEP Evaluation In our opinion, conclusive support was missing from
Not Conclusive MADEP's April 1997 report to substantiate whether the
FIRST protocol was effective in identifying significant
environmental problems. In the report's Executive
Summary, MADEP stated that significant reporting and
reconciliation difficulties were encountered due to the
installation of a new multimedia data system. As a result,
MADEP claimed an 85 percent accuracy rate for the data
provided. In our opinion, this left too much room for error in
evaluating trends.
Additionally, MADEP did not compare its results to a period
when traditional inspection approaches were used.
MADEP's trends and analysis reports started with 1993,
42 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
when multimedia inspections were implemented on a full
scale basis. Therefore, we do not know the results of
different inspection approaches.
MADEP did not define significant violations. We do not
know what type of violations make up the overall
enforcement rate. A schedule entitled, "Higher Level
Enforcement Rate (as % of inspections)" was also
presented. While MADEP indicated that such actions
represented more serious environmental problems, we do
not know if EPA defined significant violations would be
included in this category. This schedule (exhibit 2-3)
showed increases from 1993 but a decrease in 1996.
MADEP suggested various reasons for changes in
enforcement rates from revised targeting to providing
enforcement training for inspectors. It also wrote, "Note that
the enforcement rate numbers suggest the efficacy of the
FIRST Protocol B at finding environmental problems."
However, MADEP did not explain how it reached this
conclusion. No direct correlation was made showing how a
particular inspection approach resulted in identification of
violations.
MADEP's evaluation was also based upon interviews with
inspectors. However, MADEP qualified its evaluation with:
Any conclusions and findings must be qualified with
an awareness that a great deal of ambivalence
remains in DEP regional offices and among some
DEP staff as to whether any types of multimedia
inspection should have been attempted, and for what
types of regulated entities it is or would be most
effective.
Since MADEP staff may be ambivalent to this new
approach, Region 1 staff may be able to add some
43 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
objectivity to the evaluation.
Because of the above weaknesses in MADEP's evaluation
of its use of the FIRST protocol, we believe Region 1 needs
to conduct its own evaluation. Region 1 should also do its
own evaluation to determine how well FIRST inspections are
meeting federal requirements.
Inspections Not For example, Region 1 staff should determine if FIRST
Level 2 Equivalent inspections were equivalent to EPA's Level 2 inspections as
envisioned. For six inspection reports reviewed, we determined
that one inspection was clearly equivalent to a Level 2 inspection;
one inspection was not equivalent; and the remaining four
inspections did not include adequate documentation to support a
Level 2 designation. In general, those four inspections did not
adequately address control equipment.
Region 1's State C&E Coordinator stated that inspections
using the FIRST protocol were equivalent to Level 2
inspections. EPA's "CAA Compliance/Enforcement
Guidance Manual," Chapter Three, states that Level 2
inspections are considered a compliance determining
inspection.
Additionally, only Level 2 inspections are counted for
reporting purposes in EPA's database. EPA's March 29,
1991, "Revised Compliance Monitoring Strategy" provided
that: "For an on-site visit to a stationary source to be
countered as an inspection, it must meet the minimum
requirements of a Level 2 inspection as determined in "The
Clean Air Act Compliance/Enforcement Guidance Manual"
(Revised 1987),..."
44 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
For the six inspections we reviewed: the one inspection
considered Level 2 equivalent was reported in AFS; the one
inspection not Level 2 equivalent was reported in AFS; and
two of the remaining four were reported in AFS. Thus,
MADEP received equal credit as other states for conducting
Level 2 inspections when in fact not all the minimum
requirements were met.
FIRST Protocol Pilot Region 1's APT Chief said Region 1 would review the FIRST
Identified Problems protocol during its state multimedia enforcement review in
FY 1999. In our opinion, Region 1 allowed too much time to
pass before making its own assessment of a new, innovative
approach. We noted that a 1994 joint EPA/state evaluation
of the FIRST protocol pilot resulted in identifying serious
problems. An August 30, 1994 memorandum from the EPA
Pilot Evaluation Team to MADEP stated:
In general, it appears that the pilot showed that the
protocol failed to meet some, if not all, of its intended
objectives. As a result of field testing, field staff and
their managers felt, to varying degrees, that the
protocol did not save time, did not necessarily help
them to recognize significant violations, and did not
help them more readily recognize pollution reduction
opportunities. Furthermore, pilot staff who felt weak
in a particular program seemed to require more
guidance, not less, and documentation of inspections
with an abbreviated format was found to be
insufficient to support follow-up and impractical
logistically.
MADEP's April 1997 evaluation continued to report similar
concerns. Chapter 4 listed some of the following concerns
regarding use of FIRST inspections:
45 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
- they are not as likely to provide as much depth as
an inspection for a single waste medium;
- they are more time-consuming (takes longer to do
an inspection);
- inspectors are expected to know details of every
program and it is difficult to remain current on so
many regulations;
- they are difficult to perform effectively for large
facilities;
- fewer air and hazardous waste inspections occur
statewide (that is, fewer targeted inspections of EPA
priority sources or "majors" in these programs).
Considering problems identified at the pilot stage continued
into the implementation phase, we believe Region 1 should
become active in reviewing the protocol now.
Inspection Staffing We also found that even though MADEP stated in its April
1997 report that multimedia inspections were to be
performed by a single inspector rather than a team of single-
media inspectors, staffing varied by state regional office.
Interviews with three of the four MADEP Deputy Regional
Directors disclosed that each held a different view on
staffing of multimedia inspections. The NERO Deputy
Regional Director stated that regardless of facility size, only
one inspector was sent, 90 percent of the time. She
believed MADEP intended for a single inspector, not a team
of single-media inspectors, to perform the multimedia
compliance inspections.
The SERO Deputy Regional Director emphatically stated
46 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
that multimedia inspections did not mean one inspector per
facility, but rather one facility would be completely inspected
at one time, even if it took more than one inspector. He said
it would be highly unlikely and in only extremely, rare
instances that one inspector could effectively inspect a
major facility. He also stated that a small facility, with only
one process, could be inspected by a single inspector.
The CERO Deputy Regional Director stated that a major
concern of many inspectors was the expectation that they
will have to be the sole, expert inspector for each multimedia
area. She said inspectors very often went in pairs or sought
the advice of other inspectors to ensure all multimedia
inspections were adequately performed. She stated two
inspectors were used to inspect a large major facility.
However, one inspector may be sent to inspect a facility
which was considered to be a minor source of pollution.
By relying solely on MADEP's April 1997 report, Region 1
did not receive the entire picture of how the FIRST
inspections were implemented. MADEP's report gave the
impression that only one inspector was conducting
multimedia inspections.
Region 1 needs more detailed information to make a better
decision regarding the future use of the FIRST protocol.
Region 1 also needs to evaluate whether MADEP's use of
the FIRST protocol significantly deviated from national
standards. Part of Region 1's responsibilities is to ensure
that national standards are implemented, monitored and
enforced consistently in all the states (EPA Assistant
Administrator's February 21, 1996 memorandum, "Core EPA
Enforcement and Compliance Assurance Functions"). Once
Region 1 has finalized its review, it should seek EPA
Headquarters concurrence for permanent use and the
possibility of encouraging other states to use this new
47 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
approach.
MADEP Needs to
Establish a Structured
Training Program
As part of Region 1's evaluation of MADEP's
implementation of the FIRST protocol, we recommend
that Region 1 review MADEP's training of multimedia
inspectors. MADEP did not have training criteria for
multimedia inspectors and did not maintain training records
or a tracking system to determine inspector training needs.
We believe a more structured training program would assist
the state to effectively and consistently perform multimedia
inspections.
The June 29, 1988, EPA Order 3500.1 provides:
Because State and local personnel perform more
than 80% of all environmental compliance inspections
nationally under delegated or approved programs, it is
essential for EPA to work with the State and local
agencies to help assure that their personnel too
receive ample training and development. Although
this program does not require State/local agencies to
train compliance inspector/field investigators, it does
encourage these agencies to adopt structured
approaches to train their personnel, recognizing
State-specific concerns and the value of alternate
instructional methods, and to use EPA-developed
training materials where appropriate.
The Order further states that: "EPA's training program
recognizes the importance of this mutual relationship in
design and implementation of inspector training."
MADEP's Training Coordinator, Bureau of Waste Prevention
(BWP), stated there was no written criteria or established
48
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
policy for multimedia inspector training. The one exception
was the requirement for multimedia inspectors to attend
health and safety training. He also stated that there was no
state-wide certification program for MADEP multimedia
inspectors. Additionally, MADEP did not control training by
maintaining individual training records or a database tracking
system. Without a structured training program and a
database to track individual training, there was no assurance
that inspectors have been adequately trained to adequately
perform multimedia inspections. Effective inspector
multimedia training is essential to the consistent, thorough
completion of multimedia environmental compliance
inspections.
Multimedia classroom training was conducted initially in
June 1994 and again in November 1994. However, since
November 1994, only written training materials and guidance
documents were provided; no general classroom instruction
was conducted. Classroom training since November 1994
was offered only in media specific topics. Inspectors hired
since November 1994 have been trained almost entirely
through informal, field "mentoring" approaches conducted on
as-needed basis by each state regional office.
We also learned that inspector performance was informally
discussed between the supervisor and inspector without any
formal, written, performance evaluations prepared. Two of
the three Deputy Regional Directors stated that supervisors
should observe inspector performance and prepare formal
written evaluations of the inspector's performance on an
annual basis. Both Deputy Regional Directors believed this
was an excellent way to identify areas where inspectors may
need additional multimedia training. The third Deputy
Regional Director believed the current system was adequate
and additional formal written evaluations were not
necessary.
49 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
According to Chapter 4 of MADEP's April 1997 evaluation
report, inspectors offered the following criticisms:
- not enough training at the time;
- lack of hands-on training at sites in the field;
- lack of training for inspectors who have joined BWP
since the protocol was delivered;
- absence of on-going training;
- lack of follow-up training subsequent to training
sessions;
- training should be geared to different levels of
inspectors;
- training should be interactive;
- inspectors would have liked longer and more
comprehensive training, particularly in the media in
which they were unfamiliar.
Additionally, the MADEP Deputy Director for Business
Compliance Division, Enforcement and Audit Unit; MADEP's
Training Coordinator, BWP; and three MADEP Deputy
Regional Directors expressed the following concerns on
multimedia training:
- there was no written criteria for multimedia training;
- lack of a central/regional data base tracking system
to ensure all training is documented. There was no
central data base tracking system to access an
50 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
inspector's training file to determine what courses an
inspector had completed or needed to complete;
- there were no training records;
- lack of training funds;
- Region 1 had not been responsive to MADEP
training needs;
- needed more in-depth multimedia specific training;
- needed to provide people more skills on how to work
together as a multimedia team;
- needed a larger, better selection of training courses;
and
- difficult to get training in a central training classroom.
CONCLUSION Region 1 needs to conduct its own evaluation of MADEP's
use of the FIRST protocol. MADEP's evaluation did not
provide conclusive evidence that the protocol was more
effective in identifying significant violators, especially as
defined by EPA. Data system weaknesses and MADEP
staff's resistance to accepting the FIRST protocol may have
clouded results. Also, MADEP's evaluation reported that
problems which were identified in the pilot phase continued
into the implementation phase. MADEP's presentation of
how the FIRST inspections were staffed was unclear. An
EPA review should also focus on how well the FIRST
inspections were at meeting federal standards. Our limited
51 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
review showed that FIRST inspections may not be
equivalent to Level 2 inspections. This information is
important to determine program success.
Adequate training of inspectors is another important aspect
in determining how well the FIRST protocol was working.
MADEP did not have training standards, a structured training
program, or controls in place to assure that inspectors
received technical expertise to effectively perform
multimedia inspections. Region 1 and MADEP should work
together to develop a structured multimedia training
program. Such a program would help to ensure the success
of the implementation of MADEP's innovative, inspection
approach.
With an independent evaluation of the FIRST protocol and a
structured training program, Region 1 can assist not only
MADEP, but other states which may wish to learn from this
demonstration program.
REGIONAL RESPONSE In response to our recommendations the Region stated that
it will:
1. commit to its own analysis of the FIRST
protocol in FY 1998;
2. share the results of its review with OECA and
solicit input from appropriate OECA staff and
management;
3. work with MADEP to ensure that there is a
clearly articulated training program in place at
MADEP for inspectors applying the FIRST
52 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
OIG EVALUATION
protocol; and
4. encourage MADEP to document the training
profiles of its inspectors.
Region 1 also stated that it typically includes a discussion of
state assistance needs in the PPA/PPG and will be sure to
include training support in the FY 1998 PPA discussions.
We are pleased with the Region's positive actions to
address our recommendations.
MADEP RESPONSE
MADEP believed that the OIG did not properly interpret
segments of their April 1997 report, An Evaluation of the
Massachusetts Compliance Assurance Demonstration
Grant. Their statement regarding an 85 percent accuracy
rate for data was meant only to "characterize" the data.
Additionally, MADEP's statement that conclusions and
findings must be qualified with an awareness that some
MADEP staff were ambivalent toward the multimedia
approach was not meant to qualify the entire report.
MADEP did not believe it was appropriate to compare an air
quality Level 2 inspection to a multimedia FIRST inspection.
The FIRST protocol was never intended to be equivalent to
a single-medium inspection according to MADEP. It wrote,
"We remain confident that the environmentally significant
issues of regulatory compliance were not overlooked, and
that any potential losses in depth in individual regulatory
programs are out weighed by the benefits of breadth across
all regulatory programs." MADEP also stated that the OIG
did not find that the FIRST protocol was ineffective in
identifying significant violations.
53
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
Finally, MADEP did not agree that a structured, multimedia
inspector training program was warranted. MADEP believed
its initial, multimedia training along with various media
specific and on the job training was sufficient.
OIG EVALUATION We have not changed our conclusion that conclusive
support was missing from MADEP's April 1997 report to
substantiate the effectiveness of the FIRST protocol.
MADEP's conclusions were based upon staff interviews
(which it qualified) and its data (85 percent accurate). Our
finding was not intended to conclude that the FIRST protocol
was or was not effective, only to show that sufficient
information was not available to make a conclusion. This is
one reason why we recommended that Region 1 conduct its
own evaluation. The other reason was to assure that
inspections using the protocol address federal expectations.
Such a federal expectation would be whether or not the
inspections are equivalent to an EPA Level 2 inspection.
Region 1 staff stated that FIRST inspections were to be
"equivalent" to Level 2 inspections. The FIRST inspection
manual provided for a comparable review. Our review
showed that such comparability may not be occurring.
Therefore, we believe the Region should evaluate this issue.
We continue to recommend that MADEP establish a
structured training program. MADEP's lack of training
standards, inspector training records, or a data tracking
system did not provide controls to assure the adequacy of
inspector training. MADEP undertook a significant change in
how inspections were conducted. MADEP's April 1997
report indicated that some personnel had still not bought into
this concept and others believed they needed more training.
In our opinion, a structured training program should help
address these issues.
54 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
RECOMMENDATIONS We recommend that the Regional Administrator require his
staff to:
1. Conduct an independent evaluation of MADEP's use
of the FIRST inspection prior to FY 1999. The
Region should consider efficiency and cost.
2. Discuss the results of your review with EPA
Headquarters and, if applicable, seek Headquarters
concurrence for permanent or expanded use of the
FIRST protocol.
3. Encourage the MADEP to adopt a structured training
program to ensure all inspectors are adequately
trained to perform multimedia inspections.
4. Work with MADEP staff to develop training criteria for
multimedia inspector training.
5. Assist MADEP to develop a database tracking system
to monitor inspector training.
6. Assess MADEP's training needs annually through
state/EPA work group participation. This will help
improve communications between EPA and MADEP.
55 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
56 Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
APPENDIX 1
GLOSSARY OF ACRONYMS
AFS
AIRS
APT
BWP
CAA
C&E
CERO
CMS
EAD
FFY
FIRST
FMF
FMFIA
FY
MADEP
NERO
NESHAP
NON
NSPS
OECA
PPG
SERO
SIP
SSEIS
SV
VOC
Air Facility Subsystem
Aerometric Information and Retrieval System
Air, Pesticides, Toxics
Bureau of Waste Prevention
Clean Air Act
Compliance and Enforcement
Central Regional Office (of MADEP)
Compliance Monitoring Strategy
Eastern Audit Division
Federal Fiscal Year
Facility-wide Inspection to Reduce the Source of Toxics
Facility Master File
Federal Manager's Financial Integrity Act
Fiscal Year
Massachusetts Department of Environmental Protection
Northeast Regional Office
National Emissions Standards for Hazardous Air Pollutants
Notice of Noncompliance
New Source Performance Standards
Office of Enforcement and Compliance Assurance
Performance Partnership Grant
Southeast Regional Office
State Implementation Plan
Stationary Source Emission Inventory System
Significant Violator
Volatile Organic Compound
57
Report No. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
58 ReportNo. 7100305
-------
®/l OFFICE OF INSPECTOR GENERAL
REPORT OF AUDIT
APPENDIX 4
DISTRIBUTION
Headquarters
Office of Inspector General - Headquarters (2421)
Agency Audit Follow up Coordinator (3304)
Agency Audit Follow up Official (3101)
Assistant Administrator for Enforcement and Compliance Assurance (2201 A)
Assistant Administrator for Air & Radiation (6101)
Associate Administrator for Congressional & Legislative Affairs (1301)
Associate Administrator for Communications, Education & Public Affairs (1701)
Associate Administrator for Regional Operations & State/Local Relations (1501)
EPA Library (3403)
EPA Region 1
Regional Administrator
Director, Office of Environmental Stewardship
Enforcement Managers (OES)
Chief, Air, Pesticides and Toxics Office
Audit Coordinator
Other
Office of Inspector General - Divisional Offices
General Accounting Office
59 Report No. 7100305
------- |