STATE REVIEW FRAMEWORK
New Mexico
Clean Air Act
Implementation in Federal Fiscal Year 2017
U.S. Environmental Protection Agency
Region 6
Final Report
July 11, 2019
i

-------
I. Introduction
A.	Overview of the State Review Framework
The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.
Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:
1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards
2.	Promote fair and consistent enforcement necessary to protect human health and the
environment
3.	Promote equitable treatment and level interstate playing field for business
4.	Provide transparency with publicly available data and reports
B.	The Review Process
The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.
The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.
II. Navigating the Report
The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.
2

-------
A.	Metrics
There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.
Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.
B.	Performance Findings
The EPA makes findings on performance in five program areas:
•	Data - completeness, accuracy, and timeliness of data entry into national data systems
•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness
•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)
•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance
•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection
Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:
Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.
Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.
Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.
C.	Recommendations for Corrective Action
Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include
3

-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.
III. Review Process Information
Clean Air Act (CAA)
Initial file selection sent to State July 6, 2018; revised list conveyed July 13, 2018. File review
conducted onsite July 30 - August 1, 2018, by Toni Allen (retired), James Haynes (214-665-
8546), and Lisa Schaub (214-665-8583). EPA Contacts Steve Thompson, Branch Chief (214-
665-2769) and Margaret Osbourne, Section Chief (214-665-6508) NMED Contacts Elizabeth
Bisbee-Kuehn, Bureau Chief (505-476-4305); Ralph Gruebel (505-476-4373), and Tom
Fitzgerald, Data Steward (505-476-4370)
4

-------
Executive Summary
Introduction
Clean Air Act (CAA)
The Round 4 SRF Review of the New Mexico Environment Department (NMED) CAA files
revealed that the agency has continued to enter enforcement MDRs timely, while improving its
timely reporting of HPV determinations and compliance monitoring MDRs, as indicated in the
provided table comparing findings from Rounds 3 and 4. Accuracy of the facility and permit
MDRs continues to require improvement, as does the coverage of compliance evaluations,
review of Title V annual compliance certifications, and the expediency with which high priority
violations are addressed. Since the Round 3 review, NMED's stack test program appears to have
been neglected, with reporting of results no longer occurring.
Areas of Strong Performance
The following are aspects of the program that, according to the review, are being implemented at
a high level:
Clean Air Act (CAA)
•	Reporting of HPV Determinations, compliance monitoring MDRs, and enforcement
MDRs has been timely.
•	The penalty aspect of the AQB's enforcement program is generally meeting expectations.
•	The AQB has performed well at arriving at accurate determinations of violation types,
continuing to meet expectations despite staffing challenges.
Priority Issues to Address
The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:
Clean Air Act (CAA)
• During FY2017, inspection coverage at air facilities fell well below NMED's
commitment levels, due in part to low staffing levels. Similarly, review of ACCs
dwindled.
5

-------
•	There were issues identified with accurate reporting of MDRs, and data on stack tests
have not been uploaded to ICIS-Air.
•	Improvements in the timely identification and reporting of HPVs is needed, along with
developing a CD&RT when warranted.
Metric
Round 3 Finding Level (FY 2013)
Round 4 Finding Level (FY 2018)
2b Timely and accurate
reporting of MDRs
Area for State Improvement
Area for State Improvement
3a2 Timely reporting of HPV
determinations
Area for State Improvement
Meets or Exceeds Expectation
3bl Timely reporting of
compliance monitoring
MDRs
Area for State Improvement
Meets or Exceeds Expectation
3b2 Timely reporting of
stack test dates and results
Meets Expectations
Area for State Improvement
3b3 Timely reporting of
enforcement MDRs
Meets Expectations
Meets Expectations
5a FCE coverage: majors and
mega-sites
Area for State Improvement
Area for State Improvement
5b FCE coverage: SM-80s
Area for State Improvement
Area for State Improvement
5e Review of Title V annual
compliance certifications
Area for State Improvement
Area for State Improvement
10a Timely action taken to
address HPVs
Area for State Improvement
Area for State Improvement
6

-------
Clean Air Act Findings
CAA Element 1 - Data
Finding 1-1
Area for Improvement
Summary:
There were issues identified with accurate reporting of MDRs, and data on stack tests have not
been uploaded to Integrated Compliance Information System for Air (ICIS-Air).
Explanation:
(2b) Discrepancies exist between the facility files reviewed and the data recorded in ICIS-Air. For
the 27 facilities examined, five had Regulatory Subparts listed in their permits which were not
recorded in ICIS. Additionally, Subparts which appear to be applicable but are not reflected in the
Title V permit were identified for five facilities. CAA data issues with missing Subparts were cited
in the previous two SRFs conducted of the NMED. Ongoing staff-retention difficulties likely
contribute to the State's continuing data issues. Recent efforts to cross-train staff to assist with data
management should result in improvements to data quality. (3b2) It appears that stack test data
have not been transmitted to ICIS, only to its predecessor, AIRS Facility Subsystem (AFS), which
was replaced by ICIS-Air in 2014. During the March 2018 Monthly Status call, NMED indicated
that staff would be attending training on Compliance Testing, yet uploading of stack test data to
EPA's database of record has not resumed. In March 2019, Allan Morris, a long-time NMED Air
Quality Board employee assumed the position of Chief of the Enforcement and Compliance
Section, his predecessor Ralph Gruebel having resigned in December 2018. When EPA inquired
about the absence of stack test data in ICIS, Mr. Morris chronicled the history of the State's stack
test program in detail. Over the past four years, increasing numbers of reports being submitted and
inspections requiring completion, particularly of an ever-expanding number of synthetic minors
(SM-80s) permitted in New Mexico, have exacerbated the issue of staying abreast of required
compliance monitoring activities, made more challenging when struggling to fill vacant staff
positions. Receiving authorization to increase the pay grades of some positions has improved the
ability to attract qualified candidates and should improve retention, yet the volume of work seems
to exceed staff capacity. Investing in previously explored software solutions to triage the
electronically submitted stack tests so personnel can focus on those which appear to be problematic
might help maximize the efforts of the available personnel resources.
State Response:
The New Mexico Environment Department (NMED) continues to experience challenges in two
functional areas related to issues identified in Finding 1-1. These are: 1) apparent flaws in the
electronic data transfer (EDT) application used to update regulatory subpart data recorded in ICIS-
Air from NMED's Idea/Tempo database, and 2) inconsistent update of applicable regulatory
subparts in the NMED Idea/Tempo database. To mitigate these challenges NMED is implementing
the following: 1) A contract for enhancements to the NMED EDT application to facilitate
identification and correction of data transfer defects is included in the State's 2020 FY budget. A
statement of work for the contract is under development and EDT enhancements should be
7

-------
completed by the end of the fiscal year. In the immediate future, all NMED - Air Quality Bureau
compliance staff will be directed to evaluate and confirm regulatory subpart applicability as part
of the on site / off site inspection and compliance report review processes. In addition, staff
assigned monthly ICIS-Air reporting responsibilities will complete manual corrections as
identified during facility compliance reviews. This requirement will be added to internal SOP's by
August 30, 2019. Effective immediately, whenever regulatory subpart applicability corrections are
implemented in ICIS-Air, Lisa Schaub at EPA R6 will be emailed notifications. 2) In collaboration
with the NMED - Air Quality Bureau Permitting Section, a process / SOP for improvement of
applicable regulatory subpart accuracy will be completed and implemented by August 30, 2019.
As described in the Finding 1-1 "Explanation," NMED management has recently become aware
of a significant deficiency in ICIS-Air reporting of compliance test document review/processing
data. In fact, for several years prior to November 2018, only incidental, cursory compliance
evaluation of air quality test document submittals was undertaken by NMED staff. In March 2019,
NMED management completed a review of current practices and confirmed that no standard
procedure existed for reporting of compliance test document review results to ICIS-Air. No feature
had been included in the existing NMED EDT application to facilitate test document review data
transfer; although, one NMED position has been dedicated to routine review of test documents
since late fall, 2018. As a long-term solution to this data transfer issue, NMED will include
components to implement test document review data transfer in the FY 2020 EDT enhancement
contract. These components are expected to be functional by the end of the 2020 fiscal year. In the
short term, immediate action will be initiated to manually enter test document review data in ICIS
- Air to achieve monthly reporting requirements. A plan detailing these two actions will be
prepared and submitted to EPA R6 by August 30, 2019. By December 31, 2019, NMED will
submit to EPA R6 an ICIS report demonstrating that test document review data has been entered
in ICIS-Air for the previous six-month period.
Recommendation:

-------
Rec
#
Due Date
Recommendation
1
08/30/2019
EPA suggests that NMED develops a Standard Operating Procedure
(SOP) to compare the various permits with the permit data recorded in
ICIS when reviewing a facility's compliance status, followed by
completing any indicated corrections. Making review of these data part
of routine compliance reviews may also ensure updates have been
accurately made. The SOP for checking the permit requirements
against those recorded in ICIS should be submitted and approved by
EPA by the provided date.
2
10/31/2019
Documentation of truthing the air regulation subparts recorded in ICIS-
Air against the facility's permit(s) (such as by emailing screen shots of
the applicable subparts in ICIS along with the permit summaries)
should be provided each month for two months after the approval of
the SOP.
3
08/30/2019
The transmission of stack test results to EPA's current database of
record, ICIS-Air, should resume as soon as possible. A plan to resume
this function should be developed upon finalization of this report and
submitted to EPA for review.
4
12/31/2019
The plan to provide stack test data in ICIS-Air should be fully
implemented by the end of the calendar year, as confirmed by NMED's
running an ICIS Stack Test Report to show that data from the latter
half of the calendar year are present, and submitting the report to EPA.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
2b Files reviewed where data are accurately
reflected in the national data system [GOAL]
| 100%

11
27
40.7%
3b2 Timely reporting of stack test dates and
results [GOAL]
I 100%
67.1%
0
0
0
CAA Element 1 - Data
9

-------
Finding 1-2
Meets or Exceeds Expectations
Summary:
Reporting of HPV Determinations, compliance monitoring MDRs, and enforcement MDRs has
been timely.
Explanation:
The reporting of air compliance and enforcement data in ICIS-Air has been promptly executed by
the State since the last SRF. EPA commends the improvements in timely data reporting.
State Response:
NMED appreciates the timely and expert guidance provided by EPA R6 staff to assist in accurate
HPV and MDR data reporting. Full implementation of the NMED electronic data transfer (EDT)
application has also enabled the Air Quality Bureau to realize a substantial improvement in this
element.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
3a2 Timely reporting of HPV determinations
[GOAL]
| 100%
40.5%
1
1
100%
3b 1 Timely reporting of compliance
monitoring MDRs [GOAL]
| 100%
82.3%
57
60
95%
3b3 Timely reporting of enforcement MDRs
[GOAL]
	 '
| 100%
77.6%
44
44
100%
CAA Element 2 - Inspections
Finding 2-1
Area for Improvement
Summary:
During FY2017, inspection coverage at air facilities fell well below NMED's commitment levels,
due in part to low staffing levels. Similarly, review of ACCs dwindled.
Explanation:
10

-------
(5a/5b) The State has not been completing the required number of FCEs for both majors/mega-
majors and SM-80s. In Fiscal Year 2017, only 47.8% of majors received FCEs while inspection
of SM-80s fell to 11.1%. (5e) ACC reviews dropped to 14.5%, with only 19 of 131 ACCs being
reported as reviewed. During on-site discussions, it became apparent that those ACCs reviewed as
part of an FCE have not been reported as reviewed in ICIS, further exacerbating the low coverage
likely resulting from limited staffing. (6a/6b) Some inspection reports did not discuss all pertinent
aspects of the semi-annual reports while others did not evaluate all applicable Subparts. In this
limited file review, EPA found one instance where no report had been written, and two others
where reports were not finalized. These occurrences are symptomatic of the trouble with retention
that the AQB has been experiencing. The Air Quality Bureau has successfully reclassified several
of their positions, improving the associated pay grade and concomitantly enticing a more qualified
applicant pool. Improved compensation may also increase retention, ultimately benefiting the
efficiency and quality of the enforcement and compliance work of the AQB.
State Response:
Despite noble efforts on the part of NMED - Air Quality Bureau Compliance and Enforcement
(C&E) Section management, the section experienced continuing staff turnover in FFY 2017 with
a corresponding loss of the knowledge and experience base that facilitates a successful compliance
monitoring program. With several new compliance specialists and a new compliance supervisor
coming on board during the FFY, the section struggled to meet CMS Plan and ACC review
commitments. Unintended inconsistency in new staff training and implementation of standard
procedures between three compliance managers led to substantial variance in work quality and
productivity expectations for section compliance staff. An apparent misunderstanding on the part
of one or more managers within the section led to a realignment of responsibilities of compliance
staff and lower emphasis on commitments for ACC review to meet the September 30, 2017,
deadlines. In the 1.5 years since the end of FFY 2017, the NMED - Air Quality Bureau Compliance
and Enforcement Section has again experienced substantial staffing changes, including placement
of a new section chief and three new staff managers. Several capable new line compliance staff
members have been added to the section. A new compliance inspections staff manager is closely
monitoring inspector CMS Plan achievements, providing consistent process training and guidance
to direct reports and implementing an improved task completion tracking system. This tracking
system incorporates an existing timeline for inspection report completion and area of concern
referral. The compliance inspections staff manager is also collaborating with the C&E section chief
to implement a process for completion of the 42 off site FCE inspections approved with the State's
FFY 2019 CMS Plan. The process includes temporary integration of several compliance reports
staff into the inspections team to complete off site CMS FCE's of natural gas compressor stations
on the FFY 2019 Plan. In the course of FCE completion, all involved personnel will ensure that
associated ACC's and other current compliance reporting documents are fully reviewed as part of
the FCE process. If C&E Section staffing continues at anticipated levels for the rest of the 2019
FFY, a CMS inspection achievement rate near 75% should be realized and subsequently reported
in the ADMA in early 2020. As of April 1, 2019, all compliance reports staff were directed by
their staff manager to refocus their primary work activities to ACC review. In addition, all
compliance personnel have been asked to enter data for all compliance report reviews into the
State's Idea/Tempo database upon completion of these tasks. Report review and deviation data is
currently transferred to ICIS - Air by the NMED EDT application. A new compliance reports staff
manager, who is optimistically expected to be on board at NMED - AQB by June 30, 2019, will
11

-------
be charged with development of a new instrument for tracking of compliance report reviews,
including those completed by members of the compliance inspections group. This instrument will
be available for review by Lisa Schaub of EPA R6 by August 30, 2019, along with a current FCE
inspection / inspection report completion document. Additional actions under consideration by the
NMED - Air Quality Bureau to ensure long-term achievement of metrics for CAA Element 2
include the following:
•	Utilization of professional contracted services for compliance report review and selected
components of the compliance inspections program.
•	Depending on final results of the joint April 2019 EPA-NMED upstream oil and gas
facility inspections project, development and submittal of an Alternative CMS Plan by the
AQB C&E Section for review and approval action by EPA R6.
Recommendation:
Rec
#
Due Date
Recommendation
1
08/30/2019
:
A procedure should be put in place to ensure that staff managers track
the progress of reports, facilitating report completion in the event of
staff departure.
2
03/31/2020
Completion of 71% of the annual commitment of FCEs (for FFY2019),
as reflected in the Annual Data Metric Analysis (ADMA), by the
recorded deadline is requested, after the data freeze for FY2019.
3
03/31/2020
To address the low percentage of ACCs showing as having been
reviewed, it is recommended that all ACC reviews, both those done
independently and as a component of an inspection, be documented in
ICIS. The percentage, as determined by the ADMA, should rebound to
71% by the recorded deadline.
;
Relevant metrics:
12

-------
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
5a FCE coverage: majors and mega-sites
[GOAL]
100%
88.7%
22
46
47.8%
5b FCE coverage: SM-80s [GOAL]
100%
93.7%
3
27
11.1%
5c FCE coverage: minors and synthetic
minors (non-SM 80s) that are part of CMS
plan or alternative CMS Plan [GOAL]
100%
85.8%
0
0
0
5e Reviews of Title V annual compliance
certifications completed [GOAL]
100%
76.7%
19
131
14.5% !
6a Documentation of FCE elements [GOAL]
100%

8
13
61.54%
CAA Element 3 - Violations
Finding 3-1
Area for Improvement
Summary:
Accurate reporting of case information in ICIS remains a challenge.
Explanation:
(7a) About 68% of the cases reviewed in this Round exhibited inaccurate or incomplete
information: incorrect dates were recorded in ICIS-Air or improper compliance determinations
made for several, while there was one FCE for which no report was written and two other instances
where the reports were not finalized. These problems are likely associated with the high turnover
and low staffing levels experienced by the Department over the last several years. Note that the
tracking by managers recommended under Finding 2-1 should minimize the occurrence of reports
which are not finalized (one third of the incidents in this metric). In contrast, the 2014 SRF Report
showed 100% accuracy for this measure. SOPs developed to assure agreement in the Subparts
between the permit, ICIS-Air, and what is reviewed for the evaluation of compliance,
recommended for finding 1-1, should also improve the ability of the Department to reach accurate
compliance determinations.
State Response:
NMED - Air Quality Bureau is undertaking an evaluation of the bureau's current enforcement
program, including case data reporting in ICIS-Air. Inaccurate case data reporting in FFY 2017
may have been partially the result of inadequate training of enforcement personnel. NMED - Air
13

-------
Quality Bureau management also agrees that development of a detailed SOP for ICIS reporting of
enforcement data is essential to ensure consistency going forward. The enforcement staff manager
will lead development of the new SOP and create a spreadsheet as described in Finding 3-1,
Recommendation #1 of CAA Element 3 -Violations.
Recommendation:
Rec
#
Due Date
Recommendation
1
08/30/2019
A table of the terms used in TEMPO, the corresponding fields in ICIS,
and a description of each milestone (date) should be developed for use
by staff completing the evaluations as well as those responsible for
data entry and quality assurance. Review and approval of the SOPs by
EPA are needed.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State |
% |
7a Accurate compliance determinations [GOAL]
100%

19
28
	1
67.9% 1
CAA Element 3 - Violations
Finding 3-2
Meets or Exceeds Expectations
Summary:
The AQB has performed well at arriving at accurate determinations of violation types, continuing
to meet expectations despite staffing challenges.
Explanation:
(8c, 13) From the files reviewed, the accuracy rate of HPV determinations has been calculated at
86.7% based on a pool of 15 (multiple years reviewed, and includes both FRVs and HPVs), and
the timeliness of reporting determinations to ICIS-Air at 100%, although only one HPV was
reported during fiscal year 2017. The attention to accuracy in these high-priority violations despite
other obstacles is appreciated. Note that cases from 2016 were included in the file review to have
a large enough sampling with violations. (7al, 8a) The low discovery rates of HPVs and FRVs
during the reporting period, might seem to indicate good facility compliance in New Mexico.
14

-------
However, because these are both calculated as a percentage of the total number of facilities rather
than a percentage of inspections or reports completed, these low rates are likely due in part to the
limited number of inspection reports completed in FY2017. The HPV discovery rate was higher
in 2014 (6.5%) when a greater number of inspections were completed. Note that discovery rates
are support metrics and as such do not have associated goals.
State Response:
Air Quality Bureau will evaluate current HPV and FRV identification procedures to ensure that
accuracy in reporting is maintained.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
13 Timeliness of HPV Identification [GOAL]
100%
87.7%
1
1
100%
7al FRV "discovery rate" based on
inspections at active CMS sources

6.2%
13
362
3.6%
8a HPV discovery rate at majors

2.3%
0
175
0%
8c Accuracy of HPV determinations [GOAL]
100%

13
15
86.67%
CAA Element 4 - Enforcement
Finding 4-1
Area for Improvement
Summary:
Improvements in the timely identification and reporting of HPVs are needed, along with
developing a CD&RT when warranted.
Explanation:
(10a) Of the 7 HPV cases reviewed, only two were either timely addressed or a Case Development
and Resolution Timeline (CD&RT) was in place. Contributing to this lack of timely handling of
HPVs is the need to ensure that the Day Zero and Discovery Dates follow EPA's HPV Policy. If
the Day Zero is set later than it should be according to the Policy, staff may believe they are not
yet in need of a CD&RT when it is in fact required. The 2014 HPV states, "Day Zero will be
deemed to have occurred on the earlier of either (1) the date the agency has sufficient information
to determine that a violation occurred that appears to meet at least one HPV criterion or (2) 90
15

-------
days after the compliance monitoring activity that first provides information reasonably indicating
a violation of a federally enforceable requirement." For the Gissler and Jackson Tank Batteries,
the Discovery Date should have been the date the permit staff determined the NSR permit was
complete, and the notification of the permit violation occurred in December 2014, meaning a Day
Zero of no later than March 2015. However, the Discovery Date was noted as May 15, 2015, and
the Day Zero August 13, 2015. Violations at three of these facilities were addressed in a single
settlement, with the owner's disputing some of the violations, which likely increased the time it
took for the violations to be addressed. Additionally, self-reported violations need to be evaluated
more promptly. For example, the Excess Emissions Reporting has been reviewed with a frequency
of every four months. Violations may have occurred well before the time they were reported, so
not only is the Date of Discovery (when the information was reported to the agency's website)
possibly several months before the data is manually reviewed, the time remaining to get an
enforcement action in place is short due to New Mexico's statute of limitations of one year after
the violation. (14) Those 5 HPVs which were not addressed timely needed to have a CD&RT in
place; this transpired in only one instance. In the case of Jal No. 3 Gas Plant, the violation was
incorrectly identified as an FRV rather than an HPV. Therefore, while a penalty was assessed and
collected, it was not tracked using the HPV schedule. Note that for the Gissler and Jackson Tank
Battery cases, the addressing action was in place timely when using the Day Zero in August 2015,
which was identified by NMED, yet a CD&RT should have been developed since the Day Zero
should have been set 5 months earlier than it was.
State Response:
An apparent misinterpretation of HPV identification policy and procedures in 2016 - 2018 may
have caused the failure of the NMED - Air Quality Bureau enforcement group to appropriately
and consistently identify and pursue HPVs in compliance with EPA criteria and procedures for
HPV identification. In addition, limited experience levels of critical, decision-making staff in the
enforcement group likely caused increased errors in management of HPVs, including the incidents
described in the "Explanation" above. By July 1, 2019, training on determination of violation
discovery dates will be provided to all Compliance & Enforcement Section personnel. As of April
1, 2019, the Staff Manager, Enforcement at NMED - Air Quality Bureau has been instructed to
immediately evaluate all referred cases for potential classification as HPV's and follow EPA's
guidance and timelines for HPV task completion. Formal training in HPV identification and
management that meets EPA standards will be provided to all enforcement group staff by October
31, 2019. NMED - AQB will submit a memo documenting completion of HPV training to Lisa
Schaub or other appropriate official at the EPA R6 office by October 31, 2019.
Recommendation:
16

-------
Rec
#
Due Date
Recommendation
1
10/31/2019
Providing additional training to enforcement and compliance staff
regarding the table of HPV timeline dates developed for Finding 3-1
and the need for a CD&RT, as documented by a memo to EPA
indicating that the training has been administered, should be completed
by the appropriate managers as early as practicable.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
10a Timeliness of addressing HPVs or
alternatively having a case development and
resolution timeline in place
100% |
2
7
28.57%
lOal Rate of Addressing HPVs within 180
days
J 63.7%
1
1
100%
14 HPV case development and resolution
timeline in place when required that contains
required policy elements [GOAL]
100% J
1
5
20%
CAA Element 4 - Enforcement
Finding 4-2
Area for Attention
Summary:
In a few of the enforcement actions analyzed, instances of companies applying for relaxed permits
rather than improving their processes or controls were identified.
Explanation:
Most of the enforcement cases reviewed did appear to result in the subject facilities coming into
compliance. However, in 3 of the 15 enforcement actions examined, the AQB's typical approach
of requiring the regulated entity to propose how they would come into compliance seemed to spur
their application for a modified permit to allow them to continue their emissions rather than
seeking a remedy which would reduce their emission risk. For example, the Targa - Monument
Gas Plant indicated in their NOV response that they were submitting a permit application to
17

-------
authorize "malfunction emissions." It is EPA's understanding that such permit modifications have
been granted. The Board may wish to have an internal dialog as to whether this is a burgeoning
issue as no such observations were made during the previous SRF.
State Response:
NMED - Air Quality Bureau management, including the Compliance & Enforcement Section
Chief, will initiate an internal review to determine whether relaxation of permit requirements is a
pervasive issue that results from improper settlement of NOVs. If this is the case, remedial action
will be immediately initiated to avoid recurrence in on-going and future NOV settlements. The
review and remediation process, as required, will be completed by July 31, 2019.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
lObl Rate of managing HPVs without formal
enforcement action

12.9%
0
1
0%
9a Formal enforcement responses that include
required corrective action that will return the
facility to compliance in a specified time frame
or the facility fixed the problem without a
compliance schedule [GOAL]
100%
J 12
15
80%
CAA Element 5 - Penalties
Finding 5-1
Meets or Exceeds Expectations
Summary:
The penalty aspect of the AQB's enforcement program is generally meeting expectations.
Explanation:
(12a and 12b) The examined files demonstrated that assessed penalties are being collected and
differences between initial and final penalty are appropriately documented. Note that there was
one instance (Burnett Oil) where the penalty was calculated and later withdrawn by the AQB.
State Response:
The NMED - Air Quality Bureau is conducting an evaluation of all enforcement procedures,
including assessment and collection of fair and appropriate penalties as part of the NOV program.
A thorough review of the current (2016 revision) Civil Penalty Policy (CPP) is underway. The
18

-------
CPP will be revised if one or more components are found to be inadequate to meet current
programmatic standards.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]
100%

4
4
100%
12b Penalties collected [GOAL]
100%

11
11
100%
CAA Element 5 - Penalties
Finding 5-2
Area for Attention
Summary:
Disparity between HPV and FRV cases in the inclusion of all appropriate calculations was found
during the file review.
Explanation:
(11a) Review of the selected penalty orders revealed the State followed the HPV Policy and
considered gravity of the violation as well as economic benefit in instances of HP Vs. Gravity and
economic benefit calculations are less consistently incorporated for Federally Reportable
Violations, and were only found in half of the FRV files reviewed, even though the State's penalty
policy requires their application for all assessed penalties.
State Response:
As part of the enforcement program evaluation described in the State's response to CAA Element
5 - Penalties, Finding 5-1, the NMED - Air Quality Bureau is completing an analysis of the validity
of the C&E Section's current FRV identification procedures, including consistency of application
of gravity and economic benefit components of the State's CPP. A trend toward exclusive
utilization of the "Alternate Penalty Calculation" ("3-2-1") method in the 2016 CPP is causing
further management concern about proper adherence to Policy provisions. As patterns of non-
adherence are identified in the current program review process, remedial actions will be
implemented by the Enforcement Staff Manager and/or revision of the CPP completed, as
necessary to achieve full compliance with EPA civil penalty standards.
Relevant metrics:
19

-------
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
%
1 la Penalty calculations reviewed that document
gravity and economic benefit [GOAL]
100%

8 | 1 1
72.7%
20

-------
STATE REVIEW FRAMEWORK
City of Albuquerque, New Mexico
Clean Air Act
Implementation in Federal Fiscal Year 2017
U.S. Environmental Protection Agency
Region 6
Final Report
July 17, 2019
21

-------
I. Introduction
A.	Overview of the State Review Framework
The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.
Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:
1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards
2.	Promote fair and consistent enforcement necessary to protect human health and the
environment
3.	Promote equitable treatment and level interstate playing field for business
4.	Provide transparency with publicly available data and reports
B.	The Review Process
The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.
The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.
II. Navigating the Report
The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.
22

-------
A.	Metrics
There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.
Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.
B.	Performance Findings
The EPA makes findings on performance in five program areas:
•	Data - completeness, accuracy, and timeliness of data entry into national data systems
•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness
•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)
•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance
•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection
Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:
Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.
Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.
Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.
C.	Recommendations for Corrective Action
Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include
23

-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.
Executive Summary
Introduction
Clean Air Act (CAA)
Review of selected subset of the enforcement and compliance records of the City of
Albuquerque's (COA) Air Quality Program by the Environmental Protection Agency (EPA)
revealed improvement in their timely reporting of compliance monitoring minimum data
requirements (MDRs) and stack test dates and results to the Integrated Compliance Information
System (ICIS), as well as in the documentation of compliance evaluation elements. Opportunities
for continued improvement were found in the recording of regulatory subparts applicable to the
facilities under the purview, as well as in the timely reporting of enforcement MDRs.
Areas of Strong Performance
The following are aspects of the program that, according to the review, are being implemented at
a high level:
Clean Air Act (CAA)
•	The COA exhibits continued excellence in arriving at appropriate compliance
determinations.
•	The city accomplished fulfillment of its compliance monitoring strategy (CMS) plan in
FY2017.
Priority Issues to Address
The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:
Clean Air Act (CAA)
24

-------
• The City of Albuquerque (COA) continues to have some shortcomings in the areas of
timely and accurate reporting of Minimum Data Requirements (MDRs).
Metric
Round 3 Finding Level (FY 2013)
Round 4 Finding Level (FY 2018)
2b Files reviewed where
data are accurately
reflected in the national
data system
Area for Improvement
Area for Improvement
3bl Timely reporting of
compliance monitoring
MDRs
Area for Improvement
Meets or Exceeds Expectation
3b2 Timely reporting of
stack test dates and
results
Area for Improvement
Meets or Exceeds Expectation
3b3 Timely reporting of
enforcement MDRs
Area for Improvement
Area for Improvement
5e Reviews of Title V
annual compliance
certifications completed
Area for Improvement
Area for Attention
6a Documentation of FCE
elements
Area for Improvement
Meets or Exceeds Expectation
7bl Violations reported
per informal actions
Area for Improvement
n/a
25

-------
Clean Air Act Findings
CAA Element 1 - Data
Finding 1-1
Meets or Exceeds Expectations
Summary:
The City met expectations for prompt data entry with respect to High Priority Violation (HPV)
identification, compliance monitoring MDRs, and stack test information.
Explanation:
(3a2) One HPV was identified during the fiscal year and was entered into ICIS timely. (3b 1 and
3b2) EPA appreciates COA's progress in entering compliance monitoring MDRs and stack tests
plus their results in a timely fashion. While the prior SRF found only 43.8% of these MDR data
were entered timely, the FY2017 numbers indicate 87.5% were input to ICIS within the requested
time frame. Similarly, the stack test date entry statistics improved from 58.8% to 88.9%.
State Response:
Although we have had challenges with turnover, both with our data steward, inspectors and permit
writers, all of whom play a role in MDRs, we continue to make efforts to document and clearly
define our data entry process through our ICIS QAPP and associated ICIS standard operating
procedures.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
3a2 Timely reporting of HPV determinations
[GOAL]
100%
40.5%
1
1
100%
3b 1 Timely reporting of compliance
monitoring MDRs [GOAL]
100%
82.3%
12
14
85.71%
3b2 Timely reporting of stack test dates and
results [GOAL]
100%
67.1%
8
9
88.89%
CAA Element 1 - Data
26

-------
Finding 1-2
Area for Improvement
Summary:
The City of Albuquerque (COA) continues to have some shortcomings in the area of timely and
accurate reporting of Minimum Data Requirements (MDRs).
Explanation:
(2b) In the files reviewed from FY2017, there were omissions of some applicable Subparts in ICIS,
EPA's database of record, for half of the facilities. Previous SRFs found similar discrepancies
between the air program and/or Subpart data and the information reported AFS, the previous
database of record. (3b3) Historically, timely entering of enforcement data has been somewhat
problematic for the City. In the last SRF, 2/3 of the data were entered timely, while in both FY2016
and FY2017 none of the data were entered into ICIS timely. During FY2015, there was only one
enforcement action reported, and it was entered in the system timely.
State Response:
(2b)- In 2018, the City created the ICIS Entry Tracking report for the purpose of assisting with
timely and accurate entries into ICIS. We believe the continued use and supervisory oversight of
the report will increase our entry timeliness and accuracy. The City agrees to Recommendation 1
below, and believes this will assist in meeting this goal. (3b3) - As previously stated, the ICIS
Entry Tracking report has been created, and an ICIS Entry Summary report will be created. The
summary report will show where in the process, from ICIS entry request to ICIS entry, any
bottlenecks that may be occurring that could be impacting timeliness. The City believes the
continued familiarity through the use of the tracking report and subsequent summary report by the
data steward, inspectors and supervisors will help to increase the timeless of data entry. The City
agrees to the goal set out in Recommendation 2 below.
Recommendation:
27

-------
Rec
#
Due Date
Recommendation
1
10/31/2019
To address the issue of discrepancies between the air program and
subparts in ICIS and those in the facility's permit (metric 2b), it is
suggested that the City's new ICIS data steward begin providing
inspectors with the list of programs and their subparts as recorded in
ICIS for them to check for accuracy while they are reviewing the file
for each inspection. EPA will work with the data steward to determine
a procedure for providing this information. After the end of FY2019,
the City is requested to provide documentation that the MDR data in
ICIS for those facilities which have been inspected in the last quarter
of the federal fiscal year are in agreement with the file information,
such as providing pdfs of the permit subparts as they appear in ICIS
along with documentation of the applicable subparts from the permit
for each of the facilities inspected during the last quarter of FY2019.
2
10/31/2019
(3b3) EPA recognizes that the COA has recently experienced turnover
in the position of ICIS-Air data steward. To improve performance with
respect to this metric, it is recommended that the standard operating
procedure (SOP) for inspectors' transmitting of the data to the steward
for entry be reviewed for possible opportunities for increased
efficiency. Progress toward meeting the 100% goal will be discussed
during the monthly EPA/COA status calls, with monitoring of the Data
Metric Analysis (DMA) available in the SRF section of Enforcement
and Compliance History Online (ECHO). Meeting the 85% timely
mark by the due date (10/31/2019) is requested.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
2b Files reviewed where data are accurately
reflected in the national data system [GOAL]
100%

6
12
50%
3b3 Timely reporting of enforcement MDRs
[GOAL]
100%
77.6%
0
6
0%
CAA Element 2 - Inspections
28

-------
Finding 2-1
Meets or Exceeds Expectations
Summary:
The city accomplished fulfillment of its compliance monitoring strategy (CMS) plan in FY2017.
Explanation:
(5a and 5b) EPA congratulates the COA on its continued successful completion of all FCEs - for
majors, mega-majors, and synthetic minors - required to meet its CMS plan. Concomitantly, all
FCE elements (element 6a) were documented and the reviewers found that the vast majority of the
compliance monitoring reports (CMRs) or facility files contained sufficient documentation to
support the compliance determinations made (86.7%, element 6b).
State Response:
(6b)- In an effort to improve on our compliance determinations, the City requests to know what
was lacking/needed in the two CMRs that didn't have sufficient documentation.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
5a FCE coverage: majors and mega-sites
[GOAL]
100%
88.7%
6
6
100%
5b FCE coverage: SM-80s [GOAL]
100%
93.7%
6
6
100%
6a Documentation of FCE elements [GOAL]
100%

13
13
100%
6b Compliance monitoring reports (CMRs)
or facility files reviewed that provide
sufficient documentation to determine
compliance of the facility [GOAL]
100%

13
15
86.67%
CAA Element 2 - Inspections
Finding 2-2
Area for Attention
Summary:
29

-------
Completion and reporting of ACC reviews appear to be a weakness in the City's enforcement
program from the data metric, but further review reveals no issue.
Explanation:
EPA Response: (5e) Review of Title V annual compliance certifications (ACCs) appears to be
problematic for the City at times over the past several years when simply considering metric 5e.
88.9% of reviews were completed per the metric in FY2015, while in both the previous SRF, at
33.3%, and the present SRF, at 25%, the number of reported reviews as documented by the metric
fell below expectations. EPA Response to City's Comment: Upon more in-depth scrutiny of the
ACCs marked as Not Reviewed in the Data Metric Analysis for FY2017, it appears that 3 of the 6
were reviewed in October, just after the close of the Federal Fiscal Year and two to three months
after receipt of the ACC, similar to PNM Reeves which was received in July and reviewed in
December. Another, the Southside Water Reclamation Facility, was reviewed timely but was
entered into ICIS late due to pending litigation. However, the University of New Mexico, appears
to have been reviewed on its regular cycle late in the year of both 2016 and 2017. Therefore, EPA
finds that the required reviews have been completed. More rapid turnaround on review and
reporting would improve the metric percentage.
State Response:
(5e) - Following the receipt of this draft SRF report, the City reviewed ICIS to view which and
how many ACCs had been entered as reviewed in ICIS. For ACCs due in 2017, the City found
that six (6) of the (8) ACCs had been entered into ICIS as "Reviewed" with the date of review
included. One (1) of the ACCs, ABCWUA, had been reviewed as part of an ongoing enforcement
action, but had not been entered into ICIS. This ACC will be entered into ICIS as reviewed with
its review date. The last ACC, Bimbo Bakeries USA Inc., became a synthetic minor source in
2018. In ICIS, it appears that all the Title V records have been associated with its synthetic minor
source records. Under the source's synthetic minor source records, ICIS shows that the 2017 ACC
was reviewed and included the review date. The City would be interested in reviewing these entries
with EPA to verify our findings.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
%
5e Reviews of Title V annual compliance
certifications completed [GOAL]
100%
76.7%
2 | 8
25%
CAA Element 3 - Violations
Finding 3-1
Meets or Exceeds Expectations
30

-------
Summary:
The COA exhibits continued excellence in arriving at appropriate compliance determinations.
Explanation:
(7a) From the files reviewed, it appears that COA has made appropriate compliance determination,
with the possible exception of GCC Rio Grande-Tijeras Plant. In this instance, there was one
subpart which was not evaluated (NSPS F), so an area of noncompliance could have potentially
been overlooked. The DMA numbers reveal discovery rates for HPVs and FRVs to be slightly
below average at 4.20 % and zero, respectively. Because EPA's review found few instances of
missed violations, it is evident that the discovery rate may be indicative of a good record of
compliance among the facilities in the Albuquerque area.
State Response:
(7a)" The City attributes the missed subpart to be part of an inspector pool with significantly less
air quality experience we have enjoyed in the past and that the Title V Permit applicability table
was the only area that cited the source as applicable to NSPS Subpart F. That is, the remaining
sections of the permit did not have associated permit conditions for this subpart. The City will
discuss this oversight with the team of inspectors and add a CFR source applicability review to our
Inspection SOP. (7al) - The City believes that its history of exceeding minimum CMS inspection
frequency of its synthetic minor sources has continually resulted in a lower occurrence of HPVs
and FRVs at regulated facilities.
Relevant metrics:
Natl
Metric ID Number and Description Goal
Natl
Avg
State
N
State State
D 1 %
7a Accurate compliance determinations 1 nn0/
[GOAL] 100/o

14
15 | 93.33%
1
7a 1 FRV "discovery rate" based on
inspections at active CMS sources
6.2%
1
24 4.17%
I
8a HPV discovery rate at majors J
2.3%
0
8 0%
(
CAA Element 3 - Violations
Finding 3-2
Area for Attention
Summary:
31

-------
The COA exhibits continued excellence in arriving at appropriate compliance determinations,
however their accuracy rate is 83.3%, which is below the national goal.
Explanation:
Accurate determinations were reached in discriminating between HPVs and Federally Reportable
Violations (FRVs)(metric 8c).
State Response:
(8c) In an effort to improve our HPV determinations, the City is requesting to know which facility
the City did not classify as an HPV and what was the specific violation that was classified
incorrectly.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl State
Avg N
State
D
State
%
8c Accuracy of HPV determinations [GOAL] 100%
| 5
6
83.3%
CAA Element 4 - Enforcement
Finding 4-1
Meets or Exceeds Expectations
Summary:
EPA commends the City on its handling of the identified HPV.
Explanation:
(lOal, 10a, 10b 1, 10b, and 14) The single HPV identified in FY2017 was addressed within EPA's
2014 HPV Policy's stipulated 180-day time frame with an appropriate enforcement response. As a
result, no cases required the implementation of a case development and resolution timeline (10a
and 14).
State Response:
Relevant metrics:
32

-------
Natl
Metric ID Number and Description Goal
Natl
Avg
State
N
State
D
State
%
10a Timeliness of addressing HPVs or
alternatively having a case development and 100%
resolution timeline in place

0
0
0
[
lOal Rate of Addressing HPVs within 180
days
63.7%
1
1
100%
10b Percent of HPVs that have been addressed
or removed consistent with the HPV Policy 100%
[GOAL]

1
1
100%
10b 1 Rate of managing HPVs without formal
enforcement action
12.9%
0
1
0%
14 HPV case development and resolution
timeline in place when required that contains 100%)
required policy elements [GOAL]

0
0
0
CAA Element 4 - Enforcement
Finding 4-2
Area for Attention
Summary:
Explanation:
(9a) The Settlement Agreement for GCC Rio Grande-Tijeras Plant references a prior agreement
rather than making specific mention of requirements that would return the facility to compliance.
Because this is one of only 5 formal enforcement actions for FY2017, the metric falls to 80%. In
light of Albuquerque's record of consistently requiring corrective action with specified time frames
(100% in the previous SRF), it is suggested that the City review the series of draft settlement
agreements for GCC, but no formal recommendation and deadline are set. EPA Response to
Albuquerque's Comment: Initial discussions have been conducted and the template agreement will
be reviewed. The enforcement record, including the formal action, did not reference the corrective
action or terms and conditions to return to compliance. Other records in the file, but not included
in the Order, may provide such documentation.
State Response:
33

-------
(9a) - To ensure the City completely and accurately understands EPA's concern and makes the
appropriate corrections to the City's template agreements, the City requests to discuss the findings
regarding GCC Rio Grande-Tijeras Plant's compliance agreement.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
9a Formal enforcement responses that include
required corrective action that will return the
facility to compliance in a specified time frame or
the facility fixed the problem without a
compliance schedule [GOAL]
100%

4
5
80%
CAA Element 5 - Penalties
Finding 5-1
Meets or Exceeds Expectations
Summary:
In all instance where penalties were assessed, they were found to be appropriately determined and
documented.
Explanation:
Penalty calculations (2) had associated evaluations of gravity and economic benefit noted in the
files, and resultant penalties were collected (metrics 11a and 12b). When the final penalty differed
from the initial penalty calculation (1 instance), documentation of the rationale for the change in
the penalty amount was duly noted.
State Response:
Relevant metrics:
34

-------
Metric ID Number and Description
Natl
Goal
Natl State
Avg N
State State
D %
1 la Penalty calculations reviewed that document
gravity and economic benefit [GOAL]
100%
1 :
2 | 100%
12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]
100%

1
i J 100%
12b Penalties collected [GOAL]
100%

2
2 I 100%
j
35

-------
STATE REVIEW FRAMEWORK
New Mexico
Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2018
U.S. Environmental Protection Agency
Region 6
Final Report
July 23, 2019
36

-------
I. Introduction
A.	Overview of the State Review Framework
The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.
Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:
1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards
2.	Promote fair and consistent enforcement necessary to protect human health and the
environment
3.	Promote equitable treatment and level interstate playing field for business
4.	Provide transparency with publicly available data and reports
B.	The Review Process
The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.
The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.
II. Navigating the Report
The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.
37

-------
A.	Metrics
There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.
Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.
B.	Performance Findings
The EPA makes findings on performance in five program areas:
•	Data - completeness, accuracy, and timeliness of data entry into national data systems
•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness
•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)
•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance
•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection
Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:
Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.
Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.
Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.
C.	Recommendations for Corrective Action
Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include
38

-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.
III. Review Process Information
Resource Conservation and Recovery Act (RCRA)
Review Period: State FY18 (7/1/17 - 6/30/18)
Key Dates:
•	Kick off Letter/Meeting - 4/11/18
•	File Selection List sent: 8/24/18
•	DMA sent: 10/9/18
•	On-Site File Review conducted: 10/15-19/18
EPA contacts:
•	Lou Roberts, 214-665-7579, roberts.lou@epa.gov
•	Troy Stuckey, 214-665-6432, stuckev.trov@epa.gov
•	Mark Potts, 214-665-2723, potts.mark@epa.gov
NMED contacts:
•	John Kieling, 505-476-6035
•	Janine Kraemer, 505-476-4372
39

-------
Executive Summary
Introduction
Resource Conservation and Recovery Act (RCRA)
NMED continues to meet or exceed the goals and objectives of the authorized RCRA
compliance and enforcement program. Further, the Hazardous Waste Bureau is commended for
its participation in a pilot project for this SRF review that demonstrates how some inherent issues
with the national SRF process can be addressed. Working with EPA's Office of Compliance,
NMED and the Region piloted techniques for meeting national goals for consistency while
allowing for nuances among states. Additionally, the pilot demonstrates how the SRF review can
be integrated with the annual grant review for a timely and comprehensive review of NMED's
RCRA program.
Pilot deemed a success by both EPA Region 6 and the NMED HWB. Review was on current
data that was already being evaluated as part of the grant end-of-year evaluation. This allowed
immediate feedback on State's data.
Doing this SRF review in conjunction with the grant end-of-year review allowed a focus on a
couple of areas that probably would not have been identified in the normal SRF review done on
year old date (i.e., FY17):
-	The file for the Long-standing Secondary Violator may not have been reviewed in a total
random File Selection process but with doing the SRF review in collaboration with the FY18
EOY the facility was targeted for on-site review.
-	The same is true for the FRR issue, this may not have been discovered in a random File
Selection process.
Areas of Strong Performance
The following are aspects of the program that, according to the review, are being implemented at
a high level:
Resource Conservation and Recovery Act (RCRA)
NMED has six inspectors of which three are located in Santa Fe and three are located in
Albuquerque. Inspectors serve as enforcement officers. The NMED Hazardous Waste Bureau
(HWB) has developed Standard Operating Procedures (SOPs) for conducting inspections:
Compliance Evaluation Inspection (CEI) Procedure; Inspection Documentation File Procedure;
Professional Conduct During Inspections; and Compliance Assistance Visits (CAV). NMED has
40

-------
developed and implemented the use of a standardized inspection report and checklists for various
universes: Large Quantity Generator (LQG); Small Quantity Generator (SQG); Conditionally
Exempt Small Quantity Generator (CESQG); CESQG - Used Oil; and Transporter. This
inspection report ensures the required information is always included and includes carbon copies
so it can be provided to the facility at the time of the inspection. In addition, this inspection
report includes other useful information such as the entry and exit conference dates/times.
NMED continues every year to meet or exceed the inspection program goals identified in the
RCRA Compliance Monitoring Strategy to do 100% of its Federal Treatment, Storage, Disposal
(TSD) facilities every year; 100% of its operating TSD universe every two years; and 20% of its
LQG universe every year. In addition, NMED responds to all hazardous waste complaints
received usually with an on-site investigation/inspection which identifies a facility to be a SQG,
CESQG, or Not Any Universe. NMED also continues to target facilities that are in universes
(e.g., SQG, CESQG) for which EPA has not established program goals concerning the type, or
minimum number, of inspections.
NMED has also developed and uses enforcement template letters: CEI in compliance; Notice of
Violation (NOV); NOV RTC (Return to Compliance); NOV with penalty; RTC and no further
action; Penalty; CAV with findings; and CAV without findings. NMED maintains
documentation to support findings of violations, penalty calculations, and settlement
negotiations. NMED continues to pursue those enforcement actions that result in significant
protection to human health and the environment while involving complex negotiations.
The NMED RCRA hazardous waste program is championed by a strong cadre of HWB
managers who are very experienced in targeting, inspection, and enforcement processes.
NMED HWB managers and EPA Region 6 have an excellent working relationship. NMED and
EPA exchange feedback on issues and priorities of particular concern and work cooperatively to
address them.
Priority Issues to Address
The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:
Resource Conservation and Recovery Act (RCRA)
NMED depends upon EPA contractor support for Financial Record Reviews (FRR). NMED has
a regulatory requirement for permitted facilities to submit annual financial assurance
information, and all CEIs at non-Federal operating TSD facilities are to include a FRR. The FRR
of the TSDF CEI is to be entered in RCRAInfo.
NMED operating non-Federal TSDF Universe is three. One of the two TSDF CEIs done in FY17
is still awaiting a FRR (i.e., FRR entered in RCRAInfo 4/21/17 as undetermined). The two TSDF
41

-------
CEIs done in FY18 did not have, at the time of this review, a FRR entered in RCRAInfo.
Unfortunately, an EPA contract for reviewing financial records was not available for NMED for
its FRRs. The EPA contract became available for NMED to use in November 2018.
42

-------
Resource Conservation and Recovery Act Findings
RCRA Element 1 - Data
Finding 1-1
Meets or Exceeds Expectations
Summary:
NMED HWB personnel take RCRAInfo data entry seriously and make every effort to ensure data
is entered and is correct. NMED has a written process for inspection and enforcement data to be
entered into RCRAInfo. NMED has a dedicated position for RCRAInfo data entry within the
Compliance and Technical Assistance Program. This position was vacant for a period of time
during State's FY18. NMED has a Word document that is completed by inspectors and
enforcement officers and routed electronically to the RCRAInfo data entry person. The
responsibility for the data entry of penalty payments received is with the financial staff who are
not in the Compliance and Technical Assistance Program. There was discussion during the on-site
review including during the Exit Conference that perhaps this data entry should be with the
RCRAInfo data entry person who is in the Compliance and Technical Assistance Program. NMED
is not consistent in using the RCRAInfo penalty fields such as proposed penalty and final penalty
collected. EPA compliments NMED on their use of the Violation Notes field of RCRAInfo.
NMED enters the violation type (e.g., 265.D) and the regulatory citation (e.g., 265.54(d)) and a
description of the violation is entered into the Violation Notes field (e.g., Failure to amend
Contingency Plan with current ER Coordinator).
Explanation:
There were four facilities for which information was either missing or inaccurate. This information
for all four facilities was addressed during the on-site review and data was entered and/or corrected.
One facility had an incorrect date for when the informal enforcement action was issued. One
facility was not identified as having a violation and the informal enforcement action that was issued
was not in RCRAInfo. Two facilities did not have the final penalty amount collected.
State Response:
NMED is receiving emails from financial staff indicating payments have been entered into
RCRAInfo.
Relevant metrics:
43

-------
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
2b Accurate entry of mandatory data [GOAL]
100%

27
31
87.1%
RCRA Element 2 - Inspections
Finding 2-1
Meets or Exceeds Expectations
Summary:
Twenty-five facilities were identified for this SRF review. A total of 28 inspection reports were
reviewed. Three Federal TSDFs had two CEIs each. The 28 inspection reports were for compliance
evaluation inspections (CEIs). NMED has developed templates and checklists for various
inspection types and universes. NMED has also developed inspection report templates and
checklists for individual TSDFs. NMED inspection reports include a detailed facility description
that may include size of facility, number of employees, waste streams generated. The inspection
report narrative also includes any permitted units and discussion regarding storage areas. The
inspection report identifies if the facility had been inspected previously and if so the date. Inspector
identifies the types of documents reviewed and areas observed. Each inspection report includes
the inspector's observation of violations documented with photos and identifies if any compliance
assistance was provided and any discussion regarding Best Management Practices. The inspection
report includes the appropriate checklist for the universe inspected. TSDFs reviewed had their own
unique checklist created for the year of inspection. Inspection reports included the date and time
of arrival along with entry conference sign-in sheet of those in attendance. Inspection reports
included the date and time of the exit conference along with the sign-in sheet of those in attendance.
Several of the inspections involved conducting the exit conference at a later time and possibly by
phone; these inspection reports documented such to include when the inspection report was sent
to facility via email prior to the exit conference. The inspection reports reviewed were well written
and detailed and provide sufficient documentation to determine compliance. EPA's reviewer
suggested that at a minimum the initials of the person taking the photo be identified. The average
time taken to prepare the 28 inspections reports reviewed was 12 days. The longest period of time
was 38 days and the shortest period of time was 0 days (i.e., inspection report was completed and
provided to facility during the exit conference).
Explanation:
NMED conducts a CEI annually of its seven operating Federal TSDF universe. NMED conducts
annually a CEI at 50% of its operating non-Federal TSDF universe. NMED consistently conducts
a CEI at 20% of its LQG universe identified by the latest National Biennial Reporting System at
beginning of its FY, and usually conducts a higher percentage of around 30% or more. NMED
also continues to target facilities that are in universes (e.g., SQG, CESQG) for which EPA has not
established requirements concerning the type, or minimum number, of inspections. In addition,
44

-------
NMED responds to all hazardous waste complaints received usually with an on-site
investigation/inspection which identifies a facility to be a SQG, CESQG, or Not Any Universe.
State Response:
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
5a Two-year inspection coverage of operating
TSDFs [GOAL]
100%
88.1%
10
10
100%
5b Annual inspection of LQGs using BR
universe [GOAL]
20%
16.1%
13
41
31.7%
6a Inspection reports complete and sufficient
to determine compliance [GOAL]
100%

28
28
100%
6b Timeliness of inspection report completion
[GOAL]
100%

28
28
100%
RCRA Element 2 - Inspections
Finding 2-2
Area for Attention
Summary:
NMED depends upon EPA contractor support for Financial Record Reviews (FRR). NMED has a
regulatory requirement for permitted facilities to submit annual financial assurance information,
and the CEI at non-Federal operating TSD facilities are to include a FRR. The FRR of the TSDF
CEI is to be entered in RCRAInfo.
Explanation:
NMED operating non-Federal TSDF Universe is three. One of the two TSDF CEIs done in FY17
is still awaiting a FRR (i.e., FRR entered in RCRAInfo 4/21/17 as undetermined). The two TSDF
CEIs done in FY 18 did not have at time of this review, a FRR entered in RCRAInfo. Unfortunately,
an EPA contract for reviewing financial records was not available for NMED for its FRRs. The
EPA contract became available for NMED to use in November 2018. EPA Region 6 intends to
continue discussions with NMED so if EPA HQs contract lapses that there will be an alternative
for these FRRs to be completed.
45

-------
State Response:
As provided above, EPA was without a contractor to support financial record reviews; therefore,
continued support from EPA and its' contractor is vital in maintaining current reviews and allow
NMED to fulfill entries into RCRAInfo.
RCRA Element 3 - Violations
Finding 3-1
Meets or Exceeds Expectations
Summary:
Compliance determinations are based on the inspection report which identifies violations (if exist).
Inspection report includes information that is found during administrative review (pre-inspection,
on-site, post-inspection) along with observations made during the on-site visit. EPA's review of
the twenty-eight inspection reports and the two Non-Financial Records Review indicated that the
appropriate determination was made in all twenty-five facility files.
Explanation:
EPA requested to review files for 25 facilities. A total of 28 inspection reports were reviewed.
These 28 inspection reports were for CEIs. Three of the Federal TSDFs had two CEIs each. One
of the Federal TSDF also had a NRR and a LQG with a post-closure permit had a NRR. Of these
28 CEIs, 6 facilities did not have any violations identified; 14 facilities had an informal
enforcement action issued; and 8 facilities had a formal enforcement action issued. The Federal
TSDF facility with a NRR had a formal enforcement action issued, and the LQG had an informal
enforcement action issued. The LQG facility with NRR is a Long-Standing Secondary Violator as
the informal enforcement action has not resulted in a return to compliance for the six violations.
In addition, the informal enforcement action for this NRR was issued greater than 240 days and
NMED has not identified facility as SNC.
State Response:
Relevant metrics:
46

-------
Metric ID Number and Description
2a Long-standing secondary violators
Natl
Goal
Natl
Avg
State
N
State
D
State
%
2



7a Accurate compliance determinations
[GOAL]
100%

30
30
100%
7b Violations found during CEI and FCI
inspections

34.9%
71
131
54.2%
8a SNC identification rate at sites with CEI
and FCI

1.5%
4
246
1.6%
8c Appropriate SNC determinations [GOAL]
100%

20
21
95.24%
RCRA Element 3 - Violations
Finding 3-2
Area for Attention
Summary:
Explanation:
Below the National Goal and the National Average. State did not submit a request for Alternate
Schedule as provided for in the RCRA ERP.
State Response:
A SNY was entered into RCRAInfo on day 163. NMED will ensure going forward a SNY will be
entered into RCRAInfo for all facilities before day 150.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
%
8b Timeliness of SNC determinations [GOAL]
100%
84.9%
3
4
75%
RCRA Element 4 - Enforcement
47

-------
Finding 4-1
Meets or Exceeds Expectations
Summary:
Enforcement files are well organized to include inspection reports and correspondence.
Enforcement actions are issued in a timely manner and based upon thorough and timely
investigative work. All enforcement actions are reviewed by one or more NMED HWB managers.
NMED continues to identify and address violations timely and appropriately. NMED requires
corrective measures in their informal and formal enforcement actions to return facilities to
compliance immediately or within thirty days. NMED follows up through required submittals
and/or on-site visits. No further action closure letters are sent. Staff recommendation of closure
letters are reviewed by one or more NMED HWB managers.
Explanation:
Enforcement files contained documentation identifying the facility had achieved compliance or
was on a compliance schedule except for one. Of the 21 enforcement actions reviewed, only one
had not resulted in a return to compliance. The informal enforcement action was appropriate. The
facility responded that the alleged violations were not valid because the material was not hazardous
waste. NMED has not closed the informal enforcement action. Since it has been open greater than
240 days without being identified as a SNC, it is identified as a Long-standing Secondary Violator
State Response:
Relevant metrics:
Natl
Metric ID Number and Description Goal
Natl
Avg
State
N
State
D
State
%
p~~———	
10b Appropriate enforcement taken to address 1 n„0/
violations [GOAL] 0

21
21
100%
9a Enforcement that returns sites to compliance 1 nn0/
[GOAL] 100/o

20
21
95.24%
RCRA Element 5 - Penalties
Finding 5-1
Meets or Exceeds Expectations
Summary:
48

-------
Eleven penalty enforcement action files were reviewed. NMED issues a RCRAInfo Code 125,
Written Informal Enforcement Action, NOV that includes a penalty. Penalty Calculation Sheets
include Economic Benefit (EB) discussion for each violation. Documentation of the penalty
calculations, adjustments, settlement, and compliance measures taken were maintained in the files.
NMED will negotiate proposed penalties to expedite the settlement process. During the negotiating
process, NMED takes into consideration the types of violations, the amount of time the facility
took to come into compliance, and history of non-compliance. If a facility claims inability to pay,
NMED will use EPA's ABEL software to review the facilities' financial status.
Explanation:
NMED includes both economic benefit and gravity components in their penalty calculations and
documents adjustment of the initial penalty to the settled amount. Files reviewed had
documentation of all considerations for the initial proposed penalty. The EB discussion on many
of the Penalty Calculation Sheets were as follows:
•	EB could not be determined for violation
•	EB could not be determined because an unknown amount of waste was generated
•	EB not considered because of statutory penalty maximum of $10,000 per violation
Files reviewed had documentation of all considerations that resulted in the final penalty, SEP,
ability to pay issues, payment schedule, and adjustments for such items as willingness to comply
or history of non-compliance. NMED documents the collection of penalties to include date and
check number or voucher number if paying electronically. Files documented collection of all final
penalties including those on payment schedule. A copy of penalty payments received during this
SRF review were seen and noted by EPA reviewer.
State Response:
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl State
Avg N
State
D
State
%
________________________
11a Gravity and economic benefit [GOAL]
100%
| 11
11
100%
12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]
100%
| ^
8
100%
12b Penalty collection [GOAL]
100%
8
8
100%
49

-------
New Mexico
Clean Water Act
Implementation in Federal Fiscal Year 2017
U.S. Environmental Protection Agency,
Headquarters, Washington, DC
Final Report
July 23, 2019
50

-------
I. Introduction
A.	Overview of the State Review Framework
The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.
Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:
1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards
2.	Promote fair and consistent enforcement necessary to protect human health and the
environment
3.	Promote equitable treatment and level interstate playing field for business
4.	Provide transparency with publicly available data and reports
B.	The Review Process
The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.
The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.
II. Navigating the Report
The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.
51

-------
A.	Metrics
There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.
Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.
B.	Performance Findings
The EPA makes findings on performance in five program areas:
•	Data - completeness, accuracy, and timeliness of data entry into national data systems
•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness
•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)
•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance
•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection
Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:
Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.
Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.
Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.
C.	Recommendations for Corrective Action
Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include
52

-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.
Clean Water Act: Executive Summary
Area of Strong Performance
Entry of data on permit limits and discharge monitoring
report results is excellent and exceeds the national goal
Compliance determinations are clear and well
documented in inspection and enforcement files
Enforcement actions promote return to compliance
Enforcement actions are generally appropriate to the
severity of the violations
Penalty collection is well documented
Priority Areas to Address
Few single event violations detected during EPA and
state inspections are reported in the database of record
Non-major inspection coverage did not meet annual or
long-term inspection coverage goals for individual and
general permit facilities.
Inspection commitments for significant industrial users,
municipal separate storm sewer systems (MS4s),
concentrated animal feeding operations (CAFOs),
biosolids, and stormwater industrial facilities did not
meet the coverage goals in the NPDES CMS policy.
Inspection report timeliness is well below the national
goal.
Timely enforcement in response to discharge monitoring
report and single event violations is a continuing
challenge since Round 3
Penalties lack justification for economic benefit values
The rationale for changes to initially calculated penalties
is not well documented
53

-------
Metric
Round 2 Finding Level (FY
2009)
Round 3 Finding Level (FY
2013)
Round 4 Finding Level (FY
2018)
2b: Files reviewed where data
are accurately reflected in the
national data system
Meets or Exceeds
Expectations
Area for Improvement
Area for Improvement
4a2: Significant industrial user
(SIU) inspections for SIUs
discharging to non-authorized
POTWs.
N/A
Area for Improvement
Area for Improvement
4a7: Number of Phase I and II
MS4 audits or inspections.
N/A
Area for Improvement
Area for Improvement
4a8: Number of industrial
stormwater inspections
N/A
Area for Improvement
Area for Improvement
4al0: Number of inspections
of comprehensive large and
medium NPDES-permitted
CAFOs
N/A
Area for Improvement
Area for Improvement
4all: Number of
sludge/biosolids inspections at
each major POTW.
N/A
Area for Improvement
Area for Improvement
5b 1 Inspections coverage of
NPDES non-majors with
individual permits
Meets or Exceeds
Expectations
Meets or Exceeds
Expectations
Area for Improvement
5b2 Inspections coverage of
NPDES non-majors with
general permits
Meets or Exceeds
Expectations
Meets or Exceeds
Expectations
Area for Improvement
6a Inspection reports complete
and sufficient to determine
compliance at the facility.
Meets or Exceeds
Expectations
Meets or Exceeds
Expectations
Area for Attention
6b Timeliness of inspection
report completion
Meets or Exceeds
Expectations
Area for Attention
Area for Improvement
Area for Improvement	Area for Attention	Meets or Exceeds Expectations
54

-------
Metric
Round 2 Finding Level (FY
2009)
Round 3 Finding Level (FY
2013)
Round 4 Finding Level (FY
2018)
7e Accuracy of compliance
determinations
Meets or Exceeds
Expectations
Area for Improvement
Meets or Exceeds
Expectations
8b: Single event violations
accurately identified as SNC
or non-SNC
Meets or Exceeds
Expectations
Area for Improvement
N/A*
8c: Percentage of SEVs
identified as SNC reported
timely at major facilities.
Area for Improvement
Area for Improvement
N/A*
9a: Enforcement responses
that returned, or will return,
sources in violation to
compliance
Meets or Exceeds
Expectations
Area for Improvement
Meets or Exceeds
Expectations
lOal Percentage of major
NPDES facilities with formal
enforcement action taken in a
timely manner in response to
SNC violations
Area for Improvement
Area for Improvement
N/A
10b: Percentage of
enforcement responses
reviewed that address SNC
that are taken in a timely
manner.
Area for Attention
N/A
N/A
10b: Enforcement responses
reviewed that address
violations in an appropriate
manner.
Meets or Exceeds
Expectations
Area for Improvement
Area for Improvement
Area for Improvement	Area for Attention	Meets or Exceeds Expectations
* Analysis of SEV data entry is evaluated under Round 4 metric 2b
55

-------
Metric
Round 2 Finding Level (FY
2009)
Round 3 Finding Level (FY
2013)
Round 4 Finding Level (FY
2018)
10c: Percentage of
enforcement responses
reviewed that address SNC
that are appropriate to the
violations
Meets or Exceeds
Expectations
N/A
N/A
lOd: Percentage of
enforcement responses
reviewed that appropriately
address non-SNC violations.
Meets or Exceeds
Expectations
N/A
N/A
lOe: Percentage of response
for non-SNC violations where
a response was taken in a
timely manner.
Area for Attention
N/A
N/A
11a: Penalty calculations that
document and include gravity
and economic benefit
Area for Attention
Area for Improvement
Area for Improvement
12a: Documentation of the
rationale for the different
between the initial penalty
calculation and the final
penalty.
Area for Attention
Area for Improvement
Area for Improvement




Area for Improvement	Area for Attention	Meets or Exceeds Expectations
56

-------
III. Review Process Information
Clean Water Act (CWA)
File Review: November 2018
Draft Submitted for Comment: March 2019
Final Report: July 26, 2019
57

-------
Clean Water Act: Findings (CWA)
CWA Element 1 - Data
Finding
Finding 1-1:
Finding Level
Meets or Exceeds Expectations

Data Completeness and Accuracy for Permit Limits and DMR:
Summary
Data completeness for water permit limits and discharge

monitoring reports (DMRs) exceeds the national goal of >95%.
Explanation
Regional Response
Repeat Recommendation
Permit limits are the maximum amount of a pollutant that the
facility may release according to its permit and DMRs are the
actual pollutant amounts released. These two pieces of
information are minimum data requirements for both major and
non-major facilities. Exceedance of permit limits indicates that a
violation occurred on a discharge monitoring report. EPA enters
permit limits on behalf of the state in this directly implemented
program, while most regulated facilities transmit discharge
monitoring report (DMR) data using the electronic discharge
monitoring report data system Net Discharge Monitoring
Reports (NetDMR). EPA entered 108 of 109 required permit
limits into the Integrated Compliance Information System. Of
the 4,405 discharge monitoring reports required, facilities
submitted 4,270 DMRs. Seventy-four percent (118) of the
missing 135 DMRs are for non-major facilities.
The Region recognizes that there are permittees that are not
using net DMRs due to various factors (e.g. knowledge and/or
access limitations). The Region is planning an on-site
compliance assistance efforts during the summer of 2019 to help
operators be compliant with DMR requirements and as part of
that help ensure they are registered.	
No
# of Recommendation
Relevant metrics:
58

-------
Natl
Metric ID Number and Description Goal
Natl EPA EPA
Avg N D
State
%
lb5: Permit limit data entry rate for major >950/
and non-major facilities ~ 0
lb6: DMR data entry rate for major and >950/
non-major facilities ~ 0
98.8% | 108 J 109
99.10%
96.3^X^270^1 4,405
96.94%
Finding
Finding 1-2:
Finding Level
Area for Improvement

Data Completeness for SEV Violations: Single event violations

found during EPA and state inspections at major and non-major
Summary
facilities are missing in the ICIS database. Ten of 27 files

reviewed have missing or inaccurately entered minimum data

requirements.
Files reviewed had missing single event violations and one
unreported inspection along with inaccurately entered dates for
inspections, inspection report finalization, and enforcement
actions in 62.96% of files reviewed. Six of the 10 files with
incomplete or inaccurate information had missing single event
violations. The review also found isolated, infrequent missing
minimum data requirements on: an unreported inspection, an
inaccurately entered enforcement action date, an inaccurately
entered inspection report date, an inaccurately entered inspection
report finalization date.
Explanation
The 1 unreported state inspection at one major facility and 20
non-major facilities is based upon regional review of the data
metric analysis and inspection coverage table.
This finding on single event violation data entry is a recurring
finding from past SRF reviews.
Regional Response
The Region recognizes that there has been vast improvement
since the SRF Round 3 due to revised data entry procedures.
Repeat Recommendation Yes
# of Recommendation
1
59

-------
Due Date
Recommendation
1 J 3/30/2020
EPA HQ will review single event violations listed in 6 inspection
reports (3 state, 3 EPA) and compare this information to ICIS SEV
data entered. This recommendation will be considered to
implemented when >71% of inspection reports reviewed have SEVs
entered in ICIS for major and non-major facilities. Progress will be
monitored in 2020 and beyond if FY 2019 inspection report data is
incomplete.
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
2b: Files reviewed where data are accurately
reflected in the national data system.
100%
I 17
27
62.96%
60

-------
CWA Element 2 - Inspections
Finding
Finding 2-1:
Finding Level
Summary
Explanation
Meets or Exceeds Expectations
Inspection Coverage: Inspection coverage in New Mexico meets
national inspection coverage policy requirements for major
facilities over a two year period. The region did not meet
annual FY 2017 inspection coverage due to resource constraints.
The National Pollutant Discharge Elimination System
Compliance Monitoring Strategy (NPDES CMS) coverage goal
for major facilities is 100% inspection coverage over a 2 year
period of time. The region and state inspected 30 of the 31 major
facilities (96.77%) in New Mexico over a two year time period
in FY 2016-2017. The region and state conducted 13 inspections
in FY 2017 resulting in 41.94% inspection coverage at major
facilities. These results factor in FY 2016-2017 frozen data, End
of Year reports on inspection coverage, information provided by
the region, and those found during the on-site file review.
There are separate NPDES CMS coverage goals for
pretreatment, SSOs, and Phase I and II stormwater construction.
Combined regional and state inspection coverage met, or is
within ten percentage points, of the specific coverage goals for
each of these inspection activities as indicated in the relevant
metrics table below. No combined sewer systems exist in New
Mexico, which is why performance is listed as 0% in the
performance results shown below.
Regional Response
Repeat Recommendation
No
Completion Verification
# of Recommendation
61

-------
Metric ID Number and
Description
Natl
Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
4al: Pretreatment
compliance inspections
and audits at approved
local pretreatment
programs.
100%
CMS

The 1 audit conducted;
5 pretreatment
inspections
6
100%
4a4: Number of CSO
inspections.
100%
CMS

n/a
n/a
n/a
4a5: Number of SSO
inspections
100%
CMS

3
14
21.4%
4a9: Number of Phase I
and Phase II
construction stormwater
inspections
100%
CMS

3
5
60%
5al Inspection coverage
of NPDES majors.
[GOAL]
100%
CMS
40.6%
13
31
41.94%
Finding	Finding 2-2:
Finding Level	Area for Attention
„ Inspection reports: Inspection report quality is below the national
Summary	goal of 100%.	
Four of the fourteen inspection reports reviewed were not
complete and sufficient to determine compliance at the facility.
Explanation	Four of the inspection reports reviewed were primarily checklist
based and did not provide a strong narrative to document the
	inspector's observations or evidence of deficiencies found.	
Regional Response	
Repeat Recommendation	
Completion Verification	
# of Recommendation
62

-------
Metric ID Number and
Natl Goal
Natl Avg
EPA
EPAD
EPA
Description
N
%
6a Inspection reports
complete and sufficient
to determine compliance
at the facility. [GOAL]
100%

10
14
71.43%
Finding
Finding 2-3:
Finding Level
Summary
Area for Improvement
Inspection Coverage at Non-Majors & inspection report
timeliness: Non-major inspection coverage did not meet annual
or long-term inspection coverage goals for individual and
general permit facilities. Inspection commitments for significant
industrial users, municipal separate storm sewer systems (MS4s),
concentrated animal feeding operations (CAFOs), stormwater
industrial, and biosolids facilities did not meet the coverage
goals in the NPDES CMS policy. Inspection report timeliness is
below the national goal.
The NPDES CMS goal for traditional non-major facilities is
100% over a 5 year time period. Roughly 20% coverage is
anticipated each year to achieve this goal. The region and state
inspected 2% of the 2,048 non-major individual and general
permit facilities in FY 2017. Over the last five years, the region
and state's combined inspection coverage is 121/2048 = 5.9%.
Inspection coverage for non-majors is 14.1% below the national
5 year target for inspection coverage.
Explanation
There are separate NPDES CMS coverage goals for significant
industrial users (SIUs), municipal separate storm sewer systems,
CAFOs, biosolids, and stormwater industrial facilities that were
not met in FY 2017. The coverage goal for SIUs in the Clean
Water Act is for one sampling inspection at each user annually.
The region and state did not meet this commitment as no SIU
inspections were conducted. The MS4 coverage goal is one on-
site audit, on-site inspection, or off-site desk audit of each Phase
IMS4 every five years and one inspection or on-site audit of
each Phase IIMS4 every seven years thereafter. There is one
large/medium MS4 and 8 small MS4s in New Mexico with no
inspections reported in FY 2017. Prior year MS4 results in End
of Year reports indicate that there was one Phase IMS4
inspection reported in FY 2015 and no Phase II inspections
reported for 11% coverage, which is 89% under the national
63

-------
coverage goal. The stormwater industrial inspection coverage
goal is 10% of the universe each year; the region and state
completed 3 inspections in a 979 facility universe for 0.3%
coverage. The CAFO coverage goal is one comprehensive
inspection of each large and medium NPDES permitted CAFO
every five years. No CAFO inspections are reported in FY 2017.
Past end of year reports indicate that the Region and State
conducted 11 CAFO inspections in FY 2012-2015 and none in
FY 2016. Long term CAFO coverage is 11/68=16.18%, which
is 83.82%) under the five year comprehensive inspection
coverage goal. The biosolids inspection coverage goal is one
inspection every 5 years. The annual coverage for biosolids
inspections is 0% as there are no biosolids inspections reported
in FY 2017. In prior years, one biosolids inspection occurred in
2014, with none reported in end of year reports in 2015-2016 for
2.5% coverage toward the national coverage goal of 100%>
coverage within a 5 year time period.
Inspection report timeliness results are below the national goal of
100%). The standard for inspection report timeliness in the
National Pollutant Discharge Elimination System Enforcement
Management System (NPDES EMS) is 30 days for non-
sampling inspections and 45 days for sampling inspections. The
Region finalized nine of the 14 inspection reports 89-570 days
after the inspection. One state inspection report was finalized in
37 days. The average number of days to finalize an inspection
report is 137 days.
The EPA Region 6 office and the state share inspection coverage
responsibilities and results include both state and EPA conducted
inspections. Given recent hiring of two federal CWA inspectors
in the middle of 2017, the regional office anticipates that
coverage will improve to at least 20% for traditional non-major
facilities in FY 2018 and beyond for the non-major universe.
This finding on inspection coverage is a recurring finding from
past SRF reviews.
Coverage: The Region is conducting an analysis to determine the
true universe of minors. The majority of the minors universe is
from stormwater general permits which may not be active and/or
Regional Response	where the permittees are construction sites that operate for a
short amount of time
The region would like for the FY19 alternative CMS plan to be
approved which justifies a deviation from the policy's inspection
64

-------
coverage for minors (e.g. permittee not located close to waters of
the U.S.)
The Region would like for the SRF program to re-consider the
30 day timeline goal as it is not in line with the current Bowling
Chart performance measure of 60 days for inspection report
completeness
The Region has implemented various project management tools
to help ensure timely inspections. They include 1. an e-
management system which includes milestones and due dates for
various tasks leading up to the final report. This helps
management electronically track progress. In addition, we have
2. included huddle rooms which aim to also tracks progress for
inspection report assigned to staff. On a weekly basis, the team
discusses the status of the reports, any challenges they've
encountered, and collectively identify solutions to overcome any
	delays.	
Repeat Recommendation Yes	
Completion Verification	
# of Recommendation	2
Recommen dation:
65

-------
Rec
#
Due Date
Recommendation
1
09/30/2020
Review six randomly selected FY 2020 state and regional inspection
reports to assess inspection report quality and timeliness. EPA HQ
will share findings from the review of inspection reports on inspection
report quality and timeliness. This recommendation will meet or
exceed expectations when >71% of the inspection reports reviewed
meet NPDES inspection manual standards for quality and inspection
report timeliness standards of 30-45 days as required by the NPDES
EMS.
2
04/30/2020
Conduct an annual data metric analysis using FY 2019 frozen data to
examine inspection coverage for metrics 5b 1, 5b2, and request FY
2018 inspection results for SIUs, CAFOs, stormwater industrial,
biosolids, and MS4 facilities. This recommendation will be closed out
when the region meets either 1.) NPDES CMS coverage goals, or 2.)
region specific alternative compliance monitoring strategy plan
commitments for FY 2019. Progress will be monitored on an annual
basis using annual data metric analyses if FY 2019 inspection coverage
does not meet national CMS policy or alternative region specific
compliance monitoring strategy plan commitments.
66

-------
Metric ID Number and
Description
Natl Goal
NatlAvg | EPAN
EPA
D
EPA
%
4a2: Significant industrial user
(SIU) inspections for SIUs
discharging to non-authorized
POTWs.
100% CMS
1 0
1
0%
4a7: Number of Phase I and II
MS4 audits or inspections.
100% CMS

0
9
0%
4a8: Number of industrial
stormwater inspections
100% CMS

3
979
0.31%
4al0: Number of inspections of
comprehensive large and medium
NPDES-permitted CAFOs
100% CMS

0
68
0%
4al 1: Number of sludge/biosolids
inspections at each major POTW.
100% CMS

0
39
0%
5b 1 Inspections coverage of
NPDES non-majors with
individual permits [GOAL]
100% CMS
48.9%
17
91
18.68%
.
5b2 Inspections coverage of
NPDES non-majors with general
permits [GOAL]
100% CMS
3.7%
24
1,957
1.23%
6b Timeliness of inspection report
completion [GOAL]
100% |
5
14
35.71%
67

-------
CWA Element 3 - Violations
Finding
Finding 3-1:
Finding Level
Summary
Meets or Exceeds Expectations
Accuracy of Compliance Determinations: The majority of files
reviewed show clear and accurate compliance determinations.
Compliance rates for all types of violations are within 5
percentage points of the national average. Single event violation
reporting continues to increase in number indicating better data
quality for violation reporting since the last SRF review.
EPA reviewed 14 inspection files and found accurate compliance
determinations in 13 of the 14 files reviewed (92.86%).
There 27 single event violations reported from 54 inspections at
major and non-major facilities. The on-site file review identified
several files with unreported single event violations at both
major and non-major facilities; these findings and
recommendations appear under Element 1.
Explanation
Regional Response
There are two compliance rate metrics, one for overall
noncompliance and one that focuses on the most serious
significant noncompliance (SNC) and Category I violations.
Violations reported under CWA metric 7kl on the percentage of
major and non-major facilities in noncompliance include:
effluent, single event, compliance schedule, and permit schedule
violations. Three hundred sixty-five of the 2,087 facilities in
New Mexico (17.49%) have one or more violations reported in
FY 2017. The majority of these violations (91%) are violations
at smaller, non-major facilities. There are 281 of 2,074 facilities
in significant or Category I noncompliance (13.55%). Ninety-
six percent of the 281 facilities in significant or Category I
noncompliance are associated with non-major facilities.
Repeat Recommendation
No
Completion Verification
# of Recommendation
Relevant metrics:
68

-------
Metric ID Number and Description
Natl
Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
7e Accuracy of compliance determinations
[GOAL]
100%

13
14
92.86%
7j 1 Number of major and non-major facilities
with single-event violations reported in the
review year. [INDICATOR]




27
7kl Major and non-major facilities in
noncompliance. [INDICATOR]

18.1%
365
2087
17.49%
8a3 Percentage of major facilities in SNC and
non-major facilities Category I
noncompliance during the reporting year.
[INDICATOR]

11.2%
281
2074
13.55%
CWA Element 4 - Enforcement
Finding	Finding 4-1
Finding Level	Meets or Exceeds Expectations	
Enforcement Achieves Return to Compliance: Many of the
Summary enforcement actions reviewed returned, or will return, facilities
	to compliance.	
EPA reviewed 22 enforcement actions and found that 18 of the
actions returned or will return facilities to compliance. Return to
compliance is achieved through compliance schedules in formal
enforcement, or documentation of facilities taking complying
actions in response to formal or informal enforcement. Four of
F . .	the reviewed actions for 2 major and 2 non-major facilities did
not promote return to compliance due to: lack of a compliance
schedule; informal enforcement response to chronic, recurring
violations over a number years with no return to compliance; and
failure to meet compliance schedule deadlines for several
quarters with ongoing violations for 8 years with no enforcement
	escalation to promote return to compliance.	
Regional Response	
Repeat Recommendation No	
Completion Verification
69

-------
# of Recommendation	0
Natl
Metric ID Number and Description Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
9a: Enforcement responses that
returned, or will return, sources in
violation to compliance
100%

18
22
81.82%
Finding
Finding 4-2
Finding Level
Summary
Explanation
Area for Attention
Appropriate Enforcement Action: The appropriateness of
enforcement response improved since the last on-site file review
from 65% of FY 2012 actions to 73% of actions reviewed in FY
2017.
Twenty-two actions reviewed (73.33%) have appropriate
enforcement taken based on NPDES EMS violation response
action criteria. Five files had a pattern of chronic violations with
no enforcement in the review year. Three files reviewed had
enforcement for some, but not all violations, or informal
enforcement with no escalation for ongoing violations.
Regional Response
Repeat Recommendation
No
Completion Verification
# of Recommendation
Relevant metrics:
Metric ID Number and Description
Natl
Goal
Natl
Avg
EPA
N
EPA
D
EPA I
%
1
10b 1: Enforcement responses
reviewed that address violations in an
appropriate manner.
100%

22
30
1
73.33% 1
Finding	Finding 4-3
Finding Level	Area for Improvement
70

-------
Summary
Explanation
Regional Response
Timely Enforcement Response: More than half of all files
reviewed show enforcement occurring beyond the enforcement
response guidelines set forth in the NPDES EMS.
One of 9 major facilities with significant noncompliance
violations received formal enforcement in FY 2017 as indicated
by the results for CWA Metric lOal. Of those facilities with no
formal enforcement, the violations are primarily discharge
monitoring reporting violations or failure to submit discharge
monitoring reports for 20 of the 26 quarterly violations reported
for eight facilities. The remaining violations are effluent
violations of monthly and non-monthly effluent limits.
Past SRF reviews of FY 2012 enforcement and the current
review of FY 2017 enforcement actions identified the timeliness
of enforcement as a significant issue. Sixteen of 30 files
reviewed (53.33%) in FY 2017 have timely enforcement
response to violations. Lack of timely enforcement is the
primary reason for the 53% result for CWA metric 10b2 in 14
actions reviewed, while 5 files had a pattern of chronic violations
with no enforcement in the review year. Three files reviewed
had enforcement for some, but not all violations.
Repeat Recommendation Yes
Completion Verification
# of Recommendation
Rec
#
Due Date
Recommendation
1
06/01/2020
HQ will send a list of 4 randomly selected enforcement actions to
Region 6 that will be reviewed for timely enforcement based on FY
2019 frozen file selection tool information on formal actions taken. HQ
will send the results of the timely and appropriate analysis of FY 2019
formal enforcement actions to Region 6. This recommendation will be
completed when >71% actions taken meet NPDES EMS violation
response action criteria for timeliness which are within at least 12
months of violation discovery. Progress will continue to be monitored
on an annual basis if the 71% result is not achieved in 2020.
Relevant metrics:
71

-------
Natl
Metric ID Number and Description Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
lOal Percentage of major NPDES facilities
with formal enforcement action taken in a
timely manner in response to SNC violations
[INDICATOR]

2.1%
1
9
11.11%
10b2: Enforcement responses reviewed that
address violations in a timely manner.
100%

16
30
53.33%
CWA Element 5 - Penalties
Finding
Finding 5-1 :
Finding Level
Meets or Exceeds Expectations
Summary
Explanation
Penalty Collection Documented: The Region documented
penalty collection for all enforcement files reviewed.
EPA reviewed all three penalty calculations completed in FY
2017, along with two prior year penalty calculations in FY 2016
and FY 2015. The regional office provided copies of financial
documentation demonstrating full payment of the penalty
assessed.
Regional Response
Repeat Recommendation
No
Completion Verification
# of Recommendation
Relevant metrics:
Metric ID Number and Description
Natl Goal
Natl Avg

EPA
%
12b Penalties collected [GOAL]
100%

5 5
100%
Finding	Finding 5-2:
72

-------
Finding Level
Area for Improvement
Gravity and Economic Benefit Penalty Calculation Documented:
Several penalty calculations lacked documentation of economic
benefit calculations and the rationale for changes to penalty
amounts during the settlement process.
Summary
EPA reviewed five economic benefit and gravity penalty
calculations. Four of the five files reviewed did not document
the $0 value assessed for economic benefit. One file also had no
economic benefit documentation, but it was for only one
violation associated with failure to prepare discharge monitoring
reports that has a very low cost, if any, to the facility as the only
cost involved is the operator's time to prepare the spreadsheet.
The regional office explained that the region's common practice
is to assess $0 for economic benefit for violations involving the
preparation of paperwork.
Explanation
EPA reviewed five penalties for the rationale on changes to the
initial penalty and found that three files lacked documentation
for the changes. Much of the documentation is in email
correspondence, which is difficult to track over time.
Standardizing the way that the region documents changes to
penalties and economic benefit would be beneficial and the
existing penalty calculation template provides a structure for this
information.
Regional Response
These findings on penalty calculation and changes to penalties
are recurring findings from past SRF reviews.
The Region has a penalty checklist that includes an economic
benefit component and will ensure that it's properly filled out
with justification as to why economic benefit was not collected.
Repeat Recommendation Yes
Completion Verification
# of Recommendation
Recommen dation:
73

-------
Due Date
Recommendation
1 J 4/30/2021
Review 5 randomly selected penalty files for review of economic
benefit calculation and changes to initial penalties based on FY 2020
frozen file selection tool data. Supporting data including email traffic,
BEN data, and any other information that substantiates penalty
calculation and changes to penalties will be requested. If fewer than 5
penalties exist, HQ will review all penalties present in the file selection
tool. HQ will send the results of the analysis to the Region and the
recommendation will be closed when >71% of the files reviewed
document economic benefit and the rationale for changes to penalties.
Relevant metrics:
r
Metric ID Number and Description
Natl
Goal
Natl
Avg
EPA
N
EPA
D
EPA
%
11a: Penalty calculations that |
document and include gravity and |
economic benefit |
100%

1
5
20%
12a: Documentation of the rationale
for the different between the initial
penalty calculation and the final
penalty.
100%

2
5
40%
74

-------