State Review Framework

Arizona

Clean Water Act, Clean Air Act, and
Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2013

U.S. Environmental Protection Agency
Region 9, San Francisco

Final Report
July 29,2015


-------
Executive Summary

Introduction

EPA Region 9 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Arizona Department of Environmental Quality's Clean Water
Act NPDES program, Clean Air Act Stationary Source program, and RCRA Hazardous Waste
program.

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	Clean Water Act inspection coverage at major and minor facilities as well as other
programs meets or exceeds commitments in the state specific CMS plan.

•	Clean Water Act inspection reports meet or exceed EPA's expectations for report quality,
accuracy of compliance determinations, and timeliness of completion.

•	Water penalty calculation and collection is well documented.

•	ADEQ evaluates air CMS sources on a more frequent basis than the minimum evaluation
frequencies recommended in the CMS Policy.

•	The ADEQ air inspection reports which contained more narrative were well done.

•	The RCRA Field Inspection Report process is effective in supporting ADEQ's goal to
complete and issue all inspection reports within 30 days of the inspection, and facilitates
achievement of ADEQ's return-to-compliance objectives.

Priority Issues to Address

The following are the top-priority issues affecting the state program's performance:

•	Completeness and accuracy of CWA NPDES data reported in ICIS

•	Some CWA informal enforcement actions did not return facilities to compliance.

•	Timely and appropriate CWA enforcement

•	Air data reported into AFS is missing or inaccurate.

•	Air High Priority Violations (HPVs) are not being identified, and therefore are not
reported in AFS, nor enforced in a timely and appropriate manner.


-------
Most Significant CWA-NPDES Program Issues1

•	Completeness and accuracy of data on permit limits, discharge data, inspections,
violations, and enforcement actions reported in ICIS is unreliable. (CWA Finding 1-1)

•	Single event violations (SEVs) for major facilities are not reported or entered into ICIS as
required by EPA. (CWA Finding 3-1)

•	Significant non-compliance at major facilities is above the national average. (CWA
Finding 3-3)

•	20% of reviewed enforcement actions did not return facilities to compliance. (CWA
Finding 4-1)

•	Timely and appropriate enforcement is low at major facilities and non-major facilities as
reported to EPA and in actions reviewed on-site. (CWA Finding 4-2)

Most Significant CAA Stationary Source Program Issues

•	Lack of penalty actions resulting from informal enforcement actions (Notices of
Violation or Compliance.)

•	Non-adherence to EPA's 1998 HPV policy regarding identifying, reporting, and acting
on high priority violations.

•	The accuracy of compliance and enforcement data entered into AFS (soon to be ICIS-
Air) needs improvement. Data discrepancies were identified in all of the files reviewed.
EPA recommends ADEQ document efforts to identify and address the causes of
inaccurate Minimum Data Requirement (MDR) reporting. EPA will monitor progress
through the annual Data Metrics Analysis (DMA) and other periodic data reviews.

Most Significant RCRA Subtitle C Program Issues

• All ADEQ formal enforcement actions are managed through the State Attorney
General's Office. To address the inability to issue administrative orders, ADEQ has
developed innovative compliance assistance and enforcement programs that achieves a
high level of compliance with the regulated community. The ADEQ RCRA program
consistently achieved timely and appropriate enforcement actions that returned violating
facilities to compliance.

1 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."

3 | P a g e


-------
Clean Water Act Report
Clean Water Act Report
RCRA Report

TABLE OF CONTENTS

Pages 5-26
Pages 27 - 45
Pages 46 - 58

4 | P a g e


-------
State Review Framework

Arizona

Clean Water Act
Implementation in Federal Fiscal Year 2013

U.S. Environmental Protection Agency
Region 9, San Francisco

Final Report
July 29,2015

5 | P a g e


-------
7 | P a g e

Executive Summary

Introduction

EPA Region 9 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Arizona Department of Environmental Quality in 2014.

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	Inspection coverage at major and minor facilities as well as other programs meets or
exceeds commitments in the state specific CMS plan. (CWA Finding 2-1)

•	Inspection reports meet or exceed EPA's expectations for report quality, accuracy of
compliance determinations, and timeliness of completion. (CWA Findings 2-2 & 3-2))

•	Penalty calculation and collection is well documented. (CWA Finding 5-1)

Priority Issues to Address

The following are the top-priority issues affecting the state program's performance:

•	Completeness and accuracy of CWA NPDES data reported in ICIS

•	Some CWA informal enforcement actions did not return facilities to compliance.

•	Timely and appropriate CWA enforcement

7 | P a g e


-------
8 | P a g e

Most Significant CWA-NPDES Program Issues2

•	Completeness and accuracy of data on permit limits, discharge data, inspections,
violations, and enforcement actions reported in ICIS is unreliable. (CWA Finding 1-1)

•	Single event violations (SEVs) for major facilities are not reported or entered into ICIS as
required by EPA. (CWA Finding 3-1)

•	Significant non-compliance at major facilities is above the national average. (CWA
Finding 3-3)

•	20% of reviewed enforcement actions did not return facilities to compliance. (CWA
Finding 4-1)

•	Timely and appropriate enforcement is low at major facilities and non-major facilities as
reported to EPA and in actions reviewed on-site. (CWA Finding 4-2)

2 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."

8 | P a g e


-------
9 | P a g e

I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.

Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.

9 | P a g e


-------
10 I P a g e

II. SRF Review Process

Review period: 2013
Key dates:

CWA: On-Site File Review conducted July 8-11, 2014

State and EPA Key Contacts for Review:

CWA EPA Contacts: Ken Greenberg, Susanne Perkins, Liliana Christophe
CWA State Contact: Mindi Cross

10 | P a g e


-------
11 | P a g e

III. SRF Findings

Findings represent EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the state's last SRF review

•	Follow-up conversations with state agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

11 | P a g e


-------
12 | Page

Clean Water Act Findings

CWA Element 1 — Data

Metrics lb and 2b: Completeness and accuracy of permit limit and discharge data and
inspections and enforcement action data in EPA's national database.

Finding 1-1	Area for State Improvement

Summary	Throughout the review year, FY2013, ADEQ failed to input any NPDES

compliance and enforcement data to EPA's Integrated Compliance
Information System (ICIS), the agency's national compliance tracking
database. As a result, the state did not meet EPA's expectations for
completeness and accuracy of compliance and enforcement data in EPA's
national database. In addition, Arizona NPDES data available to the public
on EPA's ECHO database was incomplete and inaccurate.

By the time of this SRF review, ADEQ had begun entering some NPDES
compliance and enforcement data in ICIS. For purposes of this review,
EPA evaluated the completeness and accuracy of data that ADEQ had
input to ICIS as of June 16, 2014. Nevertheless, ADEQ still fell short of
EPA's expectations for coding major facility permit limits and entering
Discharge Monitoring Report (DMR) data in ICIS. In addition, EPA found
only 52.4% of files reviewed had compliance and enforcement information
accurately reported to EPA's ICIS database. Data accuracy in files
reviewed is well below the national goal of 100%.

Arizona's longstanding issues with data flow into ICIS have affected the
rating of this finding. Data entry into the appropriate EPA database is a
recurring issue from previous reviews of Arizona's NPDES program.

Explanation	Metrics lbl and lb2 measure the state's rate of entering permit limits and

DMR data into ICIS.

Arizona entered 89.5% of permit limits into ICIS for major facilities,
falling below both EPA's national goal of >95% and the national average
of 98.4%.

Arizona entered 89.2% of DMR data into ICIS, falling below both EPA's
national goal of >95% and the national average of 97.2%.

Under Metric 2b, EPA compared inspection reports and enforcement
actions found in selected files to determine if the inspections, inspection
findings, and enforcement actions were accurately entered into ICIS. The
analysis was limited to data elements mandated in EPA's ICIS data

12 | Page


-------
13 | P a g e

management policies. States are not required to enter inspections or
enforcement actions for certain classes of facilities.

EPA found 11 of the 21 files reviewed (52.4%) had all required
information (facility location, inspection, violation, and enforcement action
information) accurately entered into ICIS. Missing DMRs and unreported
enforcement actions were the most frequently cited data accuracy issues.
Arizona's accuracy rate of 52.4% is well below the national goal of 100%.

The results for Metrics lbl, lb2, and 2b are skewed by Arizona's
longstanding NPDES data flow issues into ICIS. Arizona's NPDES data
stopped flowing into ICIS in November 2012. Arizona began work on
resolving the data flow problems and committed to a June 30, 2013 project
completion date. By September 30, 2013, the end of federal FY13,

NPDES data was still not flowing nor was it flowing by the February 19,
2014 data freeze deadline for this review. Data finally began flowing in
the spring of 2014. EPA manually froze the FY13 data in ICIS on June 16,
2014 in order to prepare for the site review in early July 2014. Despite
Arizona's assurance that it had loaded 99.5% of the missing data to ICIS,
EPA found, and ADEQ confirmed, that the data in ICIS still had many
errors. As of September 30, 2014, the data is still not flowing reliably at
100% into ICIS. DMRs, some permit limit sets, and a few general bugs are
causing most of the problems. Although the results for Metrics lbl and
lb2 appear to be nearly acceptable, if EPA had used the February 2014
frozen FY13 data, the results for both metrics would have been 0%.
Although Metric 2b is already unacceptable at 52.4%, if EPA had used the
February 2014 frozen FY13 data, the results for this metric would have
been 0% as well.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

lbl Permit limit rate for major facilities

>95%

98.4%

68

76

89.5%

lb2 DMR entry rate for major facilities

>95%

97.2%

2153

2414

89.2%

2b Files reviewed where data are accurately
reflected in the national data system

100%

N/A

11

21

52.4%

State response During the review year, FY2013, ADEQ acknowledges the issues with

data flow into the ICIS database. ADEQ has dedicated staff and
resources to correct these issues and appreciates EPA's technical
assistance to our staff and funding additional assistance from Windsor.
ADEQ has made significant progress in flowing data into ICIS.

As of January 16, 2015, ADEQ has submitted approximately 93% of
discharge monitoring reports (DMRs) to ICIS for major and minor

13 | P a g e


-------
14 | Page

facilities. Historically, ADEQ did not send DMRs for minor facilities
during the PCS era, so data gaps are to be expected in submissions for
minors in the early part of our analysis. Additionally, ADEQ is working
to address data errors that are causing DMRs and permit data to be
rejected by ICIS.

ADEQ had been flowing informal and formal enforcement actions into
ICIS. However, due to the EPA's recent update to the ICIS node, the
enforcement action data has stopped flowing. While ADEQ is currently
working to update our node, this data is currently collected in a
temporary data table. All the saved data will be submitted to ICIS once
the update is completed.

Recommendation • By August 15, 2015, ADEQ will ensure all relevant NPDES permit,

compliance and enforcement information, including inspections,
enforcement actions, and violations, is entered and regularly
flowing into ICIS in accordance with EPA's data entry
requirements.

• EPA and ADEQ will include this as a standing agenda topic during
regular meetings to track progress and ensure data is being entered
and ADEQ is meeting its CWA section 106 grant workplan
commitments for ICIS-NPDES data management.



CWA Element 2

Inspections

Metrics 4a, 5a, and 5b: Inspection coverage compared to state workplan commitments.

Finding 2-1

Meets or Exceeds Expectations

Summary

Arizona met or exceeded inspection commitments in its Clean Water Act



Section 106 grant workplan.

Explanation	Metrics 4a, 5a, and 5b measure the number of inspections completed by the

state in the State Fiscal Year 2013 compared to the commitments in
Arizona's Clean Water Act Section 106 grant workplan. EPA Region 9
established workplan inspection commitments for Arizona consistent with
the inspection frequency goals established in EPA's 2007 Compliance
Monitoring Strategy (CMS). Arizona inspected 35 major facilities and 18

14 | Page


-------
15 | Page

minor facilities during the review year, meeting the CMS-based workplan
commitments of 35 major and 18 minor inspections.

Arizona met all of its CMS-based workplan commitments for other
inspections, completing 3 pretreatment compliance inspections; 1
pretreatment compliance audit, 1 pretreatment significant industrial user
inspection, 78 industrial and 104 construction stormwater inspections; 2
municipal stormwater program inspections; and 9 concentrated animal
feeding operation inspections.

For metric 4al0, the CMS-based workplan includes both permitted and
unpermitted CAFOs in its commitments. Arizona inspects its two
permitted CAFOs on a five year cycle as required. ADEQ has inspected all
of its CAFOs (permitted and unpermitted) over the last five years.

Relevant metrics

Metric ID Number and Description

State
CMS

Natl

Avg

State
N

State
D

State
% or #

4a 1 Pretreatment compliance inspections and
audits

100%

N/A

4

4

100%

4a2 Significant Industrial User inspections for
SIUs discharging to non-authorized POTWs

100%

N/A

1

1

100%

4a7 Phase I & IIMS4 audits or inspections

100%

N/A

2

2

100%

4a8 Industrial stormwater inspections

100%

N/A

78

60

130%

4a9 Phase I and II stormwater construction

100%

N/A

104

60

173%

inspections

4a 10 Medium and large NPDES CAFO
inspections

100%

N/A

9

4

225%

5al Inspection coverage of NPDES majors

100%

54.1%

35

35

100%

5b 1 Inspection coverage of NPDES non-majors
with individual permits

100%

25.9%

18

18

100%

State response

Recommendation

CWA Element 2 — Inspections

Metrics 6a and 6b: Quality and timeliness of inspection reports.

Finding 2-2	Meets or Exceeds Expectations

15 | Page


-------
16 | Page

Summary

Arizona's inspection reports meet or exceed EPA's expectations for report
quality and timeliness of completion.

Explanation

Metric 6a assesses the quality of inspection reports, in particular, whether
the inspection reports provide sufficient documentation to determine the
compliance status of inspected facilities. EPA reviewed 26 inspection
reports; 25 were found complete and sufficient to determine compliance in
accordance with the 2004 NPDES Compliance Inspection Manual
guidelines.

Metric 6b measures the state's timeliness in completing inspection reports
within the state's recommended deadline of 30 working days for
compliance evaluation inspection reports. EPA reviewed 25 inspection
reports; 24 were found to be completed within the state's guidelines. One
inspection report counted under Metric 6a was for an MS4 audit, which
does not have a recommended deadline, so that report was not considered
in Metric 6b. ADEQ is considering establishing a deadline for completion
of its MS4 inspection reports.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



6a Inspection reports complete and sufficient to

, , ,. , ...^ 100% N/A 25 26 96.2%
determine compliance at the facility



6b Inspection reports completed within prescribed n/A 24 25 96°/
timeframe 0 0





State response

Recommendation



CWA Element 3 — Violations

Metrics 7al, 8b and 8c: Tracking of single event violations.

Finding 3-1

Area for State Improvement

Summary

Arizona is not entering single event violations (SEVs) in EPA's ICIS
database as required for major facilities. This is a recurring issue from
previous reviews of Arizona's NPDES program

Explanation

Metric 7al assesses whether single-event violations (SEVs) are reported
and tracked in ICIS-NPDES. SEVs are violations that are determined by

16 | Page


-------
17 | P a g e

means other than automated review of discharge monitoring reports and
include violations such as spills and violations observed during field
inspections. Arizona does not report single event violations in ICIS as
required under EPA's data management policy. Single event violations are
a required data entry for major facilities as indicated in the December 28,
2007 EPA memorandum, ICIS Addendum to the Appendix of the 1985
Permit Compliance System Statement (p.9).

Although Arizona does not enter SEVs in EPA's ICIS database, they have
a robust system for tracking SEVs in the Inspection, Compliance and
Enforcement (ICE) module of the state's AZURITE database.

Metric 8b requires SEVs at major facilities to be accurately identified as
significant noncompliance (SNC) or non-SNC. Arizona does not record
SEVs in ICIS NPDES and, therefore does not flag SEVs as SNC in ICIS.
EPA has established automated and discretionary criteria for flagging
discharger violations as SNC. Arizona relies on the automated DMR-
based criteria to flag effluent limits and reporting violations as SNC, but
does not normally make discretionary labeling of SEV violations as SNC.

Metric 8c requires timely reporting of SEVs identified as SNC at major
facilities. Since Arizona does not record SEVs in ICIS NPDES, the state
cannot meet the requirements of this metric.

For Metrics 8b and 8c, EPA reviewed 12 major facility files. None of the
files had any violations noted as SEV in which to evaluate metrics 8b and
8c.

A similar finding was found in Round 2 of the SRF in that ADEQ was not
entering SEVs into PCS. As it does currently, ADEQ was using its
AZURITE database to identify and track violations

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

7al Number of major facilities with single event
violations

N/A

N/A

N/A

N/A

0

8b Single-event violations accurately identified
as SNC or non-SNC

100%

N/A

0

0

0%

8c Percentage of SEVs identified as SNC
reported timely at major facilities

100%

N/A

0

0

0%

State response

ADEQ acknowledges that SEVs are not being flowed into ICIS. ADEQ does track
SEVs in our Azurite database.

17 | P a g e


-------
18 | Page

Recommendation

EPA and ADEQ agree to meet within one year to discuss options for the
transfer of SEV data from the state's ICE database to ICIS, including
possible funding for additional IT resources.



CWA Element 3 — Violations

Metric 7e: Accuracy of compliance determinations

Finding 3-2

Meets or Exceeds Expectations

Summary

Inspection reports generally provide sufficient information to ascertain
compliance determinations on violations found during inspections.

Explanation

Metric 7e measures the percent of inspection reports that have accurate
compliance determinations. EPA reviewed 26 inspection reports and found
that 23 of the reports (88.5%) led to accurate compliance determinations
which is within the acceptable range of the national goal of 100%.
Generally, ADEQ makes compliance determinations in its inspection
reports. (Some states make compliance determinations in a separate
document or memo to the file.) The reviewers also found that ADEQ's
inspection report compliance determinations were carried over as a
violation record in its ICE database and were often found reflected in
ADEQ enforcement actions such as a Notice of Violation.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description G()a| Ayg N D o/o0r#



7e Inspection reports reviewed that led to an 10Q% N/A 23 26 gg 5%
accurate compliance determination





State response

Recommendation



CWA Element 3 — Violations

Metrics 7dl and 8a2: Major facilities in significant non-compliance

Finding 3-3

Area for State Attention

Summary

The rate of significant noncompliance at major facilities in Arizona is
higher than the national average.

18 | Page


-------
19 | P a g e

Explanation	Metric 7dl measures the percent of major facilities in non-compliance

reported in ICIS. Based on data in ICIS, noncompliance at major facilities
in Arizona was 36.61% during the review year. Arizona's rate of
noncompliance is lower than the national average noncompliance rate of
62.6%. Note that, because of Arizona's data management problems, the
accuracy of ICIS data used for this metric is uncertain.

Metric 8a2 measures the percentage of major facilities in significant
noncompliance. Twenty-one of the 71 major facilities in Arizona were in
significant noncompliance for one or more quarters during the review year.
The rate of significant noncompliance in Arizona (29.57%) is higher than
the national average of 24.3%. Because Arizona's ICIS data was
incomplete and inaccurate, EPA and ADEQ made the SNC determinations
for this metric based on a combination of ICIS data (where reliable) and
discharge data in ADEQ's AZURITE database.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

7dl Major facilities in noncompliance

N/A

62.6%

26

71

36.61%

8a2 Percentage of major facilities in SNC

N/A

24.3%

21

71

29.57%

State response ADEQ was unable to complete the migration of our data into ICIS prior to
PCS being taken out of production on November 29, 2012. During the
same time period, ADEQ's state database was not calculating violations
properly. Without electronic data management capabilities, ADEQ was
reviewing monitoring and reporting data on a case-by-case basis as part of
our inspection process.

Currently, ADEQ has resolved the majority of issues associated with our
state database. With this information and the information that is in ICIS,
ADEQ has developed and implemented a Monitoring and Reporting
Standard Operating Procedure (SOP) to conduct routine compliance
reviews of the monitoring reporting violations and follow-up with the
appropriate enforcement actions.

Recommendation ADEQ should be able to reduce the incidence of SNC by taking timely
formal enforcement as SNC violations arise. See recommendation for
Finding 4-2.

19 | P a g e


-------
20 | P a g e

CWA Element 4 —

Enforcement

Background



Information



Summary

This finding highlights the number and type of NPDES enforcement



actions taken by Arizona DEQ during the review year. The finding is for



information and not subject to a rating under EPA's SRF protocols.

Explanation	During State fiscal year 2013 (July 1, 2012 to June 30, 2013), Arizona

DEQ issued the following enforcement actions in response to NPDES
violations:

70 Informal Actions (Notices of Opportunity to Correct (NOC) or Notices

of Violation (NOV)

4 Compliance Orders
1 Penalty Actions

ADEQ's NOC and NOV are informal administrative enforcement actions
typically used by ADEQ as its initial response to a violation. NOCs and
NOVs do not create independently enforceable obligations on respondents.
Compliance orders are formal administrative enforcement actions that
impose independently enforceable obligations on the respondent to take
actions to return to compliance. In accordance with its Compliance and
Enforcement Handbook, ADEQ normally will attempt to negotiate an order
on consent with respondents, but has authority to issue unilateral
compliance orders if needed. ADEQ does not have authority to issue
administrative penalties but can take judicial actions to impose penalties
and injunctive relief obligations.

As can be seen from the FY13 data, ADEQ relies primarily on informal
enforcement actions to address NPDES violations. Findings 4-1, 4-2 and
5-1 evaluate ADEQ's use of these enforcement tools against EPA's SRF
review criteria.

20 | P a g e


-------
21 | P a g e

CWA Element 4 — Enforcement

Metric 9a: Enforcement actions promoting return to compliance

Finding 4-1

Area for State Improvement

Summary

Although most enforcement actions reviewed promote return to



compliance, about 20% of the reviewed enforcement actions did not result



in a return to compliance.

Explanation

Metric 9a measures the percent of enforcement responses that return or will
return the source to compliance. EPA found 21 of 26 enforcement actions
reviewed promote return to compliance compared to the national goal of
100%. The 26 enforcement actions reviewed in selected ADEQ files
included 17 informal actions (NOC or NOV), 7 compliance orders and 2
judicial actions.

To evaluate the informal actions, EPA determined if the file had a record of
the discharger timely returning to compliance in response to ADEQ's NOC
or NOV. For compliance orders or judicial actions, EPA assumed that the
action promoted a return to compliance if the enforcement action imposed
enforceable injunctive relief obligations or if the file noted an actual return
to compliance.

In four cases (1 NOC and 3 NOVs), ADEQ closed the informal
enforcement action prior to the discharger returning to full compliance.
ADEQ had issued these four informal actions to address reporting
violations at three different facilities. (One facility received two NOVs.)
In a fifth case, ADEQ issued an informal enforcement action (NOV)
against a facility with SNC level violations. The facility was in SNC for
all four quarters of FY13 and the SNC continued after ADEQ issued the
NOV. As of the date of the SRF review, ADEQ had not escalated its
enforcement beyond an NOV.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

9a Percentage of enforcement responses that
return or will return source in violation to
compliance

100%

N/A

21

26

81%

State response Compliance is a core mission of ADEQ and there are two performance

measures related to facility compliance in ADEQ's Strategic Plan. Our key
compliance goals are to reduce the amount of time that a facility is out of
compliance by 50% over five years; and to increase the number of facilities
that are in compliance at the time of inspection by 50% over five years. It

21 | P a g e


-------
22 | P a g e

should be noted that ADEQ does not consider issuance of a formal action
to mean compliance for the purposes of these measures. Our focus is for
facilities to be in actual compliance with their regulatory requirements.
While ADEQ is pleased that over 80% of our enforcement actions resulted
in compliance, we are committed to continuous improvement.

ADEQ acknowledges that informal enforcement actions were not always
escalated in a timely manner when compliance deadlines were missed. To
address these concerns, ADEQ has made changes to our compliance and
enforcement processes:

•	As of January 2013, ADEQ streamlined our escalated enforcement
approach so that issuance of a consent order is pre-approved by
management if an entity fails to comply with a NOV.

•	The Water Quality Compliance Section has developed and
implemented a Monitoring and Reporting SOP to conduct routine
review of monitoring and reporting data. By identifying and
responding to violations in a timely manner, ADEQ will continue to
reduce the time that a facility remains out of compliance and
therefore reduce the number of facilities in SNC.

Recommendation

EPA acknowledges ADEQ is unable to commit to adopting and implementing
revisions to its enforcement response procedures to provide for increased
automatic formal enforcement against facilities in SNC. With that
acknowledgement and by July 31, 2015,

•	ADEQ will commit to follow its revised Compliance and Enforcement
Procedures and Monitoring and Reporting procedures using a
combination of formal and informal actions.

•	ADEQ will escalate NOVs to a formal enforcement action following the
timeframes outlined in its revised Compliance and Enforcement
Procedures.

•	EPA will be prepared to take enforcement against facilities in SNC
or with other violations if ADEQ is not able to take timely and
appropriate formal enforcement, or if ADEQ requests assistance,
and in other circumstances EPA deems appropriate. The exact
form and amount of EPA's assistance will be determined as EPA
monitors ADEQ progress in meeting its yearly workplan goals.

22 | P a g e


-------
23 | P a g e

CWA Element 4 —

Enforcement

Metrics 10a and 10b: Timely and appropriate enforcement actions

Finding 4-2

Area for State Improvement

Summary

Enforcement actions taken at major and non-major facilities are not timely



or appropriate. This is a recurring issue from previous reviews of



Arizona's NPDES program.

For this finding, EPA used two metrics (metrics 10a and 10b) to evaluate
whether ADEQ is addressing violations with appropriate enforcement
actions and whether ADEQ's enforcement responses were taken in a timely
manner.

Metric 10a was used to assess ADEQ response to SNC level violations at
major facilities. EPA examined ADEQ's enforcement response to each of
the 21 major facilities that had SNC level violations during federal
FY2013. EPA policy dictates that SNC level violations must be addressed
with a formal enforcement action (administrative compliance order or
judicial action) issued within 5 V2 months of the end of the quarter when
the SNC level violations initially occurred.

Metric 10b was used to assess ADEQ's enforcement response to any type
of violation (SNC or lower level violations) at any type of facility (major,
minor or general permit discharger). EPA's evaluation of metric 10b was
based on review of 27 files selected to represent a cross section of facilities
operating in Arizona. EPA expectations for enforcement response are
provided in its Enforcement Management System which includes the strict
expectations cited above for enforcement response to major facility SNC
violations as well as the somewhat more subjective guidelines for
responses to non-SNC violations.

For metric 10a, EPA and ADEQ reviewed ICIS data (where reliable) and
discharge data in ADEQ's AZURITE database to determine that 21 major
facilities had SNC level violations in federal FY2013. ADEQ reported that
they took no enforcement against 8 of these facilities and used informal
enforcement actions (NOC or NOV) to address the violations at 9 of the
facilities. ADEQ issued formal enforcement actions (administrative
compliance orders on consent) against 4 of the SNC facilities. However, 3
of these consent orders were not timely as they were issued more than 5 V2
months following the onset of SNC violations. (ADEQ has noted the
difficulty of reaching agreement on a consent order within EPA's
timeliness deadline.) In summary, ADEQ issued a timely and appropriate
enforcement action against 1 of the 21 facilities with SNC level violations
in federal FY2013.

23 | P a g e


-------
24 | P a g e

EPA policy states that no more than 2% of the total majors in the state
should be in SNC without an appropriate enforcement action. It appears
that Arizona had 28% of its major dischargers (20 of 71) in SNC during
FY2013 without a timely and appropriate enforcement response.

For metric 10b, EPA reviewed 27 files that included documentation that a
violation had occurred at the facility. These files included a mix of major,
minor and general permitted facilities. Several of the files were major
facilities with SNC violations that were also considered under metric 10a.
EPA found 15 instances where ADEQ's enforcement response was judged
to be appropriate for the nature of the violation. ADEQ's enforcement
actions included 1 warning letter, 2 NOCs, 8 NOVs, 2 compliance orders
and 2 judicial actions. On the other hand, EPA found 11 instances where
ADEQ's enforcement response was not timely and appropriate for the
nature of the violation. These included 2 NOVs and 5 compliance orders
where EPA found the action to be appropriate, but late. In addition, EPA
found 4 instances where ADEQ either took no enforcement or an informal
action where EPA thought a formal action was warranted. In summary,
EPA found that ADEQ took appropriate action in 15 of the 27 files
reviewed (55.6%).

This same finding was identified in Rounds 1 and 2 of the SRF. ADEQ
did not implement EPA's Round 1 and Round 2 recommendations to issue
formal enforcement against facilities with SNC level violations. ADEQ's
Compliance and Enforcement Handbook calls for informal enforcement
actions (NOC or NOV) as the initial response to most violations. As a
result, ADEQ issues few formal enforcement actions.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

lOal Major facilities with timely action as
appropriate





1

21

4.8%

10b Enforcement responses reviewed that
address violations in an appropriate manner

100%

N/A

15

27

55.6%

Choose an item.

State response As discussed in Finding 4-1, facility compliance is a key to ADEQ's
success and we will continue to work on improving our processes.
However, ADEQ is unable to commit to adopting and implementing
revisions to its enforcement response procedures to provide for increased
automatic formal enforcement against facilities in SNC. ADEQ will
commit to taking more timely enforcement actions using a combination of
formal and informal enforcement actions following our Compliance and
Enforcement Procedures.

24 | P a g e


-------
25 | P a g e

Recommendation EPA acknowledges ADEQ is unable to commit to adopting and implementing
revisions to its enforcement response procedures to provide for increased
automatic formal enforcement against facilities in SNC. With that
acknowledgement and by July 31, 2015,

•	ADEQ will commit to follow its revised Compliance and Enforcement
Procedures and Monitoring and Reporting procedures using a
combination of formal and informal actions.

•	ADEQ will escalate NOVs to a formal enforcement action following the
timeframes outlined in the revised Compliance and Enforcement
Procedures.

•	EPA will be prepared to take enforcement against facilities in SNC
or with other violations if ADEQ is not able to take timely and
appropriate formal enforcement, or if ADEQ requests assistance,
and in other circumstances EPA deems appropriate. The exact
form and amount of EPA's assistance will be determined as EPA
monitors ADEQ progress in meeting its yearly workplan goals.

CWA Element 5 — Penalties

Metrics 11a, 12, and 12b: Penalty calculation and collection

Finding 5-1

Meets or Exceeds Expectations

Summary

ADEQ properly considered economic benefit and gravity in its penalty
calculation and documented collection of the penalty payment.

Explanation

Metric 11a assesses the states method for calculating penalties and whether
it properly documents the economic benefit and gravity components in its
penalty calculations. Metric 12a assesses whether the state documents the
rationale for changing penalty amounts when the final value is less than the
initial calculated value. Metric 12b assesses whether the state documents
collection of penalty payments.

EPA's findings for metrics 11a, 12a and 12b are based on review of the
single penalty action taken by ADEQ during the review year. In the file
for its penalty action, ADEQ properly documented consideration of
economic benefit and gravity in its penalty calculation (metric 11a) and
had a copy of the electronic funds transfer documenting receipt of the
penalty payment (metric 12b). Metric 12a does not apply for this action as
the penalty payment was not less than ADEQ's initial penalty calculation.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #

25 | P a g e


-------
26 | P a g e

1 la Penalty calculations reviewed that consider
and include gravity and economic benefit

100%

N/A 1

1

100%

12a Documentation of the difference between
initial and final penalty and rationale

100%

N/A



N/A

12b Penalties collected

100%

N/A 1

1

100%

State response

Recommendation

26 | P a g e


-------
27 | P a g e

State Review Framework

Arizona Department of Environmental Quality

Clean Air Act
Implementation in Federal Fiscal Year 2013

U.S. Environmental Protection Agency

Region 9

Final Report
July 29, 2015

27 | P a g e


-------
28 | P a g e

Executive Summary

Introduction

The U.S. Environmental Protection Agency (EPA) Region 9 Air & TRI Enforcement Office
conducted a State Review Framework (SRF) enforcement program oversight review of the
Arizona Department of Environmental Quality (ADEQ) in 2014.

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF
Tracker and publish reports and recommendations on the EPA ECHO web site.

Areas of Strong Performance

•	ADEQ evaluates CMS sources on a more frequent basis than the minimum evaluation
frequencies recommended in the CMS Policy.

•	The ADEQ inspection reports which contained more narrative were well done.

Priority Issues to Address

•	Data reported into AFS is missing or inaccurate.

•	High Priority Violations (HPVs) are not being identified, and therefore are not reported in
AFS, nor enforced in a timely and appropriate manner.

Most Significant CAA Stationary Source Program Issues3

•	Lack of penalty actions resulting from informal enforcement actions (Notices of
Violation or Compliance.)

•	Non-adherence to EPA's 1998 HPV policy regarding identifying, reporting, and acting
on high priority violations.

•	The accuracy of compliance and enforcement data entered into AFS (soon to be ICIS-
Air) needs improvement. Data discrepancies were identified in all of the files reviewed.
EPA recommends ADEQ document efforts to identify and address the causes of
inaccurate Minimum Data Requirement (MDR) reporting. EPA will monitor progress
through the annual Data Metrics Analysis (DMA) and other periodic data reviews.

3 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."


-------
29 | P a g e

I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections/Evaluations — meeting inspection/evaluation and coverage commitments,
inspection (compliance monitoring) report quality, and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state/local understand the
causes of issues and agree, to the degree possible, on actions needed to address them. SRF
reports capture the agreements developed during the review process in order to facilitate program
improvements. EPA also uses the information in the reports to develop a better understanding of
enforcement and compliance nationwide, and to identify issues that require a national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state/local programs.

Each state/local programs are reviewed once every four years. The first round of SRF reviews
began in FY 2004. The third round of reviews began in FY 2013 and will continue through FY
2016.


-------
30 | P a g e

II. SRF Review Process

Review period: FY 2013
Key dates:

•	Kickoff letter sent to ADEQ: April 16, 2014

•	Kickoff meeting conducted: June 9, 2014

•	CAA data metric analysis and file selection list sent to ADEQ: May 8' 2014

•	On-site CAA file review: June 9, 2014 - June 11,2014

•	Draft report sent to ADEQ: January 5, 2015

•	Report finalized: July 29, 2015

State and EPA key contacts for review:

ADEQ

•	Timothy Franquist, Manager Air Quality Compliance Section at the time of the review

•	Marina Mejia, Air Quality Supervisor

•	Pam Nicola, Air Quality Supervisor at the time of the review

EPA Region 9

•	Matt Salazar, Manager, Air & TRI Office, Enforcement Division

•	Andrew Chew, Case Developer/ Inspector, Air & TRI Office, Enforcement Division

•	Debbie Lowe-Liang, Case Developer/ Inspector, Air & TRI Office, Enforcement
Division

•	Jennifer Sui, AFS Coordinator, Information Management Section, Enforcement Division

•	Robert Lischinsky, Office of Compliance, Office of Enforcement and Compliance
Assistance


-------
31 | P a g e

III. SRF Findings

Findings represent EPA's conclusions regarding state/local performance and are based on
findings made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the previous state/local SRF review

•	Follow-up conversations with state/local agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state/local performs above national program
expectations.

Area for State/Local Attention: An activity, process, or policy that one or more SRF metrics
show as a minor problem. Where appropriate, the state/local should correct the issue without
additional EPA oversight. EPA may make recommendations to improve performance, but it will
not monitor these recommendations for completion between SRF reviews. These areas are not
highlighted as significant in an executive summary.

Area for State/Local Improvement: An activity, process, or policy that one or more SRF
metrics show as a significant problem that the agency is required to address. Recommendations
should address root causes. These recommendations must have well-defined timelines and
milestones for completion, and EPA will monitor them for completion between SRF reviews in
the SRF Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State/Local Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl. Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state/local has made.

•	Natl. Avg: The national average across all states, territories, and the District of
Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.


-------
32 | P a g e

Clean Air Act Findings

Element 1 — Data

Finding 1- Area of State Improvement

1

Summary The File Review indicated that information reported into AFS was not consistent
with the information found in the files reviewed.

Explanati Review Metric 2b evaluates the completeness and accuracy of reported
on	MDRs in AFS. Timeliness is measured using the date the activity is achieved

and the date it is reported to AFS. While the national goal for accurately reported
data in AFS is 100%, only 14.3% of reviewed data in the files was accurately
reported. Inaccuracies were related to facility information (incorrect names,
addresses, contact phone numbers, CMS information, pollutants, operating
status, etc.) and missing or inaccurate activity data (e.g., incorrect FCE dates
entered; stack test not reported to AFS). Incorrect data in ICIS-Air (AFS)
potentially hinders targeting efforts and results in inaccurate information being
released to the public.

Metric 3a2 measures whether HPV determinations are entered into AFS in a
timely manner (within 60 days) in accordance with the AFS Information
Collection Request (AFS ICR) in place during FY 2013. The metric indicates
that no HPV determination was reported timely as no HPVs were entered. EPA
policy requires all HPV determinations to be reported to AFS within 60 days.

Metric 3b 1 measures the timeliness for reporting compliance-related MDRs
(FCEs and Reviews of Title V Annual Compliance Certifications). Out of 153
individual actions, 130 were reported within 60 days (85%). This is below the
goal of 100%.

Metric 3b2 evaluates whether stack test dates and results are reported within 120
days of the stack test. The national goal for reporting results of stack tests is to
report 100% of all stack tests within 120 days. Out of 66 stack tests, only 34
were reported within 120 days (51.5%), below the national average and the
national goal.

Metric 3b3 measures timeliness for reporting enforcement-related MDRs within
60 days of the action. The actions reported by ADEQ were Notices of Violations
and Administrative Orders. Out of 14 enforcement MDR reporting, only 8 were
reported within 120 days (57.1%).


-------
33 | P a g e

Metrics 7b 1, 7b 2 and 7b3 use indicators of an alleged violation to measure the
rate at which violations are accurately reported into AFS. Violations are reported
by changing the compliance status of the relevant air program pollutant in AFS.
Metrics 7b 1 and 7b3 are "goal" indicators with a goal of 100% of violations
reported.

Metric 7b 1 indicates that for all 7 NOVs issued, ADEQ did not
change the compliance status to either "in violation" or "meeting
schedule."

Similarly, for HPVs, Metric 7b3 indicates that for all HPVs identified at major
sources in FY2011, ADEQ did not change the compliance status to either "in
violation" or "meeting schedule." ADEQ did not adhere to the 1998 HPV Policy
with regard to identifying HPVs (see Finding 3-1); because no HPVs were
identified, none were reported in AFS. Meeting the recommendation under
Finding 3-1 should rectify this concern.

Relevant	Natl	Natl State State	State

, .	Metric ID Number and Description	_ ,	„	n,

metrics	Goal	Avg N D	% or #

2b- Accurate MDR Data in AFS	100%	4 28	14.3%

3a2- Untimely Entry of HPVs	0

3bl-Timely^rHng of Compliant
Monitoring MDRs

130 153 85.0%

3b2-Timely Reporting of Stack Test

Dates and Results IOO/° 75 4/°

34 66 51.5%

3b3 - Timely Reporting of Enforcement

, 100/o 68.7/o

MDRs

8 14 57.1%

7b 1 - Violations Reported Per Informal

A . 100/o 59.5/o

Actions

0 7 0%

7b3 - Violations Reported Per HPV , nnn/ ,n/
T1 ^ 100% 57.5%
Identified

0 0 N/A


-------
34 | P a g e

State	ADEQ understands that inaccurate data appears to have been reported to AFS and

Response agrees that inaccurate data is undesirable and does not provide for the greatest level
of transparency. EPA's report indicates that only 14.3% of reviewed data was
accurately reported. ADEQ is committed to correcting any inaccuracies. To assist in
the corrections, ADEQ requests that EPA provide the AFS facility list that it
reviewed. In addition, ADEQ requests that EPA provide the list of reviewed data and
any inaccuracies that were identified to assist in the timeliness of the required
updates.

ADEQ agrees that HPVs were not reported timely as no HPVs had been entered at
the time of the SRF field work. During the exit debrief on June 11, 2014, EPA
brought this concern to ADEQ's attention. Immediately after the issue was brought to
ADEQ's attention, a concerted effort was made to provide EPA with a reconciliation
document that identified past HPVs for the review period. This spreadsheet was sent
by e-mail to Mr. Matt Salazar with EPA Region 9 by Mr. Tim Franquist of ADEQ on
June 16, 2014. EPA acknowledged receipt of the e-mail and ADEQ has yet to hear
whether the information reported meets EPA's expectations. Moving forward, ADEQ
intends to ensure that HPVs are appropriately identified by instituting a new training
course for all Air Quality Division compliance staff. A copy of the final training
material will be provided to EPA at the time it is completed on or before March 30,
2015. Although all Air Quality Division staff has been provided with a copy of the
1998 HPV policy, given the update to the policy in September 2014 and the need to
implement the training program, ADEQ anticipates the need for another
reconciliation that will be provided on March 30, 2015.

ADEQ agrees that timely reporting is important. Since the exit debrief on June 11,
2014, ADEQ has assigned a staff member to direct enter data into EPA's ICIS-Air.
ADEQ understands that as of September 2014, the timeliness of reporting to ICIS-Air
increased to 99%. Additionally, ADEQ continues to make progress on the HPV
training course. With training and direct entry of data, ADEQ expects that all of the
issues related to the timeliness portion of this finding have been resolved.

Recomme

ndation	* By August 31, 2015, EPA will provide ADEQ with the AFS facility list and identified

data inaccuracies. By October 15, 2015, ADEQ should provide EPA with
corrections to both the AFS facility list and all data inaccuracies.

•	By August 31, 2015, ADEQ will provide EPA with a final HPV identification
training course for all air quality compliance staff. By December 31, 2015, ADEQ will
provide EPA with documentation demonstrating that the training course has been
implemented, the number of compliance staff trained, and data regarding the
number of HPVs identified after the training course is complete.

•	By August 31, 2015, ADEQ will provide EPA with a HPV reconciliation document
that ensures that HPVs between June 12, 2014 and August 15, 2015 have been
properly identified.

•	By December 31, 2015, ADEQ will provide EPA with a HPV reconciliation
document that ensures that HPVs between August 31 2015 and December 31, 2015
have been properly identified.


-------
35 | P a g e

Element 2 — Inspections/Evaluations

Finding 2-1	Meets Expectations

Summary	ADEQ met the negotiated frequency for compliance evaluations of CMS

sources.

Explanation	This Element evaluates whether the negotiated frequency for compliance

evaluations is being met for each source. ADEQ met the national goal
for the relevant metrics.

ADEQ met the negotiated frequency for conducting FCEs of major and
SM80s. ADEQ ensured each major source was evaluated with an FCE
once every 2 years and each SM80 once every 5 years.

Note: The 100% achievement rate noted in the table below differs from
what would be derived using the "frozen data set", because upon review
of the reported frozen data we found the state had reported a higher,
inaccurate universe of facilities than actually existed. The FCEs do not
match all of the Title V and SM80 facilities identified in the 2010 ADEQ
CMS policy (likely due to facility closures, openings, and facilities that
changed names). Our review confirmed a universe of 56 majors (and
one SM80), versus 93 reported in the frozen data set. ADEQ did 57 FCE
inspections in FYs 12 and 13. ADEQ should revisit the CMS plan on a
regular basis and update for accuracy.

EPA commends ADEQ for full compliance evaluations at major
facilities, an impressive accomplishment given the distance and
complexities of the sources they regulate. ADEQ goes beyond the
minimum frequencies, and inspects sources more often than EPA's CMS
policy indicates. If ADEQ believes their resources can be put to better
use, EPA can approve alternative CMS plans that are not completely
consistent with CMS recommended evaluation frequencies for local and
state agencies to shift resources to other sources of concern, if needed.

Relevant metrics	.	Natl	Natl	State State State

Metric ID Number and Description	_ ,	.	„ __ .

Goal	Avg	N D % or #

5a - FCE Coverage Majors	100%	88.5	29 42 69.0%

5b - FCE Coverage SM80s	100%	93.3	0 1 0%

5c - FCE Coverage CMS non-SM80s N/A

N/A


-------
36 | P a g e

5d - FCE Coverage CMS Minors N/A	N/A

State Response

Recommendation None required.

Element 2 — Inspections/Evaluations

Finding 2-2	Meets Expectations

Summary	ADEQ nearly fully completed the required review for each Title V

Annual Compliance Certification (ACC).

Explanation	This Element evaluates whether the delegated agency has completed the

required review for Title V Annual Compliance Certifications. While
ADEQ has exceeded the national average, the goal for annual review of
Title V certifications is 100%. The data indicates that 1 certification was
not timely reviewed in FY 2012.

Arizona has opted to require semi-annual certifications, rather than one
annual certification. In lieu of submitting one annual Title V compliance
certification, it is acceptable to submit two semi-annual certifications
with each certification covering a 6 month period (i.e., January 1-June
30, and July 1-December 31), as long as the aggregation of the two
reports adequately and accurately covers the annual compliance period.
While EPA recommends the second semi-annual certification
incorporate by reference the first semi-annual certification in order to
formally satisfy the annual compliance obligation, such incorporation is
not an absolute requirement if, again, the aggregation of the two reports
provides complete annual coverage.

EPA commends ADEQ for being significantly above the national
average for reviewing Title V Annual Compliance Certifications. It
would be ideal to report all of the certifications in ICIS-AIR.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

5e - Review of TV ACCs

100%

81.3%

45

46

97.8%


-------
37 | P a g e

State Response

Recommendation None required.

Element 2 — Inspections/Evaluations

Finding 2-3

Area for State Attention

Summary

Overall, the ADEQ compliance monitoring reports (CMRs) provided



were adequate, but small additions of relevant information may make



them more useful to inspectors.

Explanation

EPA appreciates the "Lean" Transformation Process undertaken by
ADEQ and the overhaul of state processes to obtain improvements and
increase effectiveness. In addition, ADEQ has been able to overcome
past financial issues and refill staff vacancies, as needed. Developing an
updated ADEQ Handbook with an SOP is a positive outcome. EPA also
appreciates the effort to promote efficiency by updating the field
inspection reports.

28 ADEQ compliance monitoring reports (aka Air Quality Field
Inspection Reports) were reviewed under this Element. In reviewing the
majority of the reports, it is unclear if all 7 CMR elements as discussed
in the CMS policy were addressed in the reports. Report should include
sufficient numerical detail to ensure the 7 CMR elements are adequately
addressed. For example, including the production rate of a facility
would enable one to determine if a previous or future source test is
conducted at the appropriate production rate; including a significant
control device parameter (i.e., incinerator temperature), would also be
helpful information. Reviewers found 14 of 28 inspections were fully
documented. In a few of those 14, when there were deficiencies noted
during inspections, there was significant documentation of those
deficiencies.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

6a - Documentation of FCE Elements

100%



14

28

50.0%

6b - CMRs/Sufficient Documentation
to Determine Compliance

100%



14

28

50.0%


-------
38 | P a g e

State Response ADEQ agrees that numeric information associated with specific permit

conditions should be added to the standardized inspection reports. While these
records will only provide a "snapshot" of the actual operating conditions of the
facility at the time of inspection, this will ensure that the field observations and
inspection meet both quality and defensibility standards.

•	By August 31, 2015, ADEQ will send EPA a list of all general types of
standardized inspection reports that have been completed for CMS facilities.

•	By December 31, 2015, as appropriate, ADEQ will include additional
numeric detail in all general types of standardized permit inspection reports that
were listed as complete on August 31, 2015.

Recommendation None required.

Element 3 — Violations

Finding 3-1	Area for State/Local Improvement

Summary	In general, compliance determinations are accurately made and promptly

reported into AFS based on the CMRs reviewed and other compliance
monitoring information. ADEQ falls below the national average for
HPV discovery rate.

Explanation	Metric 7a is designed to evaluate the overall accuracy of compliance

determinations and Metric 8c focuses on the accurate identification of
violations that are determined to be HPVs.

Reviewed files identified circumstances where ADEQ should have
reported violations as either FRVs or HPVs into AFS and pursued
enforcement, which ADEQ did not do. For active major sources, ADEQ
is not identifying HPVs.

For 7a, there was simply not enough information in the short inspection
checklists to determine for 50% of the files reviewed whether the
inspector did enough to verify compliance. In the more detailed
inspection reports, the inspectors appeared to have strong technical skills
and made appropriate compliance determinations.


-------
39 | P a g e

ADEQ did not adhere to the 1998 HPV Policy and inspectors did not
recognize when violations met the HPV criteria and should have been
identified/reported as HPVs (as reflected and confirmed in the internal
HPV audit list).

There were NOV and NOCs EPA reviewed during the file review that
did not have adequate follow up. NOVs for failure to follow dust
control, file multiple reports, and other significant permit requirements
had no penalty actions associated with them.

The NOV/NOC Decision matrix ("Air Quality Division NOV
Assessment Matrix") raises concern and indicates a lack of adequate
responsiveness/seriousness to both reporting violations and emission
violations that exceed the limit. EPA acknowledges that Arizona lacks
administrative penalty authority which constrains its ability to assess
penalties for many medium and smaller cases. Lack of administrative
authority, however, dos not relieve the state of its obligation to pursue
timely and appropriate enforcement actions.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

Metric 7a - Accurate Compliance
Determinations

100%



14

28

50.0%

Metric 8a - HPV Discovery Rate at
Majors



4%

0

56

0%

Metric 8c - Accuracy of HPV
Determinations

100%



0

4

0%

State Response ADEQ has already responded to the first two key issues in Findings 1-1 and 2-
3, and incorporates those responses by reference here. ADEQ also believes that
implementing the proposed recommendations for both of those Findings will
resolve some of the issues identified by EPA in this area.

ADEQ agrees that non-compliance with permit and rule conditions, especially
those that result in a discharge to the environment or would provide credible
evidence of a potential discharge to the environment are critical to the
accomplishment of ADEQ's mission which is "to protect and enhance public
health and the environment".

In the finding, EPA states that there was ".. .not enough information in the short
inspection checklist to determine for 50% of the files reviewed whether the
inspector did enough to verify compliance." In the relevant metrics, EPA lists


-------
40 |

this same 50% as "Accurate Compliance Determinations." ADEQ agrees that
additional information can be added to the inspection checklist and has
committed to making appropriate changes to require numerical values be
included when available. However, a limited lack of specificity that impacted
EPA's ability to audit the inspection reports as desired does not mean that 50%
of the inspections were inaccurate.

During a face-to-face meeting at ADEQ's offices on January 26, 2015, EPA
provided some specific examples for this finding. In the discussion, EPA
identified three specific cases it thought warranted penalties for the violations
that were identified by ADEQ, the last two of which have received Clean Air
Act Section 114 letters from EPA:

1.	Needle Mountain for failure to provide six semi-annual compliance
certifications;

2.	Novo Biopower for emissions violations; and

3.	Drake Cement Company for emissions violations, missing monitoring, and
other issues.

Since the meeting, ADEQ has reviewed the record for Needle Mountain and
found all six semi-annual compliance certifications in its files. Copies of these
compliance certifications are attached to complete EPA's file review. ADEQ is
investigating how these compliance certifications were not included in the files
that EPA reviewed for this facility.

With respect to Novo Biopower, after the emissions violations occurred the
facility was sold to a new owner who has been working closely with ADEQ to
ensure that the facility is properly repaired and can operate in compliance with
the permit that has been issued to the facility. Seeking a major penalty against a
new owner who has agreed to purchase the facility to bring it into compliance
despite its past history of noncompliance is counterproductive to ADEQ's
mission. Were ADEQ to seek a penalty against the new owner, it creates a
deterrent to behavior that should be encouraged - protecting the environment
from additional violations by changing to more responsible corporate
ownership.

The Drake Cement NOVs remain under ADEQ review at this time as we
attempt to better understand the facts related to this potential case. ADEQ will
follow-up with EPA once it has completed its root cause analysis.

ADEQ disagrees that the NOV/NOC decision matrix is responsible for any of
the concerns that EPA has identified. This tool was developed in an effort to
help staff understand when a potential deficiency needs to be reviewed with the
Division Director. This is not to inhibit the issuance of NOVs. Instead, ADEQ
wants those that receive a NOV from ADEQ to react in a fashion similar to
when they receive an EPA Finding of Violation. By agreeing that an issue
deserves an NOV, the Division Director is also providing staff with implicit
authority to pursue escalated enforcement including but not limited to
abatement orders and escalated enforcement. ADEQ also understands that the
facilities in the examples provided by EPA received NOVs when warranted.


-------
41 | P a g e

Recommendation EPA and ADEQ will have a conference call by 9/1/2015 to discuss the details
supporting EPA determinations. A recommendation will then be redrafted for
incorporation in the final version of this SRF.

Element 4 — Enforcement

Finding 4-1	Area for State/Local Improvement

Summary	The one enforcement action available for review in this period required

corrective action that returned the facility to compliance in a specified
timeframe. EPA believes additional formal enforcement would be
appropriate based on review of other facility files. ADEQ does not report
HPVs.

Explanation	During fiscal year 2013, Arizona DEQ issued the following enforcement

actions in response to CAA violations:

	7	facilities with Informal Actions (Notices of Opportunity to

Correct or Notices of Violation)

	1	 Compliance Orders

	1	 Penalty Actions

EPA was only able to review one formal enforcement action for Mineral
Park. ADEQ does not have a large source universe, however, there were
other instances where EPA's file review found facilities for which EPA
believes formal enforcement and penalties would be appropriate. For
example, there were two facilities with significant and lengthy violation
and NOCs with no penalty actions. EPA welcomes the opportunity to
discuss these facilities with ADEQ in greater detail.

ADEQ's NOC and NOV are informal administrative enforcement
actions typically used by ADEQ as its initial response to a violation.
NOCs and NOVs do not create independently enforceable obligations on
respondents. Compliance orders are formal administrative enforcement
actions that impose independently enforceable obligations on the
respondent to take actions to return to compliance. In accordance with
its Compliance and Enforcement Handbook, ADEQ normally will
attempt to negotiate an order on consent with respondents, but has
authority to issue unilateral compliance orders if needed. ADEQ does


-------
42 | Page

not have authority to issue administrative penalties, but can take judicial
actions to impose penalties and injunctive relief obligations.

EPA acknowledges that Arizona's lack of administrative penalty
authority may constrain their ability to get penalties for many medium
and smaller cases. If there are instances where ADEQ's authority limits
their desired approach in enforcement, EPA would be happy to discuss
whether EPA action in these cases is appropriate and feasible, as EPA
does have administrative penalty authority. Penalties have been shown to
level the playing field and ensure that companies that comply are not at
an economic disadvantage when their competitors do not comply and
receive no penalty for the non-compliance.

Metric 10a is designed to evaluate the extent to which the agency takes
timely action to address HPVs. ADEQ did not typically code violations
as HPVs, though file review indicated instances where an HPV
designation would have been appropriate. ADEQ did not adhere to the
1998 HPV Policy and inspectors did not recognize when violations meet
the HPV criteria and should be identified/reported as HPVs (as reflected
and confirmed in the internal HPV audit list).

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State
N

State
D

State
% or #

9a - Formal Enforcement Returns

100%

1

1

100%

Facilities to Compliance

10a - Timely Action Taken to Address
HPVs

67.5%

0

0

N/A

10b - Appropriate Enforcement
Responses for HPVs

100%

0

0

N/A

State Response ADEQ welcomes the opportunity to continue working with EPA
regarding its compliance and enforcement strategies. ADEQ also
incorporates its responses to Findings 1-1 and 2-3 by reference.

Recommendation EPA acknowledges ADEQ is unable to commit to adopting and
implementing revisions to its enforcement response procedures to
provide for increased automatic formal enforcement against violating
facilities at this time. With that acknowledgement and by August 31,
2015,

• ADEQ will commit to follow its revised Compliance and
Enforcement Procedures and Monitoring and Reporting
procedures using a combination of formal and informal actions.


-------
43 | Page

•	ADEQ will escalate NOVs to a formal enforcement action
following the timeframes outlined in its revised Compliance and
Enforcement Procedures.

•	EPA will be prepared to take enforcement against facilities in
violation if ADEQ is not able to take timely and appropriate
formal enforcement, or if ADEQ requests assistance, and in other
circumstances EPA deems appropriate. The exact form and
amount of EPA's assistance will be determined as EPA monitors
ADEQ progress in meeting its yearly workplan goals.

In addition:

•	EPA and ADEQ now conduct routine conference calls, and have
discussed instances where EPA's file review found facilities for which
EPA believes penalty actions or formal enforcement would be
appropriate, and where HPV designation may be appropriate. By August
31, 2015, EPA will confer again with ADEQ to clarify any outstanding
issues in this regard.

•	By October 31, 2015, ADEQ will report to EPA regarding any changes
made to its enforcement policies based upon subsequent discussions
EPA and ADEQ have (as referenced above).

•	Incorporate or reference the recommendations in Finding 1-1 and 2-3.

Element 5 — Penalties

Finding

Area for state attention

Summary

ADEQ obtained what appears to be a reasonable penalty for the one case



available for review, but the file did not contain a description of how



ADEQ arrived at the $1.3 million dollar penalty.

Explanation	The File Review indicated that there was not enough information in the

file to determine if ADEQ has sufficient procedures in place to
appropriately document both gravity and economic benefit in penalty
calculations or whether penalty payments are being sufficiently
documented, along with any difference between initial and final penalty.
However, state penalties appear to include the penalty amount
recommended under EPA's stationary source penalty policy and ADEQ
stated they used the EPA penalty and included both a economic benefit


-------
44 | P a g e

Relevant metrics

and gravity portion. EPA commends ADEQ for obtaining a penalty over
$1,000,000 for a source that had egregious CAA violations.

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

1 la - Penalty Calculations Reviewed
that Document Gravity and Economic
Benefit

100%



0

1

0%

12a - Documentation of Rationale for
Difference Between Initial and Final
Penalty

100%



0

1

0%

12b - Penalties Collected

100%



1

1

100%

State Response ADEQ generally follows EPA's Stationary Source Penalty Policy when
calculating civil penalties. The primary driver in ADEQ's calculations is
the economic benefit of non-compliance. While these cases are rare for
Arizona, ADEQ has required sources to reconstruct affected facilities at
a significant cost if a preconstruction permit would have required more
significant controls. ADEQ is considering whether a state-specific air
quality penalty policy is more appropriate to use.

ADEQ Recommendation:

By September 30, 2015, ADEQ will report to EPA whether a state-
specific air quality penalty policy is required, or if a guidance
memorandum describing the expectation of general adherence to EPA's
Stationary Source Penalty Policy is most appropriate.

Recommendation None required.


-------
45 | P a g e

Appendix

[This section is optional. Content with relevance to the SRF review that could not be covered in
the above sections should be included here. Regions may also include file selection lists and met


-------
46 | P a g e

State Review Framework

Arizona

Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2013

U.S. Environmental Protection Agency
Region 9, San Francisco

Final Report
July 29,2015


-------
47 | P a g e

Executive Summary

Introduction

EPA Region 9 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Arizona Department of Environmental Quality (ADEQ) in
2014.

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	ADEQ's goal is to complete and issue all inspection reports within 30 days of the
inspection. The goal is being achieved through the issuance of a Field Inspection Report.
If no significant RCRA violations are observed during an inspection, a field inspection
report is issued at the conclusion of the inspection. For inspections with violations
warranting a Notice of Violation, the field inspection report is transmitted from the office
via a Notice of Violation. Additionally, the Field Inspection Report contains all the
elements required to document observed violations including process description(s), field
observations, photographs, and photograph log if Notice of Violation issued. The process
greatly increases return to compliance objectives set forth by the agency (e.g., reducing
return to compliance from 120 days down to 60 days). ADEQ documents each Return to
Compliance action completed by the facility in RCRAInfo. This includes any
photographs, correspondences (including e-mails), training certifications and other
documentation the facility submitted to ADEQ to demonstrate return to compliance with
the identified violation(s).

Priority Issues to Address

The following are the top-priority issues affecting the state program's performance:

•	No RCRA top-priority issues were identified.

Most Significant RCRA Subtitle C Program Issues

•	All ADEQ formal enforcement actions are managed through the State Attorney General's
Office. To address the inability to issue administrative orders, ADEQ has developed
innovative compliance assistance and enforcement programs that achieves a high level of
compliance with the regulated community. The ADEQ RCRA program consistently
achieved timely and appropriate enforcement actions that returned violating facilities to
compliance.


-------
48 | P a g e

I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.

Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.

State Review Framework Report | Arizona | Page 48


-------
49 | P a g e

II. SRF Review Process

Review period: Federal Fiscal Year 2013

Key dates: The review was conducted at ADEQ June 2-5, 2014.

State and EPA key contacts for review: EPA's primary point of contact for the RCRA review
is John Brock, (415)-972-3999. Other members of the EPA review team were John Schofield
and Elizabeth Janes. The primary point of contact for ADEQ is Randall Matas.

State Review Framework Report | Arizona | Page 49


-------
50 | P a g e

III. SRF Findings

Findings represent EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the state's last SRF review

•	Follow-up conversations with state agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

State Review Framework Report | Arizona | Page 50


-------
51 | P a g e

Resource Conservation and Recovery Act Findings

RCRA Element 1

— Data

Finding 1-1

Meets or Exceeds Expectations

Summary

EPA's review of ADEQ inspection and enforcement files found that
most of the minimum data requirements are being entered completely
and accurately into the national data system. For return to compliance
documentation, ADEQ has a well-developed process to ensure that
accurate return to compliance information is entered into RCRAInfo.

Explanation

Only one data error was observed (Clean Harbors). All other data
entries were observed to be accurate. For the one data entry, the
inspection report completion date and the inspection report transmittal
date was not entered into RCRAInfo. Due to the fact the one data entry
was the only exception of the 29 files reviewed, this does not represent
an area of concern.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #







2b Complete and accurate entry of mandatory

100% N/A 28 29 96.6%

data









State response

Recommendation

No further action is recommended.



RCRA Element 2

— Inspections

Finding 2-1

Meets or Exceeds Expectations

Summary

ADEQ completed core coverage for TSDs (two-year coverage) and
LQGs (one-year coverage). ADEQ has requested and has been
approved to implement an alternative Compliance Management

State Review Framework Report | Arizona | Page 51


-------
52 | Page

Strategy for generators: substituting SQG inspections for LQGs
inspections. This affects the LQGs inspection numbers for ADEQ
during the 5-year cycle covered under this review. ADEQ is meeting its
alternative CMS commitment.

Explanation	Element 2-1 is supported by Metrics 5a, 5b, and 5c. The OECA

National Program Managers (NPM) Guidance outlines the core program
inspection coverage for TSDs and LQGs. ADEQ met the 2-year TSD
inspection requirement (Metric 5a). RCRAInfo identifies 8 operating
TSD facilities within the State of Arizona. However, 1 of the TSD
facilities is located on Tribal Land not under Arizona's jurisdiction.
The correct number of operating TSD facilities that are inspected by
ADEQ is 7 not 8 as listed in RCRAInfo. ADEQ inspected all of their 7
TSD facilities during the two year period.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

5a Two-year inspection coverage of operating
TSDFs '

100%

87.6%

7

7

100%

5b Annual inspection coverage of LQGs

20%

21%

43

214

20.1%

5c Five-year inspection coverage of LQGs

100%

66.6%

142

214

66.4%

5d Five-year inspection coverage of active
SQGs

NA

11%

80

1174

6.8%

State response

Recommendation No further action is recommended.

RCRA Element 2

— Inspections

Finding 2-2

Meets or Exceeds Expectations

Summary

ADEQ inspection reports were all complete with adequate supporting



documentation (e.g., photographs, photograph logs). A majority of



inspection reports were completed and entered into RCRAInfo in a



timely manner.

Explanation

All inspection reports are prepared in a standardized format that



includes but is not limited to the following report elements: facility

State Review Framework Report | Arizona | Page 52


-------
53 | P a g e

name, date of inspection, inspection participants, facility/process
description, observations and files reviewed. At the conclusion of the
facility inspection, Arizona provides the facility with a summary of the
areas of concern, potential areas of non-compliance, and information
required to be submitted to ADEQ to demonstrate that the facility has
adequately addressed either the areas of concern or potential areas of
non-compliance. The inspection summary provided to the facility is a
component of the inspection/enforcement file. Once the inspection
report is completed, report and report transmittal information is entered
into RCRAInfo.

A general guideline of 45 days to complete an inspection report after the
inspection was used for the purposes of this review. Arizona's goal is
to complete the inspection report within 30 days. The report completion
average for the period reviewed is 31 days. During the review period,
ADEQ completed 82.8 of its inspection reports within 45 days of the
inspection.

ADEQ has developed and implemented a field inspection report for
each type of generator (i.e., LQG, SQG, CESQG). The field inspection
report was rolled out for use in late FY2013. For this reason only one of
the field inspection reports was review during this SRF. The field
inspection report contains most of the elements of the standardized
report described above. If there are no significant violations identified
during the inspection, the field inspection report is completed and
provided to the facility at the end of the inspection. If the facility wants
copies of the photographs taken by ADEQ to document potential
violations identified during the inspection, the facility must request a
copy of the photographs at the conclusion of the inspection. When
significant violations are identified during the inspection which
warrants the issuance of a Notice of Violation, the field inspection
report is issued from the office via a Notice of Violation and includes a
photograph log. One of the files reviewed contained a field inspection
report that was issued to the facility on the day of the inspection. The
field inspection report program implementation has improved the
timeliness of inspection reporting, so no state attention or improvement
is necessary.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

6a Inspection reports complete and sufficient to
determine compliance

100%



29

29

100%

6b Timeliness of inspection report completion

100%



24

29

82.8%

State response

State Review Framework Report | Arizona | Page 53


-------
54 | Page

Recommendation

No further action is recommended.



RCRA Element 3

— Violations

Finding 3-1

Meets or Exceeds Expectations

Summary

ADEQ makes accurate compliance determinations in the RCRA
inspections reviewed.

Explanation

File Review Metric 7a assesses whether accurate compliance
determinations were made based on the inspections. All 29 of the
inspection report files reviewed during that had accurate compliance
determinations.

Metric 7b is a review indicator that evaluates the violation identification
rate for inspection conducted during the year of review. In the data
metric analysis, ADEQ violation identification rate for FY2013 was
77.3%, above the national average of 34.8%.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __

Goal Avg N D % or #



7a Accurate compliance determinations 100% 29 29 100%



7b Violations found during inspections 34.8% 58 75 77.3%





State response

Recommendation

No further action is recommended.



RCRA Element 3

— Violations

Finding 3-2

Area for State Attention

Summary

Based on the files reviewed, accurate SNC determinations were made by
ADEQ.

Explanation

Only one of the selected files reviewed contained any violations that
warranted a SNC determination. The SNC determination was made
during the prior fiscal year (PAS Technologies).

State Review Framework Report Arizona Page 54


-------
55 | Page

Metric 8a identifies the percent of facilities that receive a SNC
designation in FY2013. ADEQ's SNC identification rate for FY2013 is
0%. This is well below the national average of 1.7%. ADEQ has
developed and successfully implemented a generator compliance
assistance program. EPA believes the low SNC identification rate is
attributable to this program.

There were no issues of concern identified in ADEQ's SNC
determination policy or procedure. No significant SNC determination
issues were identified in either the Round 1 or Round 2 SRFs.

SNC identification is important part of an effective inspection and
enforcement program. This information is used by the public to identify
problematic facilities within their community. For this reason, EPA is
identifying SNC determination as an area that ADEQ should pay
particular attention to ensure that appropriate and timely SNC
determination are made by the agency and entered into RCRAInfo.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

8a SNC identification rate

100%



0

75

0%

8c Appropriate SNC determinations

100%



1

1

100%

State response

Recommendation No further action is recommended.

RCRA Element 4 — Enforcement

Finding 4-1	Meets or Exceeds Expectations

Summary	ADEQ takes timely and appropriate enforcement actions.

Explanation	Metric 9a measures the enforcement responses that have returned or will

return facilities with SNC or SV violations to compliance. All files
reviewed (29 of 29) contained well documented returned to compliance
information. Each return to compliance submission by the facility is
entered into RCRAInfo by ADEQ.

Metric 10b assesses the appropriateness of enforcement actions for SVs
and SNCs. In the files reviewed, 100% of the facilities with violations
(29 of 29) had an appropriate enforcement response.

State Review Framework Report | Arizona | Page 55


-------
56 | Page

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

9a Enforcement that returns violators to

100%



29

29

100%

compliance



10b Appropriate enforcement taken to address
violations

100%



29

29

100%

State response

Recommendation No further action is recommended.

RCRA Element 5 — Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary

ADEQ's penalties consider and includes a gravity component and



economic benefit as part of the penalty calculation.

Explanation	Only 1 penalty case file was reviewed (PAS Technologies) as a part of

this SRF. A well detailed penalty calculation and justification
memorandum is contained in the confidential enforcement file. The
penalty calculation process includes gravity component, economic
benefit component and any adjustments (e.g., history of non-
compliance). The file also includes documentation supporting that the
penalty has been collected (i.e., copy of the check).

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

11a Penalty calculations include gravity and
economic benefit

100%

N/A

1

1

100%

12a Documentation on difference between
initial and final penalty

100%

N/A

1

1

100%

12b Penalties collected

100%

N/A

1

1

100%

State response

Recommendation No further action is recommended.

State Review Framework Report | Arizona | Page 56


-------
57 | P a g e

Appendix

ADEQ should ensure they maintain their FTE commitment in order to make sure they continue
to achieve their inspection numbers.

Allowing ADEQ to substitute SQG inspections for LQGs in accordance with the RCRA LQG
Flexibility Project allow them to re-direct resources to increase inspections at facilities that
potentially pose a serious risk to human health and the environment.

State Review Framework Report | Arizona | Page 57


-------
58 | P a g e

Appendix

[This section is optional. Content with relevance to the SRF review that could not be covered in
the above sections should be included here. Regions may also include file selection lists and
metric tables at their discretion. Delete this page if i

State Review Framework Report | Arizona | Page 58


-------