State Review Framework

Nashville/Davidson County, Tennessee

Clean Air Act
Implementation in Federal Fiscal Year 2012

U.S. Environmental Protection Agency
Region 4, Atlanta

Final Report
April 16,2015


-------
(Page left intentionally blank)


-------
Executive Summary

Introduction

EPA Region 4 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Nashville/Davidson County Metro Public Health Department
(MPHD).

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	Enforcement actions bring sources back into compliance within a specified timeframe.

•	MPHD considers gravity and economic benefit when calculating penalties, documenting
the collection of penalties and any differences between initial and final penalty
assessments.

Priority Issues to Address

The following are the top-priority issues affecting the local program's performance:

•	MPHD needs to improve the accuracy of data reported into the National Data System
(formerly Air Facility Subsystem (AFS), but now ICIS-Air). Data discrepancies were
identified in all of the files reviewed.

•	The review of most Title V Annual Compliance Certifications (ACCs) were not recorded
in AFS, and Full Compliance Evaluations (FCEs) and Compliance Monitoring Reports
(CMRs) did not always include all required elements.

Most Significant CAA Stationary Source Program Issues

•	The accuracy of enforcement and compliance data entered by MPHD in AFS needs
improvement. The recommendation for improvement is for MPHD to document efforts to
identify and address the causes of inaccurate Minimum Data Requirements (MDR)
reporting and make corrections to existing data to address discrepancies identified by
EPA. EPA will monitor progress through the annual Data Metrics Analysis (DMA) and
other periodic data reviews.

•	MPHD needs to ensure that FCEs and CMRs include all required elements and that ACC
reviews are documented in ICIS-Air. The recommendation for improvement is for
MPHD to submit and implement revised procedures which ensure that ACC reviews are
recorded in ICIS-Air and FCEs and CMRs include all required elements. EPA will
review sample CMRs provided by MPHD for 6 months to determine the adequacy of the
revised procedures.


-------
Table of Contents

I.	Background on the State Review Framework	4

II.	SRF Review Process	5

III.	SRF Findings	6

Clean Air Act Findings	7


-------
I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state or local program
understand the causes of issues and agree, to the degree possible, on actions needed to address
them. SRF reports capture the agreements developed during the review process in order to
facilitate program improvements. EPA also uses the information in the reports to develop a better
understanding of enforcement and compliance nationwide, and to identify issues that require a
national response. Reports provide factual information. They do not include determinations of
overall program adequacy, nor are they used to compare or rank state and local programs.

Each state's programs are reviewed once every five years. Local programs are reviewed less
frequently, at the discretion of the EPA Regional office. The first round of SRF reviews began in
FY 2004, and the second round began in FY 2009. The third round of reviews began in FY 2013
and will continue through 2017.

State Review Framework Report | Nashville, Tennessee | Page 4


-------
II. SRF Review Process

Review period: 2012

Key dates: November 15, 2013, letter sent to Local program kicking off the Round 3 review
December 3-5, 2013, on-site file review for CAA

Local Program and EPA key contacts for review:



Nashville MPHD

EPA Region 4

SRF Coordinator

John Finke

Kelly Sisario, OEA Branch Chief

CAA

John Finke

Mark Fite, OEA Technical Authority

State Review Framework Report | Nashville, Tennessee | Page 5


-------
III. SRF Findings

Findings represent EPA's conclusions regarding state or local program performance and are
based on observations made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the program's last SRF review

•	Follow-up conversations with agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state or local performs above national program
expectations.

Area for State1 Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state or local should correct the issue without additional
EPA oversight. EPA may make recommendations to improve performance, but it will not
monitor these recommendations for completion between SRF reviews. These areas are not
highlighted as significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state or local has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

1 Note that EPA uses a national template for producing consistent reports throughout the country. References to
"State" performance or responses throughout the template should be interpreted to apply to the Local Program.

State Review Framework Report | Nashville, Tennessee | Page 6


-------
Clean Air Act Findings

CAA Element 1 —

Data





Finding 1-1

Meets or Exceeds Expectations

Summary

MDRs were entered timely into AFS, EPA's national data system for air
enforcement and compliance information.

Explanation

Data Metrics 3a2 and 3b2 indicated that MPHD entered MDR data for
high priority violations (HPVs) and stack tests into AFS within the
specified timeframe.

Data Metric 3b 1 indicated that 61.2% of compliance monitoring MDRs
(71 of 116) were reported timely into AFS. However, of the 45 late
entries, 38 were non-federally reportable minor sources (dry cleaners). If
these dry cleaners are excluded from the metric calculation, the revised
metric is 91% (71 of 78), which exceeds the national average and
approaches the national goal.

Data Metric 3b3 indicated that 2 of 3 (66.7%) enforcement related
MDRs were entered into AFS within 60 days. The one late entry is
considered an isolated incident, so EPA considers that the timeliness of
MPHD's data entry meets expectations.





Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D % or #



3a2 Untimely entry of HPV determinations

0

0



3b 1 Timely reporting of compliance monitoring
MDRs

100% 80%

71 116 61.2%



3b2 Timely reporting of stack test dates and
results

100% 73.1%

2 2 100%



3b3 Timely reporting of enforcement MDRs

100% 73.7%

2 3 66.7%

State response Entry of data into ICIS-Air will be standardized to occur on the first of
each month, if not sooner, to ensure timely entry of data. All inspection
and enforcement data is now being entered on or prior to the first of the
month following the inspections.

Recommendation

State Review Framework Report | Nashville, Tennessee | Page 7


-------
CAA Element 1 — Data

Finding 1-2

Area for State Improvement

Summary

The accuracy of MDR data reported by MPHD into AFS needs
improvement. At least one discrepancy between the files and AFS was
identified in each of the files reviewed.

Explanation

Metric 2b indicated that each of the 15 files reviewed had one or more
discrepancies between information in the files and data entered into
AFS. The majority of inaccuracies related to facility information
(NAICS, name, address, CMS info, pollutants etc.) and missing or
inaccurate activity data (e.g. ACCs, NOVs, FCEs, penalties, etc.).

Several files also revealed missing or inaccurate air programs or subparts
for applicable Maximum Achievable Control Technology (MACT) or
New Source Performance Standards (NSPS) regulations in AFS. Finally,
two sources had an inaccurate compliance status code. This incorrect
data in AFS could potentially hinder EPA's oversight and targeting
efforts and/or result in inaccurate information being released to the
public.

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D % or #

2b Accurate MDR data in AFS

100%

0 15 0%

State response

The discrepancies identified by EPA have been or will be corrected in
ICIS-Air.

Recommendation

By April 30, 2015, MPHD should provide documentation to EPA
concerning efforts to identify and address the causes of inaccurate MDR
reporting. MPHD should also make corrections to existing data to
address the discrepancies EPA identified and ensure that in the future,
MDRs are accurately entered into ICIS-Air. If by June 30, 2015, EPA's
review determines that MPHD's efforts appear to be adequate to meet
the national goal, the recommendation will be considered complete.

State Review Framework Report | Nashville, Tennessee | Page 8


-------
CAA Element 2 — Inspections

Finding 2-1	Meets or Exceeds Expectations

Summary	MPHD met the negotiated frequency for inspection of Major and

Synthetic Minor 80% (SM80) sources.

Explanation	MPHD ensured that each major source was inspected at least once every

2 years, and each SM-80 source was inspected at least once every 5
years. Although Metric 5a indicates that only half of major sources (6 of
12) slated for inspection in FY2012 were inspected, all but one of the
sources not inspected are permanently closed. The remaining source had
an FCE in FY2011 (3/8/11), so it would not have been due for an FCE
until FY2013, and the corrected percentage for major sources inspected
is 100%. Similarly, Metric 5b indicates that 83.6% of SM80 sources (46
of 55) slated for inspection in FY2012 were inspected. However, all of
the sources that were not inspected are coded as permanently closed in
AFS, so the corrected percentage of SM80 sources inspected is 100%.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #



5a FCE coverage: majors and mega-sites

100%

90.4%

6

12

50%



5b FCE coverage: SM-80s

100%

93.4%

46

55

83.6%

State response

Recommendation

State Review Framework Report | Nashville, Tennessee | Page 9


-------
CAA Element 2 — Inspections

Finding 2-2

Area for State Improvement

Summary

The review of most Title V ACCs was not recorded in AFS, and FCEs
and CMRs did not always include all required elements.

Explanation

Metric 5e indicates that only 1 of 12 (8.3%) Title V ACCs were
reviewed by the local program. The program advises that these reviews
were conducted, but they were not recorded in AFS.

Metric 6a indicates that 11 of 14 (78.6%) FCEs reviewed included all
seven elements reciuired bv the Clean Air Act Stationary Source
Compliance Monitoring Stratesv (CMS Guidance). The remaining three
FCEs were missing one of the following elements: assessment of process
parameters; visible emissions observations; or review of records &
reports.

Metric 6b indicates that 9 of 14 (64.3%) CMRs included all seven
elements required by the CMS Guidance. The remaining five CMRs
were missing one or more of the following required elements: facility
information; observations and recommendations; applicable
requirements; or a description of compliance monitoring activities
conducted by the inspector.





Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D % or #



5e Review of Title V annual compliance
certifications

100% 81.8%

1 12 8.3%



6a Documentation of FCE elements

100%

11 14 78.6%



6b Compliance monitoring reports reviewed
that provide sufficient documentation to
determine facility compliance

100%

9 14 64.3%

State response All ACC were received and reviewed. Procedures will be developed to
ensure more timely entry of ACC review into ICIS-Air. All ACC data is
now entered into ICIS-Air as the ACC are received. MPHD has
developed a spreadsheet to assist in tracking ACCs and Quarterly/Semi-
Annual Reports.

Coordinate with inspectors on procedures to completely fill out
inspection reports. Develop and implement procedures to review each
inspection report received for completeness before entering into AFS.
Revise inspection forms to eliminate extraneous or outdated entries and
ensure that all CMS required entries are present.

State Review Framework Report | Nashville, Tennessee | Page 10


-------
Recommendation By June 30, 2015, MPHD should submit and implement revised

procedures to EPA which ensure that ACC reviews are recorded in ICIS-
Air and FCEs and CMRs include all elements required by the CMS
Guidance. Through December 31, 2015, MPHD should submit sample
CMRs to EPA for review. If based on this review EPA determines that
the revised procedures are adequate to meet the national goal, the
recommendation will be considered completed.

State Review Framework Report | Nashville, Tennessee | Page 11


-------
CAA Element 3 — Violations

Finding 3-1

Meets or Exceeds Expectations

Summary

MPHD made accurate compliance determinations for both HPV and
non-HP V violations.

Explanation

Metric 7a indicated that MPHD made accurate compliance
determinations in 12 of 14 files reviewed (85.7%).

Metric 8a indicated that the HPV discovery rate for majors (0%) was
below the national average of 4.3%. A low HPV discovery rate is not
unusual for small local programs. Although there were no HPV
determinations during the review year, Metric 8c indicates that an HPV
designated in the prior year and addressed in FY2012 was evaluated
during the file review, and EPA confirmed the accuracy of that
determination.

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D % or #

7a Accuracy of compliance determinations

100%

12 14 85.7%

8a HPV discovery rate at majors

4.3%

0 12 0%

8c Accuracy of HPV determinations

100%

1 1 100%



State response

All ACC were reviewed. Deviations and missing data were determined
to have been minor or had been adequately explained and addressed by
the sources. In the future, procedures will be put in place to ensure better
documentation of the review process and of any actions taken or
determinations made by this department.

All ACC data is now entered into ICIS-Air as the ACC are received.
MPHD has developed a spreadsheet to assist in tracking ACCs and
Quarterly/Semi-Annual Reports.

Recommendation



State Review Framework Report | Nashville, Tennessee | Page 12


-------
CAA Element 4 — Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary

Enforcement actions bring sources back into compliance within a
specified timeframe, and HPVs are addressed in a timely and appropriate
manner.

Explanation

Metric 9a indicated that all formal enforcement actions reviewed brought
sources back into compliance through corrective actions in the order, or
compliance was achieved prior to issuance of the order.

Metric 10a indicated that the one HPV concluded in the review year
(FY2012) was addressed in 297 days. While this slightly exceeds the
specified timeframe of 270 days, this is not considered a significant
exceedance. In addition, Metric 10b indicated that appropriate
enforcement action was taken to address all HPVs.

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D % or #

9a Formal enforcement responses that include
required corrective action that will return the
facility to compliance in a specified timeframe

100%

2 2 100%

10a Timely action taken to address HPVs

70.5%

0 1 0%

10b Appropriate enforcement responses for
HPVs

100%

1 1 100%

State response

Recommendation

State Review Framework Report | Nashville, Tennessee | Page 13


-------
CAA Element 5 — Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary

MPHD considered gravity and economic benefit when calculating
penalties; the collection of penalties and any differences between initial
and final penalty assessments was also documented.

Explanation	Metric 11a indicated that MPHD considered gravity and economic

benefit in both penalty calculations reviewed (100%). Metric 12a also
indicated that both penalty calculations reviewed (100%) documented
any difference between the initial and the final penalty assessed. Finally,
Metric 12b confirmed that documentation of all penalty payments made
by sources was included in the file.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

11a Penalty calculations include gravity and
economic benefit

100%

2

2

100%

12a Documentation on difference between
initial and final penalty

100%

2

2

100%

12b Penalties collected

100%

2

2

100%

State response

Recommendation

State Review Framework Report | Nashville, Tennessee | Page 14


-------
State Review Framework

Tennessee

Clean Water Act, Clean Air Act, and
Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2014

U.S. Environmental Protection Agency
Region 4, Atlanta

Final Report
September 29,2016


-------
Executive Summary

Introduction

EPA Region 4 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Tennessee Department of Environment and Conservation
(TDEC).

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	CWA data entry into ICIS-NPDES was complete.

•	CAA, CWA and RCRA inspection reports were well written and complete.

•	TDEC accurately identified CAA and RCRA violations and appropriately addressed the
violations with enforcement actions that returned facilities to compliance. RCRA actions
were also timely.

•	TDEC documented the difference between initial and final penalty calculations in CAA
and RCRA. RCRA also documented the calculation of economic benefit in their
penalties.

Priority Issues to Address

The following are the top-priority issues affecting the state program's performance:

•	TDEC should document the calculation of economic benefit or the rationale for excluding
economic benefit of noncompliance in CAA and CWA penalty calculations. CWA should
also document the difference between initial and final penalty calculations.

•	TDEC should meet grant and inspection coverage commitments in the CWA program.

•	CWA enforcement responses should be timely, appropriate to the violation and promote a
return to compliance.

State Review Framework Report | Tennessee | Executive Summary | Page 1


-------
Most Significant CWA-NPDES Program Issues1

•	CWA inspection report findings and cover letters were ambiguous about compliance
determinations made during the inspection. In addition, TDEC is not appropriately
reporting Significant Non-Compliance and Single Event Violations.

•	TDEC should ensure that enforcement responses promote a return to compliance and
escalate to formal actions when non-compliance continues.

Most Significant CAA Stationary Source Program Issues

•	The accuracy of data reporting into the database of record needs improvement along with
the timeliness of data entry.

Most Significant RCRA Subtitle C Program Issues

•	TDEC should make timely determinations of Significant Non-Compliance.

1 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."

State Review Framework Report | Tennessee | Executive Summary | Page 2


-------
Table of Contents

I.	Background on the State Review Framework	2

II.	SRF Review Process	3

III.	SRF Findings	4

Clean Air Act Findings	5

Clean Water Act Findings	17

Resource Conservation and Recovery Act Findings	39


-------
I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.

State Review Framework Report | Tennessee | Page 2


-------
Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.

II. SRF Review Process

Review period: FY 2014

Key dates: August 7, 2015: letter sent to the State kicking off the Round 3 review
November 2-6, 2015: onsite file reviews for CWA, RCRA and CAA programs

State and EPA key contacts for review:



Tennessee DEC

EPA Region 4

SRF Coordinator

Chris Moran, Enforcement
Coordinator

Kelly Sisario, Enforcement Coordinator

CAA

Kevin McLain, Manager
Enforcement Program
Division of Air Pollution
Control

Mark Fite, Office of Enforcement
Coordination

Chet Gala, Air, Pesticides & Toxics
Management Division

CWA

Jessica Murphy, Manager
Compliance &
Enforcement Program
Division of Water
Resources

Ronald Mikulak, Office of Enforcement
Coordination

Laurie Ireland and Pamela Myers, Water
Protection Division

RCRA

Chris Lagan, P.G.
Manager, Regulatory
Compliance and
Enforcement Program
Division of Solid Waste
Management

Shannon Maher, Office of Enforcement
Coordination

Alan Newman, Resource Conservation &
Restoration Division

State Review Framework Report | Tennessee | Page 3


-------
III. SRF Findings

Findings represent EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the state's last SRF review

•	Follow-up conversations with state agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

State Review Framework Report | Tennessee | Page 4


-------
Clean Air Act Findings

CAA Element 1

— Data

Finding 1-1

Area for State Improvement

Summary

The accuracy of MDR data reported by TDEC into AFS needs



improvement. Discrepancies between the files and AFS were identified



in nearly half of the files reviewed.

Explanation	Metric 2b indicated that 58.8% (20 of 34) of the files reviewed reflected

accurate entry of all MDRs into AFS. The remaining 14 files had one or
more discrepancies between information in the files and data entered into
AFS. The majority of inaccuracies related to duplicate, inaccurate or
missing activity data such as FCEs, NOVs, stack tests, etc. Beginning in
FY2015, the Agency transitioned to a new national data system known
as ICIS-Air. Historical data from AFS was migrated to the new system.
In general, incorrect data in the Agency's data system could potentially
hinder EPA's oversight and targeting efforts and/or result in inaccurate
information being released to the public.

Relevant metrics	State

Metric ID Number and Description	^at' ^at'	State o/o or

F	Goal Avg	N D

2b Accurate MDR data in AFS	100%	20 34 58.8%

State response To correct data and prevent future issues, the Division of Air Pollution
Control (APC) is taking the following steps:

•	Corrected all 14 errors in ICIS-Air identified during the Round 3
review by February 1, 2016. Will complete review of existing
data to address the discrepancies noted by March 31, 2017 as
recommended, with the exception of discrepancies due to
migration issues that occurred between AFS and ICIS-Air as
discussed separately below.

•	Quarterly QA review of the data by the Environmental
Consultant 3 in APC's data entry section will begin October 1,
2016.

State Review Framework Report | Tennessee | Page 5


-------
•	Enhanced interactive reports for NOVs issued and HPV
determinations. Once an NOV or HPV determination is entered
into APC's data management system, it is immediately available
in the applicable interactive report. This will allow APC to see,
and quickly evaluate, an up-to-date summary of data needed to
be transferred to ICIS-Air.

•	Sought EPA guidance on ICIS-Air data entry by participating in
all webinars relating to ICIS-Air modules. APC also sought EPA
guidance when APC had questions regarding reporting processes.
APC has made a concerted effort to examine the reporting
processes for accuracy and learn and adapt to the new data
system, including working through data migration issues that
may have occurred (see example attached). APC is committed
to finding and correcting discrepancies caused by data migration
on an ongoing basis.

Recommendation By March 31, 2017, TDEC should make corrections to existing data to
address the discrepancies EPA identified and ensure that in the future,
MDRs are accurately entered into ICIS-Air. If by September 30, 2017,
EPA determines that TDEC's efforts appear to be adequate to meet the
national goal, the recommendation will be considered complete.

State Review Framework Report | Tennessee | Page 6


-------
CAA Element 1 — Data

Finding 1-2

Area for State Improvement

Summary

Whereas MDR data for stack tests were reported timely into AFS, MDR
data associated with other areas (HPVs, compliance monitoring, and
enforcement actions) were not always reported timely.

Explanation	Metric 3b2 (93.5%) indicated that TDEC met the national goal in

entering MDR data for stack tests into AFS within the specified
timeframe. However, Metrics 3a2 (16), 3b 1 (81.4%), and 3b3 (77.
indicated that HPVs, compliance monitoring activities, and enforcement
actions were often not entered into AFS within 60 days, as required by
the Information Collection Request (ICR).

Relevant metrics

, . ttv tvt .Tv x- Natl Natl
Metric ID Number and Description „ . .

Goal Avg

State State ^ate

N D 0r

#

3 a2 Untimely entry of HPV ^
determinations

16

3b 1 Timely reporting of compliance iriri0/ M w

•. • -* • y v r\ 100 /o o3.3 /o

monitoring MDRs

977 1200 81.4%

3b2 Timely reporting of stack test 1AA0/ OA _0/

t t lOO/o oO.o/o

dates and results

101 108 93.5%

3b3 Timely reporting of enforcement 1 AA0/ 0/
MDRs 100/° 119/0

249 321 77.6%

State response APC's data entry section created enhancements for the interactive

reports for NOVs issued and HPV determinations. Once an NOV or
HPV determination is entered into APC's data management system, it is
immediately available in the applicable interactive report. This will
allow APC to see, and quickly evaluate, an up-to-date summary of data
needed to be transferred to ICIS-Air.

APC's Enforcement group notifies the Environmental Specialist 5 and
the Environmental Consultant 3 in the data entry section via email of all
HPV determinations (starting approximately September 2015) and

State Review Framework Report | Tennessee | Page 7


-------
orders issued (starting approximately October 2015). These notifications
help ensure timely data entry into ICIS-Air.

APC is in the process of evaluating the causes of untimely MDR
reporting as well as its SOPs and other process documents for revision, if
necessary, to address identified issues. APC will provide EPA further
documentation regarding these causes and any specific revisions to SOPs
or other process documents by March 31, 2017 as recommended.

Recommendation By March 31, 2017, TDEC should provide documentation to EPA

concerning efforts to identify and address the causes of untimely MDR
reporting. If by September 30, 2017, EPA determines that TDEC's
efforts appear to be adequate to meet the national goal, the
recommendation will be considered complete.

State Review Framework Report | Tennessee | Page 8


-------
CAA Element 2 — Inspections

Finding 2-1

Meets or Exceeds Expectations

Summary

TDEC met the negotiated frequency for inspection of sources, reviewed
Title V Annual Compliance Certifications, and included all required
elements in their Full Compliance Evaluations (FCEs) and Compliance
Monitoring Reports (CMRs).

Explanation	Metrics 5a and 5b indicated that TDEC provided adequate inspection

coverage for the major and SM-80 sources during FY14 by ensuring that
each major source was inspected at least every 2 years, and each SM-80
source was inspected at least every 5 years. In addition, Metric 5e
documented that TDEC reviewed Title V annual compliance
certifications submitted by major sources. Finally, Metrics 6a and 6b
confirmed that all elements of an FCE and a CMR required by the Clean
Air Act Stationary Source Compliance Monitoring Strategy (CMS
Guidance) were addressed in most facility files reviewed.

Relevant metrics

, . ttv tvt .Tv x- Natl Natl
Metric ID Number and Description „ . .

Goal Avg

State State ^ate

N D 0r

#

5a FCE coverage: majors and mega- 1AA0/ oc „0/

lUU/o oj. //o

sites

174 178 97.8%

5b FCE coverage: SM-80s 100% 91.7%

317 317 100%

5e Review of Title V annual . nn0/ no o0/

~ • lUU/O /O.O/O

compliance certifications

193 210 91.9%

6a Documentation of FCE elements 100%

30 32 93.8%

6b Compliance monitoring reports
reviewed that provide sufficient 100°/
documentation to determine facility
compliance

32 32 100%

State response

Recommendation

State Review Framework Report | Tennessee | Page 9


-------
CAA Element 3 —

Violations

Finding 3-1

Meets or Exceeds Expectations

Summary

TDEC made accurate compliance determinations for both HPV and non-
HPV violations.

Explanation

Metric 7a indicated that TDEC made accurate compliance
determinations in 31 of 33 files reviewed (93.9%).

Metric 8a indicated that the HPV discovery rate for majors (5.2%) was
above the national average of 3.1%.

Metric 8c confirmed that TDEC's HPV determinations were accurate for
all 23 files reviewed (100%). In one instance, the file indicates that the
state classified a stack test failure as an HPV, but this was never entered
into AFS. This is being addressed under the recommendation for Finding
1-1.





Relevant metrics

St&tc

Metric ID Number and Description State State ^

v Goal Avg N D



7a Accuracy of compliance 1QQ% 31 33 ^ 9%
determinations



8a HPV discovery rate at majors 3.1% 11 213 5.2%



8c Accuracy of HPV determinations 100% 23 23 100%

State response

Recommendation

State Review Framework Report | Tennessee | Page 10


-------
CAA Element 4 — Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary

Enforcement actions bring sources back into compliance within a
specified timeframe, and HPVs are addressed in an appropriate manner.

Explanation	Metric 9a indicated that all formal enforcement actions (100%) reviewed

brought sources back into compliance through corrective actions in the
order, or compliance was achieved prior to issuance of the order.

Metric 10b indicated that appropriate enforcement action was taken to
address all HPVs (100%) evaluated during the file review.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl
Avg

State State
N D

State
% or

#

9a Formal enforcement responses that
include required corrective action that
will return the facility to compliance
in a specified timeframe

100%



21

21

100%

10b Appropriate enforcement
responses for HPVs

100%



14

14

100%

State response

Recommendation

State Review Framework Report | Tennessee | Page 11


-------
CAA Element 4 —

Enforcement

Finding 4-2

Area for State Attention

Summary

About one-fourth of HPVs were not addressed in a timely manner.

Explanation

Metric 10a indicated that 77.1% of the HPVs (54 of 70) addressed in
FY14 were addressed within 270 days, which is above the national
average of 73.2%. The length of time taken to address untimely HPV
actions ranged from 278 to 1302 days. This is a continuing issue from
the Round 2 review, although the State has reduced both the number and
percentage of overdue actions. For future HPV cases, the state is
encouraged to follow the timelines established in the new HPV policy
dated August 25, 2014. If an addressing action cannot be achieved
within 180 days of day zero, the state should advise EPA Region 4 that it
has a "case-specific development and resolution timeline" as required by
the new policy and consult at least quarterly with the region until the
HPV is addressed.





Relevant metrics

. Ir. -r . .tv • Natl Natl State State ^ate
Metric ID Number and Description „ . . _T _ % or

F Goal Avg N D



10a Timely action taken to address 100./o 73 2„/o 54 70 77 1%
HPVs

State response Although this is a continuing issue from Round 2, APC's percentage of
timely actions has improved and is above the national average. APC has
increased its staff in the Enforcement section to help resolve delays in
addressing violations.

APC has implemented the HPV timelines outlined in the 2014 HPV
policy and has provided training to all staff members Also, all
unaddressed and unresolved HPVs are discussed monthly with Region 4.

APC has changed its procedure and organizational responsibility for
making HPV determinations and processing these orders. APC is in the
process of revising the SOP to address the timely review of potential
violations, issuance of a notice of violation, and referral for enforcement
actions. The new SOP should be implemented in 2017 and additional
training for staff will be conducted.

State Review Framework Report | Tennessee | Page 12


-------
Recommendation

State Review Framework Report | Tennessee | Page 13


-------
CAA Element 5 — Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary

TDEC documented the differences in initial and final penalty and the
collection of penalties in their files and data system.

Explanation	Metric 12a indicated that 18 of 19 penalty calculations reviewed (94.7%)

provided documentation in the file showing the rationale for any
difference between the initial and final penalty.

Metric 12b confirmed that documentation of penalty payments made by
most sources (18 of 19) was included in the state's data system.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl
Avg

State State
N D

State
% or

#

12a Documentation on difference
between initial and final penalty

100%



18

19

94.7%

12b Penalties collected

100%



18

19

94.7%

State response

Recommendation

State Review Framework Report | Tennessee | Page 14


-------
CAA Element 5 —

Penalties

Finding 5-2

Area for State Improvement

Summary

TDEC's penalty documentation does not include adequate
documentation of economic benefit calculations.

Explanation

Metric 11a indicates that 11 of the 19 penalty actions reviewed (57.9%)
provided adequate documentation of the State's consideration of gravity
and economic benefit. In some instances, EPA reviewers thought the
violations cited probably resulted in some economic benefit, but the file
did not contain any economic benefit calculation. In other instances, the
state penalty calculation merely showed "$0" or "NA" for economic
benefit, without sufficient rationale for why no economic benefit would
have been gained.

EPA's expectation that state and local enforcement agencies document
the consideration and assessment of both gravity and economic benefit is
outlined in the 1993 Steve Herman memo entitled "Oversight of State
and Local Penalty Assessments: Revisions to the Policy Framework
from State/EPA Enforcement Agreements."





Relevant metrics

St&tc

Metric ID Number and Description ^at' ^at' State o/o or

v Goal Avg N D



11aPenalty calculations include mno/ n in r7A0/

* , ~ iuu/o ii iy j/.y/o

gravity and economic benefit

State response APC proposes that the explanation text be revised to state: Metric 11a
indicates that 11 of the 19 penalty actions reviewed (57.9%) provided
adequate documentation of the State's consideration of gravity and
economic benefit. In some instances, EPA reviewers thought the
violations cited probably resulted in some minimal economic benefit, but
the file did not contain any economic benefit calculation. In other
instances, the state penalty calculation merely showed "$0" or "NA" for
economic benefit, without referencing the economic benefit checklist.
The economic benefit checklist provides rationale for why no economic
benefit would have been gained. EPA requests future economic benefit
checklists provide more detail when no economic benefit is assessed.

State Review Framework Report | Tennessee | Page 15


-------
EPA's expectation that state and local enforcement agencies document
the consideration and assessment of both gravity and economic benefit is
outlined in the 1993 Steve Herman memo entitled "Oversight of State
and Local Penalty Assessments: Revisions to the Policy Framework
from State/EPA Enforcement Agreements."

APC used the same economic benefit checklist in SRF Round 2, and
EPA deemed it acceptable. Round 2 CAA Element 11 Finding stated in
pertinent part, "In general, TDEC's penalty documentation includes both
gravity and economic benefit calculations." The Explanation section
stated in pertinent part, "... and 15 of the 17 (88%) provided sufficient
documentation of economic benefit. ... Therefore, the two files that did
not address economic benefit appear to be infrequent instances that do
not constitute a pattern of deficiencies or a significant problem." Of the
19 files reviewed in Round 3, all but one file included the same type of
documentation for economic benefit used in Round 2. APC proposes
that EPA change this Finding from Area for State Improvement to Area
for State Attention as the practice of APC was predominately the same
between Round 2 and 3, yet EPA findings between Round 2 and Round
3 appear inconsistent.

As APC understands the 1993 Steve Herman memo and the 1984 policy
on civil penalties referenced in the memo, states have the discretion to
not seek de minimis economic benefit (less than $10,000).

While APC is concerned with what appears to be inconsistent standards
applied by EPA staff between two different SRF rounds, APC has
revised its economic benefit checklists, starting Federal FY16-17, where
no economic benefit is assessed to include an additional detailed
explanation of APC's rationale. Additionally, the related penalty memo
will address economic benefit by specifically referring to the economic
benefit checklist.

Recommendation By March 31, 2017, TDEC should implement procedures to ensure the
appropriate documentation of both gravity and economic benefit in
penalty calculations. For verification purposes, for one year following
issuance of the final SRF report, EPA will review selected TDEC
penalty calculations. If by March 31, 2018, these reviews indicate that
the revised procedures are working and the State is documenting the
consideration of economic benefit, the recommendation will be
considered completed.

State Review Framework Report | Tennessee | Page 16


-------
Clean Water Act Findings

CWA Element 1 —

Data

Finding 1-1

Meets or Exceeds Expectations

Summary

TDEC exceeded National Goals for the entry of key data metrics for major
facilities.

Explanation

Metrics lbl and lb2 measure TDEC's data entry of permit limits and
DMRs for NPDES major facilities into Integrated Compliance Information
System (ICIS), EPA's national database. TDEC exceeded National Goals
and the National Averages for the entry of permit limit data (Metric lbl)
and DMR data (Metric lb2) for major facilities into ICIS. Issues with Data
Metrics (7al) are discussed in Element 3.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



lbl Permit limit rate for major facilities >95% 91.1% 148 154 96.1%



lb2 DMR entry rate for major facilities >95% 96.6% 4126 4126 100%

State Response

The State strives to maintain 100% correct data entry of permit limits and



DMRs for the major facility universe as well as the minor facilities and



concurs with this assessment.

Recommendation

State Review Framework Report | Tennessee | Page 17


-------
CWA Element 1 —

Data

Finding 1-2

Area for State Attention

Summary

The accuracy of data between files reviewed and data reflected in the
national data system had minor discrepancies.

Explanation

Metric 2b indicated that 75% of the files reviewed reflected accurate data
entry into ICIS. The few discrepancies observed between ICIS and the
State's files were relatively minor and were related to inspection dates,
Discharge Monitoring Report (DMR) parameters, and enforcement actions.
EPA understands that TDEC manually enters DMR and enforcement data
into ICIS. The discrepancies do not appear to reflect a systemic problem
and were promptly corrected once brought to the state's attention. Until the
flow of data from the state database into ICIS is automated, TDEC should
take steps to ensure accurate data entry into ICIS.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



2b Files reviewed where data are accurately 0/ 0/
reflected in the national data system 0 0

State Response The State does not currently have the electronic data flows in place that
would be necessary to ensure that all the data in ICIS matches the data in
the state system. As noted above, this data is currently manually entered
into two separate databases. The State is working to upgrade our Windsor
node and the coding in our database necessary to create electronic data
flows between the two systems. During the implementation phase, the
State has instituted interim procedures to streamline the data entry process
and ensure accuracy. The State has expanded the number of ICIS users
during the past two years to include field staff in each office so that the
inspector who conducts the inspection is also the person entering the
inspection. This should help ensure that the results of the inspection are
correctly entered into the system. An additional interim measure that has
been initiated to ensure data accuracy in both systems, is a monthly
inspection report generated from the central office and sent out to the EFO
staff listing any inspections that are not in both databases or appear to be
duplicated, so that any reporting errors can be quickly identified and
corrected. Once the flows are functional for the inspection module

State Review Framework Report | Tennessee | Page 18


-------
duplicate entries will be eliminated. The State anticipates having the flows
for the inspection module operational by the December 21, 2016, deadline
set forth in the e-reporting rule.

Recommendation

State Review Framework Report | Tennessee | Page 19


-------
CWA Element 2

Inspections

Mild ill" 2-1

Area for State Improvement

Summary

TDEC did not meet a couple of its FY14 Compliance Monitoring Strategy



(CMS) Plan and CWA §106 Workplan inspection commitments.



Exceptions included CSO inspections and inspections of non-majors with



general permits.

Explanation	Element 2 includes metrics that measure planned inspections completed

(Metrics 4al - 4al0) and inspection coverages (Metrics 5al, 5b 1, and 5b2)
for majors and non-majors. The National Goal for this Element is for 100%
of state specific CMS Plan commitments to be met.

Metrics 4a indicated that TDEC met seven out of eight FY14 inspection
commitments. For Metric 4a8 (Industrial Stormwater Inspections) and
Metric 4a9 (Phase I & II SW Construction Inspections), TDEC exceeded
the inspection commitments by completing 186 and 201 additional
inspections, respectively. The inspection commitment not met was Metric
4a4 (CSO Inspections). The CMS requires inspection of 100% of the TN
CSO universe (three facilities) every three years. TDEC did not conduct
any CSO inspections from FY12-FY14. In FY15, TDEC conducted one
CSO inspection.

Metric 5 indicated that TDEC met two out of three FY14 inspection
coverage commitments forNPDES majors/non-majors. The inspection
coverage not met was Metric 5b2 (Inspection coverage of non-majors with
general permits). This NPDES non-majors with general permit universe
includes municipal and industrial wastewater facilities.

Meeting inspection commitments and inspection coverages were an Area
for State Attention in Round 2 of the SRF and due to continued
commitment shortfalls, is an Area for State Improvement in Round 3.

Relevant metrics	Natl

, ttv TfcT i , ^ . .	Natl Goal A ^	State

Metric ID Number and Description	Ave	State JN State D

1	6	% or #

4al Pretreatment compliance inspections	100% of CMS	PAI: 18 PALI 8	PAI: 100%

and audits	"	PCI: 42 PCI: 41	PCI: 102%

State Review Framework Report | Tennessee | Page 20


-------
4a2 SIU inspections for SIUs discharging 100% of CMS
to non-authorized POTWs

2

2

100%

4a4 Major CSO inspections

100% of CMS -

0

3

0%

4a5 SSO inspections

100% of CMS -

3

0

-

4a7 Phase I & II MS4 audits or
inspections

100% of CMS

16

16

100%

4a8 Industrial stormwater inspections

100% of CMS -

414

228

180%

4a9 Phase I & II SW construction
inspections

100% of CMS

1,178

977

120%

4al0 Medium and large NPDES CAFO
inspections

100% of CMS

6

3

200%

5al Inspection coverage of NPDES
majors

100% of CMS

87

83

105%

5bl Inspection coverage of NPDES non-
majors with individual permits

100% of CMS

160

135

118%

5b2 Inspection coverage of NPDES non-
majors with general permits

100% of CMS

58

77

75%

State Response The State disagrees that inspection coverage is an area in need of State

Improvement. The combined sewer overflow (CSO) inspections that were
not completed are a very small subset of the universe, as there are only 3
facilities in the entire state. Additionally this item has already been
corrected as the two field offices, which have CSO facilities have been
instructed to complete at least one CSO inspection every time the facility is
inspected which will ensure that 100% of the CSO inspections are
completed according to the required time frames. Likewise, staff have been
notified and trained in the requirement to conduct SSO inspections. It is
very likely that sanitary sewer overflow inspections were conducted as part
of normal compliance evaluation inspections but not always identified as a
separate inspection. As with the CSO inspections, staff have been trained
in the need to perform SSO inspections at a minimum frequency of 10%
per year, and in meeting the non-major NPDES with general permits.
Neither CSO nor SSO inspections should be below the required frequency
going forward. The State believes that through staff training, this area of
deficiency has already been addressed. The remainder of the inspection
types all met or exceeded the minimum requirements set forth by EPA. The
State was obligated to conduct approximately 1,524 total inspections and
completed a total of approximately 1,924 inspections, or 126% of our
numerical goal. While the State did miss 22 inspections in 2 categories,
overall the State did meet its numerical inspection commitment. Therefore,

State Review Framework Report | Tennessee | Page 21


-------
the State requests that this item be changed to area for attention versus area
for improvement.

Recommendation By March 31, 2017, TDEC should implement procedures to ensure that

CWA 106 Workplan inspection annual commitments and CMS established
inspection frequencies are met and maintained. EPA will review the State's
procedures and monitor the State's implementation efforts through existing
oversight calls and other periodic data reviews. If by September 30, 2017,
these reviews indicate that the state is on target to meet its annual
commitments and the CMS inspection frequencies, the recommendation
will be considered completed.

State Review Framework Report | Tennessee | Page 22


-------
CWA Element 2 — Inspections

Finding 2-2

Meets or Exceeds Expectations

Summary

TDEC's inspection reports were well written, complete and provided
sufficient documentation to determine compliance.

Explanation	Metric 6a requires that inspection reports are complete and sufficient to

determine compliance at a facility. 100% of TDEC's inspection reports and
accompanying cover letter were well written, complete, sufficient and
included field observations noting compliance issues, where appropriate.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

6a Inspection reports complete and sufficient to
determine compliance at the facility

100%

-

26

26

100%

State Response The State concurs with this assessment.

Recommendation

State Review Framework Report | Tennessee | Page 23


-------
CWA Element 2 — Inspections

Finding 2-3	Area for State Attention

Summary	TDEC inspection reports were not always completed in a timely manner.

Explanation	File Metric 6b indicated that 7 of the 27 (26%) of TDEC's inspection

reports were not completed in a timely manner. Because TDEC's EMS
does not prescribe timeframes for inspection report completion; EPA relied
on its NPDES EMS which allows for 30 days and 45 days to complete
non-sampling inspection reports and sampling inspection reports,
respectively. The average number of days to complete an inspection report
was 25 days, with a range of 4-61 days. The seven untimely inspection
reports were completed within 41-61 days, which is an improvement from
Round 2 of the SRF when the untimely inspection reports took an average
of 84 days for completion. TDEC should reassess their practices and
procedures to ensure the timely completion of inspection reports. TDEC
also has the ability to establish their own timeframes for inspection report
completion.

Because nearly three-fourths of the reports reviewed were completed in a
timely manner pursuant to the EPA's EMS and the decrease in the number
of days needed for completion from Round 2, this does not appear to
reflect a systemic problem.

Timeliness of inspection reports is a continuing issue from Round 2 of the
SRF and remains an Area for State Attention in Round 3.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

6b Inspection reports completed within prescribed
timeframe

100%

--

20

27

74%

State Response Completion of inspection reports within the required time frame is an item
that is currently included in job plans for inspectors and is also included in
our guidance documents for staff. Some of the inspection reports that
exceeded the time frame were particularly complex audits that required
additional time to complete. The State's 106 plan language also includes a
provision for the reports to be completed within 45 days. According to the
State's interpretation of that language, the State would have 85% timely

State Review Framework Report | Tennessee | Page 24


-------
completion of inspections instead of 74%. Additional training will be given
to staff who conduct the inspections, to stress the importance of completing
the inspection reports on time. The State will also reassess the amount of
time prescribed in our guidance documents to ensure that the time-frames
are all in agreement and meet EPA's requirements.

Recommendation

State Review Framework Report | Tennessee | Page 25


-------
CWA Element 3

Violations

Finding 3-1	Area for State Improvement

Summary	Inspection Report findings and cover letters were unclear about the

compliance determinations made during the inspection.

Explanation	Metric 7e indicated that 35% of the inspection report findings and cover

letters were ambiguous about the compliance determination made at each
facility. While the inspection reports reviewed would detail deficiencies, it
did not explicitly state that the findings were violations. For example, one
file noted that deficiencies of permit conditions were observed during the
inspection. Instead of clearly indicating the deficiencies were violations,
the report mentioned the facility should ensure compliance with their
NPDES permit. In instances where SSOs or DMR exceedances were
documented, a follow-up NOV was not issued in association with those
inspection findings.

Due to these observations, this is an Area for State Improvement.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

7e Inspection reports reviewed that led to an
accurate compliance determination

100%

--

17

26

65.4%

State Response In many of the files reviewed, the inspection report is also the NOV. While
not all NOVs contain a heading stating that the inspection is a NOV, the
report does contain language identifying that a violation has occurred and
needs to be corrected. Going forward, the State will issue guidance to field
staff clarifying when to identify letters as NOVs. Previous discussions with
EPA in 2011 indicated that it would not be a problem if the title NOV was
not used in all circumstances, as long as the violation was identified in the
body of the letter. The State will send updated guidance documents to EPA
as they are developed. The State anticipates that this will be completed by
the end of 2016.

Recommendation By March 31, 2017, TDEC should implement procedures to clarify the

compliance status of a facility following inspections. Notice of Violations
should be issued for facilities found to be non-compliant during the

State Review Framework Report | Tennessee | Page 26


-------
inspection. EPA will review the State's procedures and monitor the State's
implementation efforts through existing oversight calls and other periodic
data reviews. If by September 30, 2017, these reviews indicate that
compliance determinations are clearly made and NOVs are issued when
appropriate, the recommendation will be considered completed.

State Review Framework Report | Tennessee | Page 27


-------
CWA Element 3

Violations

Mild ill" 3-2

Area for State Improvement

Summary

The State does not identify and properly report Single Event Violations



(SEVs) and Significant Noncompliance (SNCs) at major facilities.

SEVs are one-time or long-term violations, including unauthorized
bypasses or discharges, discovered by the permitting authority typically
during inspections and not through automated reviews of Discharge
Monitoring Reports.

Explanation

Metric 7al indicated that TDEC entered only one SEV at a major facility
in FY14. One inspection report documented violations, but the file does not
indicate that the State entered them as SEVs in ICIS.

Metrics 8b 1 and 8c indicated that the State did not properly code SEVs into
ICIS as required by the ICIS SEV Entry Guidance and did not identify any
SEVs as SNC in any of the files reviewed. In each of the nine files
reviewed, numerous bypasses and SSOs were documented, but were not
identified in ICIS as SEVs or SNC, where appropriate. When SSOs are
reported on a DMR, TDEC entered them as a DMR parameter into ICIS.

In Spring 2016, EPA Region 4 gave ICIS training to TDEC and discussed
solutions for reporting bypasses and SSOs in accordance with the ICIS
SEV Entry Guidance.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

7al Number of major facilities with single event
violations

N/A

--





1

8b 1 Single-event violations accurately identified
as SNC or non-SNC

100%

--

0

9

0%

8c Percentage of SEVs identified as SNC
reported timely at major facilities

100%

--

0

7

0%

State Response The State disagrees in part with this finding. All overflows and bypasses
are currently tracked on the DMR form and self-reported by the facility.
Additional entry of these overflow and bypass events would constitute an

State Review Framework Report | Tennessee | Page 28


-------
unreasonable burden on the state and duplicate data already in the system.
Current language in the 106 commitment (Item 15) states: "Enter and
maintain data in ICIS-NPDES for all Single Event Violations, except those
automatically identified by the system (e.g., if DMR data entered, effluent
violations need not be identified as SEV)". Based upon TDEC's
interpretation of that language, the State was meeting this requirement for
both majors and minors in regards to capturing overflows and bypasses in
the ICIS system during this review period.

EPA provided the State training during April of 2016 on entry of SEVs.
Field office staff also attended this training as they will be doing the SEV
entry related to inspections going forward. Following the EPA training in
April, SEVs were again discussed at the quarterly May enforcement
roundtable with field office staff to ensure that everyone understands how
to enter these violations and when they should be entered. As with the
inspection report entry, a monthly report on SEVs will be sent to field
office managers until data flows are established to ensure that SEVs are
properly entered into both the state system and ICIS. Once the data flows
are in place to support e-reporting this will help field staff in reducing their
data entry burden by eliminating the need for double data entry.

Additionally, the division has held several meetings to update our overflow
and bypass language so that more information can be captured in ICIS as
well as update the coding as permits are reissued, so that self-reported
overflows and bypasses that reach waters of the state are automatically
identified as SNC in ICIS and in ECHO. The State believes that these
actions will correct any reporting deficiency that may exist.

Recommendation By March 31, 2017, TDEC should develop and implement procedures to
ensure that SEVs are identified and coded accurately into ICIS. EPA will
review the State's procedures and monitor the State's implementation
efforts through existing oversight calls and other periodic data reviews. If
by September 30, 2017, these reviews indicate that SEVs are being
identified and coded accurately, the recommendation will be considered
completed.

State Review Framework Report | Tennessee | Page 29


-------
CWA Element 4 —

Enforcement

Finding 4-1

Area for State Improvement

Summary

The State's Enforcement Responses (ERs) were not always timely or



appropriate. Additionally, the State's ERs did not always achieve a Return



to Compliance (RTC).

Explanation	The State's Enforcement Responses (ERs) were not always timely or

appropriate. Additionally, the State's ERs did not always achieve a Return
to Compliance (RTC).

Metric 9a indicated that in 10 of 23 files reviewed (44%) the chosen ERs
did not return or were not expected to return a facility to compliance. In
several instances, an NOV was issued without a deadline for the facility to
respond with a corrective action plan and further noncompliance continued,
as documented by ICIS. In other files, despite the issuance of an NOV with
a deadline for a facility to respond with a corrective action plan, continued
noncompliance occurred and an apparent RTC was not achieved.

Metric lOal indicated that none of the State's 12 major facilities in SNC
had timely ERs.

Metric 10b documented that in 50% of the files reviewed, TDEC did not
consistently address violations in an appropriate manner. In those 12 files,
the ERs were not appropriate because numerous informal enforcement
actions were taken and noncompliance appeared to continue without ER
escalation to achieve compliance, or the State did not provide written
justification for why a formal action was not taken for facilities in SNC.

For example, one facility was issued five NOVs in seven months of SNC
violations without any ER escalation or documentation for why formal
enforcement action was not taken. Another file reviewed had ten months of
Category 1 effluent exceedances without a documented ER or justification
for why an enforcement action was not taken. Additionally, multiple files
documented informal actions with no apparent RTC or documentation for
why formal action was not taken. While issuance of an NOV may be the
appropriate initial response to promote compliance, ER escalation is
warranted when repeated violations occur to ensure a RTC.

State Review Framework Report | Tennessee | Page 30


-------
Timely and appropriate enforcement responses are a continuing issue from
Rounds 1 and 2 of the SRF and remains as an Area for State Improvement
in Round 3.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

9a Percentage of enforcement responses that











return or will return source in violation to

100%

--

10

23

44%

compliance











lOal Major facilities with timely action as
appropriate

>98%

9%

0

12

0%

10b Enforcement responses reviewed that

100%



12

24

50%

address violations in an appropriate manner



State Response The facilities that were used for the review are generally large wastewater
plants or industrial plants. These facilities may continue to have violations
while under an order that continue to place them in significant non-
compliance (SNC). Corrective actions at these facilities often take several
years to complete. Overall the state's SNC compliance rates among major
facilities have improved significantly as the SRF dashboard shows. The
number of major facilities listed as being in SNC for 2012 and 2013 show
that there were 54 and 51 facilities in SNC, while 2014 and 2015 show a
dramatic improvement in SNC compliance rates with 29 facilities listed in
2014 and 26 facilities listed in 2015. Therefore the State disagrees with the
findings 9(a) and 10(b). Further, the State believes that the improvement
shown in the metrics above is attributable to the enforcement actions taken
over a period of many years which have resulted in numerous facilities
upgrading, expanding, and in some cases building new treatment facilities.
Also of note, out of the 26 shown to be in SNC during 2015, 18 of those
have either had orders, or are currently under an order. The remainder are
in development, or are in negotiations to sign a consent order. Many of the
SNC violations occurred during the time that a facility was under an order
and required no additional enforcement, as was the case with the examples
mentioned above. The facility with numerous NOVs was under an order at
the time, and completed the actions in the approved CAP in June of 2016.
In other examples where an escalated enforcement response was
recommended, the State was already working on new orders which had not
been finalized at the time of the SRF inspection. The State will work to
further document in writing our rational for enforcement responses going
forward. The 44% rating for enforcement does not reflect the improvement

State Review Framework Report | Tennessee | Page 31


-------
shown in major SNC compliance rates over the past few years. The phrase
"responses that returned, or will return" a facility to compliance could
allow additional consideration for those facilities that are under an order or
have an order in development. In the past EPA has taken this into
consideration. Another consideration that should be taken into account is
the fact that the State does not change permit limits while a facility is under
an order as many other states routinely do. Therefore, the compliance rate
may appear worse when compared to other states that do allow interim
enforcement limits.

The State recognizes that the orders issued have not met the timeliness
definition as set forth by EPA. However, the State would like to point out
that the overall national average of timely enforcement nationwide was
only 4% in 2012 reaching a high of only 14% in 2015, (or 9% according to
the chart above), as reported in the EPA ECHO database on the State
Review Framework.

The State is working to improve timeliness through ongoing training of
staff and streamlining of procedures to initiate enforcement actions within
the division. Some of the actions taken to date include a modified LEAN
event to identify areas where processes could be shortened and review
times decreased, as well as ongoing training initiatives to facilitate staff
understanding of enforcement procedures.

By March 31, 2017, TDEC should develop and implement procedures to
Recommendation ensure that ERs are timely and appropriate, achieve a RTC, and are

documented in the file. Should TDEC update their EMS, EPA will review
and provide comments for consideration. EPA will review these procedures
and monitor the State's implementation efforts through existing oversight
calls, review of the Quarterly Non-Compliance Report, and other periodic
data reviews. If by September 30, 2017, these reviews indicate that the
revised procedures appear to result in timely/appropriate enforcement
responses that reflect a RTC; the recommendation will be considered
completed.

State Review Framework Report | Tennessee | Page 32


-------
CWA Element 5 — Penalties

Finding 5-1	Area for State Improvement

Summary	The State does not routinely include documentation in the file that

demonstrates the consideration of economic benefit (EB).

Explanation	Metric 11a indicated that two of eight (25%) files reviewed documented

the consideration of EB. In six files, TDEC did not mention EB or
document the rationale for why EB related to delayed or avoided costs
was not included in the penalty calculation worksheet. The State's
"Uniform Guidance for the Calculation of Civil Penalties" makes it clear
that to the extent practicable, the EB of noncompliance should be
calculated and recovered. TDEC's Uniform Guidance also states that "to
effectively achieve deterrence, any significant economic benefit resulting
from the failure to comply with the law should be recovered." TDEC also
developed an EB model based on EPA's BEN model.

In support of considering EB in penalty calculations, EPA guidance
(Oversight of State and Local Penalty Assessments: Revisions to the
Policy Framework from State EPA Enforcement Agreements; 1993^ notes
that to remove economic incentives for noncompliance and establish a
firm foundation for deterrence, EPA, the States, and local agencies shall
endeavor, through their civil penalty assessment practices, to recoup at
least the economic benefit the violator gained through noncompliance.

Documentation of economic benefit consideration in penalty calculations
is a continuing issue from Rounds 1 and 2 of the SRF and remains as an
Area for State Improvement in Round 3.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State State
D % or #

11a Penalty calculations reviewed that consider
and include gravity and economic benefit

100%

-

2

8 25%

State Response The uniform penalty guidance that the State is using considers both

"gravity" and economic benefit in each case. This is documented in the
worksheet that the State is currently using. Some of the older worksheets

State Review Framework Report | Tennessee | Page 33


-------
that were reviewed during the past SRF review period did not have this
documentation, as it was still in development at the time. The current
worksheet also includes an area to document economic benefit if it is
reasonable to do so, and a checklist describing areas where an economic
benefit could be assessed as well as a space to document why an
economic benefit was not assessed. This is now mandatory for all
enforcement staff to fill out. Additional training is being provided to staff
to ensure that the economic benefit checklist is filled out appropriately for
each case and enforcement staff received training on use of the TN BEN
model during a staff meeting in June of 2016. The State will be happy to
provide EPA with any orders and penalties that are requested.

Recommendation By March 31, 2017, TDEC should consistently implement procedures
which document the consideration of EB and gravity in their penalty
calculations. EPA will monitor the State's efforts through existing
oversight calls and other periodic file reviews. For verification purposes,
EPA will review finalized TDEC orders and penalty calculations, to
assess progress in implementation of these improvements. If by
September 30, 2017, these reviews indicate that the State is documenting
the consideration of gravity and EB; the recommendation will be
considered completed.

State Review Framework Report | Tennessee | Page 34


-------
CWA Element 5 — Penalties

Finding 5-2	Area for State Improvement

Summary	The difference and/or the rationale for any differences between initial and

final penalties assessed are not consistently documented by the State.

Explanation	Metric 12a indicated that six of nine (67%) files reviewed documented the

difference between the initial and final penalty and/or the rationale for the
difference.

As described in TN's "Uniform Guidance for the Calculation of Civil
Penalties," TDEC may assess a civil penalty that consists of an "upfront"
cash component and a contingent penalty component subject to injunctive
relief conditions outlined in an Order. The Uniform Guidance also states
that "the upfront civil penalty should remove any known economic
benefit" and should "encourage compliance by having non-compliance
cost more than compliance." Of the three files that did not document the
difference and/or the rationale for differences between initial and final
penalties assessed, the following observations were noted:

•	No documentation of rationale when the initial penalty calculation
worksheet amount does not match the final upfront/contingent
penalties documented in the Agreed Order. (3 files)

•	The final upfront penalty does not recover assessed economic
benefit. (2 files)

•	When an assessed penalty is 100% contingent, the cost of non-
compliance is not greater than compliance. (2 files)

Additionally, TDEC entered the total penalty (upfront and contingent)
assessed into ICIS, which does not accurately reflect the amount of
penalty collected. Subsequent to the file review, TDEC stated they
changed their procedures to only enter the upfront penalty amount into
ICIS.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

12a Documentation of the difference between

100%







67%

initial and final penalty and rationale







State Review Framework Report | Tennessee | Page 35


-------
State Response The example case where 100% of the penalty was contingent is unusual
and is no longer occurring. All NPDES cases that are currently issued by
the State have a percentage of the penalty as an up-front requirement,
which covers any economic benefit gained through non-compliance. In
cases where the Respondent is offered a SEP to offset a portion of the up-
front penalty, the Respondent must still pay any amount of economic
benefit in cash and our SEP policy requires the SEP to be at least twice
the cost of the offset penalty amount. Our state database also has a
separate area to track the amount of money required to be spent on a SEP
versus the total amount that was actually spent. This is currently being
documented for each order that contains a SEP.

The majority of orders that the State issues include minimum up-front
penalties of 15-25% of the total calculated penalty, and recover
substantially more than the economic benefit. This guidance will be
reiterated and clarified in ongoing training and in any updated guidance
documents.

The State was originally instructed to enter the entire amount of the
penalty into the PCS/ ICIS system many years ago. Upon learning that
EPA would prefer to have only the up-front amount of the penalty
entered, the State immediately changed entry procedures to comply with
EPA recommendations.

The total penalty amount is a reflection of the amount calculated for all of
the violations covered by the order. The up-front amount is the amount
that the State initially collects once the order is final. If the Respondent
complies with the order then the contingent penalties are not due.
However, if the Respondent fails to comply with the contingent
requirements, then the amounts automatically become due and payable to
the State without the need for additional orders or further legal action.

While this method differs from the way EPA assesses penalties, the State
has been advised by the Office of the Attorney General that we cannot
issue penalties for stipulated penalties except in an Order by Consent. The
current penalty allocation between up-front and contingent penalties
provides deterrence for failing to comply with order requirements while
still imposing a significant monetary penalty upon the Respondent.

State Review Framework Report | Tennessee | Page 36


-------
The State will be happy to provide EPA with any orders and penalties that
are requested.

Recommendation By March 31, 2017, TDEC should implement procedures to ensure

consistency in the use of the Uniform Guidance and that economic benefit
is recovered. The state should also ensure that only upfront penalty
amounts assessed are entered into ICIS. EPA will review the State's
procedures and monitor the State's implementation efforts through
existing oversight calls and other periodic data reviews. For verification
purposes, EPA will review finalized TDEC orders and penalty
calculations to assess progress in implementation of these improvements.
If by September 30, 2017, these reviews indicate marked improvement in
these areas, the recommendation will be considered completed.

State Review Framework Report | Tennessee | Page 37


-------
CWA Element 5

Penalties

Mild ill" 5-3

Area for State Attention

Summary

The State does not consistently document the collection of penalties.

Explanation

Metric 12b indicated that seven of nine (78%) files reviewed documented
the collection of upfront penalties within the state database and that
contingent penalties milestones were met. Several files reviewed
contained Supplemental Environmental Projects (SEPs) that offset the
upfront penalties. Of the two files that did not document penalty
collection, one was referred to collections following the SRF and the
other file had a 100% contingent penalty, but was unclear if the facility
met the contingency milestones. Because the majority of the files
documented penalty collection, this does not appear to reflect a systemic
problem.

In instances where SEPs are used to offset a portion of the penalty or
where contingent penalties are used, TDEC should implement procedures
to ensure proper documentation that SEPs have been completed and
contingent milestones have been met.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



12b Penalties collected 100% ~ 7 9 78%

State Response The State currently uses our database to track collection of penalties. Of
the two cases in question, one is currently with collections and the other
one has been paid. The SEP documentation was in our database but not
necessarily in the paper file as the process is maintained in an electronic
format. In an effort to better document completion of items required in
orders, we have started including a final report requirement when SEPs
are used, and a final report documenting all completed requirements and
analyzing the effectiveness of those corrective actions. This will be a
standard component of orders going forward that require corrective
actions that will take multiple years to complete. See also comments in
item 5-2 above.

Recommendation

State Review Framework Report | Tennessee | Page 38


-------
Resource Conservation and Recovery Act Findings

RCRA Element 1 — Data

Finding 1-1	Area for State Attention

Summary	During the SRF evaluation, the majority of the files reviewed included

accurate data.

Explanation	During the SRF file review, information in the facility files was checked

for accuracy with the information in the national RCRA database,
RCRAInfo. The data was found to be accurate in 26 of the 32 files
(81.3%). The data inaccuracies found were isolated, like incorrect dates,
missing Notices of Violation, or incorrect violation citations. Unlike
previous SRF evaluations, the problems did not appear to be systemic
and can be monitored at the state level. However, since there has been
staff turnover TDEC should consider retraining the employees on their
data SOP and RCRAInfo data entry requirements. This element
considered an Area for State Attention.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

2b Complete and accurate entry of
mandatory data

n/a

n/a

26 32

81.3%

State Response The DSWM concurs that the accuracy rate is unacceptable and that steps
need to be taken to improve it. The Division plans on having a
retraining session at the Statewide Staff meeting in October 2016 to
address the need for staff to input information accurately and completely
into RCRAInfo. This will also be addressed at the semi-annual
manager's meeting to be held in August 2016. The Enforcement and
Compliance Section will begin randomly cross-checking information in
the facility files against the information in RCRAInfo as a Quality
Assurance measure. The Division will look at the feasibility of adding a
Quality Control step to the Field Office Manager's job duties. Currently
it is difficult to add a centralized system-wide QC check due to system
configuration.

Recommendation

State Review Framework Report | Tennessee | Page 39


-------
RCRA Element 2 — Inspections

Finding 2-1	Meets or Exceeds Expectations

Summary	Tennessee met national goals for all TSD and LQG inspections.

Explanation	Element 2 measures three types of required inspection coverage that are

outlined in the EPA RCRA Compliance Monitoring Strategy: (1) 100%
coverage of operating Treatment Storage Disposal (TSD) facilities over
a two-year period, (2) 20% coverage of Large Quantity Generators
(LQGs) every year, and (3) 100% coverage of LQGs every five years.
In FY 2014, Tennessee met expectations for all inspections in these
areas. All 21 operating TSDs were inspected over the two-year time
period. The state also had excellent annual LQG inspection coverage
(30.7%) that is well above the national goal of 20%.

For the five-year LQG inspection coverage, the initial data metric of
87.8% was below the national goal of 100%. Upon reviewing the
facilities that were not inspected during this five-year time frame, it was
noted that 28 of the 41 facilities were not LQGs during the entire five
years and therefore are not part of the inspection universe. The
corrected universe would then be 295 of 308 LQGs that were inspected
in the five-year period, which is 95.7% coverage. This LQG inspection
coverage is proximate enough to the national goal of 100% coverage to
allow for fluctuation of LQG status over the five-year period.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

5a Two-year inspection coverage of operating
TSDFs '

100%

88.4%

21

21

100%

5b Annual inspection coverage of LQGs

20%

20.1%

103

336

44.2%

5c Five-year inspection coverage of LQGs

100%

67.1%

295

336

87.8%

Five-year inspection 5c Corrected of LQGs

295 308 95.7%

State Response EPA's RCRA Compliance Monitoring Strategy requirement for
inspection coverage of Large Quantity Generators (LQGs) is 20%
coverage every year and 100% coverage every five years. Some
Tennessee facilities were not LQGs during the entire 5 year period and
after making corrections for this, EPA notes in the State Review
Framework (SRF) that during this five-year time frame, 295 facilities
were inspected out of a corrected universe of 308 LQG facilities, which
is 95.7% coverage. The 95.7% coverage of LQGs is below the national
goal of 100% coverage for the five year period. EPA has considered

State Review Framework Report | Tennessee | Page 40


-------
this close enough to the national goal that this metric is considered to
meet expectations. That said, TDEC feels that this number is too low
and has evaluated the program and has taken steps to improve
inspection results. Upon investigation, it was determined that the
shortage of LQG inspections largely occurred in the Nashville Field
Office (NFO) and can be attributed to legacy issues that occurred over
several years. Procedural modifications in the way LQG inspections are
assigned and monitored have been developed and implemented in the
last two years by the manager of the NFO. These operational changes
will improve the metrics with the goal of achieving the national target
of 100% coverage of LQGs every five years.

Recommendation

State Review Framework Report | Tennessee | Page 41


-------
RCRA Element 2 — Inspections

Finding 2-2

Meets or Exceeds Expectations

Summary

The RCRA inspection reports provided sufficient documentation to
determine compliance at the facility, and were completed in a timely
manner.

Explanation	A total of 32 inspection reports were evaluated for completeness and

sufficiency to determine compliance with the RCRA requirements. It
was found that 100% of the inspection reports met this standard.

The Tennessee Division of Solid Waste Management Hazardous Waste
Program Enforcement Policy sets forth a 45-day deadline for RCRA
inspection report completion. Thirty-one inspections reports (96.9%)
inspection reports met this deadline with an average time for report
completion at 29 days.

The completeness, sufficiency, and timeliness of the RCRA inspection
reports meets SRF requirements. The quality of the TDEC RCRA
inspection reports reviewed were excellent, with thorough descriptions
of facility processes, waste management activities, potential violations
and supporting photo documentation.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #

6a Inspection reports complete and sufficient to
determine compliance

100%

n/a

32

32

100%

6b Timeliness of inspection report completion

100%

n/a

31

32

96.9%

State Response

TDEC appreciates EPA's acknowledgement of the Hazardous Waste
Management Program's efforts to focus on timely and quality reporting.

Recommendation

State Review Framework Report | Tennessee | Page 42


-------
RCRA Element 3

Violations

Finding 3-1	Meets or Exceeds Expectations

Summary	Tennessee makes accurate compliance determinations and the

appropriate identification of most SNC facilities.

Explanation	File Review Metric 7a assesses whether accurate compliance

determinations were made based on a file review of inspection reports
and other compliance monitoring activity (i.e., record reviews). The file
review indicated that 96.9% of the files reviewed had accurate
compliance determinations (31 of 32 files).

The majority of SNCs (90.5%) were identified correctly by the state in
the national database and in accordance with the RCRA ERP. Of the 21
SNC-caliber facility files reviewed, there were two facilities that were
not identified as SNCs by the state, and violations were addressed
through informal rather than formal enforcement actions per the RCRA
ERP.

The accuracy of the state's RCRA compliance determinations and the
appropriateness of the SNC identifications meet SRF requirements.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State State
N D

State
% or #



7a Accurate compliance determinations

100%



31 32

96.9%



8c Appropriate SNC determinations

100%



19 21

90.5%

State Response As noted above, two cases were addressed through informal rather than
formal enforcement actions. TDEC believes that this course of action
was appropriate, as described below.

Vitran/Central Transport:

• Vitran Memphis was not a HW transporter, but was shipped 2 trailers
of HW in November 2012 from the Missouri Vitran facility, reportedly
as a mistake, since there was no paperwork to accompany the shipment.
The trailers were stored at the Memphis facility until they could be
profiled by and shipped by Vitran to Clean Harbors and Excel in two
shipments (one in July of 2013 and the other in September of 2013).
The CEI was performed on 11-5-13 (after the waste had been disposed
of), but within days after the CEI (11-18-13), Vitran was purchased by
Central Transport. The Division decided that since the shipment of HW
originated from another state that EPA and the state of Missouri would

State Review Framework Report | Tennessee | Page 43


-------
be better able to address the problem at the source. Alan Newman
(EPA) was made aware at the time, and a copy of our warning letter was
sent to Missouri. In addition, the new owner (Central Transport) gave
the Division a copy of procedures that should prevent another similar
situation from ever occurring. Since Vitran-Memphis/Central was
considered to be an innocent party to Vitran-Missouri's actions, DSWM
believes that enforcement action is most appropriately taken against
Vitran-Missouri.

Excel:

• Excel is a Treatment Storage and Disposal (TSD) facility that is
subject to annual inspections. A CEI was performed on March 26, 2014
which identified violations associated with secondary containment
integrity. Then on July 17, 2014 Excel notified the State of the shipment
of Hazardous Waste (HW) to a Class 1 (Subtitle D) landfill. Excel
immediately cleaned up the HW at the landfill. An NOV was issued on
September 4, 2014 for the shipment of HW to a Class 1 landfill. A show
cause meeting was held on October 8, 2014 to discuss the violations
associated with the March 26, 2014 CEI as well as the September 4,
2014 NOV. The facility submitted information (PE certification) after
the October 8, 2014 show cause meeting regarding the structural
integrity of the secondary containment and sump structure to support
their contention that they were sound and not leaking. The decision was
made by the DSWM to not pursue formal enforcement action for the
2014 inspection and September 4, 2014 NOV (for secondary
containment structural integrity and for mishandling HW). Instead a
warning letter was issued to the facility. Subsequent TDEC/EPA joint
inspections identified additional violations; EPA is taking the
enforcement lead for these issues.

Recommendation

State Review Framework Report | Tennessee | Page 44


-------
RCRA Element 3

Violations

Finding 3-2	Area for State Improvement

Summary	The timeliness of SNC determinations continues to be an issue for the

Tennessee RCRA program.

Explanation	The RCRA ERP outlines that 100% of SNC determinations should be

entered into RCRAInfo within 150 days of the first day of the inspection
(day zero). The data metric that measures this requirement indicated that
only 52.3% (11 of the 21) SNCs identified met this criterion in FY2014.
The initial metric was 47.8% (11 of 23 SNCs timely), but two of the
cases were EPA joint inspections so those SNCs were removed from the
metric.

As outlined in the Tennessee Division of Solid Waste Management
Hazardous Waste Program Enforcement Policy, the TDEC Field Office
submits an "Enforcement Action Request" (EAR) to the Division's
Enforcement Section if the potential for a formal enforcement action is
identified. Enforcement personnel then evaluate the merits of the case,
typically through a show-cause meeting with the facility representatives.
If there is a decision to pursue formal enforcement then the facility is
identified as a SNC in RCRAInfo.

In evaluating the case timelines, EPA observed that in nine of the ten
untimely SNCs (90%), the EAR was submitted to the Enforcement
Section more than 100 days after the inspection. The Division's policy
states that the field offices should submit an EAR within 75 days of first
documenting the violation (inspection reports average 29 days for
completion). The delay in EAR submittals was also identified as a factor
in late SNC determinations during the SRF Round 2 evaluation. This
issue will continue as an "Area for State Improvement" in SRF Round 3.

Relevant metrics . Natl	Natl State State State

Metric ID Number and Description _ ,	. „ __

Goal	Avg N D % or #

8b Timeliness of SNC determinations 100%	11 23 47.8%

8b Corrected Timeliness of SNC determinations	11 21 52.3%

State Response The Division of Solid Waste Management will add a table for reporting
and tracking each Enforcement Action Request (EAR) in the Planning
and Reporting Excel spreadsheet for each Field Office. Facility name,
inspection date (day zero), NOV issuance date, and EAR submittal date
will be required to be entered for all facilities with EARs. This table will

State Review Framework Report | Tennessee | Page 45


-------
calculate the number of days from the inspection date (day zero) to the
NOV issuance date and the EAR submittal date. If 45 days is exceeded
from inspection date (day zero) to the NOV issuance date it will be
flagged with a NO. If 75 days is exceeded from inspection date (day
zero) to the EAR submittal date it will be flagged with a NO. This
additional reporting and tracking will reemphasize the importance of
meeting these NOV and EAR time limits and will provide management
with an easy mechanism to track compliance with the established time
limits. Non-compliance with the above timeframes will be discussed
with Field Office Managers during formal performance reviews.

Additionally, at the August 2016 meeting for all Division of Solid Waste
Management managers, a member of the Enforcement and Compliance
staff will conduct refresher training on which violations and situations
which necessitate the submittal of an EAR.

Recommendation EPA will monitor progress on meeting the timelines for SNC entry into
RCRAInfo via bimonthly conference calls and RCRAInfo data analyses.
Following the finalization of the TDEC Round 3 SRF Report, EPA will
close this recommendation after observing four consecutive quarters of
performance that meets national goals.

State Review Framework Report | Tennessee | Page 46


-------
RCRA Element 4 — Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary

TDEC consistently issued timely RCRA enforcement responses that
returned violating facilities to compliance.

Explanation

A total of 29 files were reviewed that included informal or formal
enforcement actions, and of the enforcement actions returned the
facilities to compliance with the RCRA requirements.

The data metric that measures the timeliness of formal enforcement
showed that 94.7% (18 of 19) of the formal enforcement actions met the
ERP in FY 2014. The national goal is 80%. This is a significant
improvement from previous SRF reviews.

Facility noncompliance was documented in the 29 of the files reviewed.
In evaluating the enforcement responses taken, 89.7% (26 of 29) cases
were addressed with the appropriate enforcement response. The
remaining three cases, two facilities were not identified as SNCs and the
state addressed the violations through an informal action rather than an
appropriate formal enforcement action (referenced in Finding 3-1). In
the third case the state did identify the facility as a SNC, but the consent
agreement that was negotiated with the facility is not considered formal
enforcement since the action did not mandate compliance and is not
enforceable.

The state met the SRF expectations for the criteria for timely and
appropriate enforcement actions that return violators to compliance.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

9a Enforcement that returns violators to

100%

n/a

29

29

100%

compliance

10a Timely enforcement taken to address SNC

80%

84.3%

18

19

94.7%

10b Appropriate enforcement taken to address
violations

100%

n/a

26

29

89.7%

State Response As noted above, two cases were not identified as SNCs and handled

through informal rather than formal enforcement action. The rationale
for these two decisions is discussed in DSWM's response in Section 3-1.
For the third case, TDEC elected to enter into a Consent Agreement

State Review Framework Report | Tennessee | Page 47


-------
rather than a Consent Order (while retaining the right to enter into a
separate Commissioner or Director's Order, if necessary) to maintain a
cooperative working relationship with the waste generator. The
appropriateness of this approach was validated by the completion of the
waste disposal effort significantly ahead of schedule.

Recommendation

State Review Framework Report | Tennessee | Page 48


-------
RCRA Element 5 — Penalties

Finding 5-1	Meets or Exceeds Expectations

Summary	Tennessee has may significant progress on the consideration of the

economic benefit of noncompliance into penalty assessments. There was
documentation in the files that all final penalties were collected, or that
penalty collection was being pursued.

One of the objectives of the SRF is to ensure equitable treatment of
violators through national policy and guidance, including systematic
methods of penalty calculations. As provided in the 1993 EPA
"Oversight of State and Local Penalty Assessments: Revisions to the
Policy Framework for State EPA Enforcement Agreements'" it is EPA
policy not to settle for less than the amount of the economic benefit of
noncompliance (EBN) and a gravity portion of the penalty.

Following the SRF Round 1 and 2 evaluations, Tennessee made progress
on the documentation of EBN considerations. The state incorporated an
"Economic Benefit Review Checklist" in the penalty documentation, and
if EBN was not pursued a supporting justification was included in the
penalty worksheet narrative. There were 17 penalty calculations
reviewed and 16 of the cases (94.1%) had the appropriate EBN
consideration and documentation included in the file. There was one
case where there was a clear economic benefit realized by shipping
hazardous wastes to a solid waste landfill, but there was no EBN
included in the penalty calculation. EPA encourages TDEC to continue
to emphasize the importance of EBN consideration in future penalty
assessments. EPA is available to assist in this effort.

TDEC does not typically negotiate penalties in RCRA enforcement
administrative cases, so there was no requirement for documenting
adjustments to penalty calculations in the 17 enforcement cases
reviewed.

In 94.1% of the penalty files reviewed (16 of 17), there was
documentation in the file indicating that final penalties had been
collected. There was one case where the respondent had not paid the
penalty, and the state is pursuing collection.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

11a Penalty calculations include gravity and
economic benefit

100%

N/A

16

17

94.1%

State Review Framework Report | Tennessee | Page 49


-------
12a Documentation on difference between	....	. ,	. ,	. ,

.	100% N/A	\ \	\ \ \ \
initial and final penalty

12b Penalties collected	100% N/A	l<> l(>	loo",,

State Response Jackson-Madison County Hospital disposed of empty, used vials and
packaging and did not dispose of any actual pharmaceuticals. As the
estimated amount of residual waste was miniscule, the economic benefit
derived from the hospital's activities was determined to be negligible (as
documented in the penalty calculation worksheet).

Recommendation

State Review Framework Report | Tennessee | Page 50


-------
State Review Framework

Shelby County, Tennessee

Clean Air Act
Implementation in Federal Fiscal Year 2015

U.S. Environmental Protection Agency
Region 4, Atlanta

Final Report
September 28,2017


-------
(Page left intentionally blank)


-------
Executive Summary

Introduction

EPA Region 4 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the Shelby County Health Department (SCHD).

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	SCHD met the negotiated frequency for inspection of sources for most major and SM-80
sources during the review year.

•	Compliance monitoring reports and full compliance evaluations included all elements
required by EPA's Compliance Monitoring Strategy (CMS) Guidance.

•	SCHD documented any differences in initial and final penalty and maintained
documentation of penalty payments made.

Priority Issues to Address

The following are the top-priority issues affecting the local program's performance:

•	SCHD needs to improve the timeliness and accuracy of data reported into the National
Data System (ICIS-Air). Data discrepancies were identified in 45% of the files reviewed,
and none of the data reported in FY15 was timely.

•	SCHD needs to ensure that all Title V Annual Compliance Certifications (ACCs) are
completed and recorded in ICIS-Air.

•	SCHD needs to strengthen the enforceability of their formal enforcement actions to
ensure that sources are returned to compliance within a specified timeframe.

•	SCHD needs to document the consideration of economic benefit in their penalty
calculations.

Most Significant CAA Stationary Source Program Issues

•	The accuracy and timeliness of enforcement and compliance data entered by SCHD in
ICIS-Air needs improvement.

•	SCHD's use of a notice of violation (NOV) to assess penalties does not appear to be
enforceable in court, and may not return sources to compliance.

•	SCHD's penalty assessments did not include the consideration of an economic benefit
component.


-------
Table of Contents

I.	Background on the State Review Framework	4

II.	SRF Review Process	5

III.	SRF Findings	6

Clean Air Act Findings	7


-------
I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state or local program
understand the causes of issues and agree, to the degree possible, on actions needed to address
them. SRF reports capture the agreements developed during the review process in order to
facilitate program improvements. EPA also uses the information in the reports to develop a better
understanding of enforcement and compliance nationwide, and to identify issues that require a
national response. Reports provide factual information. They do not include determinations of
overall program adequacy, nor are they used to compare or rank state and local programs.

Each state's programs are reviewed once every five years. Local programs are reviewed less
frequently, at the discretion of the EPA Regional office. The first round of SRF reviews began in
FY 2004, and the second round began in FY 2009. The third round of reviews began in FY 2013
and will continue through 2017.

State Review Framework Report (Shelby County, Tennessee | Page 4


-------
II. SRF Review Process

Review period: 2015

Key dates: August 16, 2016, letter sent to Local program kicking off the Round 3 review
October 24 - 26, 2016, on-site file review for CAA

Local Program and EPA key contacts for review:



Shelby County

EPA Region 4

SRF Coordinator

Robert Rogers

Kelly Sisario, OEC

CAA

Bill Smith

Ahmad Dromgoole, OEC





Mark Fite, OEC





Chetan Gala, APTMD

State Review Framework Report (Shelby County, Tennessee | Page 5


-------
III. SRF Findings

Findings represent EPA's conclusions regarding state or local program performance and are
based on observations made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the program's last SRF review

•	Follow-up conversations with agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state or local performs above national program
expectations.

Area for State1 Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state or local should correct the issue without additional
EPA oversight. EPA may make recommendations to improve performance, but it will not
monitor these recommendations for completion between SRF reviews. These areas are not
highlighted as significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state or local has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

1 Note that EPA uses a national template for producing consistent reports throughout the country. References to
"State" performance or responses throughout the template should be interpreted to apply to the Local Program.

State Review Framework Report (Shelby County, Tennessee | Page 6


-------
Clean Air Act Findings

CAA Element 1 — Data

Finding 1-1	Area for State Improvement

Summary	The timeliness and accuracy of minimum data requirement (MDR) data

reported by SCHD into ICIS-Air needs improvement. None of the data
was entered timely, and discrepancies between the files and ICIS-Air
were identified in 45% of the files reviewed.

Explanation	File Review Metric 2b indicated that only 45% (9 of 20) of the files

reviewed reflected accurate entry of all MDRs into ICIS-Air. The
remaining 11 files had one or more discrepancies between information in
the files and data entered into ICIS-Air. For example, six sources had
activities missing from or inaccurate in ICIS-Air, such as full
compliance evaluations (FCEs), annual compliance certifications, stack
tests, or enforcement actions. In addition, five sources had missing or
inaccurate air programs or subparts for Maximum Achievable Control
Technology (MACT) or other regulations in ICIS-Air. Another eight
files had miscellaneous inaccuracies related to facility data.

Data Metrics 3a2, 3b 1, and 3b3 indicated that none of the MDRs for
compliance and enforcement activities were reported into ICIS-Air
within 60 days. Data Metric 3b2 indicated that none of the 35 stack tests
were entered into ICIS-Air within 120 days.

At the beginning of FY2015, EPA transitioned the national database for
CAA compliance and enforcement data from the AFS legacy system to
ICIS-Air. During the initial transition period in October 2014, historical
data was migrated from AFS to ICIS-Air, and no new data could be
entered either directly or through electronic data transfer (EDT).
Following the migration, "direct reporting agencies" like SCHD could
begin accessing the new data system through the web beginning in late
November 2014. An analysis of the county's timeliness data indicates
that all of the data was entered into the new system in January 2016.

Relevant metrics	.	Natl	Natl	State State State

Metric ID Number and Description	_ ,	.	„ __ .

Goal	Avg	N D % or #

2b Accurate MDR data in ICIS-Air	100%	9 20 45%

3a2 Timely reporting of HPV determinations 100% 99.6% 0 0 NA

3b 1 Timely reporting of compliance monitoring 1AA0/ 10/
-m «-p.-i-*»	lUU/o 64.4/0 U

MDRs

3b2 Timely reporting of stack test MDRs	100% 65.2% 0 35 0%

State Review Framework Report (Shelby County, Tennessee | Page 7


-------
3b3 Timely reporting of enforcement MDRs 100% 56.6% o

State response The SRF review occurred a short time after EPA had transitioned the

National Data System from AIRS-AFS to ICIS-Air. Staff handling data
input had received no training on the new system and had not been
granted access. The access problem was resolved during the review and
staff was given preliminary training on the new system to begin
inputting data. In addition to the access problem, it appears some data
did not properly transfer from the legacy system to ICIS-Air. SCHD has
updated and corrected the information needed for ICIS-Air and has
implemented a standard operating procedure (SOP) that allows for the
tracking, input and confirmation of data into ICIS-Air.

Recommendation By December 31, 2017, SCHD should make corrections to existing data
to address discrepancies identified by EPA and take steps to ensure that
all MDRs are entered accurately and timely into ICIS-Air. If by
December 31, 2018, EPA's annual data metric analysis and other
periodic reviews confirm that SCHD's efforts appear to be adequate to
meet the national goal, the recommendation will be considered complete

State Review Framework Report (Shelby County, Tennessee | Page 8


-------
CAA Element 2 —

Inspections

Finding 2-1

Meets or Exceeds Expectations

Summary

FCEs and CMRs addressed all required elements.

Explanation	Metric 6a indicates that 18 of 20 FCEs reviewed (90%) included the

seven elements required by the Clean Air Act Stationary Source
Compliance Monitoring Strategy (CMS Guidance).

Metric 6b indicates that 18 of 20 (90%) CMRs included all seven
elements required by the CMS Guidance.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

6a Documentation of FCE elements

100%



18

20

90%

6b Compliance monitoring reports reviewed
that provide sufficient documentation to
determine facility compliance

100%



18

20

90%

State response

Recommendation

State Review Framework Report (Shelby County, Tennessee | Page 9


-------
CAA Element 2 —

Inspections

Finding 2-2

Area for State Improvement

Summary

SCHD should ensure that all Title V Annual Compliance Certification
(ACC) reviews are completed and entered into ICIS-Air.

Explanation

Metric 5e initially indicated that none of the 29 Title V ACCs (0%) were
reviewed by the local program and recorded in ICIS-Air. However, EPA
reviewers found that SCHD had actually conducted ACC reviews for the
9 Title V sources evaluated during the file review. After the file review,
EPA evaluated data in ICIS-Air for all 29 sources with an ACC due in
the review year (this information was entered after the data was frozen).
The analysis confirmed that 4 sources were not required to submit an
ACC. Another 17 of the remaining 25 sources had an ACC review
recorded in ICIS-Air, while 8 sources did not. This data results in a
revised metric for 5e of 68% (17 of 25).(1) While this reflects some
improvement in the conduct and recording of ACC reviews, it still
represents an area for improvement.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



5e Review of Title V annual compliance ./ n <->« ,.,0/li,
,.p. lUU/o 1/ 2J uoAr '
certifications

State response Previously when an ACC was received, inspectors would review it and
place it in the file, then acknowledge receipt and review in the annual
compliance inspection report. This lead to occasions where an ACC was
not picked up for entry into ICIS-Air. ACCs have been added to the SOP
and document tracking system. The tracking document identifies the
Title V ACC, including date received, date reviewed, compliance status,
and any deviations, exceedances or excursions that have occurred during
the reporting period. Additionally, as part of quality control, a
spreadsheet will be developed that lists all of these documents and is
presented to management to verify prior to uploading into ICIS-Air.

Recommendation By December 31, 2017, SCHD should take steps to ensure that all ACC
reviews for Title V sources are conducted and recorded in ICIS-Air. If
by December 31, 2018, EPA's annual data metric analysis and other
periodic reviews confirm that SCHD's efforts appear to be adequate to
meet the national goal, the recommendation will be considered complete.

State Review Framework Report (Shelby County, Tennessee | Page 10


-------
CAA Element 2 —

Inspections

Finding 2-3

Meets or Exceeds Expectations

Summary

SCHD met the negotiated frequency for inspection of sources for most



major and SM-80 sources during the review year.

Explanation	Metric 5a indicated that 22 of 29 major sources (75.9%) were inspected

at least once every 2 years. Of the 7 sources not inspected, two were
permanently closed, bringing the local percentage to 81.5% (22 of 27).(2)

Metric 5b indicated that 60 of 71 (84.5%) SM-80 sources were inspected
at least once every 5 years, in accordance with EPA's CMS Guidance.
However, a closer review of the 11 sources that were not inspected
indicated that 9 of them were permanently closed, and another is under
construction. Adjusting for these sources brings SCHD's metric to
98.4% (60 of 61).(3)

Metric 5c indicated that SCHD did not inspect any non-SM80 synthetic
minors since they follow a traditional CMS plan.

A review of FY16 frozen data shows that coverage rates under metrics
5a and 5b have improved to 96.3% and 98.8%, respectively, indicating
that the local program continues to provide adequate inspection
coverage.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

5a FCE coverage: majors and mega-sites

100%

63.2%

22

27

81.5%(2)

5b FCE coverage: SM-80s

100%

79.5%

60

61

98.4%(3)

5c FCE coverage: synthetic minors (non-SM
80s) that are part of CMS plan

100%

42.6%

0

0

NA

State response

Recommendation

State Review Framework Report (Shelby County, Tennessee | Page 11


-------
CAA Element 3 — Violations

Finding 3-1	Area for State Attention

Summary	SCHD made accurate compliance determinations in most instances, but

some violations were not classified and reported into ICIS-Air.

Explanation	Metric 7a indicated that SCHD made accurate compliance

determinations in 16 of 20 files reviewed (80%). In one instance, a
violation was identified, and an informal action (warning letter) was
issued, but the federally reportable violation (FRV) was not recorded in
ICIS-Air. In other situations, file reviewers found compliance issues
described in an inspection report or other periodic report, but these were
not formally classified as a violation, and no enforcement action was
taken. Although some FRVs were entered into ICIS-Air, these were
entered late. EPA recommends that an improved process for FRV and
HPV determination and data entry be developed.

Metric 8c confirmed that for all 3 files reviewed with violations
identified (100%), SCHD's determination that these were not HPVs was
accurate.

Metric 13 indicated that SCHD did not identify any HPVs during the
review year.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

7a Accuracy of compliance determinations

100%



16

20

80%

8c Accuracy of HPV determinations

100%



3

3

100%

13 Timeliness of HPV Identification

100%

82.6%

0

0

NA

State response SCHD updated the Major Source SOP to include two new document

tracking forms. The first form includes a decision for enforcement from
the Technical Manager and the second form establishes the type of
enforcement action including if the action is an FRV or HPV.

Recommendation

State Review Framework Report (Shelby County, Tennessee | Page 12


-------
CAA Element 4 —

Enforcement

Finding 4-1

Area for State Improvement

Summary

Enforcement actions do not always bring sources back into compliance
within a specified timeframe.

Explanation

Metric 9a indicated that 3 of 4 formal enforcement actions reviewed
(75%) brought sources back into compliance through corrective actions
in the order, or compliance was achieved prior to issuance of the order.
However, one source did not submit the required permit application or
pay the penalty, and the county ultimately closed the case. In addition,
reviewers observed that SCHD uses a Notice of Violation (NOV) that
includes a penalty assessment, which is essentially a combined informal
and formal enforcement action. This document does not appear to
include legally enforceable compliance obligations and an applicable
schedule, which led EPA to develop a recommendation for this finding.

Metrics 10a, 10b & 14 do not apply since SCHD did not have any HPVs
during the review year.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #

9a Formal enforcement responses that include
required corrective action that will return the

facility to compliance in a specified time frame 100%	3 4 75.0%

or the facility fixed the problem without a
compliance schedule.

10a Timeliness of addressing HPVs or

alternatively having a case development and 100%	0 0 NA

resolution timeline in place.

10b Percent of HPVs that have been have been

addressed or removed consistent with the HPV 100%	0 0 NA

Policy.

14 HPV Case Development and Resolution

Timeline in Place When Required that	100%	0 0 NA

Contains Required Policy Elements

State response SCHD is adopting two model enforcement documents based on those
used in the State of Tennessee's Air Pollution Control program. These
documents are: "Technical Manager's Order and Assessment of Civil
Penalty" and "Technical Manager's Order and Assessment of Civil
Penalty and Imposition of Compliance Schedule".

• The new enforcement letter contains a line stating economic
impact was considered in a penalty assessment.

State Review Framework Report (Shelby County, Tennessee | Page 13


-------
•	The enforcement letter will also all have a reference to our
enforcement authority contained in our local codes and contain a
deadline for payment of the assessment and or assessment and
compliance schedule if that is the case.

•	Consent Orders will still be utilized where appropriate.

•	These changes will be incorporated in the Department's
compliance policy manual.

Recommendation By December 31, 2017, SCHD should strengthen the enforceability of
the NOV currently in use, or consider utilizing another instrument, such
as a compliance order, for securing compliance. Revised procedures
which formalize these changes should be submitted to EPA for review. If
by December 31, 2018, EPA determines that these procedures appear
adequate to bring sources back into compliance, the recommendation
will be considered complete.

State Review Framework Report (Shelby County, Tennessee | Page 14


-------
CAA Element 5 — Penalties

Finding 5-1	Area for State Improvement

Summary	SCHD utilized a matrix for assessing the gravity portion of penalties, but

the consideration or assessment of economic benefit was not
documented.

Explanation	Metric 11a indicated that although SCHD considered gravity in all

penalty assessments reviewed, none of these (0%) documented whether
economic benefit was considered. EPA acknowledges that SCHD has
developed a process for assessing economic benefit in their draft
Environmental Penalty Policy dated September 1, 2004. However, this
process does not appear to be used consistently.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl

Avg

State
N

State
D

State
% or #

1 la Penalty calculations reviewed that

100%



0

A

0%

document gravity and economic benefit





State response

SCHD does consider economic benefit on each penalty action taken.
However, for penalty actions where no economic benefit was identified,
this fact has not been stated. The new enforcement letter (referenced in
our response to CAA Element 4 above) with a line stating economic
impact was considered will be included.

Recommendation

By December 31, 2017, SCHD should submit revised procedures which
ensure that the consideration of economic benefit is documented for all
penalty calculations. In addition, sample penalty calculations for actual
cases which follow the new procedures should be submitted to EPA for
review. If by December 31, 2018, EPA determines that these procedures
and their implementation adequately address the necessary penalty
documentation, the recommendation will be considered complete.

State Review Framework Report (Shelby County, Tennessee | Page 15


-------
CAA Element 5 —

Penalties

Finding 5-2

Meets or Exceeds Expectations

Summary

The collection of penalties and any differences between initial and final
penalty assessments was documented in facility files.

Explanation

Metric 12a indicated that all 4 penalty calculations reviewed (100%)
documented any difference between the initial and the final penalty
assessed, or there was no difference.

Metric 12b indicated that for 4 of 4 penalties (100%), documentation of
penalty payments made by source was included in the file. In one
instance, the source contested the penalty, and SCHD ultimately
rescinded their Notice of Violation and penalty assessment, which was
documented in a letter to the source.





Relevant metrics

. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . „ __ .

Goal Avg N D % or #



12a Documentation of rationale for difference

between initial penalty calculation and final 100% 4 4 100%
penalty



12b Penalties collected 100% 4 4 100%

State response

Recommendation

State Review Framework Report (Shelby County, Tennessee | Page 16


-------