State Review Framework

Kansas

Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2013

U.S. Environmental Protection Agency
Region 7, Kansas City

Final Report
October 28,2014


-------
Executive Summary

Introduction

The EPA Region 7 enforcement staff conducted a RCRA Subtitle C oversight review of the
Kansas Department of Health and Environment, Bureau of Waste Management,
Compliance/Enforcement Unit using the State Review Framework guidance on June 16-20,
2014.

The EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. The EPA will track recommended actions from the review in the SRF
Tracker and publish reports and recommendations on the EPA's ECHO web site.

Areas of Strong Performance

•	Kansas is effective at identifying violations of its RCRA regulations, bringing facilities
back into compliance, and obtaining penalties from significant non-compliers through
formal enforcement actions using a well written state penalty policy.

•	Kansas is good at providing compliance assistance to the regulated community.
Priority Issues to Address

The following are the top-priority issues affecting the state program's performance:

•	Element 5: Kansas follows its penalty policy very well, but when calculating penalties,
the state does not calculate, document, or seek the economic benefit of non-compliance
(EBN). Its policy assumes the gravity component of the penalty will address EBN.

•	Element 3: Although Kansas took appropriate enforcement actions, it was somewhat lax
in identifying significant non-compliers in the data system.

Most Significant RCRA Subtitle C Program Issues1

1 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."

State Review Framework Report | Kansas | Executive Summary | Page 1


-------
• The State does not calculate the economic benefit of noncompliance in penalty

calculations nor document this in its files; this problem continues from Rounds 1 and 2.

State Review Framework Report | Kansas | Executive Summary | Page 2


-------
Table of Contents

I.	Background on the State Review Framework	1

II.	SRF Review Process	2

III.	SRF Findings	3

Resource Conservation and Recovery Act Findings	4

Appendix	15

KDHE Bureau of Waste Management Response letter	15


-------
I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that the EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

The EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

The EPA builds consultation into the SRF to ensure that the EPA and the state understand the
causes of issues and agree, to the degree possible, on actions needed to address them. SRF
reports capture the agreements developed during the review process in order to facilitate program
improvements. The EPA also uses the information in the reports to develop a better
understanding of enforcement and compliance nationwide, and to identify issues that require a
national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.

Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.

State Review Framework Report | Kansas | Page 1


-------
II. SRF Review Process

Review period: FY 2013
Key dates:

Data metric analysis and file selection list sent to KDHE: May 2, 2014
On-site and internet file review conducted: June 16-19, 2014
Draft report sent to headquarters: August 18, 2014
Draft report sent to KDHE: September 16, 2014
Final report issued: October 28, 2014

State and EPA key contacts for review:

EPA Region 7 SRF Coordinator: Kevin Barthol

EPA Region 7 Kansas RCRA Coordinator: Edwin Buckner

EPA Region 7 Reviewer: Elizabeth Koesterer

KDHE/BWM Compliance/Enforcement Unit Chief: Rebecca Wenner

KDHE/BWM Data Manager: Phyllis Funk

State Review Framework Report | Kansas | Page 2


-------
III. SRF Findings

Findings represent the EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the state's last SRF review

•	Follow-up conversations with state agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. The EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and the EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, the EPA will write up a finding of Area
for State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

State Review Framework Report | Kansas | Page 3


-------
Resource Conservation and Recovery Act Findings

RCRA Element 1 — Data

Finding 1-1	Area for State Attention

Summary

KDHE was lacking in the entry of the mandatory data in the enforcement
area.

Explanation	Four SNCs were not recorded, one set of violations was not linked to the

formal enforcement action, one penalty payment schedule was not
entered, and one follow-up inspection was not recorded. The four SNC
not recorded were appropriately addressed through formal enforcement.
The other instances of missing data were minor oversights.

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avij

Siale Siale Siale

N D % or#

2b Complete and accurate entry of mandatory
data

100%

')



State response KDHE will review established procedures with Compliance and

Enforcement staff to try to improve our rate from 79.3% to 100% in the
future.

Recommendation During monthly coordination calls, the EPA and KDHE enforcement
staff will discuss current enforcement actions to assure SNC status and
other pertinent information is recorded.

State Review Framework Report | Kansas | Page 4


-------
RCRA Element 2 — Inspections

Finding 2-1	Area for State Attention

Summary	KDHE inspected less than the expected number of LQGs, but inspected

many other facilities during the year. The EPA inspections raised the
total to expected levels.

Explanation	KDHE faced a staffing shortfall during 2013 and concentrated efforts in

areas of greater potential environmental harm such as SQGs and
facilities that had never been inspected. KDHE is very responsive to
citizen complaints which typically do not occur at LQGs. To avoid
unnecessary duplication of effort, KDHE did not inspect TSDFs and
LQGs that were inspected by the EPA during the year. The EPA
inspections are not counted toward the state totals below, but if included
would raise levels to meet the national goals. The EPA does not plan to
change its level of inspection activity because maintaining a federal
inspection presence is an EPA priority. KDHE shouldn't expend
additional resources to inspect facilities already inspected by the EPA.
KDHE should still fill the three inspector positions that are vacant.

Relevant metrics	Natl Natl sum-sum- Siaic

Metric ID Number and Description	_ , .	. 		

Goal Avji N I) ".i or#

5a Two-year inspection coverage of operating
TSDFs

100%

87.6%



i: _5'

5b Annual inspection coverage of LQGs

20%

21%

:i

i~4 i: i'

5c Five-year inspection coverage of LQGs

100%

66.6'J,,

154

1 "4 SS 5'

5d Five-year inspection coverage of active
SQGs



11.0%

45'J

-<)<> (.4

5el Five-year inspection coverage of active





401



conditionally exempt SQGs







5e2 Five-year inspection coverage of active









transporters









5e3 Five-year inspection coverage of active





0



non-notifiers







5e4 Five-year inspection coverage of active sites
not covered by metrics 2c through 2f3

State response When planning our inspection schedule, KDHE always considers

inspections planned by EPA. This eliminates duplication of resources
and frustration from the regulated community because of multiple
inspections. If EPA's inspections were considered in the numbers, this

State Review Framework Report | Kansas | Page 5


-------
would not be an area for state attention. KDHE will continue to fill
vacant positions as long as funding allows.

Recommendation KDHE should plan for and maintain adequate staffing levels to meet its
inspection commitments.

State Review Framework Report | Kansas | Page 6


-------
RCRA Element 2 — Inspections

Finding 2-2	Area for State Attention

Summary	Several inspection reports lack narrative or sufficient detail in the

narrative to make a compliance determination. Inspection reports are not
signed or dated affecting the credulity of the report.

Explanation	The narrative in the reports need to describe the waste generation

process sufficiently to allow accurate hazardous waste determinations.
The EPA reviewers observed eight of the 29 reports were lacking
sufficient narrative or waste stream descriptions. The state recently
started using electronic checklists on tablets to document inspection
findings. The tablets can record narrative on the checklists as necessary;
however, some inspectors have neglected the narrative in this electronic
format. Undated reports allow those arguing against a report to suggest
the information in the report was not recorded in a timely manner, thus
casting suspicion on its accuracy. Signing and dating reports help verify
the documentation of the inspector's observations has not changed since
it was observed. The metric 6b was determined by reviewing the
narrative and attachments to reports and other documents. None of the
29 reports were dated, but the EPA reviewers were able to determine that
20 of 22 reports were timely written, by observing evidence such as the
date of the facility's response to the report or when KDHE issued a
compliance letter or initiated enforcement.

Relevant metrics	Natl	Natl sum-sum- Siaic

Metric ID Number and Description	_ ,	. . 		

Goal	Avii N	I) »or#

6a Inspection reports complete and sufficient to	inn0/	,	,,,

, ,.	1 (H) At	_ I	_ ^ _ 4 it
determine compliance

6b Timeliness of inspection report completion	100%	2<)	22

State response KDHE has added to its inspection reports, next to the field listing the
name of the inspector completing the report, a date field to record the
date the inspection report is completed. This should suffice in lieu of a
signature, which would be expensive to add to the reports because it
would require a change to our electronic system. KDHE will also alter
the waste stream table and/or other areas of the report to discuss or list
processes generating wastes.

Recommendation The EPA recommends that KDHE provide refresher training to all

inspectors to assure each inspector records complete narratives of their
observations.

State Review Framework Report | Kansas | Page 7


-------
The EPA concurs that adding fields for the name of the inspector
completing the report and the date of report completion should suffice to
authenticate each inspector's testimony regarding the inspection report.
The EPA will verify this recommendation has been implemented within
180 days of this final report being issued.

State Review Framework Report | Kansas | Page 8


-------
RCRA Element 3

— Violations

Finding 3-1

Meets or Exceeds Expectations

Summary

The state excels at identifying violations and returning facilities to
compliance.

Explanation	Kansas inspectors are meticulous in documenting all violations identified

during inspections and are adept at discovering those violations. In the
one case where the reviewers identified an inaccurate compliance
determination, it was because the state did not cite violations of a
previous administrative order in its actions. In one case, the EPA felt the
identified violations should have been a SNC and formal enforcement
initiated, but the state demonstrated that it was acting in concert with its
written policies for enforcement in that case.

Relevant metrics

Metric ID Number and Description

Natl
Goal

Natl
Avii

Siale
N

Siale
1)

Siale

% or#

7a Accurate compliance determinations

100%



2')

M)



7b Violations found during inspections



34.8%

i :<¦

:4"

51".,

8a SNC identification rate



1.7%

i

:4"

i) 4(1",,

State response None

Recommendation None

State Review Framework Report | Kansas | Page 9


-------
RCRA Element 3 — Violations

Finding 3-2

Area for State Improvement

Summary

The state excels at identifying violations but is lax in documenting
relevant violations as SNCs in the database. Existing SNC
determinations are timely.

Explanation	The low value for 8c comes from the state not identifying the facilities as

a SNC in the database, but the state took appropriate enforcement actions
in spite of lacking the formal determination. So, the issue is lack of
documenting SNCs, not lack of appropriate action.

Relevant metrics

Metric ID Number and Description

Natl
Goal

2a Long-standing secondary violators

8b Timeliness of SNC determinations

8c Appropriate SNC determinations

Nail Siale Siale State
Avii N I) 'Mi or#

100% 77.8% S	XX T„

100%	v.,

State response KDHE will review all established procedures with enforcement staff.

This should help improve our entry to SNC and SNN evaluations in
RCRAInfo.

Recommendation The state should institute a periodic database review process of

violations and enforcement actions to make certain the appropriate SNC
determination has been documented. This process will be discussed
during KDHE/EPA enforcement coordination calls.

State Review Framework Report | Kansas | Page 10


-------
RCRA Element 4 — Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary

The state closely follows its policies regarding enforcement and follows
up on all inspections to assure facilities return to compliance.

Explanation	For 9a, the state is still pursuing compliance in one case that received

formal enforcement. This situation is atypical. For 10b, the EPA felt
formal enforcement was appropriate for one case, but KDHE followed
its own guidance appropriately using informal enforcement in that case.

Relevant metrics

Metric ID Number and Description

Natl Nail Siale Siale Sialc
Goal Avji N I) "iior#

9a Enforcement that returns violators to
compliance

100%

:4

10b Appropriate enforcement taken to address
violations

100%

25 <)<..()"

2(i lJ
-------
RCRA Element 4 — Enforcement

Finding 4-2	Meets or Exceeds Expectations

Summary	The state takes expeditious enforcement actions and closely monitors the

respondent to assure penalties are timely paid and compliance is
achieved.

Explanation	Field inspectors follow-up with the facility independent of enforcement

staff to assure facility compliance with the regulations. Inspection
reports are sent to enforcement staff in Topeka for review and potential
SNCs are indicated. Enforcement staff review the cases and immediately
initiate prefiling negotiations with SNC facilities. Penalties are
calculated using the state's penalty policy which is precise, simple, and
thorough, except for the lack of an economic benefit of noncompliance
(EBN) calculation. KDHE vigorously pursues negotiations with the aid
of Attorney General staff specifically assigned to KDHE. This results in
quick and appropriate resolution of enforcement actions.

Relevant metrics Natl	Nail Male sum- Siaic

Metric ID Number and Description „ ,	. .

Goal	Avii N I)	«>or#

10a Timely enforcement taken to address SNC 80%	77.3% <> (>	|(io"„

State response

None

Recommendation

None

State Review Framework Report | Kansas | Page 12


-------
RCRA Element 5 — Penalties

Finding 5-1	Area for State Improvement

Summary	The state closely follows its guidance documents for calculating

penalties, but those documents do not address the economic benefit of
noncompliance (EBN). It typically obtains the penalties issued and well
documents its calculations and justifications for the amounts.

Explanation	State penalty calculations for the gravity component are accurate and

follow state guidance. Its penalty matrix is easy to use and produces
unbiased, appropriate numbers, but the policy does not address EBN and
the state does not calculate or seek it. State law directs the Department to
consider EBN in its penalty calculations. Metric 12b includes an ongoing
enforcement action.

During the close-out meeting the state said it believes that EBN in
RCRA cases is typically very small in comparison to the gravity
component. It believes the amount calculated for the gravity component
is adequate to address the EBN as well as the gravity. Often the cost of
correcting the violations outweighs any benefit the facility might have
gained through noncompliance. Further, KDHE RCRA management
thinks EBN should be calculated consistently across the different
enforcement programs and KDHE Air and Water apparently also do not
calculate EBN.

Although EBN in RCRA penalty calculations is often quite small in
comparison to the gravity component, in some cases, especially illegal
disposal or avoided actions such as training, it can be a comparatively
large sum. The state should at least do a cursory calculation of EBN
before entering negotiations so it will not fail to obtain EBN if it is
significant. This is a longstanding issue that was identified during SRF
Rounds 1 and 2 and will remain unresolved until state upper
management decides to calculate EBN as part of the state's penalty.

Relevant metrics

Metric ID Number and Description

12b Penalties collected

Natl Nail Sialc Siale Siale
Goal Av*.i N I) "nor#

11a Penalty calculations include gravity and
economic benefit

100%

100%

100%

0 |o

<)"„

12a Documentation on difference between
initial and final penalty

4 luo",,
" X5

State response

KDHE believes that our penalty matrix takes into consideration
economic benefit by penalizing more for violations that could have a

State Review Framework Report | Kansas | Page 13


-------
direct economic benefit. Further, our statutes (Kansas Statutes Annotated
(K.S.A.) 65-3446) authorizes us only to impose a penalty which "shall
constitute an actual and substantial economic deterrent to the violation
for which it is assessed."

Recommendation The KDHE needs to develop a standard procedure where EBN is

consistently considered and calculated for each penalty action. Although
K.S.A 65-3446 does not specify recovery of economic benefit in
calculating penalties, it does require a penalty that is "an actual and
substantial economic deterrent to the violation." It does not forbid
calculation of economic benefit. The EPA believes calculating and
recovering the violator's economic benefit of noncompliance in addition
to a gravity component better meets the goal of actual and substantial
economic deterrent. In K.S.A. 65-3444(b)(4), which addresses civil
penalties as opposed to administrative penalties, the statute calls for the
district court to consider "the economic savings realized by the person in
not complying with the provisions for which a violation is charged. . . "
The statute's intended result of seeking EBN in civil actions translates to
administrative penalties.

The KDHE BWM should coordinate with other KDHE media
enforcement programs to develop an equitable policy for seeking EBN in
each program's penalties. Further discussions between upper
management of the KDHE and the EPA will be necessary to make this
change across all media enforcement programs.

State Review Framework Report | Kansas | Page 14


-------
Appendix

KDHE Bureau of Waste Management Response letter

OepmtiKirt of Uetilm it Emmmmcxi

October 28.2014

Mr. Donald Tocasmg

Chief, Waste Enforcement and Materials Management Branch
II,S, Environment*! Protection Agency
Region 7

11201 Ranter Boulevard
Lenexa, Kansas 662! 9

Dear Mr. Toetuing:

The KDHE Bureau of Waste Management hat reviewed EPA's Draft Report of the Stale Review
Framework of the Kansas RCRA Enforcement Program dated September 2,2014. We appreciate EPA's
comments and guidance as we seek to continuously improve our program. 1 have attached our response to the
draft report addressing Ac "Areas of Improvement"' and findings that specify "Amm for State Attention."

Please let me know if yon have any questions sbout am response. "Hank you for conducting this review
efficiently lllici professionally.

Sincerely,

William L Hi.!.-,

Director

Bureau of Waste Management

C John Mitchell, Director, Division of Environment

Rebecca. Wenner, Chief; Compliance Assistance and Enforcement Unit
€hri.!»liiie Mcnnitkci ducf, Kc^uliii ions and O^tA ILJstt

State Review Framework Report | Kansas | Page 15


-------
KDHE's Response/Comments on Stale Review Framework
October 27,2014

Areas of Improvement:

previously discussed, KDHC believes that our penalty matrix takes into consideration economic
benefit by penalizing more for violations that could have a direct economic benefit.. Further, our
statutes {Kansas Statutes Annotated (K,S A} 6S-344«} authorizes us onty to Impose a penalty
which "shall constitute an actual and substantial economic deterrent to the violation for which it

«¦ ROME wilt work to

imp-owe the timelines of entering the SNN and SNV evaluations.

3.	"-".rg, >v'<« tot

mm has uilefed * date field In our ntports next to the field listing the name of the inspector
completing the report. This should suffice In lieu of a signature, which woufi be expensive
because it would require a change to our electronic system. We will alio modify our waste
stream table to Include additional information regarding processes generating the waste.

ftCKA Findings;

Finding l-l ROME wit review established proctiures with Compliance end Enforcement staff to try
to improve our rale from 79,3% to 100* in the future,

finding 2-1 When planning our inspection schedule, KDHE always eonsiclirs inspections planned by
EPA. This eliminates duplication of resources awl frustration from the regulated
community because of multiple inspections. If EPA's inspections were considered in the
numbers, this would not be m area for state attention, KOHE wW continue to ¦ mmnt
positions as long * funding allows.

Finding 1-2 As slated above, IBHt has Btlciea a aate field to its inspection reports to record the date
the Inspection report Is completed. ROME will also alter the wane stream table ind/or

other areas of the report to discuss or list processes generating wist*.

Finding 3 2 KDHE wit review •« established procedures with enforcement slat. This should help
improve our entry to SNC and SNN evaluations in RCRA Info

Finding 5-1 As previously discussed, KDHE believes that our penalty matrix takes into consideration
economic benefit by penalizing more for violations that coutd have a direct economic
benefit. Further, our statutes (Kansas Statutes Annotated (K.S A) $$->3446) authorizes
us only to impose a penalty which "shall constitute an actual and substantial economic
deterrent to the violation for which It is assessed.*

State Review Framework Report | Kansas | Page 16


-------
State Review Framework

Kansas

Clean Air Act
Implementation in Federal Fiscal Year 2014

U.S. Environmental Protection Agency
Region 7, Kansas City

Final Report
December 21,2015


-------
Executive Summary

Introduction

EPA Region 7 enforcement staff conducted a Clean Air Act oversight review of the Kansas
Department of Health and Environment enforcement and compliance program in June 2015
using the State Review Framework (SRF).

EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.

Areas of Strong Performance

•	Finding 2-1. KDHE is exceeding national averages for Full Compliance Evaluation
(FCE) inspection targets and review of Title V Annual Compliance Certifications.

•	Finding 2-2. KDHE's documentation of FCE elements in inspection reports was
exemplary. KDHE review of compliance monitoring reports to ensure completeness was
likewise noteworthy.

•	Finding 3-1. Accuracy of compliance and High Priority Violator (HPV) determinations
were at or near the national goal of 100%.

•	Finding 3-2. The KDHE is properly identifying HPV violations.

•	Finding 4-1. All formal enforcement responses reviewed included language requiring the
facility return to compliance.

•	Finding 5-1. KDHE files demonstrate the state's documentation of the consideration of
economic benefit in the calculations has improved significantly.

•	Finding 5-2. KDHE has a strong performance record for penalty collection. KDHE
consistently documents rationale for reducing the initial penalty.

Areas for State Attention

The following are the priority issues affecting the state's program performance:

•	Finding 1-1. The review revealed several inaccuracies in the CAA database as compared
to the facility file.

•	Supplemental Finding. EPA experienced several issues with the KDHE electronic file
review system during the SRF review.

State Review Framework Report | Choose a state | Executive Summary | Page 1


-------
Area for State Improvement - Significant CAA Stationary Source Program
Issues1

• Finding 1-2. The review revealed issues with timely data entry, most notably the reporting
of stack test data is substantially below the national average and goal.

1 EPA's "National Strategy for Improving Oversight of State Enforcement Performance" identifies the following as
significant recurrent issues: "Widespread and persistent data inaccuracy and incompleteness, which make it hard to
identify when serious problems exist or to track state actions; routine failure of states to identify and report
significant noncompliance; routine failure of states to take timely or appropriate enforcement actions to return
violating facilities to compliance, potentially allowing pollution to continue unabated; failure of states to take
appropriate penalty actions, which results in ineffective deterrence for noncompliance and an unlevel playing field
for companies that do comply; use of enforcement orders to circumvent standards or to extend permits without
appropriate notice and comment; and failure to inspect and enforce in some regulated sectors."

State Review Framework Report | Choose a state | Executive Summary | Page 2


-------
Table of Contents

I.	Background on the State Review Framework	1

II.	SRF Review Process	2

III.	SRF Findings	3

Clean Air Act Findings	4

Appendix	16

KDHE Bureau of Air Response Letter	18


-------
I. Background on the State Review Framework

The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:

•	Clean Water Act National Pollutant Discharge Elimination System

•	Clean Air Act Stationary Sources (Title V)

•	Resource Conservation and Recovery Act Subtitle C

Reviews cover:

•	Data — completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections — meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations — identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations

•	Enforcement — timeliness and appropriateness, returning facilities to compliance

•	Penalties — calculation including gravity and economic benefit components, assessment,
and collection

EPA conducts SRF reviews in three phases:

•	Analyzing information from the national data systems in the form of data metrics

•	Reviewing facility files and compiling file metrics

•	Development of findings and recommendations

EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.

Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.

Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.

State Review Framework Report | Kansas | Page 1


-------
II. SRF Review Process

Review period: Federal Fiscal year 2014
Key dates:

•	SRF Kickoff letter mailed to KDHE: March 9, 2015

•	Data Metric Analysis sent to KDHE: April 14, 2015

•	File selection list sent to KDHE: April 14, 2015

•	Entrance interview conducted April 28, 2015

•	File review conducted: May - June, 2015

•	Exit interview conducted: August 26, 2015

•	Draft report sent to headquarters: September 8, 2015

•	Draft report sent to KDHE: November 10, 2015

•	Final report issued: December 21, 2015

State and EPA key contacts for review:

•	Russ Brichacek, KDHE Air Compliance and Enforcement Section

•	Javier Ahumada, KDHE Air Compliance and Enforcement Section

•	Lisa Gotto, EPA Region 7, SRF Review Lead

•	Joe Terriquez, EPA Region 7 Air Compliance and Enforcement Section

•	Hugh McCullough, EPA Region 7 Air Compliance and Enforcement Section

•	Kevin Barthol, EPA Region 7 SRF Coordinator

State Review Framework Report | Kansas | Page 2


-------
III. SRF Findings

Findings represent the EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:

•	Annual data metric reviews conducted since the state's last SRF review

•	Follow-up conversations with state agency personnel

•	Review of previous SRF reports, Memoranda of Agreement, or other data sources

•	Additional information collected to determine an issue's severity and root causes

There are three categories of findings:

Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.

Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.

Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.

The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:

•	Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.

•	Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.

•	Natl Avg: The national average across all states, territories, and the District of Columbia.

•	State N: For metrics expressed as percentages, the numerator.

•	State D: The denominator.

•	State % or #: The percentage, or if the metric is expressed as a whole number, the count.

State Review Framework Report | Kansas | Page 3


-------
Clean Air Act Findings

CAA Element 1 — Data

Finding 1-1	Area for State Attention

Summary	KDHE maintains the Clean Air Act data in the Air Facility System2

(AFS). The review revealed several inaccuracies in the CAA database as
compared to the facility file.

Explanation	Database accuracy was evaluated by comparing the KDHE electronic

files with the Enforcement Compliance History Online (ECHO) detailed
facility reports. 28 of 36 files reviewed had complete and accurate data
entered into AFS. The remaining files revealed relatively minor
discrepancies between AFS and the files. The common discrepancies
between AFS and the facility files included inaccurate event dates
typographical errors, inaccurate compliance status, and missing events.
EPA also notes that alleged violations reported per informal enforcement
actions were below the national average of 65.60%; indicating the state
may have been issuing Notices Of Violations (NOVs) without reporting
the minimum data requirements in AFS for compliance status.

EPA notes KDHE has demonstrated a trajectory of improvement in
database accuracy over time. EPA expects KDHE will continue the arc
of improvement; EPA will continue to monitor this data element for
improvement in the future.

Relevant metrics

Metric ID Number and Description

Nail
Goal

Nail

Avg

Siale
\

Siale Siale
1) " ii or

2b Accurate MDR data in AFS

100%



:x

i(. — X"„

3a2 I	Jnliiitely entry of HPV determinations

0





I)

3b 1 Timely reporting of compliance
monitoring MDRs

100%

83%

(i(i~

X(t i

3b3 Timely reporting of enforcement MDRs

100%

77.9%

51

5c. K) 1 l"„

7b 1 Violations reported per informal actions

100%

65.60%

5

1(1 n iju"„

7b3 Violations reported per HPV identified 100% 63.2% n

2 The AFS data system has been retired and is now a part of the Integrated Compliance and Information System
(ICIS-AIR).

State Review Framework Report | Kansas | Page 4


-------
State response The report noted BOA's improvement in this metric since the previous
SRF and we intend to continue improving. With the introduction of
ICISAir, there was a period of time where staff was learning the system
and may have made some initial errors, but overall, the new system
allows our staff the ability to directly enter data into the CAA database
which will further reduce and discrepancies between our file and the
CAA database. The report mentions a possibility that all notices of
noncompliance (NONs) may have not been uploaded into AFS. BOA is
not sure ifthat is the case, or if it was a statistical anomaly, but will put
additional emphasis on entering NONs into the CAA database going
forward.

Recommendation

State Review Framework Report | Kansas | Page 5


-------
CAA Element 1 — Data

Finding 1-2	Area for State Improvement

Summary	KDHE maintains CAA data in AFS. The review revealed issues with

timely data entry; most notably the reporting of stack test data is
substantially below the national average and goal. EPA is concerned with
data flow and timeliness.

Explanation	Untimely stack test data reporting into AFS is a likely function of the

size of the current KDHE universe; KDHE staffing resource challenges;
and 2014 procedures for receiving, prioritizing, and entering data.
During the review, EPA noted KDHE has challenges getting the file
scanned into the facility file in a timely manner. The majority of
inspections are conducted by the KDHE regional offices. Inspection
reports are then submitted to the KDHE main office - which may result
in data entry time-lag of 45 days or more. The delay in receiving
inspection reports has potential impacts on the timely issuance of
enforcement activities. During the time period under review, CAA data
entry was accomplished by a single KDHE staff member. Physical
copies of the documents (inspection reports, enforcement documents,
stack test observations, etc.) were provided to the data entry staff
member, who reviewed the documents and identified the information to
be recorded in AFS. EPA notes KDHE only met the standard for timely
reporting of stack test dates and results 2.8% of the time. KDHE
averages 229 days to complete the reporting of stack test dates and
results in the database; 109 days more than the required within 120 days
of the stack test.

Relevant metrics	.	Natl	Natl Stale Stale State

Metric ID Number and Description	_ ,	. .

Goal	Avjj N D %or#

3b2 Timely reporting of slack test dates and 1	gQ g()% ,	, S11„

results

State response EPA metrics in this category specify that performance test results should
be entered into the CAA database within 120 days of the end of the
performance test. Currently, this information is entered into ICIS-Air but
in FFY2014 the database was AFS. BOA strives to enter test data into
the CAA database as quickly as possible but asks consideration of the
fact that 120 days is actually reduced to 60 days when a federal
regulation, such as a MACT or NESHAP, allows the facility to submit
the final test report no later than 60 days after the end of testing. During

State Review Framework Report | Kansas | Page 6


-------
FFY2014, Kansas had a very large number of reciprocating internal
combustion engine performance tests conducted due to implementation
of the new RICE MACT, 40 CFR Part 63, Subpart ZZZZ, regulation
which happened to coincide with the oil exploration boom. Not only
were the total number of performance tests conducted in that year
between two to three times higher than average due to this new rule, but
this regulation is one which allows 60 days for final report submittal.
Not only was our program asked by EPA to absorb a huge increase in
work, but we were then told it had to be completed within 60 days of our
receipt. In addition, due to budgetary constraints at the time, there was
only a single staff member tasked with running the entire performance
testing program for the entire state. To further complicate this, in
FFY2012, BOA did batch uploads into AFS from our internal database
once a month. Therefore, performance tests that were reviewed the day
after the upload would not be reflected in AFS for another month. BOA
would like to note that although the arbitrary 120 day, effectively 60 day,
deadline was not met on most stack test reports in FFY2014, 100% of
stack test and RATA reports, including Acid Rain reports which we
review out of courtesy to EPA, were thoroughly reviewed for scientific
accuracy and compliance demonstration.

BOA believes this was a "perfect storm" event which has already been
alleviated by a number of factors. BOA preemptively took action to
solve this problem prior to it being called to our attention in the Data
Metric Analysis, received in April 2015, by hiring additional staff in late
2014 to help process the increased workload created from this
regulation. Input into AFS was also discontinued when ICIS-Air went
live. Staff now inputs performance test results directly into ICISAir
when review is complete, which has helped our timeliness. Finally, the
number of newly subject engines dropped in the last federal fiscal year
due in part to a decline in oil prices. BOA still contends that a deadline
of 120 days after the stack test date, which is effectively reduced to 60
days after CAA regulation allowances, is not conducive to thorough and
thoughtful review and we question whether other states are simply
reporting the stack test data without proper review in order to meet this
deadline.

Recommendation Region 7 recommends KDHE continue to evaluate current data entry

procedures with the goal of improving speed by identifying opportunities
to collect and enter data from the Regional Offices and Local
Government Agencies more efficiently so data entry may occur in a
timely manner. KDHE should consider the use of a data entry form
which may be provided electronically to data entry staff upon
completion of reportable activities. KDHE should provide Region 7 with
a draft of the process improvements for review within 60 days of

State Review Framework Report | Kansas | Page 7


-------
completion of this SRF Report. If review of KDHE data at the end of
FY2016 shows that timeliness has sufficiently improved, the
Recommendation will be deemed completed.

State Review Framework Report | Kansas | Page 8


-------
CAA Element 2 — Inspection

Finding 2-1

Meets or Exceeds Expectations

Summary

KDHE is exceeding national average for FCE Inspection targets and
review of Title V Annual Compliance Certifications.

Explanation	KDHE is above the national average for FCE coverage for Title V Major

and Synthetic Minor (SM)-80 facilities, along with review of Title V
Annual Compliance Certifications. FCE coverage of Major facilities was
95.50% (national average of 85.70%) and FCE coverage of SM-80s was
98.60%) (national average of 91.70%). Kansas Title V facilities are
inspected annually. The larger Title V facilities receive multiple Partial
Compliance Evaluations (PCEs) in one year, which combine to meet the
annual FCE requirement. KDHE inspectors accompany the EPA
inspectors on inspections in Kansas whenever possible. Inspectors are
also called upon to execute complaint investigations when necessary.
The KDHE air program inspectors perform over 800 assigned facility
inspections each year. The KDHE field inspectors perform 100 to 150
additional inspections/investigations beyond the assigned inspections.
This substantial workload is accomplished with a high degree of
communication and coordination with the six KDHE Regional Offices
and local government offices on a frequent basis to ensure inspection
targets are met.

Relevant metrics

Metric ID Number and Description

Natl Nail Sialc Slale Siale
Goal Avjj N D %or#

5a FCE coverage: majors and mega-sites

100% of
commitment

85.70% 2 i:

l>5 5<>"„

5b FCE coverage: SM-80s

100% of
commitment

91.70% 15 ' i5N 'JX (.<)"„

5e Review of Title ₯ annual compliance
certifications

100% 78.80% :so tm.u",,

State response

Recommendation

State Review Framework Report | Kansas | Page 9


-------
CAA Element 2 — Inspection

Finding 2-2	Meets or Exceeds Expectations

Summary	KDHE's documentation of FCE elements in inspection reports was

exemplary. KDHE review of compliance monitoring reports to ensure
completeness was likewise noteworthy.

Explanation	KDHE performed well on the SRF inspection elements and inspection

metrics 6a and 6b. In the subset of reports reviewed, 96.7% of the FCE's
reviewed effectively documented the full complement of FCE elements.
During the review year, 33 of the 34 compliance monitoring reports
reviewed provided sufficient documentation to determine facility
compliance.

Relevant metrics	.	Natl	Nail Stale sum- Stale

Metric ID Number and Description	_ ,	.	.

Goal	Avjj	N D %or#

6a Documentation of FCE elements	100%	2') 't>

6b Compliance monitoring reports reviewed

that provide sufficient documentation to	100%	.V? "4 y~. I".,

determine facility compliance

State response

Recommendation

State Review Framework Report | Kansas | Page 10


-------
CAA Element 3 —

Violations

Finding 3-1	Meets or Exceeds Expectations

Summary	Accuracy of compliance and HPV determinations were at or near the

national goal of 100%.

Explanation	30 of the 32 files reviewed appeared to have accurate compliance

determinations. 13 of the 13 files reviewed appear to have accurate HPV
determinations, indicating that among the violations reviewed, KDHE is
accurately identifying the violations and interpreting the HPV policy.
EPA reached beyond the scope of the 2014 review period to gain a
broader picture of KDHE's HPV determinations and policy
interpretation by reviewing enforcement files for a facility identified in a
previous year as an HPV. EPA concluded KDHE is appropriately
applying the HPV policy.

Relevant metrics	.	Natl	Natl Sialc Sialc Sialc

Metric ID Number and Description	„ ,	. .

Goal	Avjj N D %or#

7a Accuracy of compliance determinations 100%	"2 X"„

8c Accuracy of HPV determinations	100%	n n loo",,

State response

Recommendation

State Review Framework Report | Kansas | Page 11


-------
CAA Element 3 —

Violations

Finding 3-2	Meets or Exceeds Expectations

Summary	The KDHE is properly identifying HPV violations.

Explanation	KDHE management discusses HPV cases and HPV identification with

Region 7 staff during their scheduled monthly conference calls. The data
demonstrate proper application of the HPV policy. Although the KDHE
HPV discovery rate is lower than the national average, KDHE is
properly identifying HPV violations.

Relevant metrics	.	Natl Natl Siaic sum- Siaic

Metric ID Number and Description	_ , . .

Goal Av < > < >".,

State response

Recommendation

State Review Framework Report | Kansas | Page 12


-------
CAA Element 4 — Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary

All formal enforcement responses reviewed included language requiring
the facility return to compliance.

Explanation	All formal enforcement settlement documents reviewed included a

condition that required the facility to return to compliance. When
practical, the return to compliance was required immediately. In
situations where immediate compliance was not feasible, a compliance
schedule was incorporated into the settlement document.

Relevant metrics

Metric ID Number and Description

Natl Natl
Goal Avjj

9a Formal enforcement responses that include
required corrective action that will return the 100%
facility to compliance in a specified timeframe

I Ob Appropriate enforcement responses for
HPVs

100%

Siale	Sialic	Sialc

N	D	% or#

l(.	I(.	|00"„

4	4	lull",,

State response

Recommendation

State Review Framework Report | Kansas | Page 13


-------
CAA Element 5 — Penalties

Finding 5-1	Meets or Exceeds Expectations

Summary	KDHE files demonstrate the state's documentation of the consideration

of economic benefit in the calculations has improved significantly.

Explanation	The 2010 SRF review indicated that a number of the enforcement

actions taken by KDHE in the public files did not include a penalty
calculation work sheet with a specific statement on consideration of
economic benefit. The 2015 SRF review demonstrates KDHE has made
significant progress in addressing this issue. As part of the 2010 review
recommendation, KDHE instituted a requirement for a statement at the
end of each Penalty Work Sheet pertaining to economic benefit that may
have been gained by the facility for failure to comply. KDHE protocol
for consideration and documentation of economic benefit has been
included in the KDHE Air Program Enforcement Policy. The policy
includes setting base penalties within the matrix at the end of the policy.
The policy sets different base penalties for various violations - more
serious violations have a higher base penalties. KDHE also sets a
multiplier to the violation as appropriate for the situation - one instance;
weeks, months, or years in violation. A history of compliance is noted
for each facility, and degree of cooperation to return to a state of
compliance is likewise evaluated. For the KDHE files reviewed in 2015,
thirteen out of fourteen penalty calculation worksheets included
documentation of the consideration of economic benefit.

Relevant metrics	.	Natl	Natl Siaic Siaic	siaic

Metric ID Number and Description	_ ,	. .		

Goal	Avjj N D	%or#

11a Penalty calculations include gravity and	0/	,,

, ~	100/o	I 14	) ii

economic benefit

State response

Recommendation

State Review Framework Report | Kansas | Page 14


-------
CAA Element 5 — Penalties

Finding 5-2	Meets or Exceeds Expectations

Summary	KDHE consistently documents rationale for reducing the initial penalty.

KDHE has a strong performance record for penalty collection.

Explanation	KDHE is consistently and adequately documents rationale for reducing

an initial penalty, 10 out of 11 files reviewed included the appropriate
documentation. KDHE has a strong performance record for penalty
collection; 12 of the 13 files reviewed demonstrated penalties were
collected.

Relevant metrics

Metric ID Number and Description

Natl Natl Siale Siale Sialc
Goal Avg N D %or#

12a Documentation on dilTerence between
initial and final penalty

100%

lu

12b Penalties collected

100%

II 'JOT,,

I j <;2J%

State response

Recommendation

State Review Framework Report | Kansas | Page 15


-------
Appendix

Supplemental Finding Summary: EPA experienced several issues with the KDHE electronic
file review system during the SRF review

Explanation: As a means of assessing the access, capabilities and potential public user
experience of the KDHE's online electronic file system, EPA elected to conduct the file review
remotely by accessing the KDHE's file system off-site. Due to software incompatibilities and
limitations, EPA found it difficult for off-site users to access and navigate in the system. A
portion of the file review was therefore conducted off-site, and a portion was conducted on-site.
EPA encountered several issues with the electronic filing system, as follows:

1.	KDHE files are organized chronologically, resulting in the occasional inability to
follow the status and/or resolution of individual issues. Overall, EPA had a measure
of difficulty following threads of information when all site-related issues were
clustered together.

2.	The electronic file system is cumbersome and difficult for users outside of KDHE to
navigate. EPA encountered software incompatibilities, while attempting to review the
files off site. Discussions with the KDHE district office revealed similar issues.
Substantial amounts of time were required for the SRF reviewers to navigate the
documents using the Webnow software outside the agency.

3.	EPA had difficulties searching the electronic files for specific documents.

4.	EPA encountered misfiled sets documents (i.e., the files for a facility were filed in the
wrong facility file).

5.	EPA is concerned about accessibility of the KDHE compliance and enforcement files
to the general public, as well as other agencies (EPA included).

To address these issues, EPA recommends KDHE develop a Standard Operating Procedure
(SOP) or guidance for outside users detailing how the search function works and KDHE pursue
updating the Webnow software.

State Response: The SRF report contained an appendix which states several issues with the
BOA electronic file review system. BOA believes the two main reasons for difficulty in using
the system had to do with the EPA computers not being fully compatible with our software and
the lack of user familiarity with the software. BOA receives numerous Kansas Open Records Act
(KORA) requests every year and we have not been made aware of any problems accessing the
requested files. After receiving these complaints, BOA invited the Region 7 SRF team to our
office in order to use our computers and to receive some basic instruction in use of the software.
We were told that the review went much faster at that point. It is not uncommon for an SRF team

State Review Framework Report | Kansas | Page 16


-------
to visit the state office in order to do their review, especially if paper files are still used.
Therefore, BOA does not think this complaint warrants mention in this audit since our system,
while it may have inconvenienced the SRF team by forcing them to travel to our office, did not
prevent them from actually seeing the files they requested.

State Review Framework Report | Kansas | Page 17


-------
KDHE Bureau of Air Response Letter

Bureau of Air

Curtis Slate Office Building
1000 SW Jackson, Suite 310
Topeka.KS 68612-1366

Phone: 785-298-0243
Fax; 78S-298-745S
JAhumadagkdheks.gov
www.kdheks.gov/bar

Susan Master, MD, Secretary

Department of Heailh & Em itonmenl

Sam Biownback, Governor

November 17,2015

Becky Weber

Air and Waste Management Division
U.S. EPA, Region 7
11201 Renner Blvd.

Lenexa, KS 66219

Dear Ms. Weber:

On November 16, 2015, the Kansas Department of Health and Environment (KDlll") received the draft report of
the State Review Framework (SRF) of the KDHE Bureau of Air (BOA) Compliance and Enforcement Program conducted
by EPA Region 7 staff. This SRF audited federal fiscal year 2014 activities. BOA would like to comment on portions of
the draft report.

The SRF report identified one area for improvement relating to the lack of timely date entry of performance test
results. EPA metrics in this category specify that performance test results should be entered into the CAA database within
120 days of the end of the performance test. Currently, this information is entered into IClS-Air but in FFY2014 the database
was AFS. BOA strives to enter test data into the CAA database as quickly as possible but asks consideration of the fact that
120 days is actually reduced to 60 days when a federal regulation, such as a MACT or NES11AP, allows the facility to
submit the final test report no later than 60 days after the end of testing. During FFY2014, Kansas had a very large number
of reciprocating internal combustion engine performance tests conducted due to implementation of the new RICE MACT,
40 CFR Part 63, Subpart ZZZZ, regulation which happened to coincide with the oil exploration boom. Not only were the
total number of performance tests conducted in that year between two to three times higher than average due to this new
rule, but this regulation is one which allows 60 days for final report submittal. Not only was our program asked by EPA to
absorb a huge increase in work, but we were then told it had to be completed within 60 days of our receipt. In addition, due
to budgetary constraints at the time, there was only a single staff member tasked with running the entire performance testing
program for the entire state. To further complicate this, in FFY2012, BOA did batch uploads into AFS from our internal
database once a month. Therefore, performance tests that were reviewed the day alter the upload would not be reflected in
AFS for another month. BOA would like to note that although the arbitrary 120 day, effectively 60 day, deadline was not
met on most stack test reports in FFY2014, 100% of stack test and RATA reports, including Acid Rain reports which we
review out of courtesy to EPA, were thoroughly reviewed for scientific accuracy and compliance demonstration.

BOA believes this was a "perfect storm" event which has already been alleviated by a number of factors, BOA
preemptively took action to solve this problem prior to it being called to our attention in the Data Metric Analysis, received
in April 2015, by hiring additional staff in late 2014 to help process the increased workload created from this regulation.
Input into AFS was also discontinued when ICIS-Air went live. Staff now inputs performance test results directly into ICIS-
Air when review is complete, which has helped our timeliness. Finally, the number of newly subject engines dropped in
the last federal fiscal year due in part to a decline in oil prices. BOA still contends that a deadline of 120 days after the
stack test date, which is effectively reduced to 60 days after CAA regulation allowances, is not conducive to thorough and
thoughtful review and we question whether other states are simply reporting the stack test data without proper review in
order to meet this deadline.

The SRF report identified one area for state attention; inaccuracies in the CAA database. The report noted BOA's
improvement in this metric since the previous SRF and we intend to continue improving. With the introduction of ICIS-
Air, there was a period of time where staff was learning the system and may have made some initial errors, but overall, the
new system allows our staff the ability to directly enter data into the CAA database which will further reduce and
discrepancies between our file and the CAA database. The report mentions a possibility that all notices of noncompliance

State Review Framework Report | Kansas | Page 18


-------
(NONs) may have not been uploaded into AFS. BOA is not sure if that is the case, or if it was a statistical anomaly, but
will put additional emphasis on entering NONs into the CAA database going forward.

The SRF report contained an appendix wMch states several issues with the BOA electronic file review system,
BOA believes the two main reasons for difficulty in using the system had to do with the EPA computers not being fully
compatible with our software and the lack of user familiarity with the software. BOA receives numerous Kansas Open
Records Act (KORA) requests every year and we have not been made aware of any problems accessing the requested files.
After receiving these complaints, BOA invited the Region 7 SRF team to our office in order to use our computers and to
receive some basic instruction in use of the software. We were told that the review went much faster at that point. It is not
uncommon for an SRF team to visit the state office in order to do their review, especially if paper files are still used.
Therefore, BOA does not think this complaint warrants mention in this audit since our system, while it may have
inconvenienced the SRF team by forcing them to travel to our office, did not prevent them from actually seeing the files
they requested.

We would like to thank the Region 7 SRF team for being courteous, patient and considerate of our time in this audit.
If you have any questions concerning these comments please contact me at JAhumada.i. 'kdhcks.gov or call at (785) 296-
0243.

Sincerely, .

jf

Javier Ahumada

Chief, Compliance and Enforcement Section
Bureau of Air

c: Rick Brunetti, BOA Director

State Review Framework Report | Kansas | Page 19


-------