STATE REVIEW FRAMEWORK

Oregon

Clean Air Act
Clean Water Act
Resource Conservation and Recovery Act

Implementation in Federal Fiscal Year 2018

U.S. Environmental Protection Agency

Region 10

Final Report
March 6, 2020


-------
I. Introduction

A.	Overview of the State Review Framework

The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.

Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:

1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards

2.	Promote fair and consistent enforcement necessary to protect human health and the
environment

3.	Promote equitable treatment and level interstate playing field for business

4.	Provide transparency with publicly available data and reports

B.	The Review Process

The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.

The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.

II. Navigating the Report

The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.


-------
A. Metrics

There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.

Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.

B.	Performance Findings

The EPA makes findings on performance in five program areas:

•	Data - completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)

•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance

•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection

Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:

Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.

Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.

Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.

C.	Recommendations for Corrective Action

Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include


-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.

III. Review Process Information

Kickoff meeting held: March 13, 2019

Data Metric Analysis and file selections sent to DEQ: April 11 and May 8„ 2019

File reviews completed: November 8, 2019

Draft report sent to DEQ: December 12, 2109

Comments from DEQ received by EPA: February 10, 2020

Report Finalized: March 6, 2020

DEQ and EPA key contacts:

Becka Puskas, J.D., DEQ Office of Compliance and Enforcement
Scott Wilder, EPA SRF Coordinator

Clean Air Act (CAA)

Jaclyn Palermo, DEQ

Elizabeth Walters, EPA CAA file reviewer

John Pavitt, EPA CAA file reviewer

Clean Water Act (CWA)

Martina Frey, DEQ

Rob Grandinetti, EPA CWA file reviewer

Resource Conservation and Recovery Act (RCRA)

Jeannette Acomb, DEQ

Cheryl Williams, EPA file reviewer


-------
Executive Summary

Introduction

Clean Air Act (CAA)

Areas of Strong Performance

The following are aspects of the program that, according to the review, are being implemented at
a high level:

Clean Air Act (CAA)

Formal enforcement responses consistently included the required corrective action to return the
facility to compliance.

Penalty calculations consistently documented gravity and economic benefit.

Rationales for differences between initial penalty and final penalty were always documented.

Penalties were consistently collected and documented.

Priority Issues to Address

The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:

Clean Air Act (CAA)

ICIS-Air is inaccurate and does not reliably match MDR in file documentation.

FCE reports occasionally lacked enough information to determine compliance.

The State frequently misidentified violations as non-HPV when it met the criteria in the HPV

Policy.

High Priority Violations (HPV) were mostly not addressed in a timely manner or alternatively
did not have a Case Development Resolution Timeline (CDRT) in accordance with the HPV
policy.


-------
Clean Water Act (CWA)

Areas of Strong Performance

The following are aspects of the program that, according to the review, are being implemented at
a high level:

Clean Water Act (CWA)

In the metrics for inspection report completeness, sufficient to determine compliance of the
facility, and accuracy of compliance determination, Oregon did an outstanding job.

Priority Issues to Address

The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:

Clean Water Act (CWA)

Oregon's data issues continue to be an area of concern during the SRF review period.

Oregon relies on several municipalities and districts as its agents to conduct inspections of
sources regulated by some stormwater and other general permits. However, ODEQ does not
routinely collect data from these agents regarding inspections that are planned, inspections that
have been conducted, and violations found during these inspections.

EPA recognizes that Oregon has provided an MOA update for one of their agents, the Oregon
Department of Geology and Mineral Industries. By June 30, 2021 Oregon shall provide a plan
and timeline to get the remaining MO As to EPA for review and comment.

Resource Conservation and Recovery Act (RCRA)

Areas of Strong Performance

The following are aspects of the program that, according to the review, are being implemented at
a high level:

Resource Conservation and Recovery Act (RCRA)


-------
Appropriate SNC determination.

Timely and appropriate enforcement.

Economic benefit included in all penalty calculations/justifications.
Priority Issues to Address

The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:

Resource Conservation and Recovery Act (RCRA)

Missing Data Elements

Inspection Report Accuracy/Completeness


-------
Clean Air Act Findings

CAA Element 1 - Data

Finding 1-1

Area for Improvement

Summary:

ICIS-Air is very inaccurate and does not reliably match MDR in file documentation.

Explanation:

Facility identifiers such as programmatic ID, address, zip code, type of ownership and NAICS
code were consistently inaccurate or missing in FCE reports. Stack tests and stack tests results
were generally not reported in a timely manner. 60% of stack tests and stack test results were
reporting in a timely manner. Stack tests were also frequently inaccurate in ICIS-Air and did not
match file documentation. Most stack tests were submitted to ICIS-Air with incorrect dates,
pollutants measured, and test results. For example, Boise Cascade Medford performed a stack
test that was entered into ICIS-Air with a "PASS" for CO emissions, but file documentation
showed that CO measurements were 2 times over the limit and that the results were actually a
"FAIL." After preparing the draft report, EPA RIO learned from ODEQ that these examples were
tests conducted for the purpose of verifying emission factors and were not for the purpose of
making a compliance determination. State and local agencies are only required to report stack
tests and their results for test performed for the purpose of making a compliance
determination. Realizing that some agencies reported all stack tests performed, EPA gave
agencies the ability to report the purpose (e.g., RATA, or "Other"). EPA does not require States
to enter results for such tests for other purposes.

A few stack tests were submitted to ICIS-Air twice, such as the 7/26/2016 stack test at Collins
Products which also has the wrong pollutants listed. The 11/1/2017 Collins Pine Company,
10/16/2018 Interfor, 9/29/2017 and 11/16/2017 Stimson Lumber stack tests were not submitted to
ICIS-Air. While air programs and subparts were generally accurate in ICIS-Air for the facilities,
the applicable pollutants section often included "Pollutant X" and "FACIL." Out of the 26
facilities/files reviewed, only 1 facility had the CMS source category and frequency entered in
ICIS-Air. Title V annual compliance certifications (ACC) were sometimes incorrectly coded into
ICIS-Air as "Facility Reported No Deviations" when file documentation included deviations
submitted by the facility such as Portland General Electric. JeldWen Bend also had a Title V ACC
that was not submitted to ICIS-Air. Agency could also not produce the FCE report for Freres
Lumber Co. Inc. While the Agency did report 94.7% of compliance monitoring MDRs in a timely
manner, it reported 44.9% HPV determinations and 10.5% of enforcement MDRs in a timely
manner into ICIS-Air.

Relevant metrics:


-------
Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

2b Files reviewed where data are accurately
reflected in the national data system [GOAL]

100%



0

26

0%

3a2 Timely reporting of HPV determinations
[GOAL]

100%

44.9%

0

2

0%

3b 1 Timely reporting of compliance
monitoring MDRs [GOAL]

100%

85.2%

143

151

94.7%

3b2 Timely reporting of stack test dates and
results [GOAL]

100%

65.1%

30

50

60%

3b3 Timely reporting of enforcement MDRs
[GOAL]

100%

71.8%

2

19

10.5%

State Response:

In December 2019, DEQ identified a training need for source test data entry into ACES and
trained source test coordinators. In addition, as of 12/31/2019 DEQ entered missing source/stack
test information into ACES. Monthly updates are now sent to all source test coordinators to as a
QA/QC step to help ensure the accuracy of data entry. DEQ is completing a thorough review of
FY19 source/stack tests to compare with the data entry system and will update EPA with
findings by 7/31/2020.

In some cases, the facility name is different in ECHO than in ICIS-Air. DEQ's database (ACES)
and ICIS-Air are correct, however ECHO reflects a different source name. DEQ is working with
EPA to resolve this issue.

In at least two instances (Boise Cascade Medford and Collins Pine Company), source test
information reviewed by EPA was for emissions factors, which are not permit limits, and thus
should not be considered a FAIL of a source test. DEQ will update its source test review memo
template to more clearly differentiate between emission factors and emission limits.

Currently, DEQ provides specific pollutant information, not a simple "pass" or "fail" for
source/stack tests. DEQ is reviewing source test data entry to meet EPA's MDR. DEQ is also
working to address the ACC findings. DEQ reviewed its FY18 CMS data and found that the
DEQ ACES system transmitted the inspection information into ICIS-Air, which confirmed its
acceptance. DEQ understands that EPA is working to address issues translating the information
from ICIS-Air to ECHO.

The DEQ's upcoming implementation of the Environmental Database Management System
(EDMS) will help to resolve certain issues with source reporting. Sources will be able to upload
their annual reports directly into the database, helping to streamline the reporting process for


-------
DEQ staff. Internal testing of these modules is currently estimated to occur in April 2021, and
the final release of EDMS for air quality permitting is projected for the end of 2nd QTR 2021. As
part of EDMS development and testing, DEQ will work with the vendor to ensure that MDR
information can be translated accurately from EDMS into ICIS-Air.

Recommendation:

U"' Due Dale

#

1 07/31/2020

Recommendation

State has informed EPA that they do not enter data directly into ICIS-
Air, but into a separate database that is linked to ICIS-Air. State
currently has a program in place to review the test reports, ID
omissions and inaccurate data and results are entered into a database
system that is linked to ICIS-Air. At this time, EPA does not know if
the stack testing issues are related to incorrect data entry into the
State's database system or a translating issue between the State's
database system and ICIS-Air. State needs to determine if these stack
testing issues are occurring at data entry, data transfer or both. By May
29, 2020, the State will conduct a thorough review on the FY19 stack
tests and a data quality check in their database system. By July 31,
2020, State will provide an update EPA on the results as well as
provide an update on the translating issues between their database
system and ICIS-Air. EPA will also expect the State to develop a plan
and timelines on resolving the stack testing issues and how they will
ensure the new database system (EDMS) will translate accurately into
ICIS-Air.

CAA Element 2 - Inspections

Finding 2-1

Area for Attention

Summary:

Inspection reports occasionally lacked required documentation of FCE elements.

Explanation:

Agency conducted FCE's at 91.8% of the major and mega-sites, and 83.3% of the SM-80s located
within the state. However, inspection reports occasionally lacked the required documentation of
FCE elements. Inspectors were generally thorough in their review of all required reports and
included a summary of what was reviewed in their reports such as Title V self-certifications and
excess emissions reports. A majority of FCE reports documented the required reports and records


-------
reviewed. However, a few FCE reports did not include an assessment of or important facts from
the underlying records. Boise Cascade Medford, for example, had a history of stack tests that had
exceeded CO limits and the inspector did not discuss the stack tests in their report. A few FCE
reports were lacking in detail. While inspectors listed what was reviewed on-site (i.e. facility
records and operating records), the reports did not include an assessment of those records. Owens
Brockway, for example, did not include assessments of process parameters or equipment
performance as well as did not state whether other facility records were reviewed.

Relevant metrics:











Metric ID Number and Description

5a FCE coverage: majors and mega-sites
[GOAL]

Natl
Goal

100%

Natl

Avg

88.1%

State

N

45

State
D

49

State

%

91.8%

5b FCE coverage: SM-80s [GOAL]

100%

93.7%

5

6

83.3%

5c FCE coverage: minors and synthetic minors
(non-SM 80s) that are part of CMS plan or
alternative CMS Plan [GOAL]

100%

70.1%

0

0

0

5e Reviews of Title V annual compliance
certifications completed [GOAL]

100%

82.5%

97

106

91.5%

6a Documentation of FCE elements [GOAL]

100%



16

22

72.7%

State Response:

DEQ staff will be trained on the expectation of complete FCE reports, thorough review of
supporting documents along with each report, and documentation of record assessment in
reports. As discussed in Finding 2-2 below, DEQ recently convened an Air Quality Lead
Inspectors Group. One task to be completed by this group includes updating inspection templates
to help ensure that FCE information is complete. This task will be conducted with input from
DEQ management, staff, and EPA.

CAA Element 2 - Inspections

Finding 2-2

Area for Improvement


-------
Summary:

FCE reports occasionally lacked enough information to determine compliance.

Explanation:

FCE reports generally documented the general information and facility information, but there was
a recurring deficiency of information necessary to determine compliance of the facility. Many FCE
reports were not thorough and did not include the federal requirements, inventory and description
of regulated emission units and on-site observations. A few FCE reports also only listed the permit
condition number without an explanation of what the permit condition or regulatory requirement
was. Inspectors would also sometimes state that a facility was following permit conditions or
regulatory requirements and did not document their observations or rationale for that
determination. The FCE report for Evraz Inc. as well listed "N/A" for many permit conditions and
did not explain what "N/A" meant or why the requirement was not applicable. FCE reports
generally lacked the inspector's on-site observations of the facility during their compliance
evaluation and did not always include what was relayed to the facility. For example, the inspector
for the EP Minerals FCE did not document the observations or findings that were discussed with
the facility and did not include federal regulatory requirements in the report.

Relevant metrics:











Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

6b Compliance monitoring reports (CMRs) or
facility files reviewed that provide sufficient
documentation to determine compliance of the
facility [GOAL]

100%



14

22

63.6% |

State Response:

In an effort to improve inspections, templates, and procedures, DEQ recently formed an AQ
Lead Inspector Group. One of the first tasks for this group is to improve DEQ's air quality
inspection templates, in coordination with EPA Region 10. These templates will be used agency-
wide for FCE reports. Going forward, DEQ will use these templates to help ensure that
inspection reports are consistent, and include on-site observations, findings that were discussed
with the facility, and other items of importance as determined by EPA. DEQ looks forward to
working with EPA to improve these inspection templates, and that work will begin March 2020.
DEQ air quality staff will be provided with the first iteration of the new inspection templates,
presented by the AQ Lead Inspector group with assistance from EPA Region 10, at the April 22-
23, 2020 Permit Writer and Inspector Forum. During this event, training will take place for
documentation and expectations for FCEs. Improvement of air inspections permits has been
identified as an Air Division priority, and will undergo continuous improvement. As part of
DEQ's process improvement efforts, this template will continue to be developed with input from
AQ inspectors, and will be continually improved and adjusted as needed.


-------
Recommendation:

Ucc

#

Due Dale

06/26/2020

Recommendation

By June 26, 2020, the State will create SOPs and provide training to
inspectors on how to sufficiently document applicable requirements,
observations and information in FCE reports for a case developer or
attorney to be able to determine compliance. The State will also
provide the SOPs and training documentation to EPA.

CAA Element 3 - Violations

Finding 3-1

Area for Attention

Summary:

Compliance determinations were generally accurate in cases where there was enough
documentation.

Explanation:

The agency's compliance determinations were generally accurate in cases where there was enough
documentation in the FCE report and other information in the source file. Compliance
determinations were consistently reported to and accurate in ICIS-Air.

Relevant metrics:

mix - in v r u j rv • x-	Natl	Natl State State	State

Metric ID Number and Description	,	..	_

1	Goal	Avg N	D	%

7a Accurate compliance determinations [GOAL]	t 100%	i > 20	j 25	> 80%

State Response:

DEQ expects that compliance determinations will be further improved by updating inspection
templates and conducting training on FCEs, as described above in Finding 2-2.

CAA Element 3 - Violations


-------
Finding 3-2

Area for Improvement

Summary:

The State frequently misidentified violations as non-HPV when it met the criteria in the HPV
Policy.

Explanation:

High Priority Violations (HPVs) were frequently misidentified as non-HPV and were not
addressed to the EPA in accordance with the HPV policy. Reviewers discovered many violations
from stack tests, inspection reports, Title V ACCs, informal and formal enforcement actions that
were never identified and addressed as HPVs. Several Federally Reportable Violations (FRVs)
were also inaccurately determined by the agency as non-HPV. 11 files contained FRVs during the
review period, of which 4 files were inaccurately determined as non-HPV.

Relevant metrics:

. . M n	I	I rv	.	11*111	OUllC	Julie	0 .	...

Metric ID Number and Description ^	^ State %

1	Goal Avg N D

13 Timeliness of HPV Identification
[GOAL]

100% | 89.5%

o

o
o

8c Accuracy of HPV determinations
[GOAL]

100%



7

11

63.64%

State Response:

Training for all DEQ AQ inspectors will begin at the Permit Writer and Inspector Forum on
April 22 - 23, 2020. It will include presentations by a representative from EPA Region 10 on
HPVs/FRVs, the discovery process for these violations, and tracking and reporting expectations,
including data entry. DEQ intends to work with EPA Region 10 throughout this process to
ensure HPV training is effective, and templates will be continually improved with input from
EPA and DEQ air quality staff.

Updated templates and forms used for tracking and reporting FRVs and HPVs will be captured
in the new database system (EDMS).

Recommendation:


-------
Ucc

#

Due Dale

05/01/2020

Recommendation

Reviewers found a PDF at several of the State offices which was used
to document the discovery of FRVs and HPVs. The PDF form included
the criteria for HPV as outlined in the policy. By May 1, 2020, the
State will standardize the HPV discovery process and the use of this
form (or a similar form) across all offices to identify and document
HPVs.

CAA Element 4 - Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary:

Formal enforcement responses consistently included the required corrective action to return the
facility to compliance.

Explanation:

7 out of 7 files that contained formal enforcement responses, during the review period, included
the required corrective action that returned or will return the facility to compliance. Documentation
of the facility's timeline to return to or had already returned to compliance was consistently
included in the file.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

9a Formal enforcement responses that include
required corrective action that will return the
facility to compliance in a specified time frame
or the facility fixed the problem without a
compliance schedule [GOAL]

100%

100%

State Response:

No state response.


-------
CAA Element 4 - Enforcement

Finding 4-2

Area for Improvement

Summary:

High Priority Violations (HPV) were mostly not addressed in a timely manner or alternatively did
not have a Case Development Resolution Timeline (CDRT) in accordance with the HPV policy.

Explanation:

The State did not follow the High Priority Violation (HPV) policy and frequently did not address
HPVs in a timely manner. 8 files contained HPVs during the review period, of which Georgia-
Pacific Consumer Operations, Interfor U.S., Boise Cascade Wood Products and Collins Pine
Company were not addressed within 180 days or did not have a Case Development Resolution
Timeline (CDRT) in place within 225 days.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

10a Timeliness of addressing HPVs or
alternatively having a case development and
resolution timeline in place

100%

50%

State Response:

DEQ agrees that the timely identification of and response to High Priority Violations (HPVs) is a
priority, and appreciates the ongoing dialogue with EPA to improve DEQ's processes and
procedures in this area.

DEQ is requesting an extension until the end of April 2020 to begin full-staff training on HPVs.
DEQ will hold its twice-annual Permit Writer and Inspector Forum on April 22 - 23, 2020. In
collaboration with DEQ, a representative from EPA Region 10 will conduct training on the HPV
policy during the April forum. DEQ recognizes that there may be a need for additional HPV
training beyond the April forum and will keep EPA appraised if there are any additional needs.

Based on prior discussions with EPA, DEQ has previously understood that the HPV list, along
with the quarterly HPV calls held with EPA, are the mechanism for tracking HPVs that take
more than 180 days to address. Based on this SRF review and recent conversations with EPA,
DEQ understands that a written Case Review Development Timeline (CRDT) is needed in the
file for cases that will take more than 180 days to address, even if the CRDT is not submitted to
EPA. DEQ is considering the best way to include this information in the file, including adding


-------
the CRDT to the updated HPV form referenced in Recommendation 3-2, and tracking this
information in DEQ's new Environmental Data Management System (EDMS).

Recommendation

By April 30, 2020, the State will develop, plan and provide training on
the EPA High Priority Violation policy to inspectors, case developers,
attorneys and permit writers. EPA can also assist with the training.
State will also provide confirmation of the completion of HPV training
to EPA Region 10.

CAA Element 4 - Enforcement

Finding 4-3

Area for Attention

Summary:

High Priority Violations (HPV) were occasionally not addressed with the appropriate enforcement
response in accordance with the HPV policy.

Explanation:

High Priority Violations (HPV) were generally removed or addressed in accordance with the EPA
HPV policy. 7 files contained HPVs identified by the State and were not in the process of currently
being addressed or not yet concluded. 5 of the 7 files contained HPVs that were appropriately
addressed or removed by the State.

Relevant metrics:











Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

10b Percent of HPVs that have been addressed
or removed consistent with the HPV Policy
[GOAL]

100%



5

7

71.4%

State Response:

Recommendation:

U"' Due Dale

#

1 04/30/2020


-------
DEQ is working to improve its identification and timely response to HPVs, as described in
Findings 3-2 and 4-2, above.

CAA Element 5 - Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary:

Penalty calculations consistently documented gravity and economic benefit.

Explanation:

The State consistently documented how gravity and economic benefit values were assessed in the
penalty. Penalty calculations always included a gravity component and the calculation of economic
benefit. This documentation was in all the files reviewed where a penalty was assessed.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

1 la Penalty calculations reviewed that document
gravity and economic benefit [GOAL]

100%

100%

State Response:

No state response.

CAA Element 5 - Penalties

Finding 5-2

Meets or Exceeds Expectations

Summary:

Rationales for differences between initial penalty and final penalty were always documented.

Explanation:


-------
The State consistently documented the rationale for differences between initial penalty calculations
and final penalty calculations. Penalties were assessed in 7 files during the review period, and a
memo or other documentation of penalty differences were included.

Relevant metrics:

Metric ID Number and Description

12a Documentation of rationale for difference

Natl Natl State State State
Goal Avg N D %

between initial penalty calculation and final

100% I I 6

6

100%

penalty [GOAL]







State Response:

No state response.

CAA Element 5 - Penalties

Finding 5-3

Meets or Exceeds Expectations

Summary:

Penalties were consistently collected and documented.

Explanation:

Photocopies of checks or other correspondence that documented check transmittal were
consistently included in the files where a penalty was collected. The State also documented the
costs, timelines and progress of Supplemental Environmental Projects (SEPs) if it was included in
the penalty.

Relevant metrics:

, . Irk.. , , . .	Natl Natl	State	State State

Metric ID Number and Description	„ ,	; .	..	_	...

1	Goal Avg	N	I)	%

12b Penalties collected [GOAL]	I 100%	I	I 7	I 7	1 100%

State Response:


-------
No state response.

Clean Water Act Findings

CWA Element 1 - Data

Finding 1-1

Area for Improvement

Summary:

Not all data in ICIS reflects the files that were reviewed.

Explanation:

There were many inspection reports that had the wrong dates in ICIS, and there were a couple of
enforcement actions dates that were inconsistent with the dates in the enforcement action in the
files. Some facilities had multiple inspections entered into ICIS, and the number in ICIS did not
match the inspection reports in the files. The Rainier STP (OR0020389) enforcement action dates
in ICIS did not match with the date in the file. The Tillamook County Creamery (OR0000141) did
not have all of the enforcement actions entered into ICIS.

Relevant metrics:

.. , . , , „ .	Natl	Natl State	State State

Metric ID Number and Description	„ , ...	_

1	Goal	Avg N	D %

2b Files reviewed where data are accurately	, „„n/

n x j • .1 1 j , r/-^r\ a . n	100% 11	30 36.7%
reflected in the national data system [GOAL]

State Response:

DEQ recently undertook a process improvement addressing how compliance and enforcement
information is generated and tracked. New SOPs are drafted that dictate how enforcements will
be entered into DEQ's compliance and enforcement database, ACES, to reflect the date on the
letter. These SOPs will be provided to EPA as part of DEQ's plan for addressing data entry
issues identified in the Recommendation below.

Regarding inspections, DEQ will update its data flow specifications to correctly characterize
inspection dates. Specifically, DEQ will populate the ICIS Actual End Date with the Inspection


-------
Actual Date in ACES. Duplicate inspection records will be addressed through inspection staff
training on ACES data entry.

Recommendation:

Ucc

#

Due Dale	Recommendation

05/29/2020

By May 29, 2019, the State will provide a plan to EPA on creating
standard operating procedures (SOPs) that will dictate which dates will
be entered into ICIS, a process to enter the correct inspection dates,
and a process to not create multiple inspection reports in ICIS.

CWA Element 1 - Data

Finding 1-2

Area for Improvement

Summary:

Discharge Monitoring Reports (DMRs) data entry rate for major and non-major facilities is 30%
and the national goal is for this to be greater than 95%.

Explanation:

Oregon is not flowing all DMR data to ICIS from their database. This is a known issue by EPA
and Oregon, and one in which they have been working on for a while now.

Relevant metrics:

Metric ID Number and Description

Natl Natl
Goal Avg

State State State
N D %

lb5 Completeness of data entry on major and
non-major permit limits. [GOAL]

| 95% | 90.6%

327

327

100%

lb6 Completeness of data entry on major and

"I i







non-major discharge monitoring reports.

95% 93.3%

1778

6058

29.3%

[GOAL]









State Response:


-------
DEQ is in the process of configuring an Environmental Data Management System (EDMS) that
will be deployed incrementally over the next two years. At this time, 85% of individual NPDES
permit holders are reporting in NetDMR (State N = 5,215, State D = 6,155). DEQ expects to
enroll the remaining individual NPDES facilities in the next few months. The remaining NPDES
permit registrants (general permits and agent-administered permits) will begin electronic
reporting when their program is deployed in EDMS. DEQ expects to develop and fully deploy
EDMS for these water quality permits by mid-2021.

Recommendation:

Uec

#

Due Dale	Recommendation

By September 30, 2020, Oregon shall provide a plan to get a system in
09/30/2020 place to be able to flow data to ICIS. The plan shall have a time frame
and proposed solution.

CWA Element 2 - Inspections

Finding 2-1

Meets or Exceeds Expectations

Summary:

In the metrics for inspection report completeness, sufficient to determine compliance of the
facility, and accuracy of compliance determination, Oregon did an outstanding job.

Explanation:

Of all of the inspection reports reviewed during the file review all were of sufficient quality to
determine compliance of the facility, the reports were complete, and were accurate in making a
compliance determination.

Relevant metrics:


-------
Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

6a Inspection reports complete and sufficient to
determine compliance at the facility. [GOAL]

| 100%



17

17

100% |

|
|

6b Timeliness of inspection report completion
[GOAL]

1 100%



16

17

94.1% |

State Response:

No state response.

CWA Element 2 - Inspections

Finding 2-2

Area for Improvement

Summary:

Oregon should be gathering inspection reports and providing oversight of the agents that perform
inspections on behalf of them.

Explanation:

Oregon relies on several municipalities and districts as its agents to conduct inspections of
sources regulated by some stormwater and other general permits. However, ODEQ does not
routinely report data from these agents regarding inspections that are planned, inspections that
have been conducted, and violations found during these inspections (except when they are
referred to DEQ's Office of Compliance and Enforcement for formal enforcement). The agents
that perform these inspections have Memorandums of Agreement (MO As) that specify that the
agents have authority to implement the permit(s) on DEQ's behalf, which includes reviewing
applications, performing inspections, setting inspection goals, and undertaking informal
enforcement and referring formal enforcement to DEQ. Some of these MO As need to be
updated. EPA acknowledges that Oregon has recently finalized an MOA with the Oregon
Department of Geology and Mineral Industries (DOGAMI) in December 2019, following EPA
review.

Relevant metrics:

Metric ID Number and	Natl	Natl State State State

Description	Goal	Avg	N	D	%

State Response:


-------
EPA recently closed out Recommendation 2-3 from the last SRF review regarding Oregon
DEQ's administrative agents. Since March 2017, when DEQ realigned how our wastewater and
stormwater programs are managed, DEQ has done a great deal of work with our agents to ensure
the permits are implemented consistently throughout Oregon. Specifically, DEQ's oversight of
stormwater agents includes hosting training sessions and conducting direct outreach with each of
the stormwater agents regarding permit implementation, record-keeping, and compliance and
enforcement actions. In addition, DEQ joins the agents on inspections periodically and works to
ensure the permits are implemented consistently throughout the state. We have continued these
communication actions throughout 2019 with all agents and hosted an in-person meeting with
those that implement the 1200-Z industrial stormwater general permit in mid-2019 and meetings
with each of the agents that implement the 1200-C construction stormwater general permit in
December 2019 and January 2020.

In 2019, DEQ worked with the Oregon Department of Geology and Mineral Industries
(DOGAMI) to update the MO A for their implementation of the NPDES 1200-A industrial
stormwater and mine dewatering discharge general permit and the WPCF-1000 general permit.
EPA reviewed the MO A, and DEQ addressed EPA's comments prior to finalizing the MOA on
December 2, 2019. The MOA includes all activities and information associated with inspection
plans, inspection outcomes and compliance and enforcement activities. One specific condition of
oversight is that the MOA requires DOGAMI to send DEQ'S Office of Compliance and
Enforcement all informal enforcement letters, including warning letters, to ensure DOGAMI's
consistent implementation of DEQ's enforcement guidance.

Now that work on the DOGAMI MOA is completed, DEQ will begin work on updating the other
Agent Agreements in coordination with DEQ's work to implement EDMS, which will serve as
common platform by which both DEQ and Agents manage permit issuance, compliance, and
enforcement with full transparency and oversight. The EDMS project for water quality general
permits is currently scheduled to be implemented by June 30, 2021. Due to the anticipated
timeline for EDMS, and the fact that EDMS will be a critical part of how DEQ shares
information with and provides oversight for its agent partners, DEQ's plan for updating the
MOA's needs to align with the timing of EDMS implementation. Thus, DEQ's plan for updating
the MOA's will be completed and shared with EPA by June 30, 2021.

Recommendation:

Uec

#

Due Dale

Recommendation

06/30/2021

EPA recognizes that Oregon has provided an MOA update for one of
their agents, the Oregon Department of Geology and Mineral
Industries. By June 30, 2021 Oregon shall provide a plan and timeline
to get the remaining MO As to EPA for review and comment.


-------
CWA Element 2 - Inspections

Finding 2-3

Area for Improvement

Summary:

Oregon did not meet the 50% criteria for major inspections performed, or the 20% minor
inspections performed during the review period.

Explanation:

The Clean Water Act monitoring strategy has annual inspection percentages that EPA expects each
state to meet. The percentage for major NPDES permits is 50% annually. The percentage for minor
NPDES permits is 20% annually. Oregon did not meet these percentages during the review period.
The major NPDES inspections for Oregon was 26.7%, and the minor NPDES inspections was
14%.

Relevant metrics:











Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

5al Inspection coverage of NPDES majors.
[GOAL]

100%

52.8%

20

75

26.7% 1

|
|

5b 1 Inspections coverage of NPDES non-
majors with individual permits [GOAL]

100%

22.6%

39

278

14% |

!

5b2 Inspections coverage of NPDES non-
majors with general permits [GOAL]

100%

5.6%

120

2973

4%

State Response:

In 2018, DEQ directed significant resources toward permit issuance process improvements with
the goal of reducing our NPDES individual permit backlog. At that time, 83% of permits were
administratively continued - well behind most other states. DEQ reassigned several compliance
staff to write permits and implemented permit writing process improvements. As a result, the
backlog is now 75% and on track for further reductions. DEQ has also re-directed significant
compliance staff resources for the purpose of transitioning NPDES permittees to electronic DMR
reporting. Since 2017, when 21% of NPDES individual permittees were eReporting, now 85% of
NPDES individual permittees are reporting electronically in NetDMR.

As part of broad scale permitting process improvement efforts, DEQ successfully communicated
the need for additional resources to the state legislature and stakeholders during the 2019
legislative session. The legislature granted 10 new positions for the Water Quality Permit


-------
Program, three of which are allocated specifically to compliance monitoring and inspections. The
positions will be phased in during 2020. Looking ahead, DEQ expects that these resources will
contribute to improving inspections rates. In addition, DEQ anticipates requesting additional
resources during the 2021 legislative session for compliance and inspection efforts during the
2021-2023 biennium.

In 2019 the DEQ Water Quality Program examined our processes for managing compliance and
enforcement data and identified several improvements that will streamline the way we capture
reportable data. We expect these improvements to free up staff time for additional compliance
monitoring work. Also, when DEQ deploys EDMS, we expect processes associated with
permitting, compliance monitoring, and reporting to be streamlined significantly. Prior to EDMS
deployment, DEQ plans to dedicate substantial staff resources to system development, testing,
registration, and internal and external user training. As a result, DEQ may plan for a short-term
reduction in compliance monitoring before the Agency can realize the multiple benefits of the
more efficient permit and data management system.

With regard to the recommendation below, DEQ requests more time to provide this plan to EPA,
because it involves evaluating priorities across the water quality program, hiring and training
new staff, and planning for additional staff resources in the coming budget cycle. DEQ requests
that the deadline be moved to 12/31/2020.

Recommendation

By December 31, 2020 the state should put together a plan that will
ensure the Clean Water Act monitoring strategy goals are met on an
annual basis. This plan should be submitted to EPA for review.

Recommendation:

U"' Due Dale

#

1 12/31/2020

CWA Element 3 - Violations

Finding 3-1

Meets or Exceeds Expectations

Summary:

The files reviewed and the data metric analysis indicate the state is appropriately determining
violations.

Explanation:

All of the primary metrics for this element were found to be satisfactory in the file review.


-------
Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

7e Accuracy of compliance determinations
[GOAL]

100%

17

17 100%

State Response:

No state response.

CWA Element 4 - Enforcement

Finding 4-1

Area for Attention

Summary:

The percentage of major NPDES facilities in SNC with no formal actions initiated is low.

Explanation:

The data metric analysis shows three major NPDES facilities in SNC during the review period. Of
those three none of them had formal enforcement actions initiated. When evaluating the frozen
data to production data in ICIS, it shows only one of those facilities were actually in SNC status
during the review period. This metric finding is both not initiating formal enforcement actions on
facilitates in SNC, and that there are data errors from flowing data to ICIS.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

lOal Percentage of major NPDES facilities

with formal enforcement action taken in a	1 100% ) 15.4% | 0 [ 1 [ 0%

timely manner in response to SNC violations

State Response:

DEQ recognizes the importance of significant non-compliers and continues to work with EPA
Region 10 to address SNC rates in Oregon. DEQ plans to share SNC ratings on a monthly basis


-------
with DEQ managers and compliance and enforcement staff so that SNC can be considered when
prioritizing work.

CWA Element 4 - Enforcement

Finding 4-2

Meets or Exceeds Expectations

Summary:

Over 90% of the facilities reviewed for facilities returning into compliance, and enforcement
responses addressed the violations in an appropriate manner

Explanation:

The state was able to return most facilities back into compliance with their effective enforcement
actions. This shows the enforcement actions are making the desired result. Similarly, from the files
reviewed EPA found that the enforcement actions taken were appropriate for the given violations.

Relevant metrics:

Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

10b Enforcement responses reviewed that
address violations in an appropriate manner
[GOAL]

| 100%



19

21

90.5%

9a Percentage of enforcement responses that
returned, or will return, a source in violation to
compliance [GOAL]

| 100%



19

21

90.5%

State Response:

No state response.

CWA Element 5 - Penalties

Finding 5-1

Meets or Exceeds Expectations


-------
Summary:

There was proper documentation to justify a difference in the penalty amount from the initial to
the final settled amount, and documentation to show the penalty was paid. There was also
justification of gravity and economic benefit.

Explanation:

Relevant metrics:

Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

1 la Penalty calculations reviewed that document
and include gravity and economic benefit
[GOAL]

100%



9

9

100%

12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]

100%



9

9

100%

12b Penalties collected [GOAL]

100%



9

9

100%

State Response:

No state response.


-------
Resource Conservation and Recovery Act Findings

RCRA Element 1 - Data

Finding 1-1

Area for Improvement

Summary:

Missing Data Elements

Explanation:

For all enforcement and penalty actions the State is not inputting all required data or is putting the
data into the wrong fields. For example, when the State initially sends a compliance order and
penalty assessment (either as a EEO or a formal action) to the facility they are coding this as a
final action 314. This results in missing data such as the date the action becomes final, initial versus
assessed penalty, etc. EPA and the State have already entered into discussions surrounding this
finding. It appears that the State's current database (ACES), that translates into RCRAInfo, is not
able to capture all the required data, or if it does, does not translate correctly. The State is currently
building an Agency wide new data platform called EDMS. The State RCRA program will
beginning using this new database in 2020.

Relevant metrics:

Natl Natl State State State
Goal Avg N D %

Metric ID Number and Description

2b Accurate entry of mandatory data [GOAL] 1 100% |	[9 I 30 | 30%

State Response:

EPA and DEQ have already begun discussions on this finding. DEQ acknowledges the state
database, Agency Compliance and Enforcement System (ACES) is currently not capable of
capturing all the required data to translate correctly to RCRAInfo. Since DEQ is building a new
Environmental Data Management System (EDMS) to launch for the hazardous waste program in
2020, DEQ will address these EPA-identified data elements in the new data system. Based on the
current EDMS project timeline, DEQ anticipates that it will be able to meet the milestone for
ensuring proper data flow between the new EDMS system and RCRAInfo as outlined in the
Recommendations below. Based on the current EDMS project timeline, DEQ expects the data
translation beta testing can begin by December 30, 2020.

Specifically, improvements in the new EDMS system will:

1) Increase the data metric percentage by adjusting the data translation process in the new EDMS
database. This will ensure the return to compliance qualifiers reflect the actual inspector
compliance verifications;


-------
2)	Include Oregon rule references in the new EDMS database; and

3)	Within the new EDMS translation, include the:

•	Informal enforcement (Warning Letters and Pre-Enforcement Notices) (100 level);

•	Initial formal enforcement (Expedited Enforcement Offers (EEOs) and Notices of Civil
Penalty Assessment and Orders (NCPOs) with proposed penalty (200 level) with the
enforcement issued date, and

•	The final formal enforcement with final penalty assessed (including accepted EEOs,
default orders, Mutual Agreement and Orders (MAOs), Remedial Action Orders (RAOs)
and final orders obtained through appeals (by ALJ, EQC, courts) (300 level) to translate
once the penalty is paid or date compliance is achieved.

DEQ will conduct training for all field staff on the new data entry procedures after the 2020
launch of the EDMS hazardous waste module.

Recommendation:

Due Dale	Recommendation

1

02/28/2020

No later than Feb 28, 2020, EPA and ODEQ will conclude discussions
regarding the required compliance and enforcement elements in
RCRAInfo, and the data flow between EDMS and RCRAInfo
(including data entry in EDMS) to ensure proper translation into
RCRAInfo. ODEQ will use this meeting/these meetings to provide
information to the IT staff building the RCRA module of EDMS.

2

12/30/2020

No later than December 30, 2020, Oregon will begin beta testing
EDMS translation into RCRAInfo and will seek EPA input to ensure
data is translating correctly

3

12/30/2020

No later than December 30, 2020, Oregon will conduct refresher
training on required data elements that will include how to accurately
enter that data into EDMS.

RCRA Element 2 - Inspections

Finding 2-1

Area for Improvement

Summary:

Inspection Report Accuracy/Completeness


-------
Explanation:

It is evident that inspection reports have greatly improved from Round 3 however, there are a few
lingering issues the impact report accuracy and/or completeness. Specifically: • The statement that
one of the purposes of inspections is to evaluate compliance with federal rules • Referencing,
linking supporting evidence such as photos to report narrative • Drawing conclusions in the report
rather than citing observations

Relevant metrics:











Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

5a Two-year inspection coverage of operating
TSDFs [GOAL]

100%

85%

2

3

66.7%

5b Annual inspection of LQGs using BR
universe [GOAL]

20%

15.6%

37

252

14.7%

5b 1 Annual inspection coverage of LQGs
using RCRAinfo universe [GOAL]

20%

9.9%

35

180

19.4%

6a Inspection reports complete and sufficient
to determine compliance [GOAL]

100%



20

29

69%

6b Timeliness of inspection report completion
[GOAL]

100%



24

29

82.8%

State Response:

EPA and DEQ have discussed these findings. Two of EPA's recommendations have already
been achieved, as noted in EPA's Recommendations below. As indicated below, if appropriate,
DEQ intends to conduct refresher training on revisions to the inspection reports in conjunction
with EDMS training, by December 30, 2020. DEQ considers periodic refresher training on these
elements will address EPA's identified concerns.

DEQ has the following additional specific comments regarding the metrics below:

5a: This measure (two-year inspection coverage of operating TSDFs) included the Umatilla
Military Depot facility, which ceased to have an operating unit prior to FY2018. Previous
communication from EPA Region 10 indicates that EPA did not believe a CEI or any RCRA
inspection is necessary based on the no wastes generated or stored onsite (see email from Scott
Downey, EPA, to DEQ, April 20, 2016). Oregon is working to ensure RCRAInfo accurately
reflects the current status for this facility.


-------
5b & 5b 1: DEQ understands we can use the biennial report (BR) LQG universe or the RCRAInfo
LQG universe to meet the LQG inspection goals. The 2015 BR was used for FY2018 inspection
goal review, which lags in updated information when compared to the current active LQGs in
DEQ's RCRA hazardous waste universe.

6a: DEQ recently revised the hazardous waste inspection report template to remind inspectors of
the following: 1) Use first-person to state facts and note observations; 2) Avoid using terms such
as "identified" or "documented;" 3) The site inspection portion should focus on what is observed
or what is seen; 4) Take pictures of what is being observed; 5) Link or reference the photo, if
possible, to the narrative text in the inspection report; 6) State if the observation is a violation; 7)
Emphasize: If observations show violations, take photos and reference the photos in the report
write up.

Recommendation:

Uec

#

Due Dale

Recommendation

1

07/30/2020

No later than July 30, 2020, the Oregon will change the purpose
section of its inspection report template so that it no longer indicates
that Oregon evaluated compliance with Federal regulation. This action
item was completed by September 1, 2019.

2

01/31/2020

Oregon will identify and update all appropriate inspection templates,
and guidance for Oregon statute and rule citations. This action item
was completed by September 1, 2019.

3

12/30/2020

No later than December 30, 2020, in coordination with data base
training, if appropriate, Oregon will conduct training on revisions to
written inspection reports. The training will address, among other
things: • Inspection report writing, • Updated templates • Oregon
regulation citations • Inspection observations and documentation

RCRA Element 3 - Violations

Finding 3-1

Meets or Exceeds Expectations

Summary:

Appropriate SNC determination.


-------
Explanation:

EPA found two instances that a SNC determination at first appeared appropriate but, in review of
the evidence and information provided agreed with the State's conclusion that evidence did not
support a SNC. None the less in each instance Oregon took an appropriate enforcement action that
returned the facility to compliance.

Relevant metrics:

Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

7a Accurate compliance determinations
[GOAL]

100%



25

29

86.2% 1

|
|
|

8b Timeliness of SNC determinations [GOAL]

100%

76.5%

5

5

100% I

8c Appropriate SNC determinations [GOAL]

100%

27

29

93.1% |

State Response:

No state response.

RCRA Element 4 - Enforcement

Finding 4-1

Meets or Exceeds Expectations

Summary:

Timely and appropriate enforcement.

Explanation:

In all cases except one, the states chosen enforcement action returned the facility to compliance.
In one case the facility did not immediately return to compliance and Oregon immediately issued
a penalty for failure to comply; In another action the state issued a penalty for small violations at
a small facility because the violations were repeat in nature. As part of this review, EPA looked
for instances where Oregon deferred to their Technical Assistance program over their compliance
program; EPA did not see any instance of this bias and instead saw a consistent use of both
elements of their program without muddling the appropriate use of the tools.

Relevant metrics:


-------
Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

10a Timely enforcement taken to address SNC
[GOAL]

100%

87.7%

1

2

50%

10b Appropriate enforcement taken to address
violations [GOAL]

100%

28

28

100%

9a Enforcement that returns sites to compliance
[GOAL]

100%

28

28

100%

State Response:

No state response.

RCRA Element 5 - Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary:

Economic benefit included in all penalty calculations/justifications.

Explanation:

Oregon provided penalty justifications that include economic benefit in all instances. However,
the justification for no economic benefit at times seemed to not be as thoroughly thought as may
be necessary. EPA and Oregon have already engaged in conversations on this topic and Oregon is
working on how better to explain no economic benefit.

Relevant metrics:


-------
Metric ID Number and Description

Natl
i Goal

Natl

Avg

State

N

State
D

State

%

1 la Gravity and economic benefit [GOAL]

| 100%



12

15

80%

12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]

I 100%



4

4

100%

12b Penalty collection [GOAL]

1 100%



15

15

100%

State Response:

No state response.


-------
STATE REVIEW FRAMEWORK

Oregon

Clean Air Act
Implementation in Federal Fiscal Year 2018

U.S. Environmental Protection Agency

Region 10

Final Report
March 10, 2020


-------
I. Introduction

A.	Overview of the State Review Framework

The State Review Framework (SRF) is a key mechanism for EPA oversight, providing a
nationally consistent process for reviewing the performance of state delegated compliance and
enforcement programs under three core federal statutes: Clean Air Act, Clean Water Act, and
Resource Conservation and Recovery Act. Through SRF, EPA periodically reviews such
programs using a standardized set of metrics to evaluate their performance against performance
standards laid out in federal statute, EPA regulations, policy, and guidance. When states do not
achieve standards, the EPA will work with them to improve performance.

Established in 2004, the review was developed jointly by EPA and Environmental Council of the
States (ECOS) in response to calls both inside and outside the agency for improved, more
consistent oversight of state delegated programs. The goals of the review that were agreed upon
at its formation remain relevant and unchanged today:

1.	Ensure delegated and EPA-run programs meet federal policy and baseline performance
standards

2.	Promote fair and consistent enforcement necessary to protect human health and the
environment

3.	Promote equitable treatment and level interstate playing field for business

4.	Provide transparency with publicly available data and reports

B.	The Review Process

The review is conducted on a rolling five-year cycle such that all programs are reviewed
approximately once every five years. The EPA evaluates programs on a one-year period of
performance, typically the one-year prior to review, using a standard set of metrics to make
findings on performance in five areas (elements) around which the report is organized: data,
inspections, violations, enforcement, and penalties. Wherever program performance is found to
deviate significantly from federal policy or standards, the EPA will issue recommendations for
corrective action which are monitored by EPA until completed and program performance
improves.

The SRF is currently in its 4th Round (FY2018-2022) of reviews, preceded by Round 3
(FY2012-2017), Round 2 (2008-2011), and Round 1 (FY2004-2007). Additional information
and final reports can be found at the EPA website under State Review Framework.

II. Navigating the Report

The final report contains the results and relevant information from the review including EPA and
program contact information, metric values, performance findings and explanations, program
responses, and EPA recommendations for corrective action where any significant deficiencies in
performance were found.


-------
A. Metrics

There are two general types of metrics used to assess program performance. The first are data
metrics, which reflect verified inspection and enforcement data from the national data systems
of each media, or statute. The second, and generally more significant, are file metrics, which are
derived from the review of individual facility files in order to determine if the program is
performing their compliance and enforcement responsibilities adequately.

Other information considered by EPA to make performance findings in addition to the metrics
includes results from previous SRF reviews, data metrics from the years in-between reviews,
multi-year metric trends.

B.	Performance Findings

The EPA makes findings on performance in five program areas:

•	Data - completeness, accuracy, and timeliness of data entry into national data systems

•	Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness

•	Violations - identification of violations, accuracy of compliance determinations, and
determination of significant noncompliance (SNC) or high priority violators (HPV)

•	Enforcement - timeliness and appropriateness of enforcement, returning facilities to
compliance

•	Penalties - calculation including gravity and economic benefit components, assessment,
and collection

Though performance generally varies across a spectrum, for the purposes of conducting a
standardized review, SRF categorizes performance into three findings levels:

Meets or Exceeds: No issues are found. Base standards of performance are met or exceeded.

Area for Attention: Minor issues are found. One or more metrics indicates performance
issues related to quality, process, or policy. The implementing agency is considered able to
correct the issue without additional EPA oversight.

Area for Improvement: Significant issues are found. One or more metrics indicates routine
and/or widespread performance issues related to quality, process, or policy. A
recommendation for corrective action is issued which contains specific actions and schedule
for completion. The EPA monitors implementation until completion.

C.	Recommendations for Corrective Action

Whenever the EPA makes a finding on performance of Area for Improvement, the EPA will
include a recommendation for corrective action, or recommendation, in the report. The purpose
of recommendations are to address significant performance issues and bring program
performance back in line with federal policy and standards. All recommendations should include


-------
specific actions and a schedule for completion, and their implementation is monitored by the
EPA until completion.

III. Review Process Information

Clean Air Act (CAA)

Kickoff letter sent: April 2, 2019

Data Metric Analysis and file selections sent to LRAPA: September 18, 2019

File reviews completed: November 8, 2019

Draft report sent to LRAPA: December 16, 2019

Comments from LRAPA received by EPA: February 25, 2020

Report Finalized: March 10, 2020

LRAPA and EPA key contacts:

Colleen Wagstaff, LRAPA
Elizabeth Walters, EPA CAA file reviewer
John Pavitt, EPA CAA file reviewer
Scott Wilder, EPA SRF Coordinator


-------
Executive Summary

Introduction

Clean Air Act (CAA)

Areas of Strong Performance

The following are aspects of the program that, according to the review, are being implemented at
a high level:

A Federally Reportable Violation (FRV) was accurately determined as non-HPV status.

Penalties were collected and documented properly.

Priority Issues to Address

The following are aspects of the program that, according to the review, are not meeting federal
standards and should be prioritized for management attention:

Clean Air Act (CAA)

Facility identifiers were consistently inaccurate, absent from facility files and/or did not match
with ICIS-Air.

Compliance determinations were generally inaccurate based on documentation in the FCE report
and facility files.

An HPV was not addressed in a timely manner or alternatively did not have a Case Development
Resolution Timeline (CDRT) in accordance with the HPV policy.

1 out of 2 formal enforcement responses did not include the required corrective action to return
the facility to compliance.


-------
Clean Air Act Findings

CAA Element 1 - Data

Finding 1-1

Area for Improvement

Summary:

Facility identifiers were consistently inaccurate, absent from facility files and/or did not match
with ICIS-Air.

Explanation:

Facility identifiers such as dates, programmatic ID, address, zip code and NAICS code were
consistently inaccurate or not included in FCE reports. A few FCE reports lacked basic details
such as the date(s) of inspection and whether it an announced or unannounced inspection.

In reviewing metric 3b2 on stack test timeliness, it was found that several stack tests did not specify
which pollutants were measured in ICIS-Air. While pollutants were generally accurate in ICIS-
Air, 4 out of 12 facilities did not have the applicable air programs and subparts.

Relevant metrics:

Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

2b Files reviewed where data are
accurately reflected in the national data
system [GOAL]

100%



0

12

0% |

LRAPA Response: LRAPA accepts EPA finding and agrees to, where deficient, implement the
EPA recommendations by the dates specified.

Recommendation:


-------
Ucc

#

Due Dale

Uecommemlalion

1

10/31/2020

The current FCE form used by LRAPA is a great template for
completing a report. However, EPA recommends that LRAPA rework
the front pages to include all of the necessary Minimum Data
Requirements (MDRs) and share a draft of the re-worked template
with EPA by March 31, 2020. Following FY20, EPA will review a
selection of LRAPA inspection reports (also for Finding 3-1) and if
EPA finds that the inspection reports include accurate facility
information, this recommendation will be closed.

2

10/31/2020

EPA recommends that LRAPA develop a plan and timeline on how to
input "pollutants measured" for source tests in ICIS-Air. Following
FY20, EPA will review a selection of stack tests and if 85% or greater
specify pollutants measure in ICIS-Air, this recommendation will be
closed.

CAA Element 1 - Data

Finding 1-1

Meets or Exceeds Expectations

Summary:

Explanation:

LRAPA reported 100% of stack tests and stack test results in a timely manner. However, several
stack tests did not specify which pollutants were measured in ICIS-Air. While pollutants were
generally accurate in ICIS-Air, 4 out of 12 facilities did not have the applicable air programs and
subparts.

CMS source category and frequency had also not been entered into ICIS-Air for all 12 facilities.
LRAPA reported 100% of compliance monitoring MDRs and enforcement MDRs within a timely
manner.

Relevant metrics:


-------
Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

3b 1 Timely reporting of compliance
monitoring MDRs [GOAL]

100%

85.2%

41

46

89.1%

3b2 Timely reporting of stack test dates
and results [GOAL]

100%

65.1%

11

11

100%

3b3 Timely reporting of enforcement
MDRs [GOAL]

100%

71.8%

1

1

100%

LRAPA Response: LRAPA accepts EPA finding and agrees to, where deficient, implement the
EPA recommendations by the dates specified.

Ucc

#

Due Dale

Recommendation

1

04/17/2020

The current FCE form used by LRAPA is a great template for
| completing a report. EPA is recommending that LRAPA rework the
| front pages to include all of the necessary MDR and share a draft of the
I re-worked template with EPA by March 31, 2020. The State will then
1 begin using the new template by April 17, 2020.

2

05/29/2020

| By May 29, 2020, the Agency must determine how to input "pollutants
I measured" for source tests in ICIS-Air and provide a timeline/plan to
EPA.

CAA Element 2 - Inspections

Finding 2-1

Area for Attention

Summary:

2 out of 12 inspection reports did not document all the required FCE elements.

Explanation:

2 out of 12 FCE reports lacked documentation of required FCE elements. Inspectors were generally
thorough in their review of reports and documents and included a summary of the documents


-------
reviewed in their reports. However, the 2 FCE reports did not review all the necessary underlying
documents and reports and/or did not include an assessment of those documents.

Relevant metrics:

. ... .. . , „ • -•	Natl	; Natl State State State

Metric ID Number and Description	, ..

1	Goal Avg N D %

6a Documentation of FCE elements [GOAL]	. 100% :	, 10 12 , 83.3%

LRAPA Response: LRAPA accepts EPA finding and agrees to improve inspection reports
so they document all the required FCE elements.

CAA Element 2 - Inspections

Finding 2-2

Area for Attention

Summary:

FCE reports occasionally lacked enough information to determine compliance.

Explanation:

The Agency conducted FCEs on 100% of the major, mega-site and SM-80 sources within their
jurisdiction and 94.1% of Title V Annual Compliance Certification reviews were completed in
FY18. FCE reports generally documented facility and regulatory information, but 3 out of 12 files
lacked the necessary information in order to determine compliance of the facility. 3 FCE reports
did not thoroughly assess the federal requirements, regulated emission units, or their on-site
observations. There were also a few FCE reports that consisted of multiple inspections over a
period of time. These multiple inspections are logged into what appears to be a database and is a
narrative of the inspector's on-site observations, records review and discussions with the facility.
However, this information often did not make it into the final FCE report. As a result, the final
FCE report lacked a lot of information and major details. Additionally, a few of the final FCE
reports also switched frequently between tenses (maintain, maintained, maintaining) and/or did
not have the complete name of the inspector (i.e. initials only).

Relevant metrics:


-------
Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

5a FCE coverage: majors and mega-sites
[GOAL]

100%

88.1%

11

11

100%

5b FCE coverage: SM-80s [GOAL]

100%

93.7%

2

2

100%

5c FCE coverage: minors and synthetic minors
(non-SM 80s) that are part of CMS plan or
alternative CMS Plan [GOAL]

100%

70.1%

0

0

0

5e Reviews of Title V annual compliance
certifications completed [GOAL]

100%

82.5%

16

17

94.1%

6b Compliance monitoring reports (CMRs) or
facility files reviewed that provide sufficient
documentation to determine compliance of the
facility [GOAL]

100%



9

12

75%

LRAPA Response: LRAPA accepts EPA Finding and agrees to include sufficient
information, including reference to Permit conditions, in the FCE reports, such that facility
compliance status may be readily ascertained by EPA reviewer.

CAA Element 3 - Violations

Finding 3-1

Area for Improvement

Summary:

Compliance determinations were generally inaccurate based on documentation in the FCE report
and facility files.

Explanation:

6 out of 12 FCE reports had inaccurately determined that the facility was "in compliance" when
they had violated permit conditions and/or permit requirements. FCE reports had identified a
violation of a permit condition and/or federal requirements, but incorrectly determined that the
facility was "in compliance." Overall, inspectors were not thorough in reviewing and assessing
federal requirements during the FCE and in their final reports and did not make accurate
compliance determinations.


-------
Relevant metrics:

mix - i tv vr u j rv • x-	Natl Natl State State State

Metric ID Number and Description

1	Goal Avg N D %

7a Accurate compliance determinations [GOAL] 100%	6 12 50%

LRAPA Response: LRAPA accepts EPA finding, and will improve documentation of
compliance status in the FCE reports. It is LRAPA understanding, that to remedy the
Recommendation, an instance of non-compliance documented during the reporting period
will be readily apparent to the EPA report reviewer, even though the facility may be "in-
compliance" at the time of FCE report submittal. LRAPA will also observe the EPA-led
inspections in Lane County and is anticipating EPA visit the week of March 30, 2019.
LRAPA inspectors and compliance staff will review the HPV Policy and HPV training
prior to the inspections.

Recommendation:

Ucc

#

Due Dale

07/31/2020

Recommendation

By July 31, 2020, LRAPA will observe 3 EPA-lead FCEs in Lane
County. EPA recommends that the LRAPA inspectors review the High
Priority Violation Policy and the HPV training developed by EPA prior
to each inspection.

CAA Element 3 - Violations

Finding 3-2

Meets or Exceeds Expectations

Summary:

1 file included a Federally Reportable Violation (FRV) which was accurately determined as non-
HP V status.

Explanation:

1 facility file documented a Federally Reportable Violation (FRV). The FRV was accurately
determined as non-High Priority Violation (HPV) status by the agency.


-------
Relevant metrics:











Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

13 Timeliness of HPV Identification [GOAL]

100%

89.5%

0

0

0

|

8c Accuracy of HPV determinations [GOAL]

100%

| 1

1

100% !

LRAPA Response: LRAPA accepts EPA finding.

CAA Element 4 - Enforcement

Finding 4-1

Area for Improvement

Summary:

1 out of 2 formal enforcement responses did not include the required corrective action to return
the facility to compliance.

Explanation:

2 FCEs resulted in a formal enforcement response. 1 out of 2 files did not include the required
corrective action. The violation was an unidentified High Priority Violation (HPV) of a MACT
subpart. A penalty was not assessed and therefore the facility file did not include the required
corrective action as per the HPV policy.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

9a Formal enforcement responses that include
required corrective action that will return the
facility to compliance in a specified time frame or
the facility fixed the problem without a
compliance schedule [GOAL]

100%

50%


-------
LRAPA Response: LRAPA accepts EPA finding and agrees to include in the formal
enforcement responses, the corrective action required or taken to return the facility to
compliance.

Recommendation:

Ucc

#

Due Dale

Recommendation

By March 27, 2020, LRAPA will organize and provide HPV training
to staff. EPA will provide HPV training already developed by EPA
HQ.

CAA Element 4 - Enforcement

Finding 4-2

Area for Improvement

Summary:

An HPV was not addressed in a timely manner or alternatively did not have a Case Development
Resolution Timeline (CDRT) in accordance with the HPV policy.

Explanation:

1 facility file contained a High Priority Violation (HPV) that was identified by the agency.
However, the HPV was not addressed in a timely manner or alternatively did not have a CDRT in
accordance with the HPV policy.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

10a Timeliness of addressing HPVs or
alternatively having a case development and
resolution timeline in place

100%

0%

LRAPA Response: LRAPA accepts EPA finding and agrees to address HPVs in a timely
manner or alternatively have a Case Development Resolution Timeline (CDRT). and agrees


-------
to improve efforts in consultation with EPA in determination of accurate HPV reporting
and to develop the HPV timeline as recommended.

Recommendation:

Ucc

#

Due Dale

Recommendation

By September 30, 2020, LRAPA will develop and use a system to
track the progression of an HPV (discovery, notify EPA, address,
resolution). This system should include the actual dates, the listed
timelines in the HPV policy, and a CDRT if the HPV is not resolved in
a timely manner. This system can be developed in Excel, Word, a
database, etc., but should be designed so that it can be printed and
included in the facility file.

CAA Element 5 - Penalties

Finding 5-1

Meets or Exceeds Expectations

Summary:

Penalties were collected and documented.

Explanation:

1 out of 1 facility file had documentation of check transmittal for a collected penalty.

Relevant metrics:

Metric ID Number and Description

Natl Natl State State State
Goal Avg N D %

1 la Penalty calculations reviewed that document
gravity and economic benefit [GOAL]

100%

100%

LRAPA Response: LRAPA accepts EPA finding and will continue to document violations
and collect penalties properly.


-------
CAA Element 5 - Penalties

Finding 5-2

Meets or Exceeds Expectations

Summary:

The initial penalty value and final penalty value did not differ.

Explanation:

1 facility file contained an assessed penalty. The initial penalty value and final penalty did not
differ, so documentation of rationale for difference is not required.

Relevant metrics:

Metric ID Number and Description

Natl
Goal

Natl

Avg

State

N

State
D

State

%

12a Documentation of rationale for difference
between initial penalty calculation and final
penalty [GOAL]

100%



'

'

100% |

LRAPA Response: LRAPA accepts EPA finding.

CAA Element 5 - Penalties

Finding 5-3

Meets or Exceeds Expectations

Summary:

Penalties were collected and documented.

Explanation:

1 out of 1 facility file had documentation of check transmittal for a collected penalty.


-------
Relevant metrics:

, . Irk.. , , .. . .	Natl	Natl	State	State	State

Metric ID Number and Description	„ ,	; . ..	...

1	Goal	Avg N	D %

12b Penalties collected [GOAL]	100% 1	1	100%

LRAPA Response: LRAPA accepts EPA finding.


-------