State Review Framework

Compliance	and Enforc

Oversight

SRF Reviewer's Guide
Round 5 (2024- 2028)



V

u

%

U.S. Environmental Protection Agency
Office of Compliance, Office of Enforcement and Compliance

Assurance (OECA)


-------
SRF Reviewer's Guide - Round 5

Table of Contents

I.	INTRODUCTION	4

A.	Background on the State Review Framework	4

B.	The Importance of a High-Quality Review	5

C.	Overview of the SRF Review Process	5

II.	Preparing for the SRF Review	6

A.	Data Verification	6

B.	Selecting Local Agencies and State District Offices (if applicable)	7

1.	Reviewing Local Agencies	7

2.	Reviewing State District Offices	8

C.	Kickoff Letter / Conference	10

1.	Kickoff Letter	10

2.	Kickoff Conference (optional)	10

D.	Regional Coordination with Headquarters During the SRF Review	11

E.	Other Considerations for the Review	12

1. Environmental Justice and Climate Change	12

III.	Conducting the Review	12

A.	Data Metric Analysis (DMA)	12

B.	Overview of the Annual Review	13

C.	Sources of Data and Information	14

D.	Conducting the Annual Review (process and end products)	17

E.	When Potential Issues are Identified (Follow-up)	18

IV.	OECA's Responsibility	19

A.	File Selection	19

1.	File Selection Preparation	19

2.	Determining Minimum Number of Facilities to Review	20

3.	Selecting a Representative Sample of Files	21

4.	Supplemental File Selection	22

5.	Transmit File Selection List	23

B.	File Review	23

1.	File Review Preparation	23

2.	Coordination with State	24

3.	Conducting the File Review	24

2


-------
SRF Reviewer's Guide - Round 5

4.	Developing Preliminary Findings	26

5.	Exit Conference	26

V.	Drafting and Finalizing the SRF Report	27

A.	Developing the First Draft	27

1.	Administrative Information	28

2.	Performance Findings	29

3.	Recommendations	31

4.	Executive Summary	32

B.	Finalizing the Report	34

1.	HQ Review of Initial Draft Report	34

2.	HQ Review of Subsequent Draft SRF Reports	34

3.	State Comment on Draft SRF Report	34

VI.	Post-Review Recommendation Monitoring and Closeout	35

A.	Annual Recommendation Inventory	35

B.	Monitoring Ongoing Recommendations (known as "working" in previous rounds)	36

C.	Prioritizing and Elevating Overdue Recommendations	36

D.	Verifying Recommendation Completion	37

1.	Policies, Guidance, and Procedures	37

2.	Incomplete, Inaccurate, and Untimely Entry of Data	37

3.	Insufficient knowledge, skills, and abilities	38

4.	Inadequate Inspection Reports and Documentation of Penalty Calculations	38

5.	Inadequate SNC-HPV determination, Return to Compliance, and Appropriate and
Timely Enforcement Action	38

Appendix A: SRF Key Information	40

Appendix B: Data Verification	42

Appendix C: Regional and Headquarters Coordination During the SRF Review	43

Appendix D: Kick-off Letter Template	46

Appendix E: Data Metric Analysis (DMA) Procedures	48

Appendix F: File Selection Procedures	50

Appendix G: Checklist of Key Items for Conducting File Review	52

Appendix H: SRF Draft Report Completeness Checklist	53

Appendix I: Sample Finding and Recommendation	54

Appendix J: Establishing Finding Levels	55

Appendix K: Tips for Conducting Electronic File Reviews Under the State Review Framework. 56
Appendix L: Annual Data Metric Analysis	59

3


-------
SRF Reviewer's Guide - Round 5

I. INTRODUCTION

This document, "The SRF Reviewer's Guide," serves as the comprehensive guidance for the
State Review Framework (SRF), meant to provide regional and headquarters personnel
involved in an SRF review — managers, liaisons, coordinators, and program leads — with a
description of the process for conducting the review, roles and responsibilities, and essential
content to include in the final report.

Though the Reviewer's Guide covers a considerable amount of material, it is not exhaustive and
therefore is intended to be used in conjunction with additional material when conducting a
review, such as:

•	Media-Specific Plain Language Guides (PLGs) - in-depth descriptions of the review
elements and metrics along with instructions on using the metrics to make appropriate
performance findings

•	Metric Quick Reference Guides - spreadsheets with SRF metric descriptions

•	File Review Checklist - template to document file specific information for each file
review metric during a file review

•	File Review Worksheet-template to compile results and preliminary findings based on
the file review metrics

•	Training Videos - visual step-by-step explanations of key parts of a review (e.g., file
selection)

•	SRF Manager Database: User Guide - Instructions on how to use the database

The documents above can be found on the SRF Manager Database, while the training videos
are available on ECHO'S SRF web page.

A. Background on the State Review Framework

The State Review Framework (SRF) is the primary means by which EPA conducts oversight of
state delegated and EPA directly implemented compliance and enforcement programs under
three core federal environmental statutes covering air, water, land (Clean Air Act, Clean Water
Act, Resource Conservation and Recovery Act). SRF was established in 2004, developed jointly
by EPA and the Environmental Council of the States (ECOS) in response to calls both inside and
outside the agency for improved and more consistent oversight. The key goals that were agreed
upon at its formation are:

1.	Ensure delegated programs and EPA Direct Implementation (Dl) programs meet minimum
performance standards outlined in federal policies and guidance

2.	Promote fair and consistent enforcement necessary to protect human health and the
environment

3.	Promote equitable treatment and level interstate playing field for business

4.	Provide transparency with publicly available data and reports.

4


-------
SRF Reviewer's Guide - Round 5

The review is conducted on a five-year cycle, so programs are reviewed once every five years.
Programs are evaluated on a one-year period of performance; typically,
the fiscal year prior to review. The review is based on a standardized set of metrics to make
findings on performance in five categories: data, inspections, violations, enforcement, and
penalties. EPA issues recommendations for deficiencies and tracks through completion. Results
of the review are organized into a final report which is published on EPA's public web site.

B.	The Importance of a High-Quality Review

Conducting a thorough, high-quality review is essential if the findings on program performance
are to be considered accurate and credible - an important factor in terms of oversight and public
transparency. Furthermore, a high-quality review increases the likelihood that if or when
performance issues are identified, EPA and the authorized program can effectively work together
to improve performance and return to conformance with federal policy and standards.

What comprises a high-quality review?

¦S States, regional Dl programs and local agencies have the opportunity to verify to ensure

reviews use complete and accurate data to establish findings.

¦S The selection of files is sufficient in number and to the degree possible, representative of

the universe and program activities.

¦S Program reviewers are adequately trained (training modules for reviewers' refresher can

be found at: https://echo.epa.gov/help/state-review-framework/training).

¦S Findings on performance are accurate and substantiated.

¦S The report is clear and concise.

¦S Recommendations are SMART (specific, measurable, achievable, results-oriented, time-
bound).

¦S Recommendation implementation is monitored

(https://www.epa.gov/compliance/state-review-framework-results-table) and
completion verified.

C.	Overview of the SRF Review Process

The review typically takes one year to complete, from verification of data in the national systems
by state and regional data stewards until publication of the final report on the public SRF web
site. The diagram below outlines general stages in the review process and a suggested schedule
for completion. The three fixed dates pertain to data verification, the draft report, and final
report.

5


-------
SRF Reviewer's Guide - Round 5

Figure 1. Overview of SRF Process

Action

Time Period

Preparing for the File Review

November-Febru ary

Conducting the Review

March-August

• Data Metric Analysis

60 days before review

• CWA Inspection Coverage Table

60 days before review

• File Selection

30 days before review

• On-Site or Remote Review of Files



• File Review Worksheet

30 days after review

Drafting And Finalizing Report

September-December

• Draft Report

By September 30

• HQ Comment Period

15 working days

• Send Revised Report to HQ



• State/Region Program Comment Period

30 calendar days

• Final Report

By December 31

Recommendation Monitoring and Close Out

Ongoing

•	Track recommendation implementation process

•	Work with reviewed program to document progress and develop
completion verification statement

•	Completion Verification and Close Out

Ongoing

II. Preparing for the SRF Review



Before reviewers can begin the substantive review - analyzing the compliance and enforcement
data metrics and reviewing facility files - a series of preparatory steps are required. These steps
include ensuring accuracy of data in the national databases; if applicable, selecting the
appropriate local agencies or state district offices to review and; establishing communication

with the agency being reviewed to officially notify them of the upcoming review and coordinate
the review process. These steps are described in more detail below.

A. Data Verification

Data verification is an annual process by which states, and regions responsible for Direct
Implementation (Dl) programs, can review and correct data in the national data systems (e.g. -
facility and activity counts) to ensure completeness and accuracy.

Steps undertaken by data stewards to verify data are listed in Appendix B

Since the SRF review relies on verified data, all data stewards for state delegated programs and

6


-------
SRF Reviewer's Guide - Round 5

EPA Dl programs need to complete steps outlined in Appendix B. This typically occurs during
the November to February timeframe following the SRF review year.

Important: Regional SRF Coordinators should work with their regional data stewards to
ensure completion of the data verification process.

Once the period of data verification concludes, "frozen data", namely the final SRF data metrics
based on verified numbers, should be made available soon after. At that time, reviewers can
begin the main portion of the review by conducting the data metric analyses and selecting
facility files for the file review. Should a region decide to use production data (unfrozen data),
the region will be required to reconcile the frozen data with the production data used to initiate
the review sooner.

Note: EPA's ECHO State Dashboards - Historically, the state dashboards that track state data
on key enforcement performance indicators relied on a verified or "frozen" dataset. Based on
an agreement between EPA and states in 2018, ECHO switched to using live or unverified data
to populate the dashboards. As a result, data updates can be made in the data systems and
will be reflected on the dashboard after the data verification period has ended.

Important: Ideally, SRF reviews will be based only on data that has been verified or frozen
through the data verification process. If a situation occurs in which frozen data isn't available
or accurate, the option of using state database or production data can be used to start the
review process.

The region will need to coordinate with the state more closely to make sure the data will
support the appropriate elements and files to reflect the activities during the review year. The
region may need to consider supplemental files to ensure appropriate number files are valid to
complete the review for each element. The review team will be required to reconcile the data
with frozen database and production data. State database information may be mentioned as
a recommendation for the data element of the report (see Appendix B on Data Verification
process and the recommendation section on providing S.M.A.R.T. recommendations for data
clean-up).

B. Selecting Local Agencies and State District Offices (if applicable)

This section applies only to reviews of states with authorized local agencies or states with a
decentralized state district office structure. If this situation doesn't apply, you can skip this
section.

1. Reviewing Local Agencies

In some states, local agencies and state districts may have a delegation authority role in
implementing compliance monitoring and enforcement programs. Therefore, as part of the
State Review Framework, EPA reviews local agencies and state districts to ensure they are

7


-------
SRF Reviewer's Guide - Round 5

implementing their inspection and/or enforcement programs consistent with national
policy and guidance.

Local agencies, as described in this section, are those that implement their programs in lieu
of a state agency or EPA. They are different from state district offices because local agencies
are not staffed by state employees and generally only have jurisdiction within a city, county,
or metropolitan area.

a.	Determining which local agencies to review

Generally, EPA should review in each SRF round local agencies that regulate moderately
to heavily populated cities, counties, or metropolitan areas as well as those serving
areas with a smaller universe. If a state has several local agencies, it may be appropriate
to review some in one review cycle and the others during a later cycle with the goal of
covering each jurisdiction over time to ensure oversight coverage across the entire
state. This might depend on the size, location, and responsibilities of the local agencies.
If local agencies are to be reviewed in a staggered fashion, regions should indicate their
plans for selecting local agencies, including the criteria and analysis involved in targeting
selected local agencies and state district offices, as part of the discussions with
Headquarters at the beginning of the SRF review cycle.

b.	Conducting a local agency review

This means EPA must include data metric analysis, file selection lists, files, findings,
recommendations, and metrics specific to the local agency and separate from the state
agency in the final report. Once EPA completes the state and local agency reviews, both
the state and local agency findings will be included in the final report and uploaded onto
the public SRF Website. If these reviews are done in separate years, please send reports
separately, and HQ will combine the content into one consolidated report for each cycle
on the public SRF website.

2. Reviewing State District Offices

Many state agencies administer their compliance and enforcement programs out of district
offices (these may also be called regional offices, districts, or boards). SRF data and file
reviews cover a state in its entirety regardless of whether it administers programs and
stores its facility files at a central office or at district offices.

An SRF file review in a state with districts may require selected facility files be sent to a
central location (if files are not already centrally located or available electronically). If that is
not possible, the EPA region should attempt to conduct file reviews at every district office,
in which case the review will follow the same rules as any other SRF review. Where it is not
possible for the EPA region to review files from every district, the EPA region should meet
the criteria described below and agreed upon with their HQ SRF Liaison.

8


-------
SRF Reviewer's Guide - Round 5

Except for the steps below, EPA will conduct these SRF reviews in the same way as those in
any other state.

c.	Selecting a subset of state district offices for review

If reviewing a subset of districts, consider the criteria below to determine which and
how many districts to review. EPA regions may choose to review different districts in
CAA, CWA, and RCRA based on these factors:

• Size of state and number of district offices: Conduct file reviews at a minimum of
two state district offices per media program. Three or more is preferred. In a
large state with many offices, such as California, reviewing all or even most may
not be possible. Regardless, EPA should try to review more districts in a state with
many district offices than it would in a state with few district offices. It should also
review more facilities in a state with a larger universe than it would in a state with
a smaller universe.

•	Known problems in district offices: Once EPA has established the number of
district offices to visit, begin to decide which to review by:

•	Considering known compliance monitoring and enforcement problems in the
districts

•	Evaluating ADMA of the districts to determine selection priority

•	Asking the state about performance issues in each district

•	Breaking out SRF data metrics for each district, if possible

•	Districts visited during previous reviews: The state's prior SRF reports may
provide additional information on how districts reviewed during those rounds
were performing. If EPA did not review a district during the previous round, it
should receive additional consideration in the current round.

A "representative sample" of district offices:

Generally, EPA should evaluate the state hoiisticaMy, even when performance
varies significantly across the districts reviewed. Unless there is clear evidence that
issues are isolated, EPA should not assume that the problem only exists in one or
more of the districts reviewed — the problem could also exist in districts not
reviewed. When drafting the report, EPA should write the finding and
recommendation to ensure adequacy of performance state-wide.

d.	Next Steps

The EPA region should communicate which districts it plans to review, and the rationale
(e.g., selection criteria, annual Data metric analysis (ADMA), and information

9


-------
SRF Reviewer's Guide - Round 5

considered) for selecting them to its HQ liaison prior to developing the data metric
analysis and file selection list. Upon reaching agreement with HQ, the reviewer can
begin the file selection process. See Table 1 in the File Selection section below for
guidelines on how many facility files to pull.

C. Kickoff Letter / Conference

To mark the official start of the review process, regions typically send an initial communication
letter, or "kickoff" letter, to the state or local agency to notify them of the upcoming review,
provide details on logistics and contacts, and coordinate the review schedule. Depending on
needs and resources, regions may choose to also set up an in-person meeting or conference call.

1.	Kickoff Letter

Communication can be in the form of either a formal written letter from a Regional senior
manager or an informal email from a program manager to his/her state/local counterpart.
To fully inform the state and local agency of the purpose and details of the review and
ensure coordination goes smoothly, make sure to include the following:

•	The purpose of the review and expected process

•	A summary of discussions and agreements to date regarding the upcoming review

•	The date and time of the on-site review (if already scheduled), or the need to schedule it

•	Media-specific program leads with contact information

•	Explanation of next steps

If the region intends to hold a kick-off conference, the letter should also include the date,
time, and topics for discussion (see below).

A suggested kickoff letter template is attached in Appendix D.

2.	Kickoff Conference (optional)

a. Personnel and Scheduling

If scheduling a kickoff conference with the state or local agency, determine who should
attend the conference. For EPA, this would generally include the SRF regional
coordinator, the media program reviewers, and the appropriate senior managers. For
the state or local agency, it might be senior management and staff coordinating the
review. EPA and the state will need to determine how they will conduct the conference
— it can be in person, video, or phone. The Regional Coordinator can work with the state
to schedule the meeting.

b. Conducting the Conference

10


-------
SRF Reviewer's Guide - Round 5

EPA should discuss the following topics during the conference:

•	Changes to the SRF process for Round 5, such as revisions to the metrics and guidance

•	Results of the Annual Data Metric Analyses (ADMA) from previous years and Data
Metric Analysis (DMA) for the SRF review year

•	The scope of the review

•	Status of performance issues and unresolved recommended actions from previous
SRF reviews

•	Expected timeline for the current review national guidance and policies.

D. Regional Coordination with Headquarters During the SRF Review
Roles and Responsibilities

Regional Coordinator: Ensure effective and efficient communication and coordination occurs
between the Region (within the region, regional coordinator, SRF reviewers, relevant regional
management) and Headquarters. Coordinators are ultimately responsible for making sure all
relevant documents, including the draft and final report, are submitted on time and of high
quality.

Headquarters Liaison: Assist the Coordinator with training, technical assistance, and guidance
on SRF policy, process, and materials. Liaisons are responsible for working with HQ staff and the
Regions to ensure the completeness and accuracy of all review materials and their consistency
with national guidance and policies.

The SRF review process involves numerous process steps, documents, people as well as
managing a shared database and therefore requires a considerable amount of coordination.
EPA regions may select from one of two process tracks for coordinating SRF reviews with
Headquarters that best suits their needs. The emphasis in Track 1 is on an initial comprehensive
scoping meeting, while Track 2 relies on check-ins throughout the review process. Regions are
encouraged to communicate with Headquarters whenever issues or questions arise.

Track 1

Scoping Meeting
Emphasis

Initial communication and concurrence occurs between
region and HQ in the form of a preliminary scoping
meeting



Track 2

Periodic Check-In
Emphasis

•Periodic communication and concurrence between
regional SRF coordinator and HQ SRF liaison occurs
at multiple steps in the process

11


-------
SRF Reviewer's Guide - Round 5

Appendix C provides a detailed description of each track.

Region should let their SRF Liaison know which track they intend to use prior to
beginning the review.

E. Other Considerations for the Review

1. Environmental Justice and Climate Change

New EPA strategy directs EPA's Enforcement and Compliance Program to help tackle the
climate crisis and strengthen enforcement in communities with environmental justice
concerns as a shared goal and responsibility between EPA and other partner agencies. While
Environmental Justice and Climate Change are agency priorities, and EPA measures federal
inspections in these areas, at the time of publication in early 2024, there are no national
policies for states on the use of these considerations in compliance and enforcement.
However, if Environmental Justice and Climate Change are part of a state's alternative
compliance monitoring plan or strategy, regions may incorporate them as criteria under
"other considerations" associated with inspections, non-compliance, and enforcement
actions during the SRF review. EPA regions and states can use EJScreen. the Climate and
Economic Justice Screening Tool, or other suitable geographic information system and
mapping tools and data to identify communities with potential environmental justice and
climate change concerns and evaluate the delegated authority's enforcement and
compliance program activities that impact those areas. The region may integrate its review
of the state's performance related to environmental justice and climate change into the
discussion in the Executive Summary.

III. Conducting the Review

When all the preparatory steps above have been completed, reviewers can begin the
substantive portion of the review, starting with the analysis of compliance and enforcement data
generated from the national data systems followed by the review of facility files for a more
qualitative look at program activities.

A. Data Metric Analysis (DMA)

Roles and Responsibilities

Regional Coordinator: Submit DMA and CWA Inspection Table (60-days before the file
review)

HQ Liaison: Review material for completion and accuracy

A Data Metric Analysis (DMA) contains a set of metrics that provide information on program

12


-------
SRF Reviewer's Guide - Round 5

compliance monitoring and enforcement activities by fiscal year. The metric data is sourced
from ECHO, which in turn pulls data from the national data systems for each media. The DMA
metric values will serve as one of two main sources - along with file metrics - of information used
to make findings on program performance.

Conducting a DMA represents the first analytical step in the review process. Based on the data
metric values, the reviewer will develop initial findings based on the recommended finding levels
shown below (table 2 on p.29). These can be used to determine what program areas may need
additional focus for file selection, on-site reviews, or discussion with state and locals.

**The instructions for completing a DMA and making preliminary findings are outlined in
Appendix E. Reviewers may also log in to ECHO to view the training video with step-by-step
instructions on conducting a DMA.

Once a DMA has been completed and reviewed by HQ, reviewers may wish to share the DMA
with the state or local agency as part of the kick-off letter or meeting, as mentioned in the section
above (section C below has more details on the Kick-off Letter and meeting coordination;
Appendix D has an example Kick-off letter). This will allow for agencies to provide any feedback
or corrections of the data before findings are drafted in a report, since findings should be based
on data that the region and state agree are accurate. All metrics in the data metric analysis are
required to appear in SRF reports including goal metrics, CMS metrics, and review indicator
metrics.

B. Overview of the Annual Review
Roles and Responsibilities

Regional Coordinator: Work with Regional program staff to develop ADMA and share with the
state. Upload final ADMA to SRF Manager.

HQ Liaison: Assist Region on development of ADMA and long-term trend data, if needed.
Annual Data Metric Analysis (ADMA)

Annual Review of SRF Data and Recommendations

The State Review Framework (SRF) uses a five-year cycle, with each program typically reviewed
once every five years. In the intervening years, EPA can conduct routine oversight. The Annual
Review, or Annual Data Metric Analysis (ADMA) outlined in this document supplements existing
oversight practices by providing additional structure and consistency. The purpose of the
Annual Review is to

a)	identify and address performance issues in a timely manner, without waiting for a full
SRF review; and

b)	ensure program performance improvement in areas identified as "area for
improvement" in previous reviews.

13


-------
SRF Reviewer's Guide - Round 5

If the region chooses to use this process, it should cover each media in SRF, including Clean Air
Act, Clean Water Act NPDES and RCRA, and should be completed at least once per year, except
for the year of the formal comprehensive SRF Review.

The annual review isn't intended to replace or supersede existing regional oversight tools,
rather it's meant to supplement efforts by providing a different approach.

The table below summarizes major steps in the annual review and comprehensive SRF, notes
the differences, specifically in formality and public reporting/posting.

Annual Review

Comprehensive Review

Review Data Metric Analysis

Formal In brief

Review Recommendation Status

Review Data Metric Analysis

Review End of Year Report

File Selection

Compile Information in Standard Format

File Review Metric Analysis

Discuss Results with State

Formal Out brief

Develop and Implement Action Plan, if
Necessary

Draft Report (findings and recommendations) in SRF
Manager database

Upload report to SRF Manager (non-public)

Opportunity for State Review and Comment

Meet With State, as Necessary

Post Final Report on Public Site



Track Recommendations Through Closure

C. Sources of Data and Information

Annual Data Metrics. The full suite of metrics providing information on universe size,
compliance monitoring, and enforcement activities derived from the national database
(ICIS-AIR, ICIS-NPDES, RCRA Info). The metrics are available on EPA's Enforcement and
Compliance History Online (ECHO) (see instructions in Attachment #1 on how to compile
data and Attachment #2 for a list of included metrics). The comprehensive SRF and the
annual data metrics have some overlap in metrics, however there are additional metrics
in the annual data metrics not included in the comprehensive SRF. The goals remain the
same for a metric if it's included in both reviews.

Data Trends (3-5 years). The Annual Data Metrics available on ECHO can be compiled to
include results over specific periods, for example 3 years. This data may highlight trends,
anomalies or outliers in the data, or other patterns to assess state performance or data
quality.

SRF recommendations. The SRF requires recommendations for metrics that fall below
national goals, these are labeled as an area for improvement. EPA tracks the status of
these recommendations in its internal tracking tool (SRF Manager database). The


-------
SRF Reviewer's Guide - Round 5

manager organizes details of the recommendation such as required deliverables or
actions and lists the associated timeline for completion. The database itself is only
available to EPA employees, however the states or public can view the data on the
public facing State Review Framework - Results Table. EPA coordinates with states to
ensure progress in closing recommendations and assists as requested. When reviewing
recommendations, it's important to discuss challenges in resolving recommendations.
The annual review may create an opportunity for states to elevate awareness and
possibly gain additional support from the region or EPA HQ. Ensuring recommendations
remain on their assigned timeline and are properly addressed is one tool to support an
effective monitoring and enforcement program.

State End-of-Year (EOY) reports. States submit end-of-year reports to the EPA. These
reports detail performance of the state's grant work plan and may provide contextual
information such as organizational structure and resource allocation, and how this
impacts state performance. It's recommended for regions to review the EOY report in
conjunction with other data sources in the annual review, as this may provide insight to
the state's priorities or possibly constraints within their program.

Compliance Monitoring Strategy (CMS) or Alternative. The CMS defines program
priorities, implementation strategies, inspection commitments, and regional
performance measures.

Potential Benefits and Uses of the Annual Review

Planning and Targeting of Enforcement Resources: Regions can use ADMA information
to inform annual planning and targeting discussions with states. For example, violation
and SNC rates may suggest program "weakness" within a state and influence EPA's
inspection targeting and investment of inspection resources within a state. ADMAs can
also be helpful in grant workplan discussions with states by helping set annual
commitments for enforcement activities. Additionally, if the review notes positive
results, the state can use this data to further support compliance and enforcement
initiatives, especially if they're recently enacted.

Verifying or Checking the Quality of SRF Data: ADMAs can be used prior to and during
data verification as a cross-check to confirm numbers reported by the states for data
metrics are accurately reflected in the data systems. Regions can engage with states on
the quality of their data before the data is frozen, providing states an opportunity to
address during the verification process. This proactive step may provide immediate
impacts in data quality and may also lead to a successful comprehensive SRF if the
corrective action resolves the root cause. States can supply explanations where data
issues exist and may contact the region during the data verification process to discuss or
explain issues.

15


-------
SRF Reviewer's Guide - Round 5

Identify Trends: Trends in data could show strong or weak performance, and the annual
review is an opportunity to recognize performance and take appropriate action in a
timely manner. Similar to timely enforcement of environmental regulations, timely
action on data metrics should result in a return to conformance or compliance or to
acknowledge effective programs. Annual Data Metric Analysis can be used during
routine oversight of states as a tool for reviewing the state performance and identifying
or getting ahead of any potential problems or issues that may require additional
oversight, monitoring and/or assistance. The ADMAs can provide valuable information
to be used during quarterly or annual discussions with the states on Performance
Partnership Agreement (PPA)/ Performance Partnership Grants (PPG) or other program
grants commitments.

For example, state data can be reviewed to determine if minimum data requirements
are being met for data entry, inspection commitments are meeting CMS goals, and
enforcement actions to ensure SNC are being addressed within designated timeframes.
If data seem to indicate an area of concern, Regions can request inspection reports or
other enforcement files for a more in-depth review. Increased communication with
states: EPA and states traditionally have multiple avenues to effectively communicate,
and the annual analysis provides another opportunity either by itself in a stand-alone
meeting, or as an additional agenda topic on an existing method.

Semi-Automated: The annual review is a semi-automated process that pulls data from
ECHO and presents data metrics for the specified media in an easy-to-follow table.
Regions can then review data and insert additional detail, such as trend analysis or
conclusions from conversations with the states, in predetermined sections. By being
semi-automated, the goal is to reduce the burden on regions and streamline the
process.

Support to OECA's ELMS Bowling Chart Metric: OECA established a bowling chart to
track certain metrics, and one is the SRF metric on status of recommendations. Each
region established a quarterly target, which represents the number of Round 4
recommendations and subsequently Round 5 recommendations in FY24 closed in the
reporting period. ADMAs can be a useful means for tracking states' progress in
addressing problems identified in a past SRF reviews. Although most SRF findings are
based on file metrics, the data metrics can be helpful in assessing state progress in fixing
problems and implementing recommendations from previous reviews. Furthermore,
annual data analysis in the form of trend data can help plan for upcoming SRF reviews
so there are fewer surprises when the reviews occur. Regions can also use the ADMAs
to determine which states might be candidates for conducting an SRF review in the
upcoming year. If the data indicates a drop off in performance in one of more areas, the
Region may decide to prioritize the state for a review sooner rather than wait for the full
5-year cycle.

16


-------
SRF Reviewer's Guide - Round 5

Opportunity to train new staff or to keep SRF principles at the front of mind: The annual
review can be completed by a single EPA employee; however, the team could also
expand to include new employees, or those with a specific area of expertise. This
presents an opportunity to train employees or gain a deeper understanding of data.
Since this is an annual process, reviewers keep SRF principles fresh in their mind.

D. Conducting the Annual Review (process and end products)

STEP #1: Determine date for the Annual Review

•	Option A: Prior to Data Verification (Oct - Feb)

o Data is from latest full fiscal year, but production data can be used to improve
the data verification process if completed prior to end of data verification period.

o Trend data can still be analyzed.

•	Option B: After Data Verification (Feb - May)

o Latest data are from previous fiscal year, at minimum approximately five months
old.

o Data are verified by State

*lt is not recommended to complete the ADMA between June and September as the data are
incomplete for the current FY.

STEP #2: Begin Annual Review by compiling Source Information

•	Download an ADMA (see Attachment #1 of Appendix L for instructions)

•	Download latest list of recommendations from the SRF Manager database
(recommendations are only for findings with an area for improvement)

•	Obtain Compliance Monitoring Strategy (CMS) Plan or Alternative CMS

•	Obtain End-of-Year Reports, if available

STEP #3: Use questions in attachment #2 of Appendix L and data metric analysis template
provided in attachment #3 of Appendix L to review data, identify potential issues, and
document follow-up.

STEP #4: Discuss results and findings with the state.

17


-------
SRF Reviewer's Guide - Round 5

STEP #§: Upload the completed template to SRF Manager database.

E. When Potential Issues are Identified (Follow-up)

•	Initial follow-up with the state to discuss results and correct data. Per the June 2023
Effective Partnerships Between EPA and the States in Civil Enforcement and Compliance
Assurance memo, regions should include discussions of the results of program audits in
their regular coordination meetings with states, including from the Annual Review.
These discussions should include the status of recommendations from the most recent
SRF review.

During these discussions the state should notify the Region of known data issues. Once
data quality is determined, the Region and the state should have a conversation on
outcome of the regional assessment. Discussion may include trend analysis, outliers,
anomalies or perceived strengths or weaknesses. If the region identifies an area with
room for improvement or an area of concern, they will coordinate with the state to
develop an action plan and coordinate through completion. This plan doesn't
necessarily have to be documented and there isn't a required format, but to aid in
developing an action plan, it's recommended regions and states identify the root cause
and use the Specific, Measurable, Cacheable, Relevant, and Time bound (SMART)
method when establishing actions.

•	Additional research and problem-solving. Based on the information in the Annual
Review as well as follow-up discussions, the Region may want to take additional steps to
investigate a potential issue, such as completing an abbreviated file review as part of
enhanced oversight:

o If a comprehensive review is scheduled in the next 18 months, the Region
may decide to select supplemental files based on areas of concern identified
in the Annual Review
o The Region could identify and review a subset of files normally done in the
comprehensive review. For example, a normal file review may include 35
files, whereas the subset for the annual analysis may be five files target for
the specific area of focus,
o If the state maintains an accessible online records inventory, or is willing to
provide records electronically, the Region has the option conducting the
review remotely.

o Recurring meetings with the state present an opportunity for scheduled
discussions to address concerns or to track progress.

18


-------
SRF Reviewer's Guide - Round 5

• Determine steps to resolve performance issues. The region and state should discuss,
and to the extent possible, agree on actions to resolve know performance issues. If
issues identified during the Annual Review appear to be persistent or widespread,
regional program managers may decide to elevate the issue to Regional senior
management, and if necessary, their HQ liaison, for attention and resolution.

IV. OECA's Responsibility

OECA will review regional input to each annual review and provide feedback to ensure EPA
maintains a nationally consistent approach, and to also identify and communicate best
practices. In addition, OECA will periodically meet to discuss progress of the annual reviews in
addition to the overall status of the SRF program.

The Office of Compliance developed the SRF Recommendation Breakout dashboard in Qlik to
include annual results for data and files metrics. Regions can use the Qlik Sense tool to create
trend charts for data metrics in the ADMAs.

**Sample media specific ADMA can be found in Appendix L.

A. File Selection

Roles and Responsibilities

Regional Coordinator: Submit File Selection List and if relevant, state specific Compliance
Monitoring Strategy plans for inspection coverage/MOA/Workplans used for inspection
coverage (30-days before on-site review)

HQ Liaison: Review material to ensure selection criteria is met correct number of files,
categories of compliance and enforcement activity, and type of facilities (size, geographic
distribution, sector, coverage of large universe and coverage areas in state specific CMS
inspection coverage plans, etc.)

The objective of file selection is to obtain sufficient information to draw conclusions
regarding state performance under each SRF element. It is very important that reviewers
have an adequate number of files to develop supportable findings and recommendations
particularly where there is a potential concern (e.g., withdrawal petition, ADMA trends, or
previous SRF findings of performance issues).

1. File Selection Preparation

Before selecting facilities, EPA completes the DMA to identify potential problems. For
CWA reviews, EPA also completes the CWA inspection coverage table (see the CWA

19


-------
SRF Reviewer's Guide - Round 5

Plain Language Guide for instructions). Reviewers should consider these sources of
information, combined with problems identified in previous SRF reviews and annual
DMAs, when determining what activities or sectors to focus on during the review. Areas
with no data appearing in the data metric analysis, such as zero violations or actions,
should be discussed with the state to determine whether there are any unreported
activities available from information in state data system to supplement the review. In
these instances, the reviewer should ask for a list of all unreported activities and select
files randomly from the list.

After HQ review the DMA and file selection, HQ recommends EPA regions transmit the
file selection list to the state at the same time as the DMA. In addition, EPA should decide
if the state review will include any reviews of local agencies or district offices. Earlier
sections of this document deal with these types of reviews.

Reminder: The DFR's change over time and is not frozen data. It's a tool to help assess
the facility information by quarter. The DFR should be accessed and printed after the file
selection has been completed.

2. Determining Minimum Number of Facilities to Review

To determine the total number of files to select for your review, examine the "total
number of records returned" or activities returned found in the upper left-hand portion
of your screen in the ECHO file selection tool. For example, if the total number of
inspections, violations, enforcement actions, and penalties that occur in the review year
is 256, this would be within the range of 26-300 compliance monitoring and
enforcement activities reported at the top of the File Selection Tool "Total Number of
Records Returned." As a result, the reviewer would select 25-30 files as the table below
indicates.

For step-by-step instructions on creating a file selection list via the ECHO File Selection
Tool, see Appendix F, or visit the SRF training videos on ECHO.

Table 1: File Selection Guidelines

State-Wide Review

Number of Activities in File Selection Tool

Minimum # of Facilities or Files Selected

More than 1,000 activities reported

35 to 40 files selected

301 to 1,000 activities reported

30 to 35 files selected

26 to 300 activities reported

25 to 30 files selected

Fewer than 25 activities reported

All files selected

Review of Local Agencies & State District Offices

,i

1 For these reviews, also refer to section I, "Preparing for the SRF Review." If fewer than 30 files are available for
review in the file selection tool, select all files available for review

20


-------
SRF Reviewer's Guide - Round 5

Number of Local Agencies or State Districts

Minimum # of Facility or Files Selected

1 agency or district

30 files selected

2 agencies or districts

30 files selected (15 per agency/district)

3 agencies or districts

30 files selected (10 per agency/district

4 agencies or districts

30 files selected (7 per agency/district, plus 2



additional files)

5 or more agencies or districts More 30 files selected with roughly even distribution
	across agencies / districts	

If data in the national data systems do not accurately reflect state activities, EPA
may need to work with the state to verify the number of activities taken by the
jurisdiction, then select a more representative group of facilities. This applies
primarily to CWA wet weather, pretreatment, and significant industrial user
universes, which may not be fully populated in ICIS-NPDES. (See Appendix C of the
CWA Plain Language Guide for additional information.)

3. Selecting a Representative Sample of Files
a. Basic Requirements

Important: EPA should select at least five facilities for each of the following
categories:

•	Inspections with enforcement

•	Inspections without enforcement

•	Non-SNC violations (CWA/RCRA), federally reportable violations (CAA), or

secondary violations (RCRA)

•	SEVs (CWA) or stack tests failed (CAA)

•	SNCs (CWA/RCRA) or HPVs (CAA)

•	Informal enforcement actions

•	Formal enforcement actions

•	Penalties

A single facility can count toward multiple activities reviewed. For example, if a
facility has an inspection, a formal enforcement action, and a penalty, then that
facility addresses all three categories.

If there are fewer than five facilities in a category, select all available to include in the
file selection list and determine if the low number is indicative of a performance issue.

Important: Regions should then select files from a prior fiscal year(s) if fewer than

21


-------
SRF Reviewer's Guide - Round 5

5 activities are available to select in the review year to ensure that performance
findings are based on a sufficient number of activities.

For example, if there are only four penalties available in the review year (e.g., FY24),
reviewers should examine the prior year (e.g., FY23) of file selection tool data to
select one additional penalty.

b. Other Considerations

•	At least half of the facilities selected should have compliance monitoring
activity, and roughly half should have enforcement activity. (Enforcement
includes informal and formal actions, and penalties.). If enforcement
activities are limited, make sure you select all review year activities.

•	Selection should include a representative mix of facilities:
o With and without violations

o Different facility types based on size (major, minor, etc.), sector,

geographic location, and other factors
o Violations but no enforcement, particularly if the DMA indicated that the
state might not be taking appropriate enforcement

•	It is a good practice to include facilities with multiple inspections in a single
year, but no violations found.

•	The "Map Selected Facilities" feature allows File Selection Tool users to view
geographic distribution at a glance to determine file selection provides an
even representation.

4. Supplemental File Selection

Representative file selection will usually provide a sufficient number of files to assess

performance across the necessary range of activities, facilities, and geographic areas.

However, there are a few circumstances where EPA may elect to select supplemental

files, including:

•	There is a sector that EPA is concerned about in the state — such as CAFOs or
POTWs in the NPDES program — that the representative selection did not
adequately cover.

•	A review of previous SRF reports, the review-year DMA, or the annual DMAs
show longstanding problems in a performance area not adequately covered
by the representative selection.

When selecting supplemental facilities, click their checkboxes to indicate that they

are part of the supplemental selection.

Other considerations:

22


-------
SRF Reviewer's Guide - Round 5

•	Reviewers should generally select supplemental files randomly from the
list of facilities for the given category.

•	On rare occasions, the file review leads to new discoveries about problem areas,
and the official file selection does not provide an adequate number of facilities
to make findings, EPA may request additional files while on site.

•	Reviewers may also want to use the ECHO.gov SRF data metric facility
drilldown screen for the issue requiring additional file review. For
example, if you are interested in facilities with formal actions not taken
in a timely manner, find the relevant SRF metric in the ECHO.gov data
metric query and click on the metric number. This will bring up a list of
facilities. Then go back into the file selection tool and randomly select
some of these.

5. Transmit File Selection List

Upon completing file selection, download an Excel file listing selected facilities by
clicking the Download button and then clicking the Download Selected button. The
region should send the list to HQ for review in advance of the on-site file review.
This will allow the Liaison to provide valuable input on the quality of the list.
Following HQ review, the region should transmit the list to the state agency at
least two weeks before the file review to allow the state time to pull files.

At this time, the reviewers should also print the DFRs in landscape to ensure that all
quarters are viewable. It is much easier to pull them at the end of file selection than
later. The File Selection Tool has a Print Selected DFRs button for this purpose.
Reminder: that the DFR is production data and can change as the data updates are
entered. DFRs are a reference tool to view detailed activities by quarter. This tool is to
be used in conjunction with the frozen data, which is the main source for determining
the accuracy of data.

B. File Review

Roles and Responsibilities

Regional Coordinator: Submit File Review Worksheet (30-days after File review)

HQ Liaison: Review material to ensure completion and accuracy of metric calculations,
initial findings, and comments

1. File Review Preparation

After selecting files, the review team should continue preparing for the file review.

23


-------
SRF Reviewer's Guide - Round 5

*See Appendix G for a checklist of all essential materials to have on hand during the

review.

c.

a.	Print ECHO.gov Detailed Facility Reports (DFRs)

If you did not print DFRs for all the facilities on the file selection list during file
selection, pull them by entering facility ID numbers into the facility ID search on
ECHO.gov.

b.	Print File Review Checklists

Download the CAA, CWA, or RCRA file review checklists from the SRF
documentation and guidance page in ECHO.gov or the EPA Manager database. Fill
in the information requested on pp. 1-2 based upon information on your detailed
facility report to save time during the on-site file review. Print or save an electronic
copy for each facility to be reviewed and clip it to the facility's DFR.

2.	Coordination with State

For on-site reviews:

If you need access to the state's data system, or assistance navigating the data system,
ask the state for assistance.

Contact the state the week before the on-site review to confirm that:

•	The state has pulled all selected files; if the state was unable to find some files, select
additional files to ensure minimum file selection requirements are met

•	The state has reserved a room for EPA to review files

•	The files contain all documentation needed to complete the review

•	The state has designated a point-of-contact to offer assistance during the review

•	The appropriate managers and staff will be available for entrance and exit meeting

Important: During the on-site file review, it is vital that reviewers take quality notes,
or if allowed, scan or copy key sections of files or documents particularly where the
situation seems complex or unclear. This will ensure that the necessary information will
be available to explain findings, support recommendation development when drafting
the report, or discussing the preliminary findings with state or local agency.

3.	Conducting the File Review

a. Conducting Reviews Remotely

If a state has all files available electronically, regions may choose to conduct the file
review remotely. Inspection reports and formal enforcement actions are available

24


-------
SRF Reviewer's Guide - Round 5

on some state web sites. It is a good practice to determine whether compliance
determinations following inspections, informal enforcement actions, penalty
calculations, and justification for changing penalties are, or can be made available
electronically. If some or all these data are not available remotely, an on-site file
review will be necessary. Consider whether state public disclosure laws or internal
policies make it necessary to supplement electronic reviews with on-site file review
and discussion with state staff.

See Appendix K has more details on organizing a successful remote review.

b.	Entrance Conference

Regions and states often find it helpful to hold an entrance conference. Appropriate
topics include:

•	A brief discussion of the new Round 5 process;

•	SRF DMA results and how those compare to past ADMAs, including CWA
CMS metrics, to indicate potential performance issues;

•	File review process;

•	Confirming availability of the state's point-of-contact during the review;

•	Expected timeline for completion of review and tentative date and time of exit
conference; and

•	Proposed topics to be covered at exit meeting, such as preliminary findings from
the review, requests for additional materials, and the process for drafting and
finalizing the report.

c.	File Review

Use file review checklists and DFRs to review facility files and refer to the Plain
Language Guides and underlying EPA policy and guidance for questions about specific
metrics.

There may be activity from a previous or subsequent year linked to activity in the
year reviewed. If so, EPA should review these activities. For example, Region 11 is
conducting a review of activity in FY 2018 in one of its states. One of the facilities
selected for file review had an enforcement action during FY 2016. This enforcement
action was in response to violations found during an inspection in FY 2015. Because
they are directly related, Region 11 would review the inspection, violation
determination, and enforcement action.

Another facility had an inspection in FY 2018 that resulted in a SNC determination
and formal enforcement in FY 2019. Again, Region 11 would review the inspection,
violation determination, and enforcement action.

If a facility has multiple inspections or enforcement actions during the review

25


-------
SRF Reviewer's Guide - Round 5

period, review all activities that take place in the review year and record responses
for the same question on a separate row of the file review spreadsheet. The file
review checklists contain supplemental sections for multiple activities, and the file
review spreadsheet contains instructions for capturing each action.

Use the File Review Worksheet to calculate metrics and make initial findings the
recommended finding levels shown on p.29 The Worksheet automatically tabulates
metric values based on the "Y" and "N" responses entered for the facilities. For N/A
responses, you may leave them blank or enter N/A. (To prevent data entry and
calculation errors, the Worksheet only allows responses of Y, N, N/A, and blank.) Do
not adjust the formulas in the Worksheet. It is a good practice to enter checklist
responses in the file review spreadsheet daily to ensure that all appropriate
questions were answered while the review team still has access to the files. Use the
far-right hand column in the table on p.l of the file review checklist as a guide to the
specific questions that should be answered for each type of activity reviewed.

4.	Developing Preliminary Findings

Once you have entered all responses in the Worksheet, click on the Initial Findings
tab. Metric values will automatically populate in the Initial Findings tab based on
values entered in the file review Worksheet. Compare the state performance result
to the national goal for each metric to establish preliminary file review findings. You
may do this prior to the exit conference, time permitting.

Issues identified as State Attention or State Improvement in the DMA generally
represent a performance issue of one kind or another (see definitions of findings on
p.29). For example, if EPA made a State Improvement initial finding in the DMA for
not inspecting enough major facilities, but state data confirms that the agency
exceeded its inspection commitment, it would appear that the agency was not
entering all inspections in the national data system. In this case, the state would
receive findings of State Improvement under Element 1 (Data) and Meets or Exceeds
Expectations under Element 2 (Inspections).

Reviewers may revise these findings and recommendations later based on
additional research and analysis.

5.	Exit Conference

EPA should hold an exit conference with state agency personnel following the file
review. This conference may occur on site immediately following the review or at
a meeting or conference call as soon as possible after the review.

a. Discussing Preliminary Findings and Potential Recommendations

26


-------
SRF Reviewer's Guide - Round 5

EPA may begin the exit conference by telling the state that it has completed the
review and has developed preliminary findings and, if possible, recommendations.
EPA should stress that these are subject to change based on further analysis and
discussions with HQ. EPA should also discuss areas where state performance is
strong.

When discussing preliminary findings for Areas for Improvement, EPA should
provide reasons for these findings, and, if possible, potential recommendations to
improve performance. This should be an opportunity for dialogue, particularly
when EPA is unsure what is causing a particular problem, or how to improve it. The
state may have additional reasons for low performance, and it may have helpful
ideas for how to improve. EPA should note these and add them to the report as
appropriate.

When problems noted in prior SRF reviews recur, ask the state why prior
recommendations did not solve the problem, and what the state believes it can do
to improve performance. If an action was completed that did not solve the problem,
recommend a different action.

EPA may ask the state or local agency when they plan to begin correcting the issue,
and what they need in terms of assistance, so a realistic due date for a proposed
recommendation can be included in the report.

Finally, EPA should discuss the process for drafting the report, reaching agreement
with HQ on findings and recommendations, and sharing a draft with the state for
its comment.

V. Drafting and Finalizing the SRF Report
A. Developing the First Draft
Roles and responsibilities

Regional Coordinator: Develop and submit a draft report to HQ liaison
HQ Liaison: Review draft report for completeness, accuracy, and integrity

The draft report represents the main product of the review, which when finalized is made
available to the public on the SRF web site. Drafting of the report typically begins after the
file review, though some reviewers may wish to begin entering administrative information,
data metric values along with preliminary findings prior to that point.

Regions have the flexibility to decide who is responsible for drafting the report, or sections
of the report, whether that be the SRF coordinator, program reviewers or some

27


-------
SRF Reviewer's Guide - Round 5

combination. Typically, the coordinator is ultimately the one responsible for ensuring that
the report is completed properly and on time.

In drafting the report, reviewers will compile the data and file metrics, along with any other
relevant information gathered during the review, to make findings on a program's
performance under each element (i.e., data, inspections, etc.). To help ensure consistency, a
metric value range generally corresponds to one of three finding levels unless there are
justifiable reasons otherwise (See Table 2 on page 29). Wherever findings of area for
improvement are made, recommendations for corrective action must be included, which to
the degree possible, should be developed in coordination with the agency reviewed.

Draft reports are due to the HQ Liaison by the end of the federal fiscal year. If the Regions
needs additional time to complete the draft report, reviewers or SRF Coordinators should
contact their liaison and provide them with an expected submission date.

Important: All Round 5 SRF reports will be drafted in the program's SRF Manager- Oracle
Apex data system launched in January 2018. The Database is a one-stop system that allows
coordinators, reviewers, and liaisons to access key guidance documents, draft, and review
SRF reports, and track recommendations until completion.

For more information on how to use SRF Manager in developing a draft report, see the
User's Guide posted in the database.

1. Administrative Information

Before drafting the report, reviewers should provide the following information in the
Administrative Information view of the SRF Manager's Database:

•	Region

•	State

•	Agency Reviewed: The implementing agency (EPA, State, Local). If state district
offices are being reviewed, the state is the implementing agency. If a local is being
reviewed, the local is the implementing agency. All state district offices should be
combined into a single report, while separate reports should be created for each
local.

•	Round

•	Review Year: Federal Fiscal Year (FFY) during which the reviewed activities were
conducted.

•	Regional Coordinator

•	HQ Liaison

•	Report Version (final or draft)

•	Report Author

•	File Review: Dates that the file review was conducted and contact info of media

28


-------
SRF Reviewer's Guide - Round 5

program lead

2. Performance Findings

Findings are the reviewers' determinations on program performance that make up the
main content of the report. There should be at least one finding per element, though
there are typically multiple findings within an element.

a.	Finding Number (up to 3 findings per element: Meets or Exceeds Expectations, Area
for Attention, Area for Improvement)

•	Each element in the report (data, inspections, etc.) has metrics associated with it
and therefore will receive at least one finding. For each element, start with finding
1 and continue sequentially up to a maximum of five findings.

e.g., Element = Data Finding 1-1, Finding 1-2, and Finding 1-3

b.	Finding Level

•	Review the source information that will be used to make findings:
o Data metrics from the DMA.

o File metrics from the file review spreadsheet,
o Other information such as ADMA performance trends, etc.

Important: Reviewers should use the national goal of the metric, not the
national average, for determining a finding level. Averages should be used to
provide context to the findings.

•	Choose a final finding level. The table below provides a definition of each finding
level and offers suggested metric value ranges for help in deciding on a finding
level. These value ranges are simply a guide in selecting an appropriate finding
level. Other factors may be considered in choosing an appropriate level, such as
the universe size of the metric or whether the issue has recurred across several
SRF rounds.

Important: Reviewers must include all metric including indicators, but no findings levels
are to be made on review indicator or indicator metrics. They can be used in the
explanation section to communicate factor being considered to goal metrics or why
additional files were selected.

See Appendix J to consider other factors for developing finding levels.

29


-------
SRF Reviewer's Guide - Round 5

Table 2: Finding Levels

Suggested Metric Finding Level
Value Ranges

Meets or Exceeds Expectations: The base level of performance is met,
>85 100°/	anC' n° c'ef'c'enc'es are identified, or the program is performing above

national expectations.

Area for Attention: An activity, process, or policy that one or more
SRF metrics show as a minor problem. Where appropriate, the state
should correct the issue without additional EPA oversight. EPA may
make suggestions to improve performance, but it will not monitor
these suggestions for completion between SRF reviews.

Area for Improvement: An activity, process, or policy that one or more
SRF metrics under a specific element show as a significant problem that
the agency is required to address. Recommended activities to correct
the issues should be included in the report and must have well-defined
timelines and milestones for completion, and, if possible, should
address root causes. EPA will monitor recommendations for completion
between SRF reviews and provide any necessary updates in the SRF
Manager database.

Important: Group metrics within an element under the same finding level. If metric values
within an element lead to the same finding level, create a single finding, and include all metrics
under that finding. If metrics within an element lead to different finding levels, create multiple
findings, grouping only those metrics that lead to the same finding level

c.	Summary

•	Provide 1-2 sentences describing the specific programmatic area(s) reviewed and
conclusions on performance. Reviewers should typically try to use the language
of the metric on which the finding is based as a guide in drafting the summary
statements.

For example:

o Compliance determinations are generally accurate in cases where there is

sufficient documentation (Meets or Exceeds);
o Inspection reports occasionally lack information sufficient to determine
compliance and are not consistently completed in a timely manner (Area for
Attention);

o Enforcement responses do not consistently address violations in an
appropriate manner (Area for Improvement)

d.	Explanation

•	Describe the program's performance in more detail, providing an explanation for

>71-84%

<70% and below

30


-------
SRF Reviewer's Guide - Round 5

how and why the finding level was chosen

•	If the finding is area for attention: Reviewers may wish to include a suggestion to
the state/local agency on how to improve the program or alleviate a concern at
the end of the explanation section, though this will not be tracked as an official
recommendation in the database.

•	If the finding is area for improvement. Define the scope of the issue and the
cause(s), or potential cause(s) to the best degree possible.

Important: Determine if the performance issue is recurring. Check to see if the same
issue was identified in previous SRF rounds. If so, explain as best as possible, why the
issue persists or resurfaced. Also, make sure to check the "recurring issue" box in the
findings section of the SRF Manager Database.

3. Recommendations

Recommendations are required whenever there is a finding of area for improvement. The
purpose of recommendations is to ensure that any significant performance issues
identified in the review receive a response that either resolves the issue or leads to
substantial and consistent progress towards a resolution of the issue (a determination
made using best professional judgement).

a. Writing Effective Recommendations

•	All recommendations must contain a description of the specific actions that will
be taken to address the issue identified, the responsible party, and well-defined
timelines or due dates for completion (e.g., 90 days from the completion of the
final report). To the greatest extent possible, recommendations should attempt
to address the full scope and underlying cause(s) of the performance issue.

•	When writing recommendations, reviewers may find it helpful to use the
following SMART checklist to ensure the recommendation includes the
required components.

SMART Checklist:

~	Specific - description of specific actions that will be taken and who will take
them.

~	Measurable -the actions can be measured either quantitatively orqualitatively
but should indicate what evidence is needed to measure completion.

~	Achievable - the actions are within the means of the implementing agency to
complete.

31


-------
SRF Reviewer's Guide - Round 5

~	Results-oriented - completion of the actions should result in improved
outcomes i.e., the issue is addressed, or meaningful and consistent progress is
made towards that end.

~	Time-bound - actions include timelines or due dates that create a practical
sense of urgency.

Important: If the recommendation is addressing a recurring performance issue,
or one identified in the previous round, the recommendation should represent
an escalated response. If the issue was resolved but resurfaced, the EPA might
consider a longer period of monitoring. Examples of escalated action can be found
in the Agency's National Strategy for Improving Oversight of State Enforcement
Performance found on the ECHO SRF web page and in the SRF Manager database
guidance section.

b. Recommendations vs. Milestones (choose an option)

Important: In writing recommendations for a finding of area for improvement,
reviewers can develop one recommendation with multiple milestones/due
dates, or create several recommendations based on each milestone. There may
be no difference in deliverables or actions between recommendations and
milestones; the only difference is how regions would like to monitor and report out
on recommendations during the post review monitoring process. Here are the two
options:

•	Draft a single recommendation that has multiple milestones (deliverables or
actions) but a single due date. The due date will typically mark when the final
milestone is to be completed.

•	Draft multiple recommendations, each with its own due date, meaning, there
would be multiple recommendations and multiple due dates associated with
that single finding.

For example, a recommendation may include the following deliverable or
action milestones: "1) The state should complete ICIS data entry training by
July 31, 2019. The state should enter all SEVs into ICIS by Dec. 31, 2019. 3)
The state should complete an SOP for entering SEVs into ICIS by March 31,
2020." Each action or deliverable would be entered in the SRF Manager
database as a separate recommendation (no. 1, no. 2, no. 3) with a single due
date for each.

Executive Summary


-------
SRF Reviewer's Guide - Round 5

As you enter data in the SRF Manager, several buttons are available to ease the
generation of the executive summary. The strengths and priority issues buttons are
available to ease the process of drafting the executive summary, as they will automatically
insert this text into the executive summary. The Summary should convey the main
findings from the review, namely the most notable performance successes and challenges
of a given program. In other words, readers, especially management, should be able to
turn to the Executive Summary to get a sense of what parts of a program are being well
implemented, and what parts require additional attention.

a.	Areas of Strong Performance (3-5 findings):

•	Review all Meets-or-Exceeds findings.

•	Identify up to five findings that reflect parts of the program that are being
implemented at a high or very high level.

•	Include the finding summary(s) as written or re-write to better encapsulate the
finding.

•	If no Areas of Strong Performance are identified, indicate this by writing "No Areas
of Strong Performance were identified."

b.	Priority Issues to Address (3-5 findings):

•	Review all Area for Improvement findings.

•	Identify up to five findings that reflect parts of the program that are being
implemented at a low or very low level.

•	Include the finding summary(s) as written or re-write it to better encapsulate the
finding.

•	If no Areas of Strong Performance are identified, indicate this by writing "No
Priority Issues to Address were identified."

c.	Other considerations (optional)

•	Include discussions of environmental justice or climate change in the executive
summary

•	Other areas reviewed that are not captured in the report

d.	Summary Table

•	Following the highlights of the current review, the Executive Summary should
include a brief overview of performance issues from past reviews.

•	The SRF Manager has a recurring box that once you check-off, will generate the
table.

•	The overview should indicate whether issues identified in previous reviews have
been resolved or continue to be a problem.

•	The SRF Manager will create a table that includes the finding levels for each issue
associated with a SRF metric and columns for each Round of a reviews.

•	The table below is an example:

33


-------
SRF Reviewer's Guide - Round 5

Metric

Round 4 Finding Level

Round 5 Finding Level

10b - Appropriate
enforcement taken to
address violations [GOAL]

Area for Improvement

Area for Improvement

B. Finalizing the Report

1.	HQ Review of Initial Draft Report

Once Regional reviewers have completed developing a draft report in the SRF
Manager database, the Regional Coordinator should notify the HQ Liaison that the
initial draft is complete. The Liaison will begin a completeness check to make sure all
the necessary information is in the draft and all the required documents are uploaded
to the database. If everything is complete, the Liaison and HQ program staff will begin
their review and provide their comments to the Regional Coordinator within 15
working days.

2.	HQ Review of Subsequent Draft SRF Reports

The process and criteria for substantive reviews of revised draft reports will be the
same as for first-draft reports unless the HQ Liaison elevates the revised draft to
management, in which case management will review and determine how to resolve
remaining issues.

3.	State Comment on Draft SRF Report

Important: The recommended approach for state review of the draft report is for
the EPA region and HQ to reach agreement on a draft report before the EPA region
shares the report with the state. This is an effort to reduce transaction costs and make
sure EPA speaks to outside parties with one voice. Experience has shown that reports
shared with the state first result in additional reviews by the state and HQ and take
longer to finalize.

Once the state receives the report, it has 30 calendar days to make comments. Once
the state has reviewed the report and the Region has made all the necessary revisions,
the EPA region should send the report back to the HQ Liaison. The EPA region must
notify the Liaison if it made any significant changes to the report based on state
comments.

e. Finalizing the Report in the SRF Manager Database and Posting to SRF Web Site

Once the state has reviewed the report and HQ and the EPA region reach
agreement on its content, the Region will make all final edits in the SRF Manager

34


-------
SRF Reviewer's Guide - Round 5

database and select the Final Report option in the Administrative Information view
of the draft report section. This will transfer the report into the Final Report view
and the document will appear in the table. The HQ Liaison will review the final
reports and will notify the EPA region in writing that the report is final. The report
is not final until the EPA region receives this written notification from HQ. Final
reports are typically due by the end of the calendar year. The Liaison will publish
the final report and along with the review recommendations on the EPA SRF web
site and notify the Regional Coordinator when the document will be available to
the public or if you need to update a final report.

VI. Post-Review Recommendation Monitoring and Closeout

Roles and Responsibilities

Regional Coordinator: Monitor recommendation implementation to make sure progress is
being made, support is available where needed, and the completion of a recommendation is
verified.

HQ Liaison: Monitor status of recommendations, ensure that completion verification meets all
appropriate criteria, and elevating issues that may require a national or upper management
response.

Following the publication of the final report, EPA is responsible for ensuring that any
recommendations resulting from the review are fully implemented so that performance issues
are resolved, or meaningful and consistent progress is made towards that end.

The SRF Manager database is a key tool for monitoring recommendations. Once the report is
finalized in the system, all report recommendations can be viewed in the Findings and
Recommendations section, where reviewers can sort and filter recommendations by various
categories including round, region, state, finding number and level, summary, explanation,
recommendation text, due date, and status. Reviewers are encouraged to check on the status of
outstanding recommendations on at least a quarterly basis and coordinate with the
implementing program to complete them prior to the due date.

A. Annual Recommendation Inventory

At the beginning of each fiscal year, regional coordinators should conduct an inventory of
all recommendations in the SRF Manager to assess their status (completed, ongoing or
overdue) and which ones will be coming due in the upcoming year. For those that are
upcoming, and especially those that are overdue, review the content of the
recommendation and prepare to follow up with the agency to ensure they are completed.
Regions are encouraged to discuss the status of any ongoing or overdue recommendations
with their states as part of their communication of their annual data metric analysis
(ADMA).

35


-------
SRF Reviewer's Guide - Round 5

B. Monitoring Ongoing Recommendations (known as "working" in previous rounds)

For ongoing recommendations that have not reached their due dates, reviewers are advised
not to wait until a recommendation is due to check in with the responsible agency on the
status of its implementation.

For example, if a recommendation deliverable or action is due in 90 days from the report
publication date, the reviewer should contact the agency at least 60-90 days in advance to
inquire on what progress has been made in implementing the recommendation. As a
suggested best practice, timelines for inquiry are included in the table below.

Recommendation Due Date

Suggested Initial Check-In Date

90 days from publication

3- days from due date

180 days from publication

120 days from due date

240 days from publication

120 days from due date

365 days from publication

180 days from due date

During check-ins, reviewers should try to determine if the reviewing agency is on track or
having trouble implementing the recommendation deliverable or action. If EPA and the
responsible agency both determine that the agency will not be able to meet the due date,
they should try to determine the cause for the delay and what actions EPA can take to aid
the state or local agency that will help them resolve the performance issue.

If it is unlikely that the issue can be resolved before the original due date, each party will try
to reach an agreement on a new due date. Once a new date is determined, the Regional
Coordinator should request a change in the due date in the SRF Manager Database. The HQ
Liaison will review the request and update the due date, if appropriate.

C. Prioritizing and Elevating Overdue Recommendations

Overdue recommendations are those that have not been completed by the due date
committed to in the final report. There might be many reasons why a recommendation
becomes overdue - staff turnover or a lack of staff, state unwillingness, the issue is
considered a low-priority, or it is simply a complex and intractable issue to resolve. The
expectation, however, is that all recommendations are to be completed, unless upon
elevation, senior management determines that the issue cannot be solved or is no longer
relevant.

36


-------
SRF Reviewer's Guide - Round 5

Reviewers should prioritize the monitoring of overdue recommendations and develop a
strategy for working with the appropriate agencies to resolve them. Most pressing to resolve
are the subset of overdue recommendations that address what reviewers determine to be
"significant and recurring issues" and have been unresolved for an extended period (e.g.,
greater than one year overdue). For these types of recommendations, Regions should
implement an elevation process for resolution by senior management either at the Regional
or HQ level in accordance with the Process for Elevation of Issues outlined in the Memo on
Enhancing Effective Partnerships Between the EPA and the States in Civil Enforcement and
Compliance Assurance.

D. Verifying Recommendation Completion

For a recommendation to be considered complete, EPA verifies that all parts of the
recommendation have been carried out in full and/or the underlying performance issue(s)
has been either resolved, or substantial and consistent progress has been made towards a
resolution.

Confirmation may require EPA to review data or a sample of inspection reports or
enforcement actions to determine that an issue has been resolved. This may or may not be
explicitly spelled out in the recommendation itself. For the most significant issues, EPA will
want to monitor implementation of a recommendation for a longer period and see sustained
improvement over several quarters before closing out the recommendation.

Documentation to demonstrate verification may differ depending on the type of
performance issue identified in the report. The list below includes some common practices
and documents for verifying specific performance issues:

1. Policies, Guidance, and Procedures

• Development or revision of a new or existing document, such as a response policy,
inspection report format, checklist or template, standard operating procedure,
penalty calculation spreadsheet, or data entry procedures.



Does the
recommendation
require development
or revision of a
document?



Review document
and provide feedback
to state/local agency,
if necessary



Attach final approved
document in SRF
Manager database

~

2. Incomplete, Inaccurate, and Untimely Entry of Data

•	Entry of missing data, such as facility info, universe, inspections, violations,
enforcement actions, or penalty counts and amounts under file metric 2b

•	Resolving technical issues such as translating data from state to federal databases
(i.e., Electronic Data Transfers (EDT))

37


-------
SRF Reviewer's Guide - Round 5

• Revising incorrectly entered data such as inaccurate dates, SEV codes, enforcement
types, penalty dollar amounts

Attach data download
to SRF Manager
Database



Insufficient knowledge, skills, and abilities

• Providing training and technical assistance on how to accurately enter data, which
data are required to be entered, when data are required to be entered,
identification of violations and discerning SNC/HPV/FRV from other types of
violations, how to calculate penalties (e.g., economic benefit)

Does the
recommendation
require training, joint
inspections?



Record # of training
attendees date, &
agenda/syllabus, or

inspections
conducted/reports
reviewed

Inadequate Inspection Reports and Documentation of Penalty Calculations

•	Inspection report quality (e.g., facility information, dates, narratives, checklist,
documentation, violation identification)

•	Penalty documentation (e.g., economic benefit and gravity, changes to penalty
amounts, penalty collection)



Does the
recommendation
require review of
inspection reports or
penalty
documentation?



Review reports or
documents from
selected files



Include file review
checklist indicating
number of reports or
files reviewed that met
requirements



Inadequate SNC-HPV determination, Return to Compliance, and Appropriate and Timely
Enforcement Action

•	Making appropriate HPV-SNC determinations of violations

•	Taking appropriate and timely informal or formal action in response to violations.

38


-------
SRF Reviewer's Guide - Round 5



Does the
recommendation
require review of
violations or
enforcement actions ?



Review reports or
documents from
selected files



Include file review
checklist indicating
number of reports or
calculations reviewed
that met requirements

~

Regions should enter all the necessary verification information the in SRF Manager, after
which, they will need to notify their HQ Liaison to request a close out of the
recommendation. The Liaison will review the information in the SRF Manager and, if all the
verification criteria are met, they will approve the request and close out the
recommendation.

In cases where the verification lacks sufficient justification or documentation, the Liaison
will work with the Region to try to reach an agreement. If relevant documentation or
information cannot be obtained, an explanation should be provided. If both parties are
unable to reach agreement, the Liaison will elevate the issue to their management.

39


-------
SRF Reviewer's Guide - Round 5

Appendix A: SRF Key Information

•	Reviewer: EPA Office of Enforcement and Compliance Assurance and 10 Regional Offices

•	Reviewed: Local, state, and EPA Dl compliance monitoring and enforcement programs

•	Frequency: At least once every five years

•	Current Round: Round 5(FY2024-2029)

•	Statutes Covered:

o Clean Water Act (CAA) - National Pollutant Discharge Elimination System (NPDES)
o Clean Air Act (CWA) - Title V

o Resource Conservation and Recovery Act (RCRA) - Subtitle C

•	Source Information:

o Data Metrics -Verified compliance monitoring and enforcement data in the national data
systems

o File Metrics - Facility files that contain compliance monitoring and enforcement activity
o Other - Non-review year data or multi-year data trends; review of previous SRF reports;
Compliance Monitoring Strategies, MOUs, and performance agreements; follow-up
conversations with agency personnel, and; additional information collected to determine
an issue's severity and root causes

•	Program Elements Covered:

o Data - completeness, accuracy, and timeliness of data entry into national data systems
o Inspections - meeting inspection and coverage commitments, inspection report quality,
and report timeliness

o Violations - identification of violations, accuracy of compliance determinations, and

determination of significant noncompliance (SNC) or high priority violators (HPV)
o Enforcement - timeliness, appropriateness, returning facilities to compliance
o Penalties - calculation including gravity and economic benefit components, assessment,
and collection

•	Finding Levels:

o Meets or Exceeds Expectations: This rating describes a situation where the base level is
met, and no performance deficiency is identified, or a state performs above base program
expectations

o Area for Attention: An activity, process, or policy that one or more SRF metrics show
as a minor problem. Where appropriate, the state should correct the issue without
additional EPA oversight,
o Area for Improvement: An activity, process, or policy that one or more SRF metrics under
a specific element show as a significant problem that the agency is required to address.
Recommended activities to correct the issues should be included in the report and must
have well-defined timelines and milestones for completion, and, if possible, should

40


-------
SRF Reviewer's Guide - Round 5

address root causes. EPA will monitor recommendations for completion between SRF
reviews in the SRF Manager database.

41


-------
SRF Reviewer's Guide - Round 5

Appendix B: Data Verification

Data Verification typically occurs every year from November to February. The following steps
should be taken by the data stewards for all state delegated and EPA Direct Implementation
programs:

•	Log into the government-only area of the Enforcement and Compliance History Online
(ECHO) website with your EPA web application ID and password.

•	Click the "Oversight" box and the "State Review Framework" link (direct link after log in is
https://echo.epa.gov/oversight/state-review-framework).

•	Click the ECHO.gov SRF Data Verification tab and submit a search for the state or local
agency.

•	Go to the Review the facility and activity counts on the search results screen to ensure their
accuracy. In other words, if a number appears inaccurate, click on it to view the list of
facilities or activities behind it.

•	Make any necessary corrections in the national data system of record. ECHO.gov will reflect
corrections after the next weekly data refresh. See the right-hand side of the ECHO.gov
Data Verification page for final refresh and anticipated freeze dates.

•	States and EPA should correct the national data systems before the final ECHO.gov
refresh. This allows for a final review prior to the data verification deadline, which is
typically in February. Click the Submit Verification button at the bottom of the results
page to complete verification.

•	When a state finds data inaccuracies that it cannot correct, it should consult with EPA
regional data stewards to develop caveats to explain why data are inaccurate. EPA will
post these caveats on ECHO.


-------
SRF Reviewer's Guide - Round 5

Appendix C: Regional and Headquarters Coordination During the SRF
Review

0

Track 1



•Initial communication and concurrence occurs between



Scoping Meeting



region and HQ in the form of a preliminary scoping





meeting



Emphasis

W



Step 1: Initial Meeting

The initial meeting is the EPA region's presentation to HQ of its comprehensive plan for the
review. This also provides a forum for discussing how the SRF process can address state
performance issues. Regional managers and/or SRF coordinators and liaison should participate.

Before the meeting, regions should provide HQ no less than 2 days in advance, and prepare to
discuss, the following:

•	If applicable, a proposal for reviewing selected district offices, or local agencies (see
Section II, "Preparing for the Review," above)

•	DMA results, NPDES Compliance Monitoring Strategy state specific CMS plan and the
CWA inspection coverage table

•	Proposed file selection lists

•	An estimate of the draft report submission date

HQ and the EPA region may schedule follow-up discussions to address any outstanding issues
and finalize a review plan. HQ and the region should document final decisions.

As another recommended but optional step, regions should provide file review results to their
HQ SRF liaison for review after the on-site or electronic file review. HQ will provide comments
within five working days.

Step 2: Draft and Final Report

The regional SRF coordinator provides a completed draft report to the HQ SRF liaison with
all supporting SRF documents uploaded in the SRF Manager. HQ will provide comments
within 15 working days, if all SRF documents are provided. See the "Finalizing Report" section
below for additional information.

43


-------
SRF Reviewer's Guide - Round 5



Track 2

Periodic Check-In
Emphasis

V

Periodic communication and concurrence between
regional SRF coordinator and HQ SRF liaison occurs
at multiple steps in the process

Step 1: Determining Scope of Review

This step applies when EPA is reviewing local agencies or select district offices in a state.
Section II in the SRF Reviewer's Guide (starts on p. 8) provides relevant elaboration.

Step 2: SRF Data Metric Analyses

Forward copies of SRF data metric analyses to the HQ SRF liaison via SRF Manager database.
For CWA SRF reviews, include any CWA state specific CMS plans. The liaison will review and
provide feedback within five working days.

Step 3: File Selection Lists

Forward file selection lists to the HQ SRF liaison before sending them to the state. The liaison
will review to ensure that:

•	The region selected a sufficient number of facilities

•	The region selected the recommended number of facilities for each element (inspections,
violations, enforcement, etc.)

•	The facilities selected are sufficiently representative of the full universe in terms
of major/minor designation, geography, CMS commitments, and sector

Regions may wish to send file selection lists and data metric analyses at the same time. The
HQ SRF liaison will review and send feedback to the region within five working days.

Step 4 -5: File Review Results and Prepare Draft Report

Once the file review is completed, regions should forward copies of the file review worksheet
to their liaison via the SRF Manager. A complete tally of the file metrics and the region's initial
findings must be included (including the comments). The liaison will provide informal
comments to the region within five working days, which the region can incorporate into the
worksheets.

The Regional SRF Coordinator provides a completed draft report, file review spreadsheet, and all
relevant documents to the OC SRF liaison a the SRF Manager database. HQ will provide comments
within 15 working days

Step 6: Finalizing Report

44


-------
SRF Reviewer's Guide - Round 5

The regional SRF coordinator provides a completed draft report to the HQ SRF liaison. See the
"Finalizing Report" section on page 25 of the Reviewer's Guide for additional guidance.

Optional Steps:

•	Review calendar: Develop milestones for completing each step in the review
process and forward them to HQ SRF liaison.

•	Kickoff letter: When sending a kickoff letter to the state, also send a copy to the HQ
SRF liaison.

45


-------
SRF Reviewer's Guide - Round 5

Appendix D: Kick-off Letter Template

Date

Name

Title

Agency

Address

City/State/Zip

Re: State Review Framework - Review of Regional Implementation of [Insert State}[lnsert
Media] Act Enforcement Program

Dear[lnsert Name],

As an integral part of our U.S. Environmental Protection Agency - [Insert State]
partnership, EPA [Insert number] will be conducting a State Review Framework (SRF)
review of the EPA Region [Insert number] [Insert Media] program for [Insert State] this
year. We will review inspection and enforcement activity from fiscal year [Insert Year] and
supplemental activities from previous years, as necessary.

This review will assess whether the region is implementing the program and meeting
minimum performance standards laid out in EPA policy and regulations. The overarching
goal of such reviews is to ensure consistency of program implementation and oversight,
and in so doing, ensure equal protection for the public and a level playing field for business.

[Method of review- Electronic]: Normally, an important part of the review process is visiting
the regional office, where we can converse face-to-face with compliance and enforcement
staff, examine regional data in ICIS, and review a sampling of facility files that contain
inspection and enforcement activity, we are adapting the planned review into an electronic
file review. We value face-to-face interactions with [regional or state] offices during these
reviews and hope to find an opportunity to visit the [region/state] in the future; however,
we expect to complete the review of [Insert State name] prior to such a visit.

OR

[Method of review- On-site]: We will perform this review on-site and logistical coordination
will be addressed during the kick-off meeting.

The [electronic]or [on-site] review of [Insert State Name] will be led by [Insert Lead POC]
along with a small team from [Insert Region X] staff. They will coordinate with the lead
regional counterpart [Name of POC]. The team will host a kickoff call to discuss logistics,
schedule, preliminary analysis of the region's SRF data metrics (DMA) and the file selection
list.

46


-------
SRF Reviewer's Guide - Round 5

Following the review, we will summarize findings and recommendations in a draft report.
Regional management and staff will have 30 days to review and comment on this draft. We
expect to complete these reviews, including the final draft report, by [Insert Date]. If EPA
identifies areas for improvement for the program, we will work with you and your team to
address such issues until they are either resolved or meaningful and consistent progress is
made.

Please contact me at XX-XXX-XXXXor have your staff contact [Lead Review insert name] at
XXX-XXX-XXXX with any questions regarding the review. We look forward to working with
you and furthering the goals of the SRF program.

Sincerely,

XXX, Director

Planning, Measures and Oversight Division
Office of Enforcement and Compliance Assurance

Cc:

POC, state
POC, region

47


-------
SRF Reviewer's Guide - Round 5

Appendix E: Data Metric Analysis (DMA) Procedures

DMA Step-by-step:

Step 1: Downloading the DMA from ECHO

1)	LogintoECHO.gov

2)	Go to Search Options > Oversight > State Review Framework

3)	Click on the Data Metrics Analysis tab

o Select the Statute (CAA, CWA, RCRA)
o Select the Review Year

o Choose the State being reviewed, and if applicable, the Local Agency
o Click submit

4)	A new window with table of metric values will appear, click download button

5)	Save document as: State-Local_Statute_Review Year Document Type
o e.g., AL_CWA_FY17_DMA or AL-Jefferson_CAA_FY17_DMA

Step 2: Making Initial Findings

6)	Open the downloaded copy and locate the columns Initial Finding and Explanation

7)	For all Goal Metrics:

o Evaluate each goal metric value and make an initial finding according to the general

ranges on page 29 or in Appendix J of the Reviewer's Guide,
o Provide a brief explanation to substantiate the finding

8)	For all Non-Goal Metrics:

o If metric values appear satisfactory, no finding is required
o If metric values suggest performance issues:

¦ Or flag the issue for follow-up in file selection and review to obtain more
information. To do this, enter Supplemental Review in either of the newly created
columns

Step 3: Using the Initial Findings

9)	When finished, submit the DMA with initial findings to your HQ liaison for review before
starting the file selection and review process

10)	Once the DMA is submitted and reviewed, focus attention on findings of area for attention,
area for improvement, or those flagged for Supplemental Review.

48


-------
SRF Reviewer's Guide - Round 5

o Check-in with the agency to make sure the values are accurate

o If so, in the file selection process make sure to select files pertaining to potential areas

of concern. (See File Selection section on pages 12-15)
o During the file review, or whenever possible, discuss the DMA results with the agency to
try to gather any additional information that could be helpful in making and
substantiating findings in the report

Reviewers may want to share the DMA with the state or local agency as part of the kick-off letter
or meeting. This will allow for agencies to provide any feedback or corrections of the data before
conducting the review.

Note: The DMA and initial findings along with the results from the file review will be used later
in the process to make findings in the report

49


-------
SRF Reviewer's Guide - Round 5

Appendix F: File Selection Procedures

Representative File Selection Step-By-Step

Enter the following EPA web site address in your Internet web browser (preferably
Google for full functionality of the file selection tool): http://echo.epa.eov

In the upper right-hand corner, click on the ECHO Gov Login link

Enter your LAN user id and password; this is the same user id and password that you
use to log into your computer. This is not the numeric PIN for your smartcard.

At the left side of the screen, select "Search Options", then at the bottom of your
screen, click on the blue icon at right called "Oversight". Next, click on the link at the
of the bottom of the page called "State Review Framework"

Scroll down and clickon the File Selection tab in the gray box in the middle of the page
Select the media you are reviewing (CAA, CWA, or RCRA)

Select the fiscal year of frozen data .

Select State Only as the Agency to be reviewed
Select the state or local agency from the Jurisdiction drop down box.

10	Click on the Submit Without Flags button

11	Click the arrows below the Informal Action header twice to bring facilities with

informal actions to the top.

12	Select at least five facilities with informal actions at random by clicking on the

checkboxes on the left. Click the checkboxes twice to indicate that the facilities are
part of the representative selection. You will see a green checkmark next to all
selected files. (Beginning with enforcement actions is an efficient way to conduct
files selection. These facilities are the most likely to have inspections, violations and
penalties reported. To assist with random selection, the File Selection Tool only
identifies facilities by program ID number.)

13	Use the same methodology to select at least 5 formal actions, penalties, non-HPV/

non-SNC violations, SNC/HPV violations, and inspections.

14	Select at least 10 facilities with inspections. (Some of the facilities already selected

will have inspections. These count toward the 10 inspection files.) For CAA, click the

50


-------
SRF Reviewer's Guide - Round 5

up arrow in the FCE column; for CWA and RCRA files, choose the Inspection
column

15	Select additional facilities as needed so at least five are selected in each of the

violation categories.

16	Review the number of files required to be selected based on comparison of the total
number of records returned in the top left side of the file selection tool to the number
of files required to be reviewed in Table 1 [page 20]. If more files need to be selected
to meet minimum file selection requirements, identify activities in greatest need of
additional facilities to make a proper evaluation. Randomly select facilities for those
activities until you have selected at least the minimum number of total files. Review
the file selection criteria on pages 19-20 to ensure that all factors such as geographic
distribution and other criteria are met.


-------
SRF Reviewer's Guide - Round 5

Appendix G: Checklist of Key Items for Conducting File Review

Hard copies:

List of selected facilities

Detailed Facility Reports (DFRs) for each facility
File review checklists for each facility

Contact information for point-of-contact and others at state agency
Copy of the DMA

Electronic copies:

File review worksheet

Completed CWA CMS metric spreadsheet (metrics 4al - 4all) - CWA only

Either hard or electronic copies:

Plain Language Guide

Previous SRF reports & recommendation status

Program MOA or any other relevant state-EPA agreement

This guidance document

CMS Plan?

End of year report?

Enforcement response policies
Penalty policies
Inspection Manual

State compliance monitoring or inspection policies

n
jZ


-------
SRF Reviewer's Guide - Round 5

Appendix H: SRF Draft Report Completeness Checklist

When creating a draft report, be advised that the DMA, File Selection List, CWA inspection
coverage table, File Review Worksheet, and any other documents used for the SRF review
process must be submitted to the HQ SRF Liaison for him/her to determine completeness and
perform an accurate review of the report. These should also be uploaded to the SRF Manager
database which serves as a central repository and official record for the review.

A draft report is complete if all required sections listed below are uploaded into the SRF
Manager database or emailed to the Liaison.

Report Components and Attachments

Comp
Yes

lete?
No

Report Components for Each Element for Each Media Chapter
(CWA, CAA, and/or RCRA) See Example in Appendix 1

~

~

Finding (number and level)

~

~

Summary

~

~

Explanation

~

~

Relevant metrics

~

~

Recommendations

~

~

Attachments

~

~

Data Metric Analysis spreadsheet*

~

~

File Selection spreadsheet*

~

~

CWA inspection coverage table* and/or alternative CMS
plans

~

~

File Review spreadsheet*

~

~

* These documents can be uploaded on the Administration
Information page of the SRF Manager. They will appear in the
Attachments table when the report is finalized





53


-------
SRF Reviewer's Guide - Round 5

Appendix I: Sample Finding and Recommendation

CWA Element 4 —

Enforcement

Finding 4-1

Area for State Improvement

Summary

SNC violations are not addressed in a timely or appropriate manner.

Explanation

For two of the eight SNC violations reviewed, the violations did receive
appropriate follow-up action. However, in six instances, these violations received
neither informal nor formal enforcement action.

The state does not have a formal policy in place for taking enforcement against
SNC violators.

Metric 10a shows that the state was not consistently taking timely enforcement
action. This can be traced to the failure to complete inspection reports in a timely
manner.





Relevant metrics

• ¦ _ Natl Natl State State State
Metric ID Number and Description „ , „ ^

K Goal Avg N D % or #



10a Major facilities with timely action 98% - 1 8 13%



10b Enforcement responses reviewed that

address violations in an appropriate 100% - 5 15 33%
manner





State response

The state agrees that this is a problem and has agreed to work with EPA to resolve
it.

Recommendation

1) The state will develop a Standard Operating Procedure (SOP) for taking
enforcement action against SNC violators within 90 days of finalization of this
report and will send a copy to EPA for approval. 2) The state will immediately begin
taking enforcement action against SNC violators in accordance with the SOP
developed under item 1. 3) EPA will monitor performance via quarterly conference
calls and annual SRF data metric analyses. EPA will close this recommendation
after approving the state's SOP and observing three consecutive quarters of
performance that meets national goals.

54


-------
SRF Reviewer's Guide - Round 5

Appendix J: Establishing Finding Levels

The table below provides a definition of each finding level and offers suggested metric value
ranges for help in deciding on a finding level. These value ranges are simply a guide in selecting
an appropriate finding level. Other factors may be considered (e.g., universe size of metric) in
choosing an appropriate level.

Suggested Metric
Value Ranges

Finding Level

~85-100%

Meets or Exceeds Expectations: The base level is met, and no performance
deficiencies are identified, or the program is performing above national
expectations.

~71-84%

Area for Attention: An activity, process, or policy that one or more SRF
metrics show as a minor problem. Where appropriate, the state should
correct the issue without additional EPA oversight. EPA may make
suggestions to improve performance, but it will not monitor these
suggestions for completion between SRF reviews. These areas are
typically not highlighted as priority areas to address in an executive
summary.

~70% and below

Area for Improvement: An activity, process, or policy that one or more SRF
metrics under a specific element show as a significant problem that the
agency is required to address. Recommended activities to correct the issues
should be included in the report and must have well-defined timelines and
milestones for completion, and, if possible, should address root causes. EPA
will monitor recommendations for completion between SRF reviews in the
SRF Manager database and provide any necessary updates in the EPA
Manager database.

Additional Factors

Sample Size

In cases where there is a small universe for a metric or a low number of activities to review, the
small sample size means greater variability in metric values which can make it difficult to establish
a reliable finding on performance.

Though the review focuses on a one-year period of activity, the reviewer can select additional
files from prior years of activity to increase the sample size and have a more robust set of files.
Reviewers can also use multi-year trend data to decide when performance is on the edge of two
finding levels Otherwise, follow the general range unless there is evidence to support a different
conclusion. If such evidence exists, include that information in the explanation section of the
finding which will be reviewed by HQ.

55


-------
SRF Reviewer's Guide - Round 5

Appendix K: Tips for Conducting Electronic File Reviews Under the
State Review Framework

The purpose of this document is to assist SRF reviewers in their ability to conduct reviews using
electronic files. Whether or not it is appropriate to conduct an SRF review electronically will
depend upon a number of factors, most fundamentally, the availability of electronic files and/or
the ability to put such documents into an electronic format. This document outlines strategies
for success specific to electronic file reviews. In addition, the document seeks to articulate the
benefits and constraints associated with electronic file reviews. Overall, the SRF program seeks
to take advantage of the benefits that electronic file reviews can provide without
underemphasizing the insight and benefits to the program that face-to-face interactions
provide.

Basic Steps:

1.)	Determine how many of the following file review materials are currently available
electronically:

•	Inspection reports

•	Alternate compliance monitoring strategy agreements

•	Violations

•	Compliance determinations

•	Force majeure claims, compliance extension and waiver requests

•	Correspondence sent to/from facilities and the state on response to violations

•	Informal enforcement actions

•	Formal enforcement actions

•	Supplemental environmental project (SEP) proposals associated with enforcement
actions

•	Certifications of completion for corrective action completion required by enforcement
actions

•	Penalty calculation spreadsheets that document economic benefit and gravity

•	Documentation of changes between the initial and final penalty Documentation of
penalty collection including copies of cancelled checks and/or documentation from state
financial accounting systems

2.)	Develop a single place to store all file review materials including:

•	Data metric analysis

•	Guidance: plain language guide, quick metric reference guide, SRF reviewer's guide

•	Past SRF reports

•	File selection list

•	File selection assignments among review team members

•	File review worksheet

56


-------
SRF Reviewer's Guide - Round 5

•	File review facility checklists

•	Draft report

•	Responses to EPA questions from the state

•	Database reports from ICIS or RCRAInfo and information received from state data
system

•	Inspection reports

•	Violations

•	Compliance determinations

•	Correspondence sent to/from facilities and the state on response to violations

•	Informal enforcement actions

•	Formal enforcement actions

•	Penalty calculation spreadsheets that document economic benefit and gravity

•	Documentation of changes between the initial and final penalty

•	Documentation of penalty collection including copies of cancelled checks and/or
documentation

from state financial accounting systems

•	State end of year results reports and grant work plans

•	Alternative CMS commitments

•	CWA inspection coverage table (for water reviews only)

•	EPA and state inspection manuals, enforcement response policies, standard operating

•	procedures, and penalty policies

•	EPA inspection manuals, enforcement response policies, and penalty policies

•	Contact list for review team members and key state contacts, including their roles and

•	responsibilities throughout the process

•	Organization Charts including both an Agency and Program specific overview to see how
items such as formal enforcement move through the process (i.e., legal services vs. City
Council etc.)

•	Database reports from ICIS or RCRAInfo and information received from state data
systems

•	SRF review and report development schedule

Tips for a Successful File Review

Pre-Review Planning

•	Determine the number of staff needed to complete the facility and programmatic
reviews within the target deadlines

•	Electronic file reviews may allow for different numbers of team members and
different review timeframes than an in-person review might allow because of travel
and other considerations.

•	Discuss the variety of ways that states document their compliance determinations
and assess whether sufficient information will be available electronically

57


-------
SRF Reviewer's Guide - Round 5

•	Evaluate what information is publicly available to reduce burden of document
collection on state staff, many states publicize formal enforcement actions, and
some post inspection reports online

•	Establish a file naming convention with the state in advance or rename files upon
receipt with a record of their original name

o Experience indicates that some state IT departments or data staff save the
date of the file upload to a web site or shared drive rather than the date the
inspection/action took place

•	Be aware that file naming conventions/nomenclature can be misleading; plan on
extra time to inventory whether all requested information is provided before
starting the file review

Review Team Coordination

•	Block off and schedule time on a regular basis to review files, discuss findings with
review team members, and request additional information or ask further questions
of state counterparts to ensure the review team can meet its established file review
deadlines

•	Consider using One Note, Microsoft Teams, SharePoint, One Drive, or obtain direct
access to the state's electronic files, if possible, to share files with EPA staff on the
review team and to organize file review materials listed above as well as questions
and answers developed during the review between the review team and the primary
agency being reviewed.

•	Familiarize team members with electronic files and establish mechanism for sharing
files with the review team

•	Conduct an inventory of all files received on the first day of the partial on-site file
review, or remotely once all files are provided prior to starting the file review

•	Network with other staff who have conducted full, or partial, remote file reviews to
learn from their experience

•	Search electronically for content in lengthier documents by key word search using
the CTRL + F search capability in Microsoft Word and/or pdf files to make the review
of files more efficient

•	Clarify process for the integration of results across team reviewers

58


-------
SRF Reviewer's Guide - Round 5

Appendix L: Annual Data Metric Analysis

ATTACHMENT #1
Instructions for Downloading the ADMA from ECHO

To download an ADMA, follow the steps below:

1)	Log in to ECHO.gov

2)	Go to Search Options > Oversight > State Review Framework

3)	Click on the Annual Data Metrics Analysis tab
o Select the Statute (CAA, CWA, RCRA)

o Select the Review Year

o Choose the State being reviewed, and if applicable, the Local Agency
o Click submit

4)	A new window with table of metric values will appear, click download button

5)	Save document as: State-Local_Statute_Review Year Document Type
o e.g., AL_CWA_FY17_ADMA or AL-Jefferson_CAA_FYl 7_ADMA

Attachment #2
SRF Data Metrics

RCRA

Element 1 - Data

lal Number of operating TSDFs
la2 Number of active LQGs
la5 Number of BR LQGs
Element 2 - Inspections

lbl Number of sites with on-site inspections
5a Two-year inspection coverage for operating TSDFs
5b Annual inspection coverage for BR LQGs (review 5b or 5bl)

5bl Annual inspection coverage for active LQGs (review 5b or 5bl)
5dl Number of SQGs inspected
5e5 Number of VSQGs inspected
5e6 Number of transporters inspected
5e7 Number of other sites inspected
Element 3 - Violations

lcl Number of sites with new violations during review year

lc2 Number of sites in violation at any time during the review year regardless of

determination date

lei Number of sites with new SNC during year
2a Long-standing secondary violators

59


-------
SRF Reviewer's Guide - Round 5

7b Violations found during CEI and FCI compliance evaluations
8a SNC identification rate at sites with CEI and FCI compliance evaluations
8b Timeliness of SNC determinations
Element 4 - Enforcement

ldl Number of sites with informal enforcement actions
ld2 Number of informal enforcement actions
lfl Number of sites with formal enforcement actions
lf2 Number of formal enforcement actions
10a Number of SNY evaluations with timely enforcement
Element 5 - Penalties

lg Total dollar amount of final penalties

60


-------
SRF Reviewer's Guide - Round 5

Element

Metric ID

Metric description

Metric
Type

National
Goal

National
Average
(FY20)

FY20

Data Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Element
1 - Data

lal

Number of operating TSDFs

Data

Verification

















Element
1 - Data

la2

Number of active LQGs

Data

Verification

















Element
1 - Data

la5

Number of BR LQGs

Data

Verification

















Element

9























Inspecti
ons

lbl

Number of sites with on-site inspections

Data

Verification

















Element

9 _























Inspecti
ons

5a

Two-year inspection coverage for
operating TSDFs

Goal

















Element

9























Inspecti
ons

5b

Annual inspection coverage for BR LQGs
(review 5b or 5bl)

Goal

















Element

9























Inspecti
ons

5bl

Annual inspection coverage for active
LQGs (review 5b or 5bl)

Goal

















61


-------
SRF Reviewer's Guide - Round 5

Element

Metric ID

Metric description

Metric
Type

National
Goal

National
Average
(FY20)

FY20

Data Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Element

9























4 -

Inspecti
ons

5dl

Number of SQGs inspected

Informatio
nal Only

















Element

9























4 -

Inspecti
ons

5E5

Number of VSQGs inspected

Informatio
nal Only

















Element

9























4 -

Inspecti
ons

5E6

Number of transporters inspected

Informatio
nal Only

















Element

9























4 -

Inspecti
ons

5E7

Number of other sites inspected

Informatio
nal Only

















Element

O























J -

Violatio
ns

lcl

Number of sites with new violations
during review year

Data

Verification

















Element
3-

Violatio
ns

lc2

Number of sites in violation at any time
during the review year regardless of
determination date

Data

Verification

















Element

O























J -

Violatio
ns

1E1

Number of sites with new SNC during
year

Data

Verification

















62


-------
SRF Reviewer's Guide - Round 5

Element

Metric ID

Metric description

Metric
Type

National
Goal

National
Average
(FY20)

FY20

Data Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Element

O























J -

Violatio
ns

2a

Long-standing secondary violators

Review
Indicator

















Element

O























J -

Violatio
ns

7b

Violations found during CEI and FCI
compliance evaluations

Review
Indicator

















Element

O























J -

Violatio
ns

8a

SNC identification rate at sites with CEI
and FCI compliance evaluations

Review
Indicator

















Element

O























J -

Violatio
ns

8b

Timeliness of SNC determinations

Goal

















Element
4-























Enforce
ment

ldl

Number of sites with informal
enforcement actions

Data

Verification

















Element
4-























Enforce
ment

ld2

Number of informal enforcement actions

Data

Verification

















Element
4-























Enforce
ment

lfl

Number of sites with formal enforcement
actions

Data

Verification

















63


-------
SRF Reviewer's Guide - Round 5

Element

Metric ID

Metric description

Metric
Type

National
Goal

National
Average
(FY20)

FY20

Data Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Element
4-

Enforce
ment

lf2

Number of formal enforcement actions

Data

Verification

















Element
4-

Enforce
ment

10a

Number of SNY evaluations with timely
enforcement

Goal

















Element
5-

Penaltie

s

lg

Total dollar amount of final penalties

Data

Verification

















64


-------
SRF Reviewer's Guide - Round 5

CAA

Element 1 - Data

lal	Number of Active Majors

la2	Number of Active Synthetic Minors

la3	Number of Active Minors Subject to NESHAP Part 61

la4	Number of Other Active Facilities on CMS Plan

la5	Number of HPV Minors

la6	Number of Minors Subject to Formal Enforcement

lb4	Number of Active Title V Facilities

lb5	Number of CMS Majors

lb6	Number of CMS 80% Synthetic Minors

Vol	Number of Other CMS Minors

3a2	Timely reporting of HPV determinations into ICIS-Air

3bl	Timely reporting of compliance monitoring MDRs

3b2	Timely reporting of stack tests and stack test results

3b3	Timely reporting of enforcement MDRs

Element 2 - Inspections

lcl	Number of Facilities with an FCE (Facility Count)

lc2	Number of FCEs (Activity Count)

li7	Number of Stack Tests that occurred

ljl	Number of Facilities with a Reviewed TVACC

lj2	Number of Facilities with TVACC Due

5a	FCE coverage: majors and mega-sites

5b	FCE coverage: SM-80s

5c	FCE coverage: minor and synthetics minor (non-SM80s) sources that are part of a CMS Plan and Alternative CMS Facilities

5e	Reviews of Title V annual compliance certifications completed

Element 3 - Violations

ldl	Number of Facilities with an FRV Identified (Facility Count)

ld2	Number of Case Files with an FRV Identified (Activity Count)

lei	Number of Informal Enforcement Actions (Activity Count)

65


-------
SRF Reviewer's Guide - Round 5

le2	Number of Facilities with an Informal Enforcement Action (Facility Count)

lfl	Number of Case Files with an HPV Identified (Activity Count)

lf2	Number of Facilities with an HPV Identified (Facility Count)

7al	FRV 'discovery rate' based on evaluations at active CMS sources

8a	Discovery rate of HPVs at majors

13	Timeliness of HPV Identification
Element 4 - Enforcement

lgl	Number of Formal Enforcement Actions (Activity Count)

lg2	Number of Facilities with a Formal Enforcement Action (Facility Count)

lh2	Number of Formal Enforcement Actions with an Assessed Penalty
lOal Rate of Addressing HPVs within 180 days
lObl Rate of managing HPVs with an NOV or NOW or no action
Element 5 - Penalties

lhl	Total Amount of Assessed Penalties

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Data

Verification



























lal

Number of
Active Majors

Data
Verificatio
n

State





274

272

253











la2

Number of

Active
Synthetic
Minors

Data
Verificatio
n

State





281

290

252











66


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

la3

Number of
Active Minors
Subject to
NESHAP Part
61

Data
Verificatio
n

State





1

0

0











la4

Number of
Active CMS
Minors

Data
Verificatio
n

State





1

1

6











la5

Number of
HPV Minors

Data
Verificatio
n

State





0

0

0











la6

Number of
Minors Subject

to Formal
Enforcement

Data
Verificatio
n

State





2

3

1











lb4

Number of
Active Title V
Facilities

Data
Verificatio
n

State





242

235

225











lb5

Number of
CMS Majors

Data
Verificatio
n

State







190

140











lb6

Number of
CMS 80%
Synthetic
Minors

Data
Verificatio
n

State







131

85











lb7

Number of
Other CMS
Minors

Data
Verificatio
n

State







0

0











67


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

lcl

Number of
Facilities with
an FCE (Facility
Count)

Data
Verificatio
n

State





288

303

197





Significant drop
in FCEs in FY20,
although data
entry may not
be complete.





lc2

Number of
FCEs (Activity
Count)

Data
Verificatio
n

State





293

307

197











ldl

Number of
Facilities with
an FRV
Identified
(Facility Count)

Data
Verificatio
n

State





2

2

2











ld2

Number of
Case Files with
an FRV
Identified
(Activity
Count)

Data
Verificatio
n

State





2

2

2











lei

Number of
Informal
Enforcement
Actions
(Activity
Count)

Data
Verificatio
n

State





93

63

41





3-year

downward trend
in informal
enforcement
actions,
although the
number of
facilities with
informal actions
remained fairly
steady





68


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

le2

Number of
Facilities with
an Informal
Enforcement
Action (Facility
Count)

Data
Verificatio
n

State





45

47

31











lfl

Number of
Case Files with
an HPV
Identified
(Activity
Count)

Data
Verificatio
n

State





2

1

2











lf2

Number of
Facilities with
an HPV
Identified
(Facility Count)

Data
Verificatio
n

State





2

1

2











Igl

Number of

Formal
Enforcement
Actions
(Activity
Count)

Data
Verificatio
n

State





7

9

6





Less than 15% of
informal actions
progress to
formal actions





lg2

Number of
Facilities with

a Formal
Enforcement
Action (Facility
Count)

Data
Verificatio
n

State





7

9

6











69


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

lhl

Total Amount
of Assessed
Penalties

Data
Verificatio
n

State





$258,50
0

$247,50
0

$92,00
0





Significant drop
in penalties in
FY20, although
data entry may
not be
complete.





lh2

Number of

Formal
Enforcement
Actions with
an Assessed
Penalty

Data
Verificatio
n

State





6

9

6











li7

Number of
Stack Tests
that occurred

Data
Verificatio
n

State







146

48











ljl

Number of
Facilities with
a Reviewed
TVACC

Data
Verificatio
n

State







424

364





Metric value
should not
significantly
exceed the
number of Title
V facilities (lb4);
appears that
non-Title V
certifications are
being entered as
TVACCs





lj2

Number of
Facilities with
TVACC Due

Data
Verificatio
n

State







225

223











70


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

Element 1 -
Data



























3a2

Timely
reporting of
HPV
determination

s

Goal

State



40.5%

50.0%

0.0%

100.0%





The one HPV in
FY21was
reporting into
ICIS-Air about 2
months late.





3bl

Timely
reporting of
compliance
monitoring
MDRs

Goal

State

100.0%

82.3%

83.1%

46.8%

95.5%











3b2

Timely
reporting of
stack tests and
stack test
results

Goal

State

100.0%

67.1%

45.4%

43.2%

66.7%











3b3

Timely
reporting of
enforcement
MDRs

Goal

State

100.0%

77.6%

89.0%

48.6%

95.7%











Element 2 -
inspections



























5a

FCE coverage:
majors and
mega-sites

Goal

State

100.0%

88.7%

99.4%

93.2%

80.0%











71


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

5b

FCE coverage:
SM-80s

Goal

State

100.0%

93.7%

95.5%

90.8%

97.6%











5c

FCE coverage:
minor and
synthetics
minor (non-

SM80s)
sources that
are part of a
CMS Plan and

Alternative
CMS Facilities

Goal

State

100.0%

85.8%

NA

NA

NA











5e

Reviews of
Title V annual
compliance
certifications
completed

Goal

State

100.0%

76.7%

85.1%

90.7%

78.0%











Element 3 -
Violations



























7al

FRV "discovery
rate" based on
evaluations at
active CMS
sources

Support

State



6.2%

0.4%

0.4%

0.4%











8a

Discovery rate
of HPVs at
majors

Support

State



2.3%

0.7%

0.4%

0.8%











72


-------
SRF Reviewer's Guide - Round 5

Metric ID

Metric Name

Metric
Type

Agency

National
Goal

National
Average
(FY20)

FY20

FY21

FY22

Data
Trend

Related SRF
Recommendation

(Review Year,
Finding #, Status)

Analysis

Questions for
the State

Follow-Up

13

Timeliness of

HPV
Identification

Goal

State

100.0%

87.7%

50.0%

100.0%

100.0%











Element 4 -
Enforcemen
t



























10a 1

Rate of
Addressing
HPVs within
180 days

Support

State



63.7%

0.0%

0.0%

NA











10b 1

Rate of
managing
HPVs with a
NOV or NOW
or No Action

Support

State



12.9%

100.0%

50.0%

NA











CWA

Element 1 - Data

lal

Number

of

active

NPDES

la2

Number

of

active

NPDES

la3

Number

of

active

NPDES

la4

Number

of

active

NPDES

lb5 Permit limit data entry rate for major and non-major facilities

73


-------
SRF Reviewer's Guide - Round 5

lb6	DMR data entry rate for major and non-major facilities

Vol	Number of active NPDES individual DMR filers

lb8	Number of active NPDES individual DMR filers with permit limits in ICIS

lei	Facilities with Informal Actions

lfl	Facilities with Formal Actions

lg3	Facilities with Penalties
Element 2 - Inspections

5al	Inspection coverage of NPDES majors

5a3	Number of inspected major facilities

5bl	Inspection coverage of NPDES non-majors with individual permits

5b2	Inspection coverage of NPDES non-majors with general permits

5b3	Number of inspected non-major individual or unpermitted facilities

5b4	Number of inspected non-major general permit covered facilities
Element 3 - Violations

7jl	Number of major and non-major facilities with single-event violations reported in the review year

7j2	Number of facilities with SNC/Category 1 noncompliance

7j3	Number of facilities with RNC/Category 2 noncompliance or effluent, single event, or schedule violations open during the year

7kl	Major and non-major facilities in noncompliance

8a3	Percentage of active major facilities in SNC and non-major facilities in Category I noncompliance during the reporting year

8a4	Percentage of active non-major general permit facilities in Category I noncompliance during the reporting year
Element 4 - Enforcement

lOal Percentage of major NPDES facilities with formal enforcement action taken in a timely manner in response to SNC violations

10a2	Percentage of major individually permitted NPDES facilities with formal enforcement action taken in a timely manner in response to missing DMR SNC violations

10a3	Percentage of major individually permitted NPDES facilities with formal enforcement action taken in a timely manner in response to SNC effluent violations

10a4	Percentage of major individually permitted NPDES facilities with formal enforcement action taken in a timely manner in response to SNC compliance schedule violations
Element 5 - Penalties

74


-------
SRF Reviewer's Guide - Round 5

CWA

Elemen
t

Metric
ID

Metric Description

Metric
Type

Add

ADMA

Data

from

ECHO

Area of
Concern

ADMA Question

Data

lb5

Completeness of data entry on major and non-major permit
limits.

Goal



X

Are data entered complete?

lb6

Completeness of data entry
on major and non-major
discharge monitoring reports















Goal





Are data entered complete?



If an Area for Concern:

Provide the state with MDRs

Ensure that state has a designated data steward, and that the
state and regional data stewards are effectively coordinating
Ensure that protocol is in place for entering data
Ensure that state database is accurately transferring data into the
national database (i.e. - Electronic Data Transfer (EDT))

Provide training if necessary

Inspections

5a3

Number of inspected major facilities [Data Verification]

Indicator





Are inspections increasing or decreasing in any significant way
from previous year(s)?

5b3

Number of inspected non-
major individual or
unpermitted facilities [Data
Verification]















Indicator





Are inspections increasing or decreasing in any significant way
from previous year(s)?

5b4

Number of inspected non-
major general permit
facilities [Data Verification]















Indicator





Are inspections increasing or decreasing in any significant way
from previous year(s)?

5al

Inspection coverage of NPDES majors

Goal





Is inspection coverage being met, either through traditional or
alternative CMS?

75


-------
SRF Reviewer's Guide - Round 5



5bl

Inspections coverage of NPDES non-majors with individual permits

Goal





Is inspection coverage being met, either through traditional or
alternative CMS?

5b2

Inspections coverage of NPDES non-majors with general permits

Goal





Is inspection coverage being met, either through traditional or
alternative CMS?







If an Area for Concern:

Provide state with NPDES CMS policy

Determine if inspection numbers match with End of Year Reports
(EOY)

If inspections are below commitments or decreasing, determine if
state is using a traditional or alternative CMS

-	if using an alternative plan, determine why commitments
are not being met and if adjustments are required

-	if using a traditional plan, consider working with the state
to create an alternative plan

Ensure that the state has an adequate targeting strategy in place
in coordination with the region

Violations

7jl

Number of major and non-major facilities with single-event
violations reported in the review year

Indicator





Are SEVs being entered? Are they increasing or decreasing in any
significant way from previous year(s)?





If an Area for Concern:

Provide SEV data entry guide
Provide training if necessary

7j2

Number of active facilities
with SNC/Category 1 non-
compliance [Data
Verification]















Indicator





Are facilities with significant violations increasing or decreasing in
any significant way?

7j3

Number of active facilities
with RNC/Category 2 non-
compliance [Data
Verification]















Indicator





Are facilities with violations increasing or decreasing in any
significant way?

8al

Number of Major facilities in SNC [Data Verification]

Indicator





Are majors with significant violations increasing or decreasing in
any significant way?

76


-------
SRF Reviewer's Guide - Round 5



7kl

Major and non-major facilities in noncompliance

Indicator





Is the percentage of facilities in non-compliance significantly
above or below the annual national average? Is the increasing or
decreasing in any significant way from previous year(s)?

8a3

Percentage of major facilities in SNC and non-major facilities in
Category 1 noncompliance during the reporting year

Indicator





-	Is the percentage of major and non-major facilities with more
severe violations significantly above or below average?

-	Is the percentage of major and non-major facilities with more
severe violations increasing or decreasing in any significant way?

8a4

Percentage of active non-major general permit facilities in
Category 1 noncompliance during the reporting year

Indicator





-	Is the percentage of non-major facilities in Cat. 1 noncompliance
significantly above or below average?

-	Is the percentage non-major facilities in Cat. 1 noncompliance
increasing or decreasing in any significant way?



If an Area for Concern:

Determine if the violation rates match with state data
Provide the state with the relevant violation policies (FRV/HPV,
SEV/SNC, SV/SNC)

If violation counts or rates appear notably low, review a selection
of files containing inspections to determine if the state is making
accurate violation determinations

If violation counts or rates appear notablv high, review a selection
of files containing inspections to determine if violations are being
adequately addressed

Enforcement

lei

Number of facilities with informal actions [Data Verification]

Indicator





Are informal actions being entered (MDR requirement for minors
as of FY17)? Are the numbers of informal actions increasing or
decreasing in any significant way from previous year(s)? Does the
number of informal actions suggest any potential issues in terms
of taking appropriate enforcement or overall compliance rates?

le2

Number of informal actions [Data Verification]

lfl

Number of facilities with formal actions [Data Verification]

Indicator





Are the numbers of formal actions increasing or decreasing in any
significant way? Does the number of formal actions suggest any
potential issues in terms of taking appropriate enforcement or
overall compliance rates?

lf2

Number of formal actions [Data Verification]

lOal

Percentage of major NPDES facilities with formal enforcement
action taken in a timely manner in response to SNC violations

Indicator





Is formal enforcement being taken timely in accordance with the
response policy? Has timely formal enforcement increased or
decreased in any significant way?

77


-------
SRF Reviewer's Guide - Round 5



10a2

Percentage of major individually permitted NPDES facilities
with formal enforcement action taken in a timely manner in
response to missing DMR SNC violations

Indicator





Is formal enforcement being taken timely in accordance with the
response policy? Has timely formal enforcement increased or
decreased in any significant way?

10a3

Percentage of major individually permitted NPDES facilities
with formal enforcement action taken in a timely manner in
response to SNC effluent violations

Indicator





Is formal enforcement being taken timely in accordance with the
response policy? Has timely formal enforcement increased or
decreased in any significant way?

10a4

Percentage of major individually permitted NPDES facilities
with formal enforcement action taken in a timely manner in
response to SNC compliance schedule violations

Indicator





Is formal enforcement being taken timely in accordance with the
response policy? Has timely formal enforcement increased or
decreased in any significant way?



If an Area for Concern:

Provide the state with the relevant enforcement response policies
If rates of timely formal enforcement appear notably low, review
a selection of files containing SNC violations to determine if
enforcement eventually took place

General

Is there a SRF recommendation from a completed report related to a flagged area?

Y/N	Response

Yes	Prioritize recommendation for completion.

Yes	Contact state or local agency to inquire what progress is being made to implement recommendation.

Work with the state or local agency to identify potential causes and solutions. Track any action items through
No	routine oversight.

78


-------
SRF Reviewer's Guide - Round 5

ATTACHMENT #3
General Questions for the Annual Review

Verify or Check the Quality of SRF Public Data

•	Are Minimum Data Requirements (MDRs) met for data entry?

•	Are there data gaps or data quality concerns for specific metrics that may require special
attention during the Annual Data Verification process?

Determine Potential Annual Performance Issues

•	Did the program meet its annual inspection commitments, either through alternative or
traditional Compliance Monitoring Strategy (CMS)?

o If not, should commitments be revised in the alternative CMS or grant workplan?
o If an alternative CMS does not exist, should EPA work with the program to
develop one?

•	Are violation determinations for SNC and HPV timely?

•	Does the number of reported violations (SEVs, FRVs/HPVs, SV/SNC) seem realistic in
relation to inspections conducted?

•	Are the 'violation rate' metrics (i.e., % of major facilities in non-compliance, FRVs found
during inspections, etc.) notably above or below the annual national average?

•	Does the number of informal and formal actions seem realistic in relation to universe,
inspections, and violations?

•	Is formal enforcement taken timely against SNC/HPVs in accordance with the response
polices?

Utilize Data Trends

•	What is the trend direction, is it gradual or sudden?

•	Are data metrics increasing or decreasing in any notable way from previous years?

•	Are the number of inspections higher or relatively the same while violation rates are
lower than previous years? Might this indicate the program is not accurately identifying
or reporting violations?

•	Are violation rates or counts higher while the number of enforcement actions lower
than previous years? Might this indicate that the program is not taking timely and
appropriate enforcement?

Track Areas for Improvement Identified in Previous SRF Reviews

•	Is there a recommendation(s) from a previous SRF review related to a flagged area in
the ADMA?

o If yes and the recommendation deadline is in the future (ongoing), check with

the state to ensure progress,
o If yes and the recommendation is overdue, prioritize the recommendation for
completion.

79


-------
SRF Reviewer's Guide - Round 5
o If yes and completed, determine if the issue has resurfaced.

80


-------
SRF Reviewer's Guide - Round 5

ATTACHMENT #4

The table below is a sample of the reporting tool used for the annual analysis. Each cell should be completed, and the level of detail may vary based on status of a given metric.

CWA Data Metrics

Metric
#

Metric

ECHO Data
Results
+ metric type

Data Quality Confidence
(High, Medium, Low)

Goal
Met
(Y/N)

Data Trend
(Up/Down)

Related SRF Recommendation?
(finding number)

Analysis

Questions
for the
State

Follow-Up
Required
(e.g., training,
file review)

lb5

Completeness
of data entry
on major and
non-major
permit limits.

















81


-------