State Review Framework

Compliance and Enforcement Program

Oversight

SRF Reviewer's Guide
Round 4 (2018-2022)



&

PRO^°

U.S. Environmental Protection Agency
Office of Compliance, Office of Enforcement and
Compliance Assurance (OECA)


-------
SRF Reviewer's Guide - Round 4

Table of Contents

I.	INTRODUCTION	4

A,	Background on the State Review Framework	4

B,	The Importance of a High-Quality Review	5

C,	Overview of the SRF Review Process	6

II.	PREPARING I OR THE SRI REVIEW	7

A.	Data Verification	7

B.	Selecting Local Agencies and State District Offices (if applicable)	7

1.	Reviewing Local Agencies	7

2.	Reviewing State District Offices	8

C.	Ki ckoff Letter / C onference	10

1.	Ki ckoff Letter	10

2.	Kickoff Conference (optional)	10

D.	Regional Coordination with Headquarters	11

III.	CONDUCTING THE REVIEW	 12

A,	Data Metric Analysis (DMA)	12

B.	Annual Data Metric Analysis	12

C,	File Selection	13

1.	File Selection Preparation	13

2.	Determining Minimum Number of Facilities to Review	14

3.	Selecting a Representative Sample of Files	154

4.	Supplemental File Selection	145

5.	Transmit File Selection List	156

D.	File Review	16

1.	File Review Preparation	17

2.	Coordination with State	17

3.	Conducting the File Review	17

4.	Developing Preliminary Findings	19

5.	Exit Conference	19

IV.	DRAFTING AM) FINALIZING THE SRI REPORT	20

A, Developing the First Draft	20

1.	Admini strative Informati on	21

2.	Performance Findings	21

3.	Recommendations	23


-------
SRF Reviewer's Guide - Round 4

4. Executive Summary	24

B, Finalizing the Report	25

1.	HQ Review of Initial Draft Report	25

2.	HQ Review of Subsequent Draft SRF Reports	255

3.	State Comment on Draft SRF Report	26

4.	Finalizing the Report in the SRF Manager Database and SRF Web Site	26

V. POST-REVIEW RECOMMENDATION MONITORING AND CLOSEOUT	26

A.	Annual Recommendation Inventory	27

B.	Monitoring Ongoing Recommendations (formally known as "working")	27

C.	Prioritizing and Elevating Overdue Recommendations	28

D.	Verifying Recommendation Completion	28

Appendix A: SRF Key Information	31

Appendix B: Data Verification	32

Appendix C: Regional and Headquarters Coordination During the Review	33

Appendix D: Kick-off Letter Template	35

Appendix E: Data Metric Analysis (DMA) Procedures	37

Appendix F: File Selection Procedures	39

Appendix G: Checklist of Essential Items for Conducting File Review	41

Appendix H: SRF Draft Report Completeness Checklist	42

Appendix I: Sample Finding and Recommendation	43

Appendix J: Establishing Finding Levels	44


-------
SRF Reviewer's Guide - Round 4

I. INTRODUCTION

This document, "The SRF Reviewer's Guide" serves as the comprehensive guidance for the State
Review Framework (SRF), meant to provide regional and headquarters personnel involved in an SRF
review — managers, liaisons, coordinators, and program leads — with a description of the procedures
for conducting the review, roles and responsibilities, and essential content to include in the final report.

Though the Reviewer's Guide covers a considerable amount of material, it is not exhaustive and
therefore is intended to be used in conjunction with additional material when conducting a review, such
as:

•	Media-Specific Plain Language Guides (PLGs) - in-depth descriptions of the review elements
and metrics along with instructions on using the metrics to make appropriate performance
findings

•	Metric Quick Reference Guides - spreadsheets that contain all SRF metrics with descriptions

•	File Review Checklist - template used to document information during a file review

•	File Review Worksheet- template used to compile results and preliminary findings based on
the file review metrics

•	Training Videos - visual step-by-step explanations of key parts of a review (e.g. file selection)

•	SRF Manager Database: User Guide - Instruction on how to use the database

The documents above can be found on the SRF Manager Database, while the training videos are
available on ECHO'S SRF web page.

A, Background on the State Review Framework

The State Review Framework (SRF) is the primary means by which EPA conducts oversight of state
delegated and EPA directly implemented compliance and enforcement programs under three core federal
environmental statutes covering air, water, land (Clean Air Act, Clean Water Act, Resource Conservation
and Recovery Act). SRF was established in 2004, developed jointly by EPA and the Environmental
Council of the States (ECOS) in response to calls both inside and outside the agency for improved and
more consistent oversight. The key goals that were agreed upon at its formation are:

1.	Ensure delegated programs and EPA Direct Implementation (Dl) programs meet minimum
performance standards outlined in federal policies and guidance

2.	Promote fair and consistent enforcement necessary to protect human health and the environment

3.	Promote equitable treatment and level interstate playing field for business

4.	Provide transparency with publicly available data and reports

The review is conducted on a five-year cycle so that programs are reviewed once every five years.
Programs are evaluated on a one-year period of performance; typically,


-------
SRF Reviewer's Guide - Round 4

the year prior to review. The review is based on a standardized set of metrics to make findings on
performance in five categories: data, inspections, violations, enforcement, and penalties. If the review
finds that program performance deviates significantly from federal policy, guidance, or standards, the
EPA issues recommendations for corrective action which are monitored by EPA until they are fully
implemented. Results of the review are organized into a final report which is published on EPA's public
web site.

B. The Importance of a High-Quality Review

Conducting a thorough, high-quality review is essential if the findings on program
performance are to be considered accurate and credible - an important factor in terms of
oversight and public transparency. Furthermore, a high-quality review increases the
likelihood that if or when performance issues are identified, EPA and the authorized
program will be able to effectively work together to improve performance so that it returns
back in line with federal policy and standards.

What does a high-quality review look like?

•S data in the national data systems are verified as accurate;

•S the selection of files is sufficient in number and to the degree possible, representative of the full

universe and program activity;

•S program reviewers are adequately trained;

•S findings on performance are accurate and substantiated;

•S the report is clear and concise;

•S recommendations are SMART {specific, measurable, achievable, results-oriented, time-bound);
S recommendation implementation is monitored and completion verified

5


-------
SRF Reviewer's Guide - Round 4

C. Overview of the SRF Review Process

The review typically takes one year to complete, from verification of data in the national systems by state and
regional data stewards until publication of the final report on the public SRF web site. The diagram below
outlines the general stages in the review process along with a suggested schedule for completion. The
three fixed dates pertain to data verification, the draft report, and final report.

Preparing for the Review

Data Verification

Selecting a State/Local to Review

Kickoff Letter/Meeting

Conducting the Review

Data Metric Analysis (60-days before on-site review)
CWA Inspect. Table (60-days before on-site review)
File Selection List (30-days before on-site review)
On-Site Review

File Review Worksheet (30-days after on-site review)

November to February

(months 1-3)
Feb 15: Data verification ends

March to August

(months 4-10)

J

Important: Submit documents to HQ

Drafting and Finalizing the Report

Draft Report

HQ Review	(15 calendar-days)

State/Local Comment (45 calendar-days)

Final Report

Recommendation Monitoring
and Closeout

Tracking State/Local Implementation
Progress

Completion Verification and Closeout

September - December

(months 11-13)

Sep 30: Draft reports due
Dec 31: Final reports due

6


-------
SRF Reviewer's Guide - Round 4

II. PREPARING FOR THE SRF REVIEW

Before reviewers can begin the substantive review - analyzing the compliance and enforcement data
metrics and reviewing facility files - a series of preparatory steps are required. These steps include: making
sure that the data in the national databases is accurate; if applicable, selecting the appropriate local
agencies or state district offices to review and; establishing communication with the agency being
reviewed to officially notify them of the upcoming review and coordinate the review process. These steps
are described in more detail below.

A.	Data Verification

Data verification is an annual process by which states, and regions responsible for Direct
Implementation (DI) programs, have the opportunity to review and correct data in the national data
systems (e.g. - facility and activity counts) to ensure that it is complete and accurate, and in turn that
the SRF data metrics and ECHO dashboards rest on a solid foundation of quality data.

The steps undertaken by data stewards to verify data are listed in Appendix B

Since the SRF review relies on verified data, all data stewards for state delegated programs and
EPA DI programs will need to complete the steps outlined in Appendix B. This will typically
occur during the November to February timeframe following the SRF review year.

Important: Regional SRF Coordinators should work with their regional data stewards to
ensure that either they or the state data stewards complete the data verification process.

Once the period of data verification concludes, "frozen data", namely the final SRF data metrics
based on verified numbers, should be made available soon after. At that time, reviewers can begin
the main portion of the review by conducting the data metric analyses and selecting facility files
for the file review.

Note: EPA's ECHO State Dashboards - Historically, the state dashboards that track state data on
key enforcement performance indicators relied on a verified or "frozen" dataset. Based on an
agreement between EPA and states in 2018, ECHO switched to using live or unverified data to
populate the dashboards. As a result, data updates can be made in the data systems and will be
reflected on the dashboard after the data verification period has ended.

Important: SRF reviews will be based only on data that has been verified or frozen through
the data verification process and not production or "live" data on the ECHO web site.

B.	Selecting Local Agencies and State District Offices (if applicable)

This section applies only to reviews of states with authorized local agencies, or, states with a
decentralized state district office structure.

1. Reviewing Local Agencies

In some states, local agencies and state districts play a significant role in implementing
compliance monitoring and enforcement programs. Therefore, as part of the State Review

7


-------
SRF Reviewer's Guide - Round 4

Framework, EPA reviews local agencies and state districts to ensure they are implementing
their inspection and/or enforcement programs consistent with national policy and guidance.

Local agencies as described in this section are those that implement their programs in lieu of a
state agency or EPA. They are different from state district offices because local agencies are
not staffed by state employees and generally only have jurisdiction within a city, county, or
metropolitan area.

a.	Determining which local agencies to review

Generally, EPA should review in each SRF round local agencies that regulate moderately to
heavily populated cities, counties, or metropolitan areas as well as those serving areas with a
smaller universe. If a state has several local agencies, it may be appropriate to review some
in one review cycle and the others during a later cycle with the goal of covering each
jurisdiction over time to ensure oversight coverage across the entire state. This might depend
on the size, location, and responsibilities of the local agencies. If local agencies are to be
reviewed in a staggered fashion, regions should indicate their plans for selecting local
agencies, including the criteria and analysis involved in targeting selected local agencies and
state district offices, as part of the discussions with Headquarters at the beginning of the SRF
review cycle and as part of the annual ACS commitment process.

b.	Conducting a local agency review

EPA should review local agencies separately from the state agency.1 This means EPA must
include findings, recommendations, and metrics specific to the local agency and separate
from the state agency in the final report. Once EPA completes the state and local agency
reviews, both the state and local agency findings will be included in the state's final report.

2. Reviewing State District Offices

Many state agencies administer their compliance and enforcement programs out of district
offices (these may also be called regional offices, districts, or boards). SRF data and file reviews
cover a state in its entirety regardless of whether it administers programs and stores its facility
files at a central office or at district offices.

An SRF file review in a state with districts may require that all selected facility files be sent
to a central location (if files are not already centrally located or available electronically). If
that is not possible, the EPA region should attempt to conduct file reviews at every district
office, in which case the review will follow the same rules as any other SRF review. Where
it is not possible for the EPA region to review files from every district, the EPA region should
discuss options and come to an agreed upon approach with their HQ SRF Liaison.

1 The California CAA, CWA, and RCRA programs are implemented by local and regional boards, state district offices, and certified
unified program agencies. However, for California, the section below, "Reviewing State District Offices," provides a more
appropriate model. When reviewing California, follow the District Office approach and work with Region 9's SRF liaison to
develop a plan.

8


-------
SRF Reviewer's Guide - Round 4

With the exception of the steps below, EPA will conduct these SRF reviews in the same way as
those in any other state.

c.	Selecting a subset of state district offices for review

If reviewing a subset of districts, consider the criteria below to determine which and how
many districts to review. EPA regions may choose to review different districts in CAA,
CWA, and RCRA based on these factors:

d.	Size of state and number of district offices

Conduct file reviews at a minimum of two state district offices per media program. Three or
more is preferred. In a large state with a large number of offices, such as California, reviewing
all or even most may not be possible. Regardless, EPA should try to review more districts in a
state with many district offices than it would in a state with few district offices. It should also
review more facilities in a state with a larger universe than it would in a state with a smaller
universe.

e.	Known problems in district offices

Once EPA has established the number of district offices to visit, begin to decide which to
review by:

•	Considering known compliance monitoring and enforcement problems in the districts.

•	Asking the state about performance issues in each district.

•	Breaking out SRF data metrics for each district, if possible.

f.	Districts visited during previous reviews

The state's prior SRF reports may provide additional information on how districts reviewed
during those rounds were performing. If EPA did not review a district during the previous
round, it should receive additional consideration in the current round.

g.	A "representative sample" of district offices

Reviewers should select a representative sample of all districts, to the degree possible. To
illustrate, if the state has five districts and EPA is reviewing three, determine which districts
are the strongest and weakest performers by comparing data for inspection frequency,
violations, enforcement frequency, and accuracy of data entry. Then, while also using the
above criteria to inform the decision, select one strong, one average, and one weak performer.

Generally, EPA should evaluate the state holistically, even when performance varies
significantly across the districts reviewed. Unless there is clear evidence that issues are
isolated, EPA should not assume that the problem only exists in one or more of the districts
reviewed — the problem could also exist in districts not reviewed. When drafting the report,
EPA should write the finding and recommendation to ensure adequacy of performance state-

9


-------
SRF Reviewer's Guide - Round 4

wide. In some cases, Regions may feel the need to develop findings in the report focused on
specific sectors (e.g., stormwater) or districts within a state. This is acceptable as long as
aggregate state-wide data is also included to develop findings for the report.

h. Next Steps

The EPA region should communicate which districts it plans to review, and the rationale
(e.g., selection criteria, information considered) for selecting them to its HQ liaison prior to
developing the data metric analysis and file selection list. Upon reaching agreement with HQ,
the reviewer can begin the file selection process. See Table 2 in the File Selection section
below for guidelines on how many facility files to pull.

C, Kickoff Letter / Conference

To mark the official start of the review process, regions typically send an initial communication
letter, or "kickoff' letter, to the state or local agency to notify them of the upcoming review, provide
details on logistics and contacts, and coordinate the review schedule. Depending on needs and
resources, regions may choose to also set up an in-person meeting or conference call.

1.	Kickoff Letter

Communication can be in the form of either a formal written letter from a Regional senior
manager or an informal email from a program manager to his/her state/local counterpart. In
order to fully inform the state and local agency of the purpose and details of the review and
ensure coordination goes smoothly, make sure to include the following:

•	The purpose of the review and expected outcomes.

•	A summary of discussions and agreements to date regarding the upcoming review.

•	The date and time of the on-site review (if already scheduled), or the need to schedule it.

•	Media-specific program leads with contact information.

•	Explanation of next-steps.

If the region intends to hold a kick-off conference, the letter should also include the data, time,
and topics for discussion (see below).

A suggested kickoff letter template is attached in Appendix B.

2.	Kickoff Conference (optional)

a. Personnel and Scheduling

If scheduling a kickoff conference with the state or local agency, determine who should attend
the conference. For EPA, this would generally include the SRF regional coordinator, the
media program reviewers, and the appropriate senior managers. For the state or local agency,
it might be senior management and staff coordinating the review. EPA and the state will need
to determine how they will conduct the conference — it can be in person, video or phone. The

10


-------
SRF Reviewer's Guide - Round 4

Regional Coordinator can work with the state to schedule the meeting,
b. Conducting the Conference

EPA may wish to discuss the following topics during the conference:

•	Changes to the SRF process for Round 4, such as revisions to the metrics and guidance

•	Results of the Annual Data Metric Analyses (ADMA) from previous years and Data Metric
Analysis (DMA) for the SRF review year

•	The scope of the review

•	Status of performance issues and unresolved recommended actions from previous SRF
reviews

•	Expected timeline for the current review

D. Regional Coordination with Headquarters During the SRF Review
Roles and Responsibilities

Regional Coordinator: Ensure effective and efficient communication and coordination occurs
between the Region and Headquarters. Coordinators are ultimately responsible for making sure all
relevant documents, including the draft and final report, are submitted on time and of high quality.

Headquarters Liaison: Assist the Coordinator with training, technical assistance, and guidance on
SRF policy, process and materials. Liaisons are responsible for working with HQ staff and the
Regions to ensure the completeness and accuracy of all review materials and their consistency with
national guidance and policies.

The SRF review process involves numerous process steps, documents, people as well as managing
a shared database and therefore requires a considerable amount of coordination. EPA regions may
select from one of two process tracks for coordinating SRF reviews with Headquarters that best
suits their needs. The emphasis in Track 1 is on an initial comprehensive scoping meeting, while
Track 2 relies on check-ins throughout the review process. Regions are encouraged to communicate
with Headquarters whenever issues or questions arise.

Track 1

Scoping Meeting

Emphasis ^

•Initial communication and concurrence occurs between
region and HQ in the form of a preliminary scoping
meeting



Track 2

Periodic Check-In

Emphasis ^

•Periodic communication and concurrence between
regional SRF coordinator and HQ SRF liaison occur at
multiple steps in the process

11


-------
SRF Reviewer's Guide - Round 4

Appendix C provides a detailed description of each track. Region should let their SRF
Liaison know which track they intend to use prior to beginning the review.

III. CONDUCTING THE REVIEW

When all the preparatory steps above have been completed, reviewers can begin the substantive portion
of the review, starting with the analysis of compliance and enforcement data generated from the national
data systems followed by the review of facility files for a more qualitative look at program activities.

A. Data Metric Analysis (DMA)

Roles and Responsibilities

Regional Coordinator: Submit DMA and CWA Inspection Table (60-days before the file
review)

HQ Liaison: Review material for completion and accuracy

A Data Metric Analysis (DMA) contains a set of metrics that provide information on program
compliance monitoring and enforcement activities by fiscal year. The metric data is sourced from
ECHO, which in turn pulls data from the national data systems for each media. The DMA metric
values will serve as one of two main sources - along with file metrics - of information used to make
findings on program performance.

Conducting a DMA represents the first analytical step in the review process. Based on the data metric
values, the reviewer will develop initial findings. These can be used to determine what program
areas may need additional focus for file selection, on-site reviews, or discussion with state and locals.

The instructions for completing a DMA and making preliminary findings are outlined in
Appendix E. Reviewers may also log in to ECHO to view the training video with step-by-step
instructions on conducting a DMA.

Once a DMA has been completed and reviewed by HQ, reviewers may wish to share the DMA with
the state or local agency as part of the kick-off letter or meeting, as mentioned in the section above.
This will allow for agencies to provide any feedback or corrections of the data before findings are
drafted in a report, since findings should be based on data that the region and state agree are accurate.

B. Annual Data Metric Analysis (ADMA)

Roles and Responsibilities

Regional Coordinator: Work with Regional program staff to develop ADMA and share with the
state. Upload final ADMA to SRF Manager.

HQ Liaison: Assist Region on development of ADMA and long-term trend data, if needed.

12


-------
SRF Reviewer's Guide - Round 4

During Round 4, Regions are expected to conduct a data metric analysis for each state in their
region each year. The annual data metric analysis (ADMA) is the same as a DMA, but is conducted
on an annual basis for those state or local agencies that are not undergoing an SRF review during
the year.

The annual DMA uses frozen data verified by states during data verification. It allows for annual
identification and resolution of counts of facilities and activities, and assists in managing
performance issues.

The ADMA is designed to be a simple, regular check-in, supporting annual performance reviews
that already occur under Performance Partnership Grants (PPG) and other annual state and EPA
planning and grant vehicles.

Additionally, the annual DMAs can be analyzed for trends during the SRF review. This can help
put data metric values during the SRF review year in context and help EPA determine appropriate
findings.

The steps for conducting an ADMA are the same as for a DMA, which can be found in
Appendix E.

C, File Selection

Roles and Responsibilities

Regional Coordinator: Submit File Selection List and if relevant, CMS/MOA/Workplans used
for inspection coverage (30-days before on-site review)

HQ Liaison: Review material to ensure selection criteria is met: correct number of files,
categories of compliance and enforcement activity, and type of facilities (size, geographic
distribution, sector, etc.)

The objective of file selection is to obtain sufficient information to draw conclusions regarding state
performance under each SRF element. It is very important that reviewers have an adequate number
of files to develop supportable findings and recommendations particularly where there is a potential
concern (e.g., withdrawal petition, ADMA trends, or previous SRF findings of performance issues).

1. File Selection Preparation

Before selecting facilities, EPA completes the DMA to identify potential problems. For CWA
reviews, EPA also completes the CWA inspection coverage table (see the CWA Plain Language
Guide for instructions). Reviewers should consider these sources of information, combined with
problems identified in previous SRF reviews and annual DMAs, when determining what
activities or sectors to focus on during the review.

HQ recommends EPA regions transmit the file selection list to the state at the same time as the
DMA. In addition, EPA should decide if the state review will include any reviews of local
agencies or district offices. Earlier sections of this document deal with these types of reviews.

13


-------
SRF Reviewer's Guide - Round 4

2. Determining Minimum Number of Facilities to Review

To determine the total number of files to select for your review, examine the number of records
or activities returned found in the upper left-hand portion of your screen in the ECHO file
selection tool. For example, if the total number of inspections, violations, enforcement actions,
and penalties that occur in the review year is 256, this would be within the range of 26-300
compliance monitoring and enforcement activities reported at the top of the File Selection Tool
"Total Number of Records Returned." As a result, the reviewer would select 25-30 files as the
table below indicates. For step by step instructions on creating a file selection list via the
ECHO File Selection Tool, see Appendix E, or visit the SRF training videos on ECHO.

Table 1: File Selection Guidelines

State-Wide Review

Number of Activities in File Selection Tool

Minimum # of Facilities or Files Selected

More than 1,000 activities reported

35 to 40 files selected

301 to 1,000 activities reported

30 to 35 files selected

26 to 300 activities reported

25 to 30 files selected

Fewer than 25 activities reported

All files selected

Review of Local Agencies & State District Offices2

Number of Local Agencies or State Districts

Minimum # of Facility or Files Selected

1 agency or district

30 files selected

2 agencies or districts

30 files selected (15 per agency/district)

3 agencies or districts

30 files selected (10 per agency/district

4 agencies or districts

30 files selected (7 per agency/district, plus 2
additional files)

5 or more agencies or districts More

30 files selected with roughly even distribution
across agencies / districts

If data in the national data systems do not accurately reflect state activities, EPA may need
to work with the state to select a more representative group of facilities. This applies
primarily to CWA wet weather, pretreatment, and significant industrial user universes,
which may not be fully populated in ICIS-NPDES. (See Appendix C of the CWA Plain
Language Guide for additional information.)

3. Selecting a Representative Sample of Files

a. Basic Requirements

Important: EPA should select at least five facilities for each of the following categories:

•	Inspections with enforcement

•	Inspections without enforcement

2 For these reviews, also refer to section I, "Preparing for the SRF Review." If less than 30 files are available for review in the file
selection tool, select all files available for review

14


-------
SRF Reviewer's Guide - Round 4

•	Non-SNC violations (CWA/RCRA), federally reportable violations (CAA), or secondary

violations (RCRA)

•	SEVs (CWA) or stack tests failed (CAA)

•	SNCs (CWA/RCRA) or HPVs (CAA)

•	Informal enforcement actions

•	Formal enforcement actions

•	Penalties

A single facility can count toward multiple activities reviewed. For example, if a facility has
an inspection, a formal enforcement action, and a penalty, then that facility addresses all
three categories.

If there are fewer than five facilities in a category, select all available to include in the file
selection list and determine if the low number is indicative of a performance issue.

Important: Regions should then select files from a prior fiscal year(s) if fewer than 5
activities are available to select in the review year to ensure that performance findings are
based on a sufficient number of activities.

For example, if there are only four penalties available in the review year (e.g. FY18),
reviewers should examine the prior year (e.g. FY17) of file selection tool data to select one
additional penalty.

b. Other Considerations

•	At least half the facilities selected should have compliance monitoring activity, and
roughly half should have enforcement activity. (Enforcement includes informal
and formal actions, and penalties.)

•	Selection should include a representative mix of facilities:
o With and without violations

o Different facility types based on size (major, minor, etc.), sector, geographic

location, and other factors
o Violations but no enforcement, particularly if the DMA indicated that the state
might not be taking appropriate enforcement

•	It is a good practice to include facilities with multiple inspections in a single year but
no violations found.

•	The Map Selected Facilities feature allows File Selection Tool users to view
geographic distribution at a glance to determine

4. Supplemental File Selection

Representative file selection will usually provide a sufficient number of files to assess
performance across the necessary range of activities, facilities, and geographic areas. However,
there are a few circumstances where EPA may elect to select supplemental files, including:

• There is a sector that EPA is concerned about in the state — such as CAFOs or POTWs
in the NPDES program — that the representative selection did not adequately cover.

15


-------
SRF Reviewer's Guide - Round 4

•	A review of previous SRF reports, the review-year DMA, or the annual DMAs show
longstanding problems in a performance area not adequately covered by the
representative selection.

When selecting supplemental facilities, click their checkboxes to indicate that they are part of
the supplemental selection.

Other considerations:

•	Reviewers should generally select supplemental files at random from the list of
facilities for the given category.

•	If the file review leads to new discoveries about problem areas, and the official file
selection does not provide an adequate number of facilities to make findings, EPA may
request additional files while on site.

•	Reviewers may also want to use the ECHO.gov SRF data metric facility
drilldown screen for the issue requiring additional file review. For example, if
you are interested in facilities with formal actions not taken in a timely manner,
find the relevant SRF metric in the ECHO.gov data metric query and click on
the metric number. This will bring up a list of facilities. Then go back into the
file selection tool and randomly select some of these.

5. Transmit File Selection List

Upon completing file selection, download an Excel file listing selected facilities by clicking
the Download button and then clicking the Download Selected button. EPA should send
the list to HQ for review in advance of the on-site file review. This will allow the Liaison
to provide valuable input on the quality of the list. Following HQ review, the region should
transmit the list to the state agency at least two weeks before the file review to allow the
state time to pull files.

Reviewers should also print the detailed facility reports (DFRs) at this time. It is much easier
to pull them at the end of file selection than later. The File Selection Tool has a Print Selected
DFRs button for this purpose.

D. File Review

Roles and Responsibilities

Regional Coordinator: Submit File Review Worksheet (30-days after File review)

HQ Liaison: Review material to ensure completion and accuracy of metric calculations, initial
findings, and comments

16


-------
SRF Reviewer's Guide - Round 4

1.	File Review Preparation

After selecting files, the review team should continue preparing for the file review. See
Appendix G for a checklist of all essential materials to have on hand during the review.

a.	Print ECHO.gov Detailed Facility Reports (DFRs)

If you did not print DFRs for all of the facilities on the file selection list during file selection,
pull them by entering facility ID numbers into the facility ID search on ECHO.gov.

b.	Print File Review Checklists

Download the CAA, CWA, or RCRA file review checklists from the SRF documentation
and guidance page in ECHO.gov or the EPA Manager database. Fill in the information
requested on pp. 1-2 based upon information on your detailed facility report to save time
during the on-site file review. Print or save an electronic copy for each facility to be reviewed
and clip it to the facility's DFR.

2.	Coordination with State

If you need access to the state's data system, or assistance navigating the data system, ask the
state for assistance.

Contact the state the week before the on-site review to confirm that:

•	The state has pulled all selected files; if the state was unable to find some files, select
additional files to ensure minimum file selection requirements are met

•	The state has reserved a room for EPA to review files

•	The files contain all documentation needed to complete the review

•	The state has designated a point-of-contact to offer assistance during the review

•	The appropriate managers and staff will be available for entrance and exit meetings

3.	Conducting the File Review

a.	Conducting Reviews Remotely

If a state has all files available electronically, regions may choose to conduct the file review
remotely. Inspection reports and formal enforcement actions are available on some state web
sites. It is a good practice to determine whether compliance determinations following
inspections, informal enforcement actions, penalty calculations, and justification for
changing penalties are, or can be made available electronically. If some or all of these data
are not available remotely, an on-site file review will be necessary. Consider whether state
public disclosure laws or internal policies make it necessary to supplement electronic reviews
with on-site file review and discussion with state staff.

b.	Entrance Conference

17


-------
SRF Reviewer's Guide - Round 4

Regions and states often find it helpful to hold an entrance conference. Appropriate topics
include:

•	A brief discussion of the new Round 4 process.

•	SRF DMA results and how those compare to past ADMAs, including CWA CMS
metrics, to indicate potential performance issues.

•	File review process.

•	Confirming availability of the state's point-of-contact during the review.

•	Expected timeline for completion of review and tentative date and time of exit conference.

•	Proposed topics to be covered at exit meeting, such as preliminary findings from the
review, requests for additional materials, and the process for drafting and finalizing the
report.

c. File Review

Use file review checklists and DFRs to review facility files, and refer to the Plain Language
Guides and underlying EPA policy and guidance for questions about specific metrics.

There may be activity from a previous or subsequent year linked to activity in the year
reviewed. If so, EPA should review these activities. For example, Region 11 is conducting a
review of activity in FY 2018 in one of its states. One of the facilities selected for file review
had an enforcement action during FY 2016. This enforcement action was in response to
violations found during an inspection in FY 2015. Because they are directly related, Region
11 would review the inspection, violation determination, and enforcement action.

Another facility had an inspection in FY 2018 that resulted in a SNC determination and
formal enforcement in FY 2019. Again, Region 11 would review the inspection, violation
determination, and enforcement action.

If a facility has multiple inspections or enforcement actions during the review period, review
all activities that take place in the review year and record responses for the same question on
a separate row of the file review spreadsheet. The file review checklists contain supplemental
sections for multiple activities, and the file review spreadsheet contains instructions for
capturing each action.

Use the File Review Worksheet to calculate metrics and make initial findings. The Worksheet
automatically tabulates metric values based on the "Y" and "N" responses entered for the
facilities. For N/A responses, you may leave them blank or enter N/A. (To prevent data entry
and calculation errors, the Worksheet only allows responses of Y, N, N/A, and blank.) Do
not adjust the formulas in the Worksheet. It is a good practice to enter checklist responses in
the file review spreadsheet daily to ensure that all appropriate questions were answered while
the review team still has access to the files. Use the far-right hand column in the table on p. 1
of the file review checklist as a guide to the specific questions that should be answered for
each type of activity reviewed.

18


-------
SRF Reviewer's Guide - Round 4

Important: During the on-site file review, it is vital that reviewers take quality notes, or
if allowed, scan or copy key sections of files or documents particularly where the
situation seems complex or unclear. This will ensure that the necessary information will be
available to explain findings, support recommendation development when drafting the report,
or discussing the preliminary findings with state or local agency.

4.	Developing Preliminary Findings

Once you have entered all responses in the Worksheet, click on the Initial Findings tab.
Metric values will automatically populate in the Initial Findings tab based on values entered
in the file review Worksheet. Compare the state performance result to the national goal for
each metric to establish preliminary file review findings. You may do this prior to the exit
conference, time permitting.

Issues identified as State Attention or State Improvement in the DMA generally represent a
performance issue of one kind or another (see definitions of findings on pp 21-22). For
example, if EPA made a State Improvement initial finding in the DMA for not inspecting
enough major facilities, but state data confirms that the agency actually exceeded its
inspection commitment, it would appear that the agency was not entering all inspections in
the national data system. In this case, the state would receive findings of State Improvement
under Element 1 (Data), and Meets or Exceeds Expectations under Element 2 (Inspections).

Reviewers may revise these findings and recommendations later based on additional
research and analysis.

5.	Exit Conference

EPA should hold an exit conference with state agency personnel following the file review.
This conference may occur on site immediately following the review or at a meeting or
conference call as soon as possible after the review.

a. Discussing Preliminary Findings and Potential Recommendations

EPA may begin the exit conference by telling the state that it has completed the review and
has developed preliminary findings and, if possible, recommendations. EPA should stress
that these are subject to change based on further analysis and discussions with HQ. EPA
should also discuss areas where state performance is strong.

When discussing preliminary findings for Areas for Improvement, EPA should provide
reasons for these findings, and, if possible, potential recommendations to improve
performance. This should be an opportunity for dialogue, particularly when EPA is unsure
what is causing a particular problem, or how to improve it. The state may have additional
reasons for low performance, and it may have helpful ideas for how to improve. EPA should
note these and add them to the report as appropriate.

When problems noted in prior SRF reviews recur, ask the state why prior recommendations

19


-------
SRF Reviewer's Guide - Round 4

did not solve the problem, and what the state believes it can do to improve performance. If an
action was completed that did not solve the problem, recommend a different action.

EPA may ask the state or local agency when they plan to begin correcting the issue, and what
they need in terms of assistance, so a realistic due date for a proposed recommendation can
be included in the report.

Finally, EPA should discuss the process for drafting the report, reaching agreement with HQ
on findings and recommendations, and sharing a draft with the state for its comment.

IV. DRAFTING AND FINALIZING THE SRF REPORT

A. Developing the First Draft
Roles and responsibilities £9^

Regional Coordinator: Develop and submit a draft report to HQ liaison
HQ Liaison: Review draft report for completeness, accuracy, and integrity

The draft report represents the main product of the review, which when finalized is made available to
the public on the SRF web site. Drafting of the report typically begins after the file review, though
some reviewers may wish to begin entering administrative information, data metric values along with
preliminary findings prior to that point.

Regions have the flexibility to decide who is responsible for drafting the report, or sections of the
report, whether that be the SRF coordinator, program reviewers or some combination. Typically, the
coordinator is ultimately the one responsible for ensuring that the report is completed properly and
on time.

In drafting the report, reviewers will compile the data and file metrics, along with any other relevant
information gathered during the review, to make findings on a program's performance under each
element (i.e. data, inspections, etc.). To help ensure consistency, a metric value range generally
corresponds to one of three finding levels unless there are justifiable reasons otherwise (See Table 2 on
page 22). Wherever findings of area for improvement are made, recommendations for corrective action
must be included, which to the degree possible, should be developed in coordination with the agency
reviewed.

Draft reports are due to the HQ Liaison by the end of the federal fiscal year as required by the Agency's
ACS commitment. If the Regions needs additional time to complete the draft report, reviewers or SRF
Coordinators should contact their liaison and provide them with an expected submission date.

Important: All Round 4 SRF reports will be drafted in the "SRF Manager Database," the
program's new Oracle Apex data system launched in January 2018. The Database is a one-stop
system that allows coordinators, reviewers, and liaisons to access key guidance documents, draft and
review SRF reports, and track recommendations until completion. For more information on how to

20


-------
SRF Reviewer's Guide - Round 4

use SRF Manager in developing a draft report, see the User's Guide posted in the database.

1.	Administrative Information

Before drafting the report, reviewers should provide the following information in the
Administrative Information view of the SRF Manager's Database:

•	Region

•	State

•	Agency Reviewed: The implementing agency (EPA, State, Local). If state district offices are
being reviewed, the state is the implementing agency. If a local is being reviewed, the local is
the implementing agency. All state district offices should be combined into a single
report, while separate reports should be created for each local.

•	Round

•	Review Year: Federal Fiscal Year (FFY) during which the reviewed activities were
conducted.

•	Regional Coordinator

•	HQ Liaison

•	Report Version (final or draft)

•	Report Author

•	File Review: Dates that the file review was conducted and contact info of media program
lead

2.	Performance Findings

Findings are the reviewers' determinations on program performance that make up the main content
of the report. There should be at least one finding per element, though there are typically multiple
findings within an element.

a.	Finding Number (up to 5 findings per element)

•	Each element in the report (data, inspections, etc.) has metrics associated with it and
therefore will receive at least one finding. For each element, start with finding 1 and
continue sequentially up to a maximum of five findings.

e.g. Element = Data Finding 1-1, Finding 1-2... Finding 1-5

b.	Finding Level

•	Review the source information that will be used to make findings:
o Data metrics from the DMA.

o File metrics from the file review spreadsheet,
o Other information such as ADMA performance trends, etc.

Important: Reviewers should use the national goal of the metric, not the national

average, for determining a finding level. Averages should be used to provide context to
the findings.

•	Choose a final finding level. The table below provides a definition of each finding level
and offers suggested metric value ranges for help in deciding on a finding level. These

21


-------
SRF Reviewer's Guide - Round 4

value ranges are simply a guide in selecting an appropriate finding level. Other factors
may be considered in choosing an appropriate level, such as the universe size of the metric
or whether the issue has recurred across several SRF rounds. See Appendix J to consider
other factors for developing finding levels.

Table 2

Suggested Metric Finding Level
Value Ranges

-85-100%

Meets or Exceeds Expectations: The base level of performance is met
and no deficiencies are identified, or the program is performing above
national expectations.

-71-84%

Area for Attention: An activity, process, or policy that one or more
SRF metrics show as a minor problem. Where appropriate, the state
should correct the issue without additional EPA oversight. EPA may
make suggestions to improve performance, but it will not monitor
these suggestions for completion between SRF reviews.

Area for Improvement: An activity, process, or policy that one or
more SRF metrics under a specific element show as a significant problem
that the agency is required to address. Recommended activities to correct
the issues should be included in the report and must have well-defined
timelines and milestones for completion, and, if possible, should address
root causes. EPA will monitor recommendations for completion between
SRF reviews and provide any necessary updates in the SRF Manager
database.

-70% and below

Important: Group metrics within an element under the same finding level. If metric
values within an element lead to the same finding level, create a single finding and include all
metrics under that finding. If metrics within an element lead to different finding levels, create
multiple findings, grouping only those metrics that lead to the same finding level

Summary

•	Provide 1-2 sentences describing the specific programmatic area(s) reviewed and
conclusions on performance. Reviewers should typically try to use the language of the
metric on which the finding is based as a guide in drafting the summary statements.

For example:

o Compliance determinations are generally accurate in cases where there is sufficient

documentation (Meets or Exceeds);
o Inspection reports occasionally lack information sufficient to determine compliance

and are not consistently completed in a timely manner {Area for AMention).,
o Enforcement responses do not consistently address violations in an appropriate manner
(Area for Improvement)

Explanation

•	Describe the program's performance in more detail, providing an explanation for how and

22


-------
SRF Reviewer's Guide - Round 4

why the finding level was chosen

•	If the finding is area for attention'. Reviewers may wish to include a suggestion to the
state/local agency on how to improve the program or alleviate a concern at the end of the
explanation section, though this will not be tracked as an official recommendation in the
database.

•	If the finding is area for improvement. Define the scope of the issue and the cause(s), or
potential cause(s) to the best degree possible.

Important: Determine if the performance issue is recurring. Check to see if the same
issue was identified in previous SRF rounds. If so, explain as best as possible, why the
issue persists or resurfaced. Also, make sure to check the "recurring issue" box in the
findings section of the SRF Manager Database.

3. Recommendations

Recommendations are required whenever there is a finding of area for improvement. The purpose
of recommendations is to ensure that any significant performance issues identified in the review
receive a response that either resolves the issue, or leads to substantial and consistent progress
towards a resolution of the issue (a determination made using best professional judgement).

a. Writing Effective Recommendations

•	All recommendations must contain a description of the specific actions that will be taken
to address the issue identified, the responsible party, and well-defined timelines or due
dates for completion (e.g. 90 days from the completion of the final report). To the
greatest extent possible, recommendations should attempt to address the full scope and
underlying cause(s) of the performance issue.

•	Reviewers are encouraged to access the compilation of SRF Case Studies that contains
real-world examples of successful approaches that Regions have used in previous
reviews. Those case studies can be found at the I mmunitv SharePoint Site.

•	When writing recommendations, reviewers may find it helpful to use the following
SMART checklist to ensure the recommendation includes the required components.

SMART Checklist:

~	Specific - description of specific actions that will be taken and who will take them.

~	Measurable - the actions can be measured either quantitatively or qualitatively but
should indicate what evidence is needed to measure completion.

~	Achievable - the actions are within the means of the implementing agency to complete.

~	Results-oriented - completion of the actions should result in improved outcomes i.e. the
issue is addressed or meaningful and consistent progress is made towards that end.

~	Time-bound - actions include timelines or due dates that create a practical sense of
urgency.

Important: If the recommendation is addressing a recurring performance issue, or one

identified in the previous round, the recommendation should represent an escalated

23


-------
SRF Reviewer's Guide - Round 4

response. If the issue was resolved but resurfaced, the EPA might consider a longer period
of monitoring. Examples of escalated action can be found in the Agency's National Strategy
for Improving Oversight of State Enforcement Performance found on the ECHO SRF web
page and in the SRF Manager database guidance section.

b. Recommendations vs. Milestones (choose an option)

Important: In writing recommendations for a finding of area for improvement, reviewers
can develop one recommendation with multiple milestones/due dates, or create several
recommendations based on each milestone. There may be no difference in deliverables
or actions between recommendations and milestones; the only difference is how regions
would like to monitor and report out on recommendations during the post review monitoring
process. Here are the two options:

•	Draft a single recommendation that has multiple milestones (deliverables or actions) but
a single due date. The due date will typically mark when the final milestone is to be
completed.

•	Draft multiple recommendations, each with its own due date, meaning, there would
be multiple recommendations and multiple due dates associated with that single
finding.

For example, a recommendation may include the following deliverable or action
milestones: "1) The state should complete ICIS data entry training by July 31, 2019.
The state should enter all SEVs into ICIS by Dec. 31, 2019. 3) The state should
complete an SOP for entering SEVs into ICIS by March 31, 2020." Each action or
deliverable would be entered in the SRF Manager database as a separate
recommendation (no. 1, no. 2, no. 3) with a single due date for each.

4. Executive Summary

Once the draft report has been completed, reviewers can now begin developing the Executive
Summary. The Summary should convey the main findings from the review, namely the most
notable performance successes and challenges of a given program. In other words, readers,
especially management, should be able to turn to the Executive Summary to get a sense of what
parts of a program are being well implemented, and what parts require additional attention.

a. Areas of Strong Performance (3-5 findings):

•	Review all Meets-or-Exceeds findings.

•	Identify up to five findings that reflect parts of the program that are being implemented at a
high or very high level.

•	Include the finding summary(s) as written, or re-write to better encapsulate the finding.

•	If no Areas of Strong Performance are identified, indicate this by writing 'Wo Areas of
Strong Performance were identified

24


-------
SRF Reviewer's Guide - Round 4

b. Priority Issues to Address (3-5 findings):

•	Review all Area for Improvement and Area for Attention findings.

•	Identify up to five findings that reflect parts of the program that are being implemented at a
low or very low level.

•	Include the finding summary(s) as written, or re-write it to better encapsulate the finding.

•	If no Areas of Strong Performance are identified, indicate this by writing "No Priority Issues
to Address were identified

Following the highlights of the current review, the Executive Summary should include a brief
overview of performance issues from past reviews. The overview should indicate whether issues
identified in previous reviews have been resolved or continue to be a problem. One approach
for communicating this is to create a table that includes the finding levels for each issue
associated with a SRF metric and columns for each Round of a reviews. The table below is an
example:

Metric

Round 3 Finding Level (FY 20	)

Round 4 Finding Level (FY 20	)

6b- Inspection reports were not
completed in a timely manner

Meets or Exceeds Expectations

Area for State Improvement

9c- Percentage of enforcement
responses that have returned or will
return a source with non-SNC
violations to compliance

Area for State Improvement

Meets or Exceeds Expectations

10b- Enforcement responses are not
consistently addressing SNC
violations in an appropriate manner

Area for State Attention

Area for State Improvement

lOd- Percentage enforcement
responses reviewed that
appropriately address non-SNC
violations

Area for State Improvement

Area for State Improvement

B. Finalizing the Report

1.	HQ Review of Initial Draft Report

Once Regional reviewers have completed developing a draft report in the SRF Manager
database, the Regional Coordinator should notify the HQ Liaison that the initial draft is
complete. The Liaison will begin a completeness check to make sure all the necessary
information is in the draft and all the required documents are uploaded to the database. If
everything is complete, the Liaison and HQ program staff will begin their review and provide
their comments to the Regional Coordinator within 15 calendar days.

2.	HQ Review of Subsequent Draft SRF Reports

25


-------
SRF Reviewer's Guide - Round 4

The process and criteria for substantive reviews of revised draft reports will be the same as
for first-draft reports unless the HQ Liaison elevates the revised draft to management, in
which case management will review and determine how to resolve remaining issues.

3.	State Comment on Draft SRF Report

Important: The recommended approach for state review of the draft report is for the
EPA region and HQ to reach agreement on a draft report before the EPA region shares
the report with the state. This is an effort to reduce transaction costs and make sure EPA
speaks to outside parties with one voice. Experience has shown that reports shared with the state
first result in additional reviews by the state and HQ, and take longer to finalize.

Once the state receives the report, it has 45 calendar days to make comments. Once the state
has reviewed the report and the Region has made all of the necessary revisions, the EPA region
should send the report back to the HQ Liaison. The EPA region must notify the Liaison if it
made any significant changes to the report based on state comments.

4.	Finalizing the Report in the SRF Manager Database and Posting to SRF Web Site

Once the state has reviewed the report and HQ and the EPA region reach agreement on its
content, the Region will make all final edits in the SRF Manager database and select the
Final Report option in the Administrative Information view of the draft report section. This
will transfer the report into the Final Report view and the document will appear in the table.
The HQ Liaison will review the final reports and will notify the EPA region in writing that
the report is final. The report is not final until the EPA region receives this written
notification from HQ. Final reports are typically due by the end of the calendar year
according to the Agency's ACS commitment. The Liaison will publish the final report and
along with the review recommendations on the EPA SRF web site and notify the Regional
Coordinator when the document will be available to the public.

V. POST-REVIEW RECOMMENDATION MONITORING AND
CLOSEOUT

Roles and Responsibilities

Regional Coordinator: Monitor recommendation implementation to make sure progress is being made,
support is available where needed, and the completion of a recommendation is verified.

HQ Liaison: Monitor status of recommendations, ensure that completion verification meets all
appropriate criteria, and elevating issues that may require a national or upper management response.

Following the publication of the final report, EPA is responsible for ensuring that any recommendations
resulting from the review are fully implemented so that performance issues are resolved, or meaningful
and consistent progress is made towards that end.

26


-------
SRF Reviewer's Guide - Round 4

The SRF Manager database is a key tool for monitoring recommendations. Once the report is finalized in
the system, all report recommendations can be viewed in the Findings and Recommendations section,
where reviewers can sort and filter recommendations by various categories including round, region, state,
finding number and level, summary, explanation, recommendation text, due date and status. Reviewers
are encouraged to check on the status of outstanding recommendations on at least a quarterly basis, and
coordinate with the implementing program to complete them prior to the due date.

A. Annual Recommendation Inventory

At the beginning of each fiscal year, regional coordinators should conduct an inventory of all
recommendations in the SRF Manager database to assess their status (completed, ongoing or
overdue) and which ones will be coming due in the upcoming year. For those that are upcoming,
and especially those that are overdue, review the content of the recommendation and prepare to
follow up with the agency to ensure they are completed. Regions are encouraged to discuss the
status of any ongoing or overdue recommendations with their states as part of their communication
of their annual data metric analysis (ADMA).

B. Monitoring Ongoing Recommendations (formally known as "working" in previous SRF
rounds)

For ongoing recommendations that have not reached their due dates, reviewers are advised not to
wait until a recommendation is due to check in with the responsible agency on the status of its
implementation.

For example, if a recommendation deliverable or action is due in 90 days from the report publication
date, the reviewer should contact the agency at least 60-90 days in advance to inquire on what
progress has been made in implementing the recommendation. As a suggested best practice, timelines
for inquiry are included in the table below.

Recommendation Due Date

Suggested Initial Check-In Date

90 days from publication

30 days from due date

180 days from publication

120 days from due date

240 days from publication

120 days from due date

365 days from publication

180 days from due date

During check-ins, reviewers should try to determine if the reviewing agency is on track or having
trouble implementing the recommendation deliverable or action. If EPA and the responsible agency
both determine that the agency will not be able to meet the due date, they should try to determine the
cause for the delay and what actions EPA can take to aid the state or local agency that will help them
resolve the performance issue.

If it is unlikely that the issue can be resolved before the original due date, each party will try to reach
an agreement on a new due date. Once a new date is determined, the Regional Coordinator should
request a change in the due date in the SRF Manager Database. The HQ Liaison will review the
request and update the due date, if appropriate.

27


-------
SRF Reviewer's Guide - Round 4

C.	Prioritizing and Elevating Overdue Recommendations

Overdue recommendations are those that have not been completed by the due date committed to in
the final report. There might be many reasons why a recommendation becomes overdue - staff
turnover or a lack of staff, state unwillingness, the issue is considered a low-priority, or it is simply a
complex and intractable issue to resolve. The expectation, however, is that all recommendations are
to be completed, unless upon elevation, senior management determines that the issue cannot be solved
or is no longer relevant.

Reviewers should prioritize the monitoring of overdue recommendations and develop a strategy for
working with the appropriate agencies to resolve them. Most pressing to resolve are the subset of
overdue recommendations that address what reviewers determine to be "significant and recurring
issues" and have been unresolved for an extended period (e.g., greater than one year overdue). For
these types of recommendations, Regions should implement an elevation process for resolution by
senior management either at the Regional or HQ level depending on the cause in the delay in
implementation of the recommendation.

D.	Verifying Recommendation Completion

For a recommendation to be considered complete, EPA verify that all parts of the recommendation
have been carried out in full and/or the underlying performance issue(s) has been either resolved,
or, substantial and consistent progress has been made towards a resolution.

Confirmation may require EPA to review data or a sample of inspection reports or enforcement
actions to determine that an issue has been resolved. This may or may not be explicitly spelled out
in the recommendation itself. For the most significant issues, EPA will want to monitor
implementation of a recommendation for a longer period and see sustained improvement over
several quarters before closing out the recommendation.

Documentation to demonstrate verification may differ depending on the type of performance issue
identified in the report. The list below includes some common practices and documents for verifying
specific performance issues:

1. Policies, Guidance, and Procedures

• Development or revision of a new or existing document, such as a response policy, inspection
report format, checklist or template, standard operating procedure, penalty calculation
spreadsheet, or data entry procedures.

2. Incomplete, Inaccurate, and Untimely Entry of Data

28


-------
SRF Reviewer's Guide - Round 4

Entry of missing data, such as facility info, universe, inspections, violations, enforcement
actions, or penalty counts and amounts under file metric 2b

Resolving technical issues such as translating data from state to federal databases (i.e.,
Electronic Data Transfers (EDT))

Revising incorrectly entered data such as inaccurate dates, SEV codes, enforcement types,
penalty dollar amounts

Attach data download
to SRF Manager
Database



Insufficient knowledge, skills, and abilities

• Providing training and technical assistance on how to accurately enter data, which data are
required to be entered, when data are required to be entered, identification of violations and
discerning SNC/HPV/FRV from other types of violations, how to calculate penalties (e.g.,
economic benefit)

Does the
recommendation
require training, joint
inspections?

vA

Record # of training
attendees date, &
agenda/syllabus, or

inspections
conducted/reports
reviewed

Inadequate Inspection Reports and Documentation of Penalty Calculations

•	Inspection report quality (e.g., facility information, dates, narratives, checklist,
documentation, violation identification)

•	Penalty documentation (e.g., economic benefit and gravity, changes to penalty amounts,
penalty collection)



Does the
recommendation
require review of
inspection reports or
penalty
documentation?



Review reports or
documents from
selected files



Include file review
checklist indicating
number of reports or
files reviewed that met
requirements



Inadequate SNC-HPV determination, Return to Compliance, and Appropriate and Timely
Enforcement Action

•	Making appropriate HPV-SNC determinations of violations

•	Taking appropriate and timely informal or formal action in response to violations.

29


-------
SRF Reviewer's Guide - Round 4



Regions should enter all the necessary verification information the in SRF Manager, after which,
they will need to notify their HQ Liaison to request a close out of the recommendation. The Liaison
will review the information in the SRF Manager and, if all the verification criteria are met, they will
approve the request and close out the recommendation.

In cases where the verification lacks sufficient justification or documentation, the Liaison will work
with the Region to try to reach an agreement. If relevant documentation or information cannot be
obtained, an explanation should be provided. If both parties are unable to reach agreement, the
Liaison will elevate the issue to their management.

30


-------
SRF Reviewer's Guide - Round 4

Appendix A: SRF Key Information

•	Reviewer: EPA Office of Enforcement and Compliance Assurance and 10 Regional Offices

•	Reviewed: Local, state, and EPA DI compliance monitoring and enforcement programs

•	Frequency: At least once every five years

•	Current Round: Round 4 (FY2018-2022)

•	Statutes Covered:

o Clean Water Act (CAA) - National Pollutant Discharge Elimination System (NPDES)
o Clean Air Act (CWA) - Title V

o Resource Conservation and Recovery Act (RCRA) - Subtitle C

•	Source Information:

o Data Metrics - Verified compliance monitoring and enforcement data in the national data systems
o File Metrics - Facility files that contain compliance monitoring and enforcement activity
o Other - Non-review year data or multi-year data trends; review of previous SRF reports;
Compliance Monitoring Strategies, MOUs and performance agreements; follow-up conversations
with agency personnel and; additional information collected to determine an issue's severity and
root causes

•	Program Elements Covered:

o Data - completeness, accuracy, and timeliness of data entry into national data systems
o Inspections - meeting inspection and coverage commitments, inspection report quality, and report
timeliness

o Violations - identification of violations, accuracy of compliance determinations, and determination

of significant noncompliance (SNC) or high priority violators (HPV)
o Enforcement - timeliness, appropriateness, returning facilities to compliance
o Penalties - calculation including gravity and economic benefit components, assessment, and
collection

•	Finding Levels:

o Meets or Exceeds Expectations: This rating describes a situation where the base level is met, and

no performance deficiency is identified, or a state performs above base program expectations
o Area for Attention: An activity, process, or policy that one or more SRF metrics show as a
minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight.

o Area for Improvement: An activity, process, or policy that one or more SRF metrics under a
specific element show as a significant problem that the agency is required to address.
Recommended activities to correct the issues should be included in the report and must have well-
defined timelines and milestones for completion, and, if possible, should address root causes. EPA
will monitor recommendations for completion between SRF reviews in the SRF Manager database.

31


-------
SRF Reviewer's Guide - Round 4

Appendix B: Data Verification

Data Verification typically occurs every year from November to February. The following steps should
be taken by the data stewards for all state delegated and EPA Direct Implementation programs:

•	Log into the government-only area of the Enforcement and Compliance History Online
(ECHO) website with your EPA web application ID and password.

•	Click the "Oversight" box and the "State Review Framework" link (direct link after log in is
https://echo. epa.gov/oversight/ state-revi ew-fram ework).

•	Click the ECHO.gov SRF Data Verification tab and submit a search for the state or local agency.

•	Go to the Review the facility and activity counts on the search results screen to ensure their accuracy.
In other words, if a number appears inaccurate, click on it to view the list of facilities or activities
behind it.

•	Make any necessary corrections in the national data system of record. ECHO.gov will reflect
corrections after the next weekly data refresh. See the right-hand side of the ECHO.gov Data
Verification page for final refresh and anticipated freeze dates.

•	States and EPA should correct the national data systems before the final ECHO.gov refresh.

This allows for a final review prior to the data verification deadline, which is typically in
February.3 Click the Submit Verification button at the bottom of the results page to complete
verification.

•	When a state finds data inaccuracies that it cannot correct, it should consult with EPA regional
data stewards to develop caveats to explain why data are inaccurate. EPA will post these caveats
on ECHO.

The timeframe for 2018 data verification is later than in past years due to the review of proposed revisions to the
SRF metrics as part of planning for SRF Round 4 reviews.

32


-------
SRF Reviewer's Guide - Round 4

Appendix C: Regional and Headquarters Coordination During the SRF
Review

Track 1

Scoping Meeting
Emphasis

•Initial communication and concurrence occurs between
region and HQ in the form of a preliminary scoping
meeting

Step 1: Initial Meeting

The initial meeting is the EPA region's presentation to HQ of its comprehensive plan for the review.
This also provides a forum for discussing how the SRF process can address state performance issues.
Regional managers and/or S'RF coordinators and liaison should participate.

Before the meeting, regions should provide to HQ no less than 2 days in advance, and prepare to discuss,
the following:

•	If applicable, a proposal for reviewing select district offices, or local agencies (see Section II,
"Preparing for the Review," above)

•	DMA results, NPDES Compliance Monitoring Strategy state specific CMS plan and the CWA
inspection coverage table

•	Proposed file selection lists

•	An estimate of the draft report submission date

HQ and the EPA region may schedule follow-up discussions to address any outstanding issues and
finalize a review plan. HQ and the region should document final decisions.

As another recommended but optional step, regions should provide file review results to their HQ SRF
liaison for review after the on-site file review. HQ will provide comments within five working days.

Step 2: Draft and Final Report

The regional SRF coordinator provides a completed draft report to the HQ SRF liaison with all
supporting SRF documents. HQ will provide comments within 15 working days, as long as all SRF
documents are provided. See the "Finalizing Report" section below for additional information.

Track 2

Periodic Check-In
Emphasis

•Periodic communication and concurrence between
regional SRF coordinator and HQ SRF liaison occur at
multiple steps in the process

Step 1: Determining Scope of Review

33


-------
SRF Reviewer's Guide - Round 4

This step applies when EPA is reviewing local agencies or select district offices in a state. Section II
in the SRF Reviewer's Guide (p. 7) provides relevant elaboration.

Step 2: SRF Data Metric Analyses

Forward copies of SRF data metric analyses to the HQ SRF liaison. For CWA SRF reviews, include
any CWA state specific CMS plans. The liaison will review and provide feedback within five working
days.

Step 3: File Selection Lists

Forward file selection lists to the HQ SRF liaison before sending them to the state. The liaison will
review to ensure that:

•	The region selected a sufficient number of facilities

•	The region selected the recommended number of facilities for each element (inspections,
violations, enforcement, etc.)

•	The facilities selected are sufficiently representative of the full universe in terms of
major/minor designation, geography, and sector

Regions may wish to send file selection lists and data metric analyses at the same time. The HQ SRF
liaison will review and send feedback to the region within five working days.

Step 4: File Review Results

Once the file review is completed, regions should forward copies of the file review worksheet to their
liaison. A complete tally of the file metrics and the region's initial findings must be included (including
the comments). The liaison will provide informal comments to the region within five working days,
which the region can incorporate into the worksheets.

Step 5: Prepare Draft Report

The Regional SRF Coordinator provides a completed draft report via the new SRF Manager database, and
file review spreadsheet to the OC SRF liaison. HQ will provide comments within 15 working days

Step 6: Finalizing Report

The regional SRF coordinator provides a completed draft report to the HQ SRF liaison. See the
"Finalizing Report" section on page 25 of the Reviewer's Guide for additional guidance.

Optional Steps:

•	Review calendar: Develop milestones for completing each step in the review process and
forward them to HQ SRF liaison.

•	Kickoff letter: When sending a kickoff letter to the state, also send a copy to the HQ SRF
liaison.

34


-------
SRF Reviewer's Guide - Round 4

Appendix D: Kick-off Letter Template

Date

Name

Title

Agency

Address

City/State/Zip

Re: State Review Framework (SRF) - Upcoming Round 4 Review

As an integral part of our U.S. Environmental Protection Agency - [State] partnership, Region [ ] will be
conducting a State Review Framework (SRF) review of the [State] [Agency] this year. Specifically, the
EPA will be looking at the Resource Conservation and Recovery Act (RCRA) Subtitle C, Clean Water
Act (CWA) National Pollutant Discharge Elimination System (NPDES) and Clean Air Act (CAA)
Stationary Source enforcement programs. We will review inspection and enforcement activity from fiscal
year [review year].

The purpose of the review is to assess whether program implementation is taking place in accordance with
federal policy and meeting agreed upon minimum performance standards as established in EPA-state
agreements (i.e., PPA/PPG, MOAs). The overarching goal of SRF is to improve the consistency of
program implementation and oversight, and in doing so, ensure equal protection for the public and a level
playing field for business. A summary of the key components of the SRF program are included in the
attachment to this letter.

An important part of the review process is the visit to your state agency office. Through this visit, which
will likely take place in [month and/or day, if scheduled], the EPA will have face-to-face discussions with
enforcement staff and review their respective files to better understand the overall enforcement program.

State visits for these reviews will include:

•	discussions between Region [ ] and [state agency] program managers and staff;

•	examination of data in EPA and [state agency] data systems; and,

•	review of selected [state agency] inspection and enforcement files and policies.

To carry out the review, Region [ ] has established a cross-program team of managers and staff. The
regional SRF coordinator, [name], will be your primary point of contact and will coordinate overall
logistics for the EPA. We request that you also identify a primary contact person for the EPA to work with
and provide that name to [SRF Coordinator], The full review team including media program leads is as
follows:

[ name ]	Regional Coordinator	(xxx) xxx-xxxx [ email ]

RCRA lead
CWA lead
CAA lead

35


-------
SRF Reviewer's Guide - Round 4

These program leads will be contacting [state agency] enforcement managers and staff to schedule a
meeting to discuss expectations, lessons learned from previous reviews, procedures and scheduling for the
review. The EPA will also send its analysis of the SRF data metrics and list of selected facility files prior
to the on-site visit.

Following our visit to your office, the EPA will summarize findings and recommendations in a draft
report. Your management and staff will be provided 45 days to review and comment on this draft. The
EPA expects to complete the [state agency] review, including the final report, by [proposed date]. If any
areas for improvement are identified in the SRF, we will work with you to address them issue until they
are resolved or meaningful and consistent progress is made towards that end.

Please do not hesitate to contact me at [phone number], or have your staff contact [SRF Coordinator name]
at [phone number] with any questions about this review process. We look forward to working with you
on the 2018 SRF review, and furthering our critical EPA-State partnership.

Sincerely,

36


-------
SRF Reviewer's Guide - Round 4

Appendix E: Data Metric Analysis (DMA) Procedures

DMA Step-by-step:

Step 1: Downloading the DMA from ECHO

1)	LogintoECHO.gov

2)	Go to Oversight > State Review Framework

3)	Click on the Data Metrics Analysis tab

o Select the Statute (CAA, CWA, RCRA)
o Select the Review Year

o Choose the State being reviewed, and if applicable, the Local Agency
o Click submit

4)	A new window with table of metric values will appear, click download button

5)	Save document as: State-Local Statute Review Year Document Type
o e.g., AL CWA FY 17 DMA or AL-Jefferson CAA FY 17 DMA

Step 2: Making Initial Findings

6)	Open the downloaded copy and locate the columns Initial Finding and Explanation

7)	For all Goal Metrics:

o Evaluate each goal metric value and make an initial finding according the general ranges on page

21 or in Appendix J of the Reviewer's Guide,
o Provide a brief explanation to substantiate the finding

8)	For all Non-Goal Metrics:

o If metric values appear satisfactory, no finding is required
o If metric values suggest performance issues:

¦	Determine an initial finding level as described above

¦	Or flag the issue for follow-up in file selection and review to obtain more information. To do
this, enter Supplemental Review in either of the newly created columns

Step 3: Using the Initial Findings

9)	When finished, submit the DMA with initial findings to your HQ liaison for review before starting
the file selection and review process

10)	Once the DMA is submitted and reviewed, focus attention on findings of area for attention, area for
improvement, or those flagged for Supplemental Review.

o Check-in with the agency to make sure the values are accurate

o If so, in the file selection process make sure to select files pertaining to potential areas of concern.
(see File Selection section on pages 12-15)

37


-------
SRF Reviewer's Guide - Round 4

o During the file review, or whenever possible, discuss the DMA results with the agency to try to
gather any additional information that could be helpful in making and substantiating findings in
the report

Reviewers may want to share the DMA with the state or local agency as part of the kick-off letter or
meeting. This will allow for agencies to provide any feedback or corrections of the data before conducting
the review.

Note: The DMA and initial findings along with the results from the file review will be used later in the
process to make findings in the report

38


-------
SRF Reviewer's Guide - Round 4

Appendix F: File Selection Procedures

Representative File Selection Step-By-Step

1.)	Enter the following EPA web site address in your Internet web browser (preferably Google for
full functionality of the file selection tool): http://echo.epa.eov

2.)	In the upper right-hand corner, click on the ECHO Gov Login link

3.)	Enter your LAN user id and password; this is the same user id and password that you use to log
into your computer. This is not the numeric PIN for your smartcard.

4.)	At the bottom of your screen, click on the blue icon at right called "Oversight"

5.)	Next, click on the link at the bottom of the page called "State Review Framework"

6.)	Scroll down and click on the File Selection tab in the gray box in the middle of the page

7.)	Select the media you are reviewing (CAA, CWA, or RCRA)

8.)	Select the fiscal year of frozen data (select the most recent fiscal year). Reviewers in the first
year of Round 4 will selection FY 2017 frozen data for example.

9.)	Select State Only as the Agency to be reviewed

10	Select the state or local agency from the Jurisdiction drop down box.

11	Click on the Submit Without Flags button

12	Click the arrows below the Informal Action header twice to bring facilities with

informal actions to the top.

13	Select at least five facilities with informal actions at random by clicking on the

checkboxes on the left. Click the checkboxes twice to indicate that the facilities are
part of the representative selection. You will see a green checkmark next to all
selected files. (Beginning with enforcement actions is an efficient way to conduct
files selection. These facilities are the most likely to have inspections, violations and
penalties reported. To assist with random selection, the File Selection Tool only
identifies facilities by program ID number.)

14	Use the same methodology to select at least 5 formal actions, penalties, non-HPV/

non-SNC violations, SNC/HPV violations, and inspections.

15	Select at least 10 facilities with inspections, (some of the facilities already selected

will have inspections. These count toward the 10 inspection files.) For CAA, click the
up arrow in the FCE column; for CWA and RCRA files, choose the Inspection

39


-------
SRF Reviewer's Guide - Round 4

column

16	Select additional facilities as needed so at least five are selected in each of the

violation categories.

17	Review the number of files required to be selected based on comparison of the total number of
records returned in the top lef- hand portion of the file selection tool to the number of files
required to be reviewed in Table 1 [page 13], If more files need to be selected to meet
minimum file selection requirements, identify activities in greatest need of additional facilities
to make a proper evaluation. Randomly select facilities for those activities until you have
selected at least the minimum number of total files. Review the file selection criteria on pages
14-15 to ensure that all factors such as geographic distribution and other criteria are met.

40


-------
SRF Reviewer's Guide - Round 4

Appendix G: Checklist of Key Items for Conducting File Review

1.	Hardcopies:

List of selected facilities

Detailed Facility Reports (DFRs) for each facility reviewed
File review checklists for each facility reviewed
Contact information for point-of-contact and others at state agency
Copy of the DMA

2.	Electronic copies:

File review worksheet

Completed CWA CMS metric spreadsheet (metrics 4al - 4al 1)

3.	Either hard or electronic copies:

Plain Language Guide

Previous SRF reports & recommendation status

NPDES Program MOA or any other relevant state-EPA agreement

This guidance document

Enforcement response policies
Penalty policies
Inspection Manual

State compliance monitoring or inspection policies

41


-------
SRF Reviewer's Guide - Round 4

Appendix H: SRF Draft Report Completeness Checklist

When creating a draft report, be advised that the DMA, File Selection List, CWA inspection coverage
table, File Review Worksheet, and any other documents used for the SRF review process must be
submitted to the HQ SRF Liaison for him/her to determine completeness and perform an accurate
review of the report. These should also be uploaded to the SRF Manager database which serves as a
central repository and official record for the review.

A draft report is complete if all required sections listed below are uploaded into the SRF Manager
database or emailed to the Liaison.

Report Components and Attachments

Complete?

Yes

No

Report Components lor l\ach l-lemenl lor 1 -Inch Media Chapter
(CWA. C.VV and or RCRA) See l-\ample in Appendix 1

~

~

Finding (number and level)

~

~

Summary

~

~

l-\planation

~

~

Rele\anl metrics

~

~

Recommendations

~

~

Attachments

~

~

Data Metric Analysis spreadsheet"

~

~

I'ile Selection spreadsheet"

~

~

CW A inspection coverage table" and or alternati\e CMS
plans

~

~

File Review spreadsheet*

~

~

* These documents can be uploaded on the
Administration Information page of the SRF Manager.
They will appear in the Attachments table when the
report is finalized





42


-------
SRF Reviewer's Guide - Round 4

Appendix I: Sample Finding and Recommendation

CWA Element 4 —

Enforcement

Finding 4-1

Area for State Improvement

Summary

SNC violations are not addressed in a timely or appropriate manner.

Explanation

For two of the eight SNC violations reviewed, the violations did receive
appropriate follow-up action. However, in six instances, these violations received
neither informal nor formal enforcement action.

The state does not have a formal policy in place for taking enforcement against
SNC violators.

Metric 10a shows that the state was not consistently taking timely enforcement
action. This can be traced to the failure to complete inspection reports in a timely
manner.





Relevant metrics

• ¦ _ Natl Natl State State State
Metric ID Number and Description „ , „ ^

K Goal Avg N D % or #



10a Major facilities with timely action 98% - 1 8 13%



10b Enforcement responses reviewed that

address violations in an appropriate 100% - 5 15 33%
manner





State response

The state agrees that this is a problem and has agreed to work with EPA to resolve
it.

Recommendation

1) The state will develop a Standard Operating Procedure (SOP) for taking
enforcement action against SNC violators within 90 days of finalization of this
report, and will send a copy to EPA for approval. 2) The state will immediately
begin taking enforcement action against SNC violators in accordance with the SOP
developed under item 1. 3) EPA will monitor performance via quarterly conference
calls and annual SRF data metric analyses. EPA will close this recommendation
after approving the state's SOP and observing three consecutive quarters of
performance that meets national goals.

43


-------
SRF Reviewer's Guide - Round 4

Appendix J: Establishing Finding Levels

The table below provides a definition of each finding level and offers suggested metric value
ranges for help in deciding on a finding level. These value ranges are simply a guide in selecting
an appropriate finding level. Other factors may be considered (e.g., universe size of metric) in
choosing an appropriate level.

Suggested Metric
Value Ranges

Finding Level

-85-100%

Meets or Exceeds Expectations: The base level is met, and no performance
deficiencies are identified, or the program is performing above national
expectations

-71-84%

Area for Attention: An activity, process, or policy that one or more SRF
metrics show as a minor problem. Where appropriate, the state should
correct the issue without additional EPA oversight. EPA may make
suggestions to improve performance, but it will not monitor these
suggestions for completion between SRF reviews. These areas are
typically not highlighted as priority areas to address in an executive
summary.

-70% and below

Area for Improvement: An activity, process, or policy that one or more
SRF metrics under a specific element show as a significant problem that the
agency is required to address. Recommended activities to correct the issues
should be included in the report and must have well-defined timelines and
milestones for completion, and, if possible, should address root causes. EPA
will monitor recommendations for completion between SRF reviews in the
SRF Manager database and provide any necessary updates in the EPA
Manager database.

Additional Factors

Sample Size

In cases where there is a small universe for a metric or a low number of activities to review, the
small sample size means greater variability in metric values which can make it difficult to establish
a reliable finding on performance.

Though the review focuses on a one-year period of activity, the reviewer can select additional files
from prior years of activity to increase the sample size and have a more robust set of files.
Reviewers can also use multi-year trend data to help make a decision when performance is on the
edge of two finding levels Otherwise, follow the general range unless there is evidence to support
a different conclusion. If such evidence exists, include that information in the explanation section
of the finding which will be reviewed by HQ.

44


-------