CWA Plain Language Guide

Clean Water Act Metrics Plain Language Guide

State Review Framework Round 4

This Plain Language Guide describes the elements and metrics EPA uses during a State Review
Framework (SRF) review of CWA compliance and enforcement programs and provides instructions
on how to use the metrics to make appropriate findings and recommendations. SRF reviews are
based on information from EPA data systems and file reviews. Reviewers should refer to the CWA
file review checklist and spreadsheet when developing review findings on performance.

Data used in SRF reviews fall into two primary categories — data metrics and file review metrics.
These metrics provide the basis for determining agency performance.

1.	Data metrics are derived from frozen, verified data in ICIS-NPDES. Reviewers download
data metrics from the Enforcement and Compliance History Online (ECHO) to get an initial
overview of a state or local agency's performance. All data metrics fall into one of the
following subcategories:

•	Goal metrics evaluate performance against a specific numeric goal and are used to
develop findings. The ECHO data also provides the national average for these metrics
expressed as a percentage. EPA evaluates agencies against goals, not national
averages. These metrics include averages only to provide a sense of where an agency
falls relative to others.

•	Review Indicator metrics use national averages to indicate when agencies diverge from
national norms. Review indicators are not used to develop findings. They are used to
identify areas for further analysis during the file review. When an indicator diverges
significantly from the average, EPA should ensure that it pulls a sufficient sample of files
to evaluate the issue during the file review (see the File Selection Protocol for additional
guidance). EPA and the state or local agency should discuss the issue to determine if a
problem exists. Indicators can also provide narrative context for findings from file
reviews.

•	Compliance Monitoring Strategy (CMS) metrics are only required to be included in
the review when an agency has an alternative compliance monitoring strategy (CMS)
that includes one or more inspection commitments that differ from traditional
commitments in the national CMS. Typically, under an alternative CMS an agency will
substitute a certain number of inspections at larger facilities for some at smaller
facilities. If a state does not have a CMS plan for a given CMS inspection area, regions
will evaluate the state against the national inspection coverage goals for all sectors
(majors and non-majors) set forth in the 2014 NPDES compliance monitoring strategy
under metrics 4al - 4al0.

2.	File review metrics are evaluated during the review of facility files (including information such
as inspection reports, evaluations, enforcement responses and actions, and penalty documentation).
File reviews provide a greater understanding of an agency's performance than data metrics alone.

All file review metrics have national goals; however, unlike data metrics with goals, file metrics will
not have a national average.

1


-------
CWA Plain Language Guide

Guidance References and Acronyms

The SRF Documentation Page on ECHO provides a full list of links to SRF guidance and policies.

Year reviewed refers to the federal fiscal year of activities reviewed, not the year in which the
review is conducted. The year reviewed should generally be the year preceding the year the
SRF review is conducted. Agency refers to the state, local or federal agency which has the lead
for compliance monitoring and enforcement within the state or other jurisdiction undergoing the
SRF review.

A list of acronyms is provided as an attachment to this Plain Language Guide.

CWA SRF Review Process

1.	Annual data verification

2.	Annual data metric analysis

3.	File Selection

4.	Local agency or state district office inclusion (if applicable)

5.	Discussion with HQ on review process (or discussion on a step-by-step basis, as chosen
by the Region)

6.	Entrance conference

7.	File Review

8.	Exit conference

9.	Draft Report Submitted for internal agency review

10.	State Comment Period

11.	Revised report sent to agency for review and internet posting

12.	Final report and recommendations published on a SRF web site

13.	Track implementation status of Area for Improvement Recommendations in the SRF
Manager database on a periodic basis

Using Metrics to Determine Findings

Goal metrics always have numeric goals and stand alone as sufficient basis for a finding. For
example, the goal for CWA metric lb5 is 95% of completion of permit limit data entry
requirements. To analyze performance under this metric, reviewers compare the percentage of
permit limit data entered by the state to the 95% goal.

Based on this analysis, the reviewer would make a finding. All findings fall under one of these
categories:

Meets or Exceeds Expectations: The SRF was established to assess the base level or floor of
enforcement program performance. This rating describes a situation where the base level is met,
and no performance deficiency is identified, or a state performs above base program expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as a
minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make suggestions to improve performance, but it will not monitor these
suggestions for completion between SRF reviews. These areas are not highlighted as significant
in an executive summary.

2


-------
CWA Plain Language Guide

Area for State Improvement: EPA will develop a finding of Area for State Improvement whenever
an activity, process, or policy that one or more SRF metrics under a specific element show as a
significant problem that the agency is required to address. A finding for improvement should be
developed regardless of other metric values pertaining to that element. Recommended activities to
correct the issues should be included in the report. Recommendations must have well-defined
timelines and milestones for completion, and, if possible, should address root causes. EPA will
monitor recommendations for completion between SRF reviews in the SRF Manager database. The
status of recommendations will be publicly available on EPA's SRF web site.

The National Strategy for Improving Oversight of State Enforcement Performance is a key
reference in identifying recommendations for Areas for Improvement. Where a performance
problem cannot be readily addressed, or where there is a significant or recurring performance
issues, there are steps EPA can and should take to actively promote improved state performance.
For additional information: https://www.epa.gov/sites/production/files/2014-06/documents/state-
oversi ght-strategy. pdf.

Using Other Metrics

When metrics other than Goal metrics indicate problems, EPA should conduct the additional
research necessary to determine the nature of the issue. These metrics provide additional
information that is useful during file selection, and for gauging program health when compared to
other metrics.

For example, CWA metric 8a3 is a Review Indicator metric that covers the percentage of major
facilities in significant noncompliance (SNC) and non-major facilities in Category I noncompliance.
It is only with knowledge of the CWA universe information, deviations from a known national
average, knowledge of the accuracy of SNC and Category I determinations, and/or other contextual
information that a reviewer is able to judge whether the percent of facilities in SNC or Category I
noncompliance presents a performance issue.

Use of State Guidance and Regional-State Agreements as Basis for Findings in SRF Reviews

The State Review Framework evaluates enforcement program performance against established
OECA national program guidance. State program guidance or regional-state agreements are
applicable to the SRF review process under the following circumstances.

1.	It is acceptable to use the state's own guidance to evaluate state program performance if:
1) the region can demonstrate that the state's standard(s) is(are) equivalent to or more
stringent than OECA guidance, and; 2) and the state agrees to being evaluated against that
standard(s). In these cases, regions should inform OECA/OC in advance of the review that
they intend to use state guidance and should include a statement in the SRF report
indicating that the state guidance was determined to be equivalent or more stringent than
the applicable OECA policy and was used as the basis for the review.

2.	For certain metrics, clearly specified in this Plain Language Guide, it will be necessary to
refer to state policies or guidance, or to EPA-state agreements. For example:

a. If the state has an Alternative CMS, EPA will use these state-specific
commitments as the basis to evaluate compliance monitoring coverage.

3


-------
CWA Plain Language Guide

b. The national guidance may require only that a state establish a standard but not
actually provide the standard. In such cases, the reviewer will need to ensure that
the state has developed the required standard, and once it has been reviewed and
approved by the region, use that standard to evaluate state performance.

3.	Where national guidance has been modified or updated, it is important to review the
corresponding state program implementation guidance to assess whether it has become
out of date or inaccurate. In such cases, the reviewer should make appropriate
recommendations for revision of the state guidance, review the revised version, and
approve it, if appropriate.

4.	Where state program guidance or regional-state agreements establish practices or
standards that are not consistent with or at least equivalent to national program guidance,
this may be an allowable flexibility under section A4 of the Revised Policy Framework
for State/EPA Enforcement Agreements (Barnes, August 1986, as revised). If so, the
region should inform OECA/OC prior to the review and note this flexibility in the
explanation of the SRF report. If the differences between the state guidance or regional-
state agreements and the national guidance is significant, or if it is unclear whether
flexibility from OECA policy is appropriate, the region should elevate the issue to OECA
for resolution (per Interim Guidance on Enhancing Regional-State Planning and
Communication on Compliance Assurance Work in Authorized States (Bodine, 2018)
prior to developing findings or a draft report.

Element and Metric Definitions
Element 1 — Data

EPA uses Element 1 to evaluate data accuracy and completeness. Review of this element is
conducted in the following two ways:

•	File review: EPA evaluates accuracy and completeness primarily through metric 2b, a file
review metric that compares data in the ECHO Detailed Facility Report or ICIS-NPDES to
information in facility files.

•	Evaluating data metrics: As the reviewer has discussions with the state/local agency and
conducts data metric analysis and the file reviews, he or she may find the value for a data
metric to be inaccurate or incomplete to a significant degree. In this case, the finding in the
report should be an Area for Improvement and should cite both the reported and, when
possible, the actual values for the relevant metric.

To provide an example, data metric 5a shows that State X inspected 5 of its 20 major facilities.
EPA believes that the state actually inspected all 20 but failed to enter the inspections into ICIS.
EPA will need to confirm this during the entrance conference and file reviews. If the state
inspected all 20 but failed to enter the inspections into ICIS, that would be an Area for State
Improvement under Element 1 (Data). If the metric is accurate and the state only inspected 5 of 20,
that would be an Area for State Improvement under Element 2 (Inspections).

Refer to NPDES Electronic Reporting E-rule (NPDES E-rule) for minimum data requirements.

4


-------
CWA Plain Language Guide

Key metrics: 2b, lb5, and lb6. Also consider data entry and/or accuracy issues pertaining to
metrics 5al, 5bl, 5b2, 7j 1, 7kl, 8a3, and lOal, if applicable. For example, if a reviewer finds that
a state has adequate inspection coverage of majors under metric 5al, but some or all of those
inspections are not in EPA data systems, this should be noted as an Area of Attention or Area for
State Improvement under Element 1. Conversely, if a state is not meeting minimum expectations
for inspection coverage and state performance is well below the national goal, this should be noted
in the report as an area for improvement under the Element 2 on inspections, not the Element 1
data element. The same guidance applies for data entry issues pertaining to metrics 7j 1, 7kl, 8a3,
and lOal.

Metric 2b — Files reviewed where data are accurately reflected in the national data system

Metric type: File, Goal

Goal: 100% of data are complete and accurate

What it measures: Percentage of files reviewed where mandatory data are accurately reflected in
the national data system. The numerator = number of files that accurately reflect mandatory data,
denominator = number of files reviewed.

Guidance: Reviewers should compare data in the ECHO Detailed Facility Report (DFR) or ICIS-
NPDES with information in the facility files to check that they accurately reflect activities such as
inspection dates, inspection types, single event violations, significant noncompliance (SNC) status,
and enforcement responses. The detailed facility report lists the facility site name, rather than the
permittee name. If a reviewer questions the accuracy of the permittee name in ICIS-NPDES (the
database of record for SRF reviews of NPDES data), the permittee name should be reviewed in
the organizational formal name field in ICIS-NPDES. See the CWA File Review Facility
Checklist, Part II for complete instructions. The following are examples of data to examine for
accuracy and completeness under Metric 2b:

1.	Inspections: Compare the inspection date listed in the inspection report with information in
the DFR under "Compliance Monitoring History."

2.	Violations: Compare the information in the file to the facility's significant noncompliance
status, DMR violations, single event violations, permit schedule violations, and compliance
schedule violations in the "Compliance Summary Data" and "Three Year Compliance
Summary Data" sections of the DFR

3.	Informal Enforcement Action: Check to ensure that all informal enforcement actions
found in the file for the review year are in the DFR and compare date(s) in the file with
information in the "Notice of Violation or Informal Enforcement" section of the DFR

4.	Formal Enforcement Action: Check to ensure that all formal enforcement actions found in
the file for the review year are in the DFR and compare date(s) in the file with information
under the "Formal Enforcement Actions (05 Year History)" section of the DFR

5.	Penalties: Compare any penalty amounts in the file with information in the DFR under
"Formal Enforcement Actions."

If information in the files is missing from, or inaccurately entered into, the national database ICIS-
NPDES, the data for that file is not complete or accurate.

5


-------
CWA Plain Language Guide

Reviewers should also consider their knowledge of the agency's program when conducting this
analysis. For example, if the reviewer notices multiple compliance evaluation inspections
identified in the DFR for a facility within one week's time, it is unlikely that the agency has
conducted multiple CEIs in this timeframe. It is more likely that the later ones are follow-up
inspections. In addition, reviewers have the flexibility to differentiate between non-recurring,
clerical errors with little consequence to overall program implementation and management versus
those more significant errors or omissions, particularly those inaccuracies that recur across
multiple reviewed files. For example, a typo in zip code in one or two files is a much less
significant issue than unreported single event violations at most facilities reviewed.

Per 40 CFR 127.16, general permit reports [Notices of Intent to discharge (NOIs); Notices of
Termination (NOTs); No Exposure Certifications (NOEs); Low Erosivity Waivers (LEWs) and other
Waivers] are not required to be submitted until December 21, 2020. In addition, per the information
in Section F on non-major facility inspection single event violation data, an authorized NPDES
program is only required to share with EPA SEV data from a construction storm water inspection
when the authorized NPDES program also issues a formal enforcement action against the inspected
construction site.

Applicable EPA policy/guidance: Permit Compliance System (PCS) Policy Statement, August 31,
1985, as amended in 2000; ICIS Addendum to the Appendix of the 1985 Permit Compliance
System Statement from Michael M. Stahl, Director, Office of Compliance and James A. Hanlon,
Director, Office of Wastewater Management, December 28, 2007 and the ICIS Addendum Data
Elements Attachment; PCS Quality Assurance Guidance Manual. August 28, 1992; Final Sinsle
Event Violation Data Entry Guide for the Permit Compliance System (ICIS-NPDES)).

May 22, 2006; National Pollutant Discharge Elimination System (NPDES) Electronic Reporting
Rule. October 22, 2015.

Metrics lb5 and lb6 — Completeness of data entry on major and non-major permit limits
and discharge monitoring reports (DMRs)

Metric type: Data, Goal

Goal: >95%

What it measures: Completeness of information entered into the ICIS-NPDES database on
permit limits and discharge monitoring reports.

•	lb5: Permit limit data entry rates for major and non-major facilities

•	lb6: DMR data entry rate for major and non-maj or facilities

Guidance: The NPDES Electronic Reporting Rule states that for the purposes of
requirements regarding timeliness, accuracy, completeness, and national consistency, data
are complete when 95% or more of the submissions required for each NPDES data group
are available in EPA's national NPDES data system.

Applicable EPA policy/guidance: The Code of Federal Regulations including 40CFR
123.26(e)( 1) and 40 CFR 123.26(e)(4); The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; ICIS Addendum to the Appendix of the
1985 Permit Compliance System Statement from Michael M. Stahl, Director, Office of Compliance

6


-------
CWA Plain Language Guide

and James A. Hani on, Director, Office of Wastewater Management, December 28, 2007 and the
ICIS Addendum Data Elements Attachment. PCS Quality Assurance Guidance Manual. August 28,
1992; National Pollutant Discharge Elimination System (NPDES) Electronic Reporting Rule.
October 22, 2015.

Element 2 — Inspections

Element 2 evaluates:

•	Inspection coverage compared to CMS commitments

•	Inspection report completeness and quality

•	Inspection report timeliness

at major and non-major facilities.

For the Clean Water Act, Clean Water Act National Pollutant Discharge Elimination
System Compliance Monitoring Strategy (NPDES CMS, July 21, 2014) provides
inspection frequency goals for the core NPDES program and for wet weather sources and available
flexibilities that EPA and states may use in negotiating inspection commitments. Under the NPDES
CMS, major facilities are generally to be inspected biennially. The CMS provides for triennial
inspections if the site/facility is consistently in compliance and not contributing to impairments. For
most sources other than majors, the CMS provides flexibility in how the goals are achieved (i.e.,
inspection type and selection of facilities), and generally calls for inspections every five years, with
some source types even less frequently.

The NPDES CMS provides flexibility to regions and state agencies to address unique mixes of
regulated entities and environmental conditions and to identify and document state-specific
NPDES inspection frequency goals that differ from the frequencies recommended in the CMS.
SRF reviews consider all of the flexibility and trade-offs built into the NPDES CMS plans for
each state to provide a clear and accurate picture of the broad set of inspections completed by
states.

Inspection coverage at major facilities is tracked under data metric 5al. Non-major inspection
coverage at individually permitted facilities is analyzed under data metric 5b 1, and non-major
general permit inspection coverage is reviewed under data metric 5b2. Metrics 5al, 5b 1 and 5b2
are evaluated against state commitments in their CMS plans. State progress in meeting inspection
commitments in CMS plans is also available under file metrics 4al-4al0; these metrics primarily
track non-major pretreatment, significant industrial user, and wet weather facilities.

During the exit interview following the file review, regions should evaluate progress toward the
annual CMS commitments, along with other findings, and discuss the state's strategy for
meeting multi-year commitments. This should, in turn, inform annual planning discussions with
states to ensure CMS goals for all sources, including pretreatment and wet weather, are
appropriately considered in a manner that will lead states on a path to meet multi-year goals.
Regional offices have the flexibility to share copies of detailed facility reports with states before,
during or after file reviews.

7


-------
CWA Plain Language Guide

Key metrics: 4al, 4a2, 4a4, 4a5, 4a7, 4a8, 4a9, 4al0,4all, 5a, 5bl, 5b2, 6a, and 6b.

Applying the Appendix C Inspection Coverage Table Facility Data to the SRF Review

Data for the Inspection Coverage Table that appears in the Conducting a SRF Review guidance
document and in Appendix C at the end of this guide is part of the data metric analysis (DMA)
process (see the guidance on Conducting a SRF Review for additional details). Regions should
review information available from ICIS-NPDES and contact their state to obtain complete
information for the CMS Commitments Table. This information should be used to develop the
explanation narrative and finding level selected under SRF Element 2 on inspections, and,
where relevant, finding levels selected for Element 3 on violations, Element 4 on enforcement
actions, and Element 5 on penalties.

SRF reviewers will rely on the inspection coverage data at several stages during the review
process, including file selection, review of Element 2, and review of Elements 3-5. Review of the
Non-major facility metrics, metrics 4al-4al 1, may also be relevant to the exit interview.

File Selection

EPA evaluates inspection and enforcement files where activity occurs during the review year as
part of the State Review Framework evaluation process. As part of the file review preparation
process, regions use the ECHO File Selection Tool available on the ECHO web site to randomly
select a small set of files representative of a broad spectrum of the state's compliance monitoring
and enforcement work during the review year. The SRF file review guidance_describes the
necessary steps including selecting an appropriate number of files with compliance monitoring
and enforcement activity, ensuring geographic distribution across the state.

Ensuring that the file selection list is representative of commitments made in the state's NPDES
CMS plan is a key consideration for SRF CWA file reviews. Regions should review some files
in the inspection commitment categories negotiated in the state specific CMS Plan. If the initial
file selection list provided by the ECHO File Selection Tool does not generate file selection
representative of priorities indicated in the state's CMS plan for wet weather and pretreatment
universe facilities in the initial file selection download, add or substitute supplemental files to
ensure adequate coverage of pretreatment, CSOs, SSOs, stormwater and CAFO facilities using
the established file selection protocol to randomly select files for on-site review. The Inspection
Coverage Data Table completed by reviewers for each state can be used to facilitate this process.

Metric 4a — Percentage of planned inspections completed
Metric type: Compliance Monitoring Strategy Metrics

Goal: 100% of state specific CMS Plan commitments

What it measures:

• 4al: Number of pretreatment compliance inspections and audits at approved local
pretreatment programs (Target: EPA's CMS goal is two pretreatment compliance
inspections that include >2 oversight inspections of industrial users (IUs) and one audit

8


-------
CWA Plain Language Guide

at each approved local pretreatment program within five years. Reviewers should
compare the number of state inspections to the commitment in the state specific CMS
Plan for the review year, or against the goal in the NPDES CMS policy if there is no
state specific CMS plan for pretreatment facilities.)

•	4a2: EPA or state Significant Industrial User inspections for SIUs discharging to non-
Authorized POTWs (Target: EPA's CMS goal is one sampling inspection at each SIU
annually. Reviewers should compare the number of state inspections to the commitment
in the state specific CMS Plan for the review year, or against the goal in the NPDES
CMS policy if there is no state specific CMS plan for SIU facilities.)

•	4a4: Number of CSO inspections (Target: EPA's CMS goal is one inspection of each
major and non-major CSO every five years for states with combined sewer systems.
Reviewers should compare the number of state inspections to the commitment in the state
specific CMS Plan for the review year, or against the goal in the NPDES CMS policy if
there is no state specific CMS plan for CSO facilities

•	4a5: Number of SSO inspections. (Target: EPA's CMS goal is to inspect 5% of the
universe of permitted POTWs with SSS annually. Reviewers should compare the number
of state inspections to the commitment in the state specific CMS Plan for the review year,
or a g a i n s t the goal in the NPDES CMS policy if there is no state specific CMS plan for
SSO facilities.)

•	4a7: Number of Phase I and IIMS4 audits or inspections (Target: EPA's CMS goal is
one audit, on-site inspection, or off-site desk audit* of each Phase I andIIMS4 every five
years and one inspection or on-site audit of each Phase I and IIMS4 every seven years.)
Reviewers should compare the number of state inspections to the commitment in the
state specific CMS Plan for the review year, or against the goal in the NPDES CMS
policy if there is no state specific CMS plan for Phase I and IIMS4 facilities.

* Off-site desk audits include but are not limited to review of facility reports and records, review
of agency-gathered testing, sampling and ambient monitoring data, evaluation of responses to
CWA section 308 information requests, review of compliance deliverables submitted pursuant to
permits or enforcement actions, and analysis of aerial or satellite images. An off-site desk audit
conducted pursuant to a CMS plan will include the appropriate combination of these activities to
allow the region or the state to make a facility-level or program level compliance determination.
In order for an off-site desk audit or focused inspection to count toward CMS implementation for
the results in this table, the region or state must report the activity into ICIS-NPDES (either
through direct data entry or via the CDX National Environmental Information Exchange
Network). See Part 3 of the CWA NPDES CMS for additional details on focused inspections and
off-site desk audits.

•	4a8: Number of industrial stormwater inspections (Target: EPA's CMS goal is 10% of
the state universe each year, (includes inspections of unpermitted facilities and those
with and without "no exposure certification.") Reviewers should compare the number of
state inspections to the commitment in the state specific CMS Plan, or against the goal in
the NPDES CMS policy if there is no state specific CMS plan for industrial stormwater
facilities.

•	4a9: Number of Phase I and Phase II construction stormwater inspections (Target: EPA's

9


-------
CWA Plain Language Guide

CMS goal is 10% of the state Phase I and II universe each year including inspections of
unpermitted sites.) Reviewers should compare the number of state inspections to the
commitment in the state specific CMS Plan, or against the goal in the NPDES CMS
policy if there is no state specific CMS plan for Phase I and II construction stormwater
facilities.

•	4al0: Number of comprehensive inspections of large and medium NPDES-permitted
concentrated animal feeding operations (CAFOs). (Target: EPA's CMS goal is one
comprehensive inspection of each large and medium NPDES- permitted CAFO every
five years) Reviewers should compare the number of state inspections to the
commitment in the state specific CMS Plan, or against the goal in the NPDES CMS
policy if there is no state specific CMS plan for large and medium CAFO facilities.

•	4al 1: Number of sludge/biosolids inspections at each major POTW. (Target: EPA's
CMS goal is one inspection every 5 years for each major POTW in a state with biosolids
program authorization. Biosolids use and disposal operations, including incineration and
surface application, should receive at least one sludge/biosolids inspection every 5
years.)*

* States may substitute an off-site desk audit for sludge/biosolids generation, use, and disposal
sites that meet the following criteria: (1) are not currently subject to enforcement actions or
compliance schedules that are the result of concluded enforcement actions; (2) have not been
reported in Significant Noncompliance (SNC) within the previous four quarters; (3) have no
unresolved single event violation(s) identified in prior inspection(s); (4) do not discharge to
CWA section 303(d) listed waters for pollutant(s) contributing to the listing; and (5) have no
known potential to impact drinking water supplies. A CMS plan that utilizes this approach for
conducting off-site desk audits in lieu of sludge/biosolids inspections is still considered a
traditional CMS plan. In states where EPA is the permitting authority for biosolids, compliance
monitoring activities for biosolids facilities will be conducted in accordance with plans and
protocols established by the EPA Biosolids Center for Excellence.

Guidance: Metrics 4al-4al 1 track progress in meeting inspection commitments per the negotiated
state-specific Compliance Monitoring Strategy Plan (CMS Plan) in the review year based on the
NPDES Compliance Monitoring Strategy (NPDES CMS, July 21, 2014). The numerator =
number of inspections completed; denominator = number of inspections planned based on
information in the state CMS Plan.

The information in the completed NPDES CMS metrics table will form the basis for determining
whether the state meets, exceeds, or falls short of meeting commitments. Use the Inspection
Coverage Data Table in Appendix C to calculate these metrics. EPA will evaluate the percentage
of inspection commitments met based on the commitments in the state's CMS plan for the
review year. For each metric with an annual compliance monitoring goal, EPA review teams will
compare the number of inspections or audits committed to in the state's CMS plan against
information that appears in EPA data systems regarding inspections or audits conducted. Where
inspections covered by the CMS do not have data entered in ICIS- NPDES, reviewers should
gather and assess information from the state agency to review performance against the applicable
CMS commitments. (If the state fails to enter system required inspection data in ICIS-NPDES,
the reviewer should note this as a problem under Element 1 with a finding of Area for State
Attention or Improvement.) For commitments that span more than one year, regions should
consider whether the state met the commitment set forth in its CMS plan and how well this
prepares the state to meet the cumulative, or multi-year, commitment. If a state does not have a
state-specific CMS plan for a given CMS inspection area, regions will evaluate the state against

10


-------
CWA Plain Language Guide

the national inspection coverage goals set forth in the 2014 NPDES compliance monitoring
strategy under metrics 4al - 4al 1.

The SRF review will then evaluate the violations identified through those inspections, enforcement
actions, and associated penalties in areas where states commit to conduct pretreatment, SIU, and
wet weather inspections as part of the file review process to ensure that states take action to address
violations found at non-major facilities covered under the NPDES CMS policy. EPA selected
these 9 metrics in order to look beyond major facilities and assess performance in inspection
frequency for pretreatment, SIU, and wet weather sources.

Metric 5al — Inspection coverage of NPDES majors
Metric type: Goal Metric

Goal: 100% of state specific CMS Plan commitment

What it measures: Percentage of major NPDES facilities inspected. The numerator = the
number of major NPDES facilities inspected; the denominator = the number of major NPDES
facilities scheduled for inspection in the state specific CMS Plan for the review year. Reviewers
are to compare the number of state inspections of major NPDES facilities listed in the data
metric analysis to the commitment in the state specific CMS Plan for the review year; the
denominator that automatically populates in the data metric analysis for Metric 5al is not likely
to reflect the state's annual inspection commitment that varies from year to year. The
denominator for this metric is the state's inspection commitment listed in the state specific CMS
plan for the review year. It is also helpful to examine state end of year reports on inspection
results to assess inspection coverage and to determine whether all inspections are reported in the
ICIS database.

Guidance: EPA's CMS goal for inspections of major NPDES permittees is a minimum of at least
one comprehensive inspection every two years. Where OECA's Inspection Targeting Model is
used to assist in screening and identifying inspection targets, the inspection frequency can be
adjusted to one comprehensive inspection every three years for major NPDES facilities in
compliance and not contributing to CWA §303(d) listings or §305(b) reporting unless there is an
alternative CMS commitment. A state may have approval for an alternative CMS plan that has
different frequencies than those listed above for that year. Reviewers should examine inspection
coverage holistically in the Inspection Coverage Data Table to determine findings on inspection
coverage in SRF reports.

Applicable EPA policy/guidance: Memo, Clean Water Act National Pollutant Discharge
Elimination System Compliance Monitoring Strategy July 21, 2014; OECA National Program
Manager Guidance;

Metric 5bl — Inspections coverage of NPDES non-majors with individual permits
Metric type: Goal Metric

Goal: 100% of the state specific CMS Plan commitment

What it measures: The percentage of NPDES individual non-major permittees inspected in
review year. The numerator = the number of non-major individual permittees inspected; the

11


-------
CWA Plain Language Guide

denominator = the number of non-major individual permittees scheduled for inspection in the
state specific CMS Plan for the review year. Reviewers are to compare the number of state
inspections of non-major individually permitted NPDES facilities against the commitment in
the state specific CMS Plan for the review year; the denominator that automatically populates
in the data metric analysis for Metric 5b 1 is not likely to reflect the state's annual inspection
commitment that varies from year to year. The denominator for this metric is the state's
inspection commitment listed in the state specific CMS plan for the review year. It is also helpful
to examine state end of year reports on inspection results to assess inspection coverage and to
determine whether all inspections are reported in the ICIS database.

Guidance: EPA's CMS goal for inspections of non-major facilities with individual NPDES
permittees (traditional minor permittees) is an inspection at least once in each five-year permit
term.

Applicable EPA policy/guidance: Memo, Clean Water Act National Pollutant Discharge
Elimination System Compliance Monitoring Strategy July 21, 2014, OECA National Program
Manager Guidance. Clean Water Act Action Plan (Prior to February 22, 2010 known as the Clean
Water Act Enforcement Action Plan), October 15, 2009.

Metric 5b2 — Inspections coverage of NPDES non-majors with general permits
Metric type: Goal Metric

Goal: 100% of the state specific CMS Plan commitment

What it measures: Percentage of non-major NPDES facilities with general permits. The numerator
= the number of non-major facilities with general permits inspected; the denominator = the
number of facilities with non-major general permits in the state specific CMS Plan for the review
year. Reviewers are to compare the number of state inspections of non-major general permit
NPDES facilities against the commitment in the state specific CMS Plan for the review year;
the denominator that automatically populates in the data metric analysis for Metric 5b2 is not
likely to reflect the state's annual inspection commitment that varies from year to year. The
denominator for this metric is the state's inspection commitment listed in the state specific CMS
plan for the review year. It is also helpful to examine state end of year reports on inspection
results to assess inspection coverage and to determine whether all inspections are reported in the
ICIS database.

Guidance: This metric is evaluated in the same manner as metric 5b 1. The difference between
the two is that the universe for 5b2 applies to permittees covered by a general permit

Applicable EPA policy/guidance: Memorandum, Clean Water Act National Pollutant Discharge
Elimination System Compliance Monitoring Strategy July 21, 2014; OECA National Program
Manager Guidance; Clean Water Act Action Plan (Prior to February 22, 2010 known as the Clean
Water Act Enforcement Action Plan), October 15, 2009.

12


-------
CWA Plain Language Guide

Metric 6a — Inspection reports complete and sufficient to determine compliance at the facility
Metric type: File, Goal
Goal: 100%

What it measures: Percentage of inspection reports reviewed that provide sufficient documentation
to determine compliance. This metric describes the quality of inspection reports. Numerator =
number of inspection reports with sufficient documentation to determine compliance; denominator =
total number of inspection reports reviewed.

Guidance: Inspection reports should be reviewed to see if they provide the information requested
in the NPDES Compliance Inspection Manual, Chapter 2. Basic information that should be
collected in inspection reports is discussed in the NPDES Compliance Inspection Manual
including:

•	linking permit and/or regulatory requirements to observations made by the inspector
regarding noncompliance,

•	narrative describing the facility and its procedures,

•	documentation such as reports, records, photographs, maps, conditions observed,
statements by facility personnel, checklists.

See the CWA File Review Facility Checklist for additional details on inspection report quality and
completeness. For each inspection report found in reviewed files, reviewers should complete
CWA Inspection Report Checklist in the "CWA Facility Checklist" on p.3.

All essential report components should be present and properly documented. If certain components
are routinely missing, these should be mentioned in the SRF report. Reviewers have the flexibility
to consider a wide range of information sources beyond the inspection report, including state web
sites and permits.

Agencies will have their own methods for completing inspection reports. EPA should discuss this
with the agency at the beginning of the review to determine which parts of the agency's inspection
report (particularly for Compliance Evaluation Inspections (CEIs)) are consistent with EPA
expectations. EPA reviews the quality of the written inspection reports only under this metric; this
metric is not an evaluation of the quality of field inspections.

Applicable EPA Policy/Guidance: NPDES Compliance Inspection Manual EPA Report # 305-K-
17-001, Interim Revised Version, January 2017.

Metric 6b — Timeliness of inspection report completion

Metric type: File, Goal
Goal: 100%

What it measures: Percentage of inspection reports reviewed that are timely. The numerator =

13


-------
CWA Plain Language Guide

number of inspection reports completed within recommended timeframe; denominator = total
number of inspection reports reviewed.

Guidance: Reviewers should evaluate timeliness of state inspection reports against timeliness goals
in state inspection procedures. In the absence of state guidelines, reviewers should evaluate
timeliness against EPA guidelines. The National Pollutant Discharge Elimination System
Enforcement Management System. Chapter 5, Section A provides guidance on timeliness of
inspection reports. Specifically, timely inspection reports are completed within 45 days of the date
of inspection for sampling inspections and completed within 30 days for non-sampling types of
inspections. The federal 30 and 45-day inspection report completion standard applies if state
standards are more stringent than the timeframes listed in the NPDES EMS.

EPA reviews the timeliness of the written inspection reports only under this metric; this metric is
not an evaluation of the quality of field inspection reports (see metric 6a). The number of
inspection reports reviewed is dependent upon the size of the regulated universe of facilities in the
state; see the File Selection Protocol for details on selecting the appropriate number of files to
evaluate under this metric.

Reviewers should record the length of time it took to complete each report in the File Review
Checklist so they can compute average timeframes.

If an agency does not have a timeliness standard, EPA should use the SRF as an opportunity to
encourage the Agency to adopt one, particularly if it is not consistently completing reports in less
than 30 to 45 days, and especially if this creates delays in other aspects of the program, such as
violation determination or enforcement.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act): Clean Water Act National Pollutant Discharge
Elimination System Compliance Monitoring Strategy July 21, 2014; Clean Water Act Action
Plan (Prior to February 22, 2010 known as the Clean Water Act Enforcement Action Plan),
October 15, 2009; NPDES Compliance Inspection Manual. EPA Report # 305-K-17-001, Interim
Revised Version, January 2017; Clean Water Act Inspector Training.

Element 3 — Violations

Under this element, EPA evaluates the accuracy of the agency's violation and compliance
determinations, and the accuracy and timeliness of its significant non-compliance determinations.

Reviewers will evaluate data metrics 7j 1, 7kl, and 8a3 during the data metric analysis. If the
reviewer finds that violation or SNC rates are lower than the national average, he or she may want
to include additional inspections or violations in the file selection process in order to determine
the accuracy of violation and SNC determination.

File metric 7e covers the accuracy of compliance determinations made from inspections. These
metrics will generally form the basis for findings under this element.

Key metrics: 7e, 7jl, 7kl, 8a3

Reports should factor in findings from the Inspection Coverage Data table listed in Appendix C
that affect violation identification in enforcement programs. Reviewers should request from the

14


-------
CWA Plain Language Guide

state or local agency information on violations identified as a result of inspections of non-major
facilities when this information is not available through ICIS-NPDES. States are required to
provide to EPA any information requested on NPDES program implementation per 40 CFR
123.45. If the state or local agency does not provide this information, reviewers should note the
missing information as an issue that could not be fully evaluated in the final report, and that needs
to be addressed.

File Reviews

The SRF considers inspections, violations, enforcement actions; the timeliness and appropriateness
of enforcement action; and documentation of penalty calculation, assessment and collection (see
SRF Elements 3-5). As part of file reviews for Elements 3-5, regions should review files for wet
weather and pretreatment facilities that the state inspected in accordance with its NPDES CMS
plan to ensure that inspections and enforcement activities at these facilities are well implemented.
For non-major permittees, Category 1 violations should be considered requiring enforcement
follow-up. Specific metrics and calculation methodology for measures for major facilities utilized
under Elements 3-5 are described in detail in this Clean Water Act Plain Language Guidance and
accompanying file review spreadsheets on the following ECHO web site. As part of the review of
regional files selection lists, EPA will review the representativeness of files selected to ensure
NPDES CMS commitments are adequately factored into the review process.

Metric 7e —Accuracy of compliance determinations

Metric type: File, Goal

Goal: 100%

What it measures: Percentage of inspection reports reviewed with sufficient documentation
leading to an accurate compliance determination. The numerator = number of files containing
inspection reports reviewed with sufficient documentation leading to an accurate compliance
determination; denominator = total number of inspection reports reviewed.

Guidance: This metric assesses whether violations — either significant noncompliance or single
event violations — were accurately identified based on the documentation contained in facility
files. For example, violations identified in the enforcement action should be documented in facility
files as observations noted while on-site at the facility. This information may be in the inspection
report narrative or in the single event violation (SEV) section of state's inspection report form.
Note that if the compliance determination is not made in the inspection report, then it should be
documented elsewhere in the file including: SEV data in ICIS or a state data system, informal or
formal actions taken in response to deficiencies found during the inspection that clearly reference
the inspection, tracking systems that document violations discovered and actions taken in response,
and unsatisfactory ratings on inspection checklists. Reviewers should examine inspection
conclusion data sheet (ICDS) information in ICIS to determine whether compliance determinations
on deficiencies found are noted in ICIS and discuss with the state how the state tracks violations.

Agencies will have their own methods for completing inspection reports. EPA should discuss this
with the agency at the beginning of the review to determine if the agency's inspection reports,
particularly for Compliance Evaluation Inspections (CEIs)), are consistent with EPA expectations.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; Memorandum. Clarification of NPDES

15


-------
CWA Plain Language Guide

EMS Guidance on Timely and Appropriate Response to Significant Noncompliance Violations
from Mark Pollins, Director, Water Enforcement Division, and Betsy Smidinger, Acting Director,
Enforcement Targeting and Data Division, May 29, 2008. Data Entry Guide for the Permit
Compliance System (ICIS-NPDES); NPDES Compliance Inspection Manual. EPA Report # 305-K-
17-001, Interim Revised Version, January 2017.

Metric 7jl — Number of major and non-major NPDES facilities with single-event violations
reported in the review year

Metric type: Review Indicator

What it measures: Assesses whether single-event violations (SEVs) determined by means other
than automated discharge-to-limits comparisons are reported and tracked in ICIS-NPDES.

• 7jl: Number of major and non-major NPDES facilities with single-event violations in the
review year

Guidance: Reviewers should carefully compare SEVs found during the on-site file review in
inspection reports, enforcement actions, SSO notifications, and other correspondence to drilldown
data for metric 7j 1. This metric is limited to SEVs that start within the federal fiscal year reviewed
under SRF; SEVs that begin in prior years and continue in the review year are not reported under
this metric. Facilities with unreported SEVs not listed in drilldown data for this metric should be
noted along with any other unreported data accuracy issues under Element 1 to group all data
related recommendations under the same element. SEVs are minimum data requirements for both
major and non-major facilities as of December 21, 2016 under the NPDES Erule, excluding SEVs
without formal enforcement at storm water construction sites.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; Memorandum. Clarification of NPDES
EMS Guidance on Timely and Appropriate Response to Significant Noncompliance Violations,
from Mark Pollins, Director, Water Enforcement Division and Betsy Smidinger, Acting Director,
Enforcement Targeting and Data Division, May 29, 2008:Data Entry Guide for the Permit
Compliance System (ICIS-NPDES): NPDES Compliance Inspection Manual EPA Report #: 305-X-
4-001, June 2004; National Pollutant Discharge Elimination System (NPDES) Electronic
Reporting Rule. October 22, 2015.

Metric 7kl — Major and non-major facilities in noncompliance
Metric type: Review Indicator

What it measures: The percentage of major and non-major facilities with violations reported to the
national database. Violations factored into metric 7kl include effluent, single event, compliance
schedule, and permit schedule violations for non-compliance codes D, E, N, S, T, X, and V.

Guidance: Review the percent of major and non-major facilities in noncompliance and compare
this percentage to the national average and prior year trends for the state. If noncompliance is
significantly higher, or is high and remains high, the reviewer should consider selecting additional
files with violations and enforcement actions to ensure that timely and appropriate enforcement
occurs in response to violations when evaluating file review metric 10b. If levels are well below
the national average, reviewers may also want to look into what is behind the lower numbers -
either higher levels of compliance or failure to identify or report violations.

16


-------
CWA Plain Language Guide

The number of non-major facilities in Category 1 noncompliance (more serious) violations [i.e.
as defined in 40CFR123.45(a)(2)(G)(i-vi)1 and the number of non-major facilities in Category 2
noncompliance (i.e., less serious violations) [i.e. as defined by 40CFR123.45(a)(2)(G)(vii)1
work in conjunction with the NNCR process, which is designed to obtain accurate counts of
facilities in noncompliance.

Reviewers may also wish to consult the national average as additional context in interpreting
noncompliance at facilities in a given state. If state noncompliance at majors or non-majors is
significantly above the national average, timely and appropriate action may not be promoting return
to compliance. Conversely, if the state noncompliance rate is low, compliance may be high or the
state may not be identifying or reporting violations accurately during inspections or in inspection
reports. Information about relative non-compliance at major and non-major facilities may help
inform the number of files selected with violations with and without enforcement.

Note: As previously addressed on p.l on metric types regarding review indicator metrics, reviewers
should not establish SRF report findings on the basis of review indicator metrics. Findings should
primarily be based on file review metrics for CWA timely and appropriate enforcement, using file
review metric 10b, as it is possible to factor in the specific date when the violation was discovered
and the date of the enforcement action for individual violations only during on-site file reviews.
The above metric provides an overall total number of SNC violations and formal actions taken in
the review year and quarter one of the following fiscal year, but does not calculate timely
enforcement based upon the start date for each violation.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; Memorandum Clarification of NPDES
EMS Guidance on Timely and Appropriate Response to Significant Noncompliance Violations from
Mark Pollins, Director, Water Enforcement Division, and Betsy Smidinger, Acting Director,
Enforcement Targeting and Data Division, May 29, 2008;; Data Entry Guide for the Permit
Compliance System (ICIS-NPDES); NPDES Compliance Inspection Manual, EPA Report #: 305-X-
4-001, June 2004. Permit Compliance System (PCS) Policy Statement, August 31, 1985, as
amended in 2000; ICIS Addendum to the Appendix of the 1985 Permit Compliance System
Statement from Michael M. Stahl, Director, Office of Compliance and James A. Hanlon, Director,
Office of Wastewater Management, December 28, 2007 and the ICIS Addendum Data Elements
Attachment; National Pollutant Discharge Elimination System (NPDES) Electronic Reporting
Rule. October 22, 2015.

Metric 8a3 — Percentage of major facilities in SNC and non-major facilities in
Category I noncompliance during the reporting year

Metric type: Review Indicator

This metric is a key indicator of EPA's commitment to ensure agencies identify the most
significant violations in terms of their environmental and human health impacts to target
enforcement actions toward the most important water pollution problems.

What it measures: Percentage of major and non-major NPDES facilities in significant non-
compliance or Category I noncompliance during the review year. The numerator = the number of
major facilities in SNC and the number of non-major facilities in Category I noncompliance during
review year; denominator = total number of major and non-major facilities.

17


-------
CWA Plain Language Guide

Guidance: Review the percent of major facilities in significant noncompliance and non-major
facilities in Category I noncompliance and compare this percentage to the national average and
prior year trends for the state. If significant noncompliance is significantly higher or lower than
the national average, or is high and remains high, the reviewer should consider selecting
additional files with violations and enforcement actions to ensure that timely and appropriate
enforcement occurs in response to violations. If significant noncompliance at majors or non-major
facilities in Category 1 noncompliance is significantly above the national average, timely and
appropriate action may not be promoting return to compliance. If the percentage of major
facilities in SNC or non-major facilities in Category I noncompliance is significantly lower than
the national average, reviewers should carefully review files for inspected facilities without
violations, and those with non-SNC violations, to determine whether SNC or Category I violation
determinations are accurately identified in files reviewed. Reviewers will have the flexibility to
utilize drilldown data available on ECHO to view the proportion of major and non-major facilities
reported as in significant noncompliance.

Note: As previously addressed on p.l on metric types regarding review indicator metrics, reviewers
should not establish SRF report findings on the basis of review indicator metrics. Findings should
primarily be based on file review metrics for CWA timely and appropriate enforcement, using file
review metric 10b, as it is possible to factor in the specific date when the violation was discovered
and the date of the enforcement action for individual violations only during on-site file reviews.
The above metric provides an overall total number of SNC violations and formal actions taken in
the review year and quarter one of the following fiscal year, but does not calculate timely
enforcement based upon the start date for each violation.

The following guidance defines significant and other types of violations and minimum data
reporting requirements: Interim Significant Non- Compliance Policy for Clean Water Act
Violations Associated with CSOs, SSOs, CAFOs, and Storm Water Point Sources (Interim Wet
Weather SNC Policy) issued to EPA Regions only on October 23, 2007; Memo ICIS Addendum
to the Appendix of the 1985 PCS Policy Statement from Michael M Stahl, Director, Office of
Compliance and James A Hani on, Director, Office of Wastewater Management, December 7,
2007; PCS Quality Assurance Guidance Manual. August 28, 1992. The Enforcement Management
System, National Pollutant Discharge Elimination System (Clean Water Act), 1989;

Memorandum. Revision ofNPDES Significant Noncompliance (SNC) Criteria to Address
Violations of Non-Monthly Average Limits issued to Water Management Division Directions and
Regional Counsels from Steven A. Herman, 1995; National Pollutant Discharge Elimination
System (NPDES) Electronic Reporting Rule. October 22, 2015.

Element 4 — Enforcement

Reviewers will use Element 4 to determine the agency's effectiveness in taking timely and
appropriate enforcement (metrics lOal and 10b) and using enforcement to return facilities to
compliance (metric 9b). High noncompliance reported under metrics 7j 1, 7kl and 8a3 in Element
3 may indicate a lack of timely and appropriate enforcement. For example, if violation and SNC
rates are higher than the national average, but the number of formal or informal enforcement is
low, reviewers may wish to select extra facilities with SNC and non-SNC violations to determine
why enforcement activity is low. If enforcement numbers are high, reviewers should review
facility files with enforcement to determine if those actions were appropriate and return facilities to
compliance. Adequate file selection is important to develop robust findings in the report and can be
based on SNC rate or violation rate trend data. Reviewers should also factor in findings from the
Inspection Coverage Data table that may affect timely and appropriate enforcement.

18


-------
CWA Plain Language Guide

Additional context: Reviewers should discuss whether compliance schedule milestones are in
place for any files selected for review to ensure the accuracy of responses for metrics 9a and 10b
as compliance schedules may start before the SRF review year and, therefore, not be captured in
review year data for metric lOal. Reviewers have the flexibility to examine information beyond
the DFR for documentation in the file that provides rationale for use of informal action and
documentation including but not limited to information from quarterly meetings and Pacesetter
calls.

File Reviews

As part of file reviews, regions should review files for wet weather, significant industrial user, and
pretreatment facilities that the state inspected in accordance with its NPDES CMS plan to ensure
that enforcement activities at these facilities promote return to compliance under metric 9a and are
timely and appropriate under metric 10b. As part of the review of regional files selection lists, EPA
will review the representativeness of files selected to ensure NPDES CMS commitments at non-
major facilities, including pretreatment, SIU, and wet weather facilities, are adequately factored
into the review process.

Key metrics: 9a, lOal, 10b

Metric 9a — Percentage of enforcement responses that returned, or will return, a source in
violation to compliance

Metric type: File, Goal
Goal: 100%

What it measures: Percentage of enforcement responses in reviewed files that returned, or will
return, a source in violation to compliance. Reviewers should evaluate all enforcement responses
found in selected files regardless of the type of violation. The violations addressed by reviewed
enforcement responses may be SNC or non-SNC violations. The numerator = number of
enforcement responses that returned or will return the source to compliance; denominator = total
number of enforcement responses in reviewed files.

Guidance: Actions that promote return to compliance generally include:

•	injunctive relief,

•	documentation of return to compliance, and

•	an enforceable requirement that compliance be achieved by a date certain for significant
noncompliance at major facilities.

Non-major facilities, and facilities with non-SNC violations, should also receive an enforcement
response (either informal or formal enforcement) that results in the violator returning to compliance,
particularly in areas where minor facilities have a major impact on water quality. Non-SNC
violations, and violations at non-major facilities should generally receive an enforcement response
in the range of options noted in the Enforcement Response Guide of the NPDES Enforcement
Management System Guidance, see especially Chapter 2 pp. 55-68 for the range of recommended
responses to potential violations. Administrative penalty orders (APOs) count as formal
enforcement actions but return to compliance at a facility that has received an APO should be
documented in the file for the action to be deemed as returning the facility to compliance.

19


-------
CWA Plain Language Guide

Applicable EPA policy/guidance: The Enforcement Management System, National
Pollutant Discharge Elimination System (Clean Water Act), 1989; "Clarification of NPDES
EMS Guidance on Timely and Appropriate Response to Significant Noncompliance
Violations" from Mark Pollins Water Enforcement Division and Betsy Smidinger, Acting
Director, Enforcement Targeting and Data Division, May 29, 2008.

Metric lOal — Percentage of major NPDES facilities with formal enforcement action taken
in a timely manner in response to SNC violations

Metric type: Review Indicator

What it measures: The percentage of major NPDES facilities in SNC during the review year with
formal enforcement action taken during the review year or quarter 1 of the following fiscal year.

Numerator = the number of major NPDES facilities in the denominator having formal
enforcement action in the review year or quarter 1 of the following fiscal year

Denominator = the number of major facilities with two or more consecutive quarters of SNC non-
effluent violations or SNC effluent violations at:

•	the same pipe and parameter reported in the Quarterly Noncompliance Report (QNCR), or

•	facilities with significant effluent violations in 2 consecutive quarters for violations of
the same pipe and parameter in each quarter, or

•	facilities that did not submit discharge monitoring reports (DMRs) listed in the QNCR in
2 consecutive quarters, or

•	facilities with compliance schedule violations in 2 consecutive quarters with
open compliance schedule violations at any time in the fiscal year

Guidance: Per the guidance in the NPDES EMS, formal enforcement should occur at facilities in
significant non-compliance prior to the second official QNCR unless there is supportable
justification for an alternative action, such as an informal enforcement action, permit
modification, or the facility returns to compliance. This metric is a review indicator metric given
the complexity of assessing the interplay between review year actions taken and those actions
taken over time that have on-going compliance schedules with milestones in the review year.

This metric is a review indicator and is not designed to be used to establish SRF report findings.
Actions taken may not be directly linked to SNC violations reported in the review year if related
to prior year compliance monitoring activities.

Note: As previously addressed on p.l on metric types regarding review indicator metrics,
reviewers should not establish SRF report findings on the basis of review indicator metrics The
above metric provides an overall total number of SNC violations and formal actions taken in the
review year and quarter one of the following fiscal year, but does not calculate timely
enforcement based upon the start date for each violation.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; "Clarification of NPDES EMS Guidance
on Timely and Appropriate Response to Significant Noncompliance Violations " from Mark
Pollins, Director, Water Enforcement Division and Betsy Smidinger, Acting Director,

Enforcement Targeting and Data Division, May 29, 2008.; Guidance for Preparation of
Quarterly and Semi-Annual Noncompliance Reports (Per Section 123.45, Code of Federal
Regulations, Title 40) March 13, 1986; Revision of NPDES Significant Noncompliance (SNC) Criteria to

20


-------
CWA Plain Language Guide

Address Violations of Non-Monthly Average Limits issued to Water Management Division Directions and
Regional Counsels from Steven A. Herman, 1995; National Pollutant Discharge Elimination System
(NPDES) Electronic Reporting Rule. October 22, 2015.

Metric 10b — Enforcement responses reviewed that address violations in an appropriate
manner

Metric type: File, Goal
Goal: 100%

What it measures: The percentage of enforcement actions taken in an appropriate manner. The
numerator = the number of appropriate enforcement responses in reviewed files taken to address
violations; denominator = the number of actions identified by the reviewer.

Note: The denominator for this metric should include all violations regardless of whether the
agency accurately identifies the violation.

Guidance: All SNC violations should be responded to in an appropriate manner with an
enforcement response that reflects the nature and severity of the violation. Unless there is
supportable justification such as violations that were returned to compliance quickly within the
review year with only an informal action or other documentation in the file, the enforcement
response should be a formal action which returns to compliance by permittee to return to
compliance by date certain.

When formal enforcement action is not taken, there should be a written record that clearly justifies
why the alternative action (e.g., informal enforcement action) is more appropriate. As indicated in
the introduction section for Element 4, reviewers have the flexibility to consider a wide range of
information sources beyond that found in the ECHO Detailed Facility Report (DFR) to make
findings under Metric 10b, including Pacesetter meetings, state web sites, and documentation from
quarterly calls on progress in addressing violations.

Examining the appropriateness of enforcement taken includes examination of any compliance
schedule milestones due in the review year. Some files may contain an inspection or action that
takes place in the review year at facilities where long term consent decrees exist. If compliance
schedule milestones are due from prior year consent decrees in the SRF review year, reviewers
have the flexibility to factor this into their response under CWA metric 10b. For example, if a
facility is meeting the terms of a long-term consent decree but appears to be in SNC under data
metric lOal, reviewers should give credit for meeting the terms of the consent decree. Conversely,
if there is no evidence that follow up is occurring to verify compliance schedule milestones,
especially those past due by 90 days or more (a SNC violation), appropriate enforcement is likely
not occurring and should be factored into the responses for Metric 10b.

Non-major facilities with Category 1 or 2 violations, and facilities with non-SNC violations, should
also receive an enforcement response (either informal or formal enforcement) within 12 months that
results in the violator returning to compliance. Non-SNC violations, and violations at non-major
facilities should generally receive an enforcement response in the range of options noted in the
Enforcement Response Guide of the NPDES Enforcement Management System Guidance in 12
months. See especially Chapter 2 pp. 55-68 for the range of recommended responses to potential
violations.

21


-------
CWA Plain Language Guide

Reviewers should consider Administrative Penalty Orders (APOs) as formal enforcement actions
under SRF file review metric 10b. APOs, as formal enforcement actions, are generally an
appropriate response to non-SNC violations and violations at non-major facilities. Per the NPDES
EMS policy, APOs are not appropriate to address SNC violations at major facilities because APOs
generally do not contain injunctive relief provisions. An APO at a major facility may be
appropriate if the file reviewed shows documentation of return to compliance. In addition, there
are some types of violations that could occur at non-majors, such as reporting false information, for
which an APO is not sufficient. Refer to the Enforcement Response Guide in the EMS if you
have questions about whether the response is appropriate.

Applicable EPA policy/guidance: The Enforcement Management System, National Pollutant
Discharge Elimination System (Clean Water Act), 1989; "Clarification of NPDES EMS Guidance
on Timely and Appropriate Response to Significant Noncompliance Violations " from Mark
Pollins, Director, Water Enforcement Division and Betsy Smidinger, Acting Director,
Enforcement Targeting and Data Division, May 29, 2008; National Pollutant Discharge
Elimination System Enforcement Management System (NPDES EMS), Chapter 7, Quarterly
Noncompliance Report Guidance; Revision of NPDES Significant Noncompliance (SNC) Criteria to
Address Violations of Non-Monthly Average Limits issued to Water Management Division
Directions and Regional Counsels from Steven A. Herman, 1995; Interim Significant Non-
Compliance Policy for Clean Water Act Violations Associated with CSOs, SSOs, CAFOs, and
Storm Water Point Sources (Interim Wet Weather SNC Policy) issued to EPA Regions only on
October 23, 2007.

Element 5 — Penalties

Element 5 evaluates penalty documentation using three metrics — 1 la for gravity and economic
benefit, 12a for difference between initial and final penalty, and 12b for collection.

File Reviews

As part of file reviews, regions should review files for wet weather, significant industrial user, and
pretreatment facilities that the state inspected in accordance with its NPDES CMS plan, along with
those for NPDES major facilities, to ensure that penalties at these facilities are well documented.
As part of the review of regional files selection lists, EPA will review the representativeness of
files selected to ensure NPDES CMS commitments are adequately factored into the review
process.

Key metrics: 11a, 12a, and 12b.

Metric 11a — Penalty calculations reviewed that document and include gravity and economic
benefit

Metric type: File, Goal
Goal: 100%

What it measures: Percentage of penalty calculations reviewed that document and include, where
appropriate, gravity and economic benefit. The numerator = the number of penalties reviewed
where the penalty was appropriately calculated and documented; the denominator = the total

22


-------
CWA Plain Language Guide

number of penalties reviewed.

Guidance: Agencies should document penalties sought, including the calculation of gravity and
economic benefit where appropriate. With regard to this documentation, the Revisions to the Policy
Framework for State/EPA Enforcement Agreements (1993) says the following:

EPA asks that a State or local agency make case records available to EPA upon request and
during an EPA audit of State performance. All recordkeeping and reporting should meet the
requirements of the quality assurance management policy and follow procedures
established by each national program consistent with the Agency's Monitoring Policy and
Quality Assurance Management System.

State and local recordkeeping should include documentation of the penalty
sought, including the calculation of economic benefit where appropriate. It is
important that accurate and complete documentation of economic benefit
calculations be maintained to support defensibility in court, enhance Agency's
negotiating posture, and lead to greater consistency.

Agencies may use their own penalty policies and either EPA's computerized model, known as
BEN, or their own method to calculate economic benefit consistent with national policy.

Review the files containing enforcement responses with penalties and examine whether the gravity
and economic benefit components were documented (sometimes found in a penalty calculation
worksheet). If the penalty does not include an economic benefit or gravity calculation, the reviewer
should determine if the file documents the reason for the absence, such as one of the mitigation
factors listed in the policy. Reviewers have the flexibility in the SRF report narrative to add context
regarding the number of files reviewed where ability to pay was a factor and penalty
documentation was not complete for economic benefit and/or gravity

Applicable EPA policy/guidance: Interim Clean Water Act Settlement Penalty Policy, March 1,
1995; Oversight of State and Local Penalty Assessments: Revisions to the Policy Framework for
State/EPA Enforcement Agreements (1993); Revised Policy Framework for State/EPA Enforcement
Agreements (1986).

Metric 12a — Documentation of rationale for difference between initial penalty calculation
and final penalty

Metric type: File Review, Goal

Goal: 100%

What it measures: Percentage of penalties reviewed that document the rationale for the final value
assessed when it is lower than the initial calculated value. The numerator = number of penalties
reviewed that document the rationale for the final value assessed compared to the initial value
calculated; denominator = number of penalties reviewed where final value assessed is lower than
initial value calculated.

Guidance: According to the Revisions to the Policy Framework for State/EPA Enforcement
Agreements (1993), states should document any adjustments to the initial penalty including a
justification for any differences between the initial and final assessed penalty. Review penalty files
to identify their contents with respect to initial and final penalties. If only one of the two penalty

23


-------
CWA Plain Language Guide

amounts is found in the file, ask the agency why the initial and final assessed penalties are not both
documented, along with the rationale for any differences.

Applicable EPA guidance/policy: Oversight of State and Local Penalty Assessments: Revisions to
the Policy Framework for State/EPA Enforcement Agreements (19931 Revised Policy Framework
for State/EPA Enforcement Agreements (1986); Interim Clean Water Act Settlement Penalty Policy,
March 1, 1995.

Metric 12b — Penalties collected
Metric type: File, Goal

Goal: 100% of files with documentation of penalty collection

What it measures: Percentage of penalty files reviewed that document collection of penalty. The
numerator = the number of penalties with documentation of collection or measure, or
documentation of measures to collect a delinquent penalty; denominator = the number of penalties
reviewed for which penalty payment was due by the time of the review.

Guidance: This metric assesses whether the final penalty was collected. Begin by looking in the
file for a cancelled check or other correspondence documenting transmittal of the check. If this
documentation is not in the file, ask the agency if they can provide proof of collection through the
data system of record. The dollar amount on the detailed facility report should list the final
penalty dollar value collected, not an initial proposed penalty value at the start of settlement
negotiation; address inaccuracies as data quality issues under Element 1. Findings in SRF reports
are not designed to address trends in penalty dollar amounts over time as there is no guidance on
assessing penalty dollar amounts against a national goal.

If the penalty has not been collected, there should be documentation either in the file or in the data
system of record that the agency has taken appropriate follow-up measures.

Applicable EPA policy/guidance: Oversight of State and Local Penalty Assessments: Revisions to
the Policy Framework for State/EPA Enforcement Agreements (19931 Revised Policy Framework
for State/EPA Enforcement Agreements (1986); Interim Clean Water Act Settlement Penalty Policy,
March 1, 1995.

24


-------
CWA Plain Language Guide

Appendix A: Acronyms

Note: This is not a complete list of acronyms used in this document. It includes only those

acronyms that are not frequently used in the Agency lexicon, or which have multiple meanings in

the Agency lexicon.

Agency	Agency is the state or EPA regional directly implemented program reviewed.

CMS	Compliance Monitoring Strategy. When the reference is to the National CMS, the

reference is to Source 9, below.

EMS	Enforcement Management System. In this document, EMS ALWAYS means

Enforcement Management System. Elsewhere in the Agency, the acronym refers to
an Environmental Management System, however, that term is not used in this
document or the State Review Framework.

FFY	Federal Fiscal Year (October 1 through September 30)

SRF	State Review Framework. In this document, SRF ALWAYS refers to the State

Review Framework.

SRF Tracker The Tracker is an on-line database that contains records of individual agency
reviews and includes a system to track agency progress in completing
recommendations stemming from the SRF reviews.

25


-------
CWA Plain Language Guide

Appendix B: Information Sources

The following documents referenced in the metric discussions above are available electronically at:

http://echo.epa.gov

1.	The Enforcement Management System, National Pollutant Discharge Elimination
System (Clean Water Act), 1989

2.	Memo Clarification of NPDES EMS Guidance on Timely and Appropriate Response to
Significant Noncompliance Violations from MarkPollins, Director, Water Enforcement
Division, and Betsy Smidinger, Acting Director, Enforcement Targeting and Data Division,
May 29, 2008

3.	Policy Framework for State/EPA Agreements. August 1986, as revised

4.	Permit Compliance System (PCS) Policy Statement, August 31, 1985, as amended in
2000.

5.	Meino_/( 7tS' Addendum to the Appendix of the 1985 PCS Policy Statement from Michael
M Stahl, Director, Office of Compliance and James A Hanlon, Director, Office of
Wastewater Management, December 7, 2007

6.	Chapter 7 of the Enforcement Management System, Quarterly Noncompliance Report
Guidance; Guidance for Preparation of Quarterly and Semi-Annual Noncompliance Reports
(40 CFR 123.45) (this document is also included as an attachment to Source 1)

7.	Revised Interim Clean Water Act Settlement Penalty Policy, March 1, 1995.

8.	Memorandum. Clean Water Act National Pollutant Discharge Elimination System Compliance
Strategy, July 21, 2014.

9.	Memorandum, The Office of Enforcement and Compliance Assurance's Agency Response to
the Evaluation Report: Better Enforcement Oversight Needed for Major Facilities with Water
Discharge Permits in Long-term Significant Noncompliance (ReportNo.2007-P-00023) from
Granta Y Nakayama, Assistant Administrator, Aug 14, 2007.

10.	Memorandum. Oversight of State and Local Penalty Assessments: Revisions to the Policy
Framework for State/EPA Enforcement Agreements, from Steven A. Herman, Assistant
Administrator, June 23, 1993 (this document contains an amendment to source 3)

11.	PCS Quality Assurance Guidance Manual. August 28, 1992

26


-------
CWA Plain Language Guide

CWA Plain Language Guide 130

12.	The Code of Federal Regulations including 40CFR123.26(e). 40CFR123.26(e)(5) and
40CFR123.45c.

13.	Clean Water Act Action Plan (Prior to February 22, 2010 known as the Clean Water Act
Enforcement Action Plan), October 15, 2009.

14.	Interim Significant Non-compliance Policy for Clean Water Act Violations Associated with
CSOs. SSOs. CAFOs. and Storm Water Point Sources (WW SNC Policy), issued to EPA
Regions only on October 23, 2007.

References (also see SRF Compendium of Guidance and Policy Documents)

•	Clean Water Act Civil Enforcement Policy and Guidance site:
http://www2.epa.gov/enforcement/water-enforcement-policyguidance-and-publications

VI. Key Contacts

State Review Framework Round 4 Implementation Process & Guidance:

•	Michael Mason, State and Tribal Performance Branch Chief: 202-564-0572,
Mason.Michael@epa.gov

Development & Use of NPDES CMS Data in Specific State Reports by Region:

•	Region 1, 10 SRF liaison: Fran Jonesi, 202-564-7043,

Jonesi.Fran@epa.gov

Region 2, 4, 7 SRF liaison: Andrew Moiseff, 202-564-3007, Moiseff.Andrew@epa.gov

•	Region 3, SRF liaison: Arlene Anderson, 202-564-0658,

Anderson.Arlene@epa.gov

•	Region 5,6,8,9 SRF liaison: Elizabeth Walsh, 202-564-0115, Walsh.Elizabeth@epa.gov

27


-------
CWA Plain Language Guide

Appendix C: Inspection Coverage Data Table

State: [insert state]

FY: [insert
FY]

Inspection Coverage
Data Table

Percent of planned inspections completed: Planned inspections per the negotiated CMS Plan completed in the Year Reviewed. Calculate as a
percentage by category where the numerator = number of inspections completed; denominator = number of inspections planned. Compliance
monitoring activities counted for metrics below should use the inspection type codes listed in the NPDES CMS policy in Attachment 2, Part 4, pp. 25-
28. See http://www2.epa.gov/compliance/clean-water-act-national-pollutant-discharge-elimination-system-compliance-monitoring for additional details.

Where inspections covered by the CMS do not have data entered in ICIS- NPDES, reviewers should gather and assess information from the agency to
review performance against the applicable CMS commitments and note this as a problem with a finding of Area for State Attention or Improvement. If
a state does not have a state-specific CMS plan for a given CMS inspection area, regions will evaluate the state against the national inspection
coverage goals set forth in the 2014 NPDES compliance monitoring strategy under metrics 4a1 - 4a10.

CMS	Enforcement

. . Description (based on NPDES .. . Commitment/	Inspections Violations . .. Penalties

Metric r . .. Universe	„ r . . . Actions . .

CMS target) Performance	Conducted Found T . Assessed

a ^ .	Taken

Goa -

5a1

Inspection coverage of
NPDES majors

1	comprehensive inspection every

2	years; alternative: one
comprehensive inspection every 3
years based upon Inspection
Targeting Model (ITM) or
comparable targeting methodology
for facilities in compliance, not
subject to any credible citizen tips
or complaints, and facilities not
contributing to section 303(d)
impaired waters.













5b1

Inspections coverage of
NPDES non-majors with
individual permits

1 focused, reconnaissance,
enforcement follow-up, oversight,
or sludge/biosolids inspection
every 5 years for facilities not
contributing to 303(d) impairment;
for facilities contributing to 303(d)
impairment 1 comprehensive
inspection at least every 5 years.














-------
CWA Plain Language Guide

CMS	Enforcement

. . Description (based on NPDES .. . Commitment/	Inspections Violations . .. Penalties

Metric r A. . .. Universe	~ i-j Actions . .

CMS target) Performance	Conducted Found Taken Assessed

Goal

5b2

Inspections coverage of
NPDES non-majors with
individual permits

1 focused, reconnaissance,
enforcement follow-up,
oversight, or sludge/biosolids
inspection every 5 years for
facilities not contributing to
303(d) impairment; for
facilities contributing to 303(d)
impairment 1 comprehensive
inspection at least every 5
years.













4a 1

Pretreatment
compliance inspections
and audits

Every five years, two
pretreatment compliance
inspections and one audit at
each approved local
pretreatment program that
includes >2 oversight
inspections of industrial users
(lUs)













4a2

Significant industrial
user (SIU) inspections
for SlUs discharging to
non-authorized POTWs

One sampling inspection at
each SIU annually













4a4

CSO inspections

One inspection of each major
and non-major CSO every five
years (for states with
combined sewer systems)













4a 5

SSO inspections

5% universe permitted POTWs
with SSS annually

Number of
POTW

permits with
>1 sanitary
sewer
collection
system












-------
CWA Plain Language Guide

CMS	Enforcement

. . Description (based on NPDES .. . Commitment/	Inspections Violations . .. Penalties

Metric r A. . .. Universe	~ i-j Actions . .

CMS target) Performance	Conducted Found Taken Assessed

Goal

4a7

Phase 1 and II MS4
audits or inspections

One on-site audit, on-site
inspection or off-site desk
audit* of each Phase 1 & II MS4
every five years and one
inspection or on-site audit of
each Phase 1 & II MS4 every
seven years thereafter













4a8

Industrial stormwater
inspections

Inspections of 10% of the
industrial stormwater universe
each year (includes inspections
of unpermitted facilities with
and without "no exposure
certification")

Permitted
industrial SW
facilities











4a9

Phase 1 and II
construction
stormwater inspections

Inspections of 10% of Phase 1
and Phase II construction
stormwater universe each year
including inspections of
unpermitted sites

Permitted
construction
sw sites











*Off-site desk audits include

)ut are not limited to review ol

facility reports and records, review of agency-gathered testing, sampling

and ambient monitoring data, evaluation of responses to CWA section 308 information requests, review of compliance deliverables
submitted pursuant to permits or enforcement actions, and analysis of aerial or satellite images. An off-site desk audit conducted
pursuant to a CMS plan will include the appropriate combination of these activities to allow the region or the state to make a facility-
level or program level compliance determination. In order for an off-site desk audit or focused inspection to count toward CMS
implementation for the results in this table, the region or state must report the activity into ICIS-NPDES (either through direct data
entry or via the CDX National Environmental Information Exchange Network). See Part 3 of the CWA NPDES CMS for additional details on
focused inspections and off-site desk audits.


-------
.. . .	Description (based on NPDES

c	CMS target)

4a 10

Inspections of large and
medium NPDES-
permitted CAFOs

4a 11

4all: Number of
sludge/biosolids
inspections at each
major POTW.

One comprehensive inspection
of each large and medium
NPDES-permitted CAFO every
five years

One inspection every 5 years
for each major POTW in a state
with biosolids program
authorization, (use and
disposal operations, including
incineration and surface
application), includes off-site
desk audit substitutions if sites
are not subject to enforcement
actions, compliance schedules
from concluded enforcement
actions; (2) in SNC in the
previous four quarters; (3)
have no unresolved SEVs in
prior inspection(s); (4) do not
discharge to CWA section
303(d) listed waters for
pollutant(s) contributing to the
listing; and (5) have no known
potential to impact drinking
water supplies.

CWA Plain Language Guide

Universe

CMS
Commitment /
Performance
Goal-

Inspections
Conducted

Violations
Found

Enforcement
Actions
Taken

Penalties
Assessed


-------
CWA Plain Language Guide


-------