RCRA Metrics Plain Language Guide

State Review Framework Round 4

The Plain Language Guide describes the elements and metrics EPA uses during a State Review
Framework (SRF) review of RCRA Subtitle C compliance and enforcement programs and
provides instructions on how to use the metrics to make appropriate findings and
recommendations. SRF reviews are based on information from EPA data systems and file
reviews. Reviewers should refer to the RCRA file review checklist and spreadsheet when
developing review findings on performance.

Data used in SRF reviews fall into two primary categories — data metrics and file review
metrics. These metrics provide the basis for determining agency performance.

1. Data metrics are derived from frozen, verified data in RCRA-Info. Reviewers
download data metrics from the Enforcement and Compliance History Online (ECHO) to
get an initial overview of a state or local agency's performance. All data metrics fall into
one of the following subcategories:

•	Goal metrics evaluate performance against a specific numeric goal and are used to
develop findings. The ECHO data also provides the national average for these metrics
expressed as a percentage. EPA evaluates agencies against goals, not national
averages. These metrics include averages only to provide a sense of where an agency
falls relative to others.

•	Review Indicator metrics use national averages to indicate when agencies diverge from
national norms. Review indicators are not used to develop findings. They are used to
identify areas for further analysis during the file review. When an indicator diverges
significantly from the average, EPA should ensure that it pulls a sufficient sample of files
to evaluate the issue during the file review (see the File Selection Protocol for additional
guidance). EPA and the state or local agency should discuss the issue to determine if a
problem exists. Indicators can also provide narrative context for findings from file
reviews.

•	Alternative CMS metrics are only required to be included in the review when an agency
has a compliance monitoring strategy (CMS) that includes one or more alternative
inspection commitments. Typically, under an alternative CMS an agency will substitute a
certain number of inspections at larger facilities for some at smaller facilities. When the
agency does not have an alternative CMS, EPA will evaluate the state against the national
inspection coverage goals via metrics 5a and 5b.

2. File review metrics are evaluated during the review of facility files (including information
such as inspection reports, evaluations, enforcement responses and actions, and penalty
documentation). All file review metrics evaluate performance against a national goal. (File
metrics will not have national averages.)


-------
RCRA Plain Language Guide | 2

Guidance References and Acronyms

The SRF Documentation Page on ECHO Gov provides a full list of links to SRF guidance and
policies.

Year reviewed refers to the federal fiscal year of activities reviewed, not the year in which the
review is conducted. The year reviewed should generally be the year preceding the year in which
the SRF review is conducted. Agency refers to the state, local, or federal agency that has the lead
for compliance monitoring and enforcement within the state or other jurisdiction undergoing the
SRF review.

A list of acronyms is provided as an attachment to this Plain Language Guide.

RCRA SRF Review Process

1.	Annual data verification

2.	Annual data metric analysis

3.	File Selection

4.	Local agency or state district office inclusion (if applicable)

5.	Discussion with HQ on review process (or discussion on a step-by-step basis, as
chosen by the Region)

6.	Entrance conference

7.	File Review

8.	Exit conference

9.	Draft Report Submitted for internal agency review

10.	State Comment Period

11.	Revised report sent to agency for review and internet posting

12.	Final report and recommendations published on the SRF web site

13.	Track implementation status of Area for Improvement Recommendations in the SRF
Manager database on a periodic basis

Using Metrics to Determine Findings

Goal metrics always have numeric goals and stand alone as sufficient basis for a finding. For
example, the goal for RCRA metric 2b is for agencies to accurately enter 100 percent of
minimum data requirements (MDRs) into RCRAInfo. To analyze performance under this metric,
compare the percentage of MDR actions accurately entered to the goal of 100 percent.

Based on this analysis, the reviewer would make a finding. All findings will fall under one of
these categories:

Meets or Exceeds Expectations: The SRF was established to assess the base level or floor of
enforcement program performance. This rating describes a situation where the base level is met


-------
RCRA Plain Language Guide | 3

and no performance deficiency is identified, or a state performs above base program
expectations.

Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. The state should correct the issue without additional EPA oversight. EPA may
make suggestions to improve performance, but it will not monitor these suggestions for
completion between SRF reviews. These areas are not highlighted as significant in an executive
summary.

Area for State Improvement: EPA will develop a finding of Area for State Improvement
whenever an activity, process, or policy that one or more SRF metrics under a specific element
show as a significant problem that the agency is required to address. A finding for improvement
should be developed regardless of other metric values pertaining to that element. Recommended
activities to correct the issues should be included in the report. Recommendations must have
well-defined timelines and milestones for completion, and, if possible, should address root
causes. EPA will monitor recommendations for completion between SRF reviews in the SRF
Manager database. The status of recommendations will be publicly available on EPA's SRF web
site.

The National Strategy for Improving Oversight of State Enforcement Performance is a key
reference in identifying recommendations for Areas for Improvement. Where a performance
problem cannot be readily addressed, or where there is a significant or recurring performance
issues, there are steps EPA can and should take to actively promote improved state performance.
For additional information: https://www.epa.gov/sites/production/files/2014-06/documents/state-
oversi ght-strategy. pdf.

Using Other Metrics

When metrics other than Goal metrics indicate potential problems, EPA should conduct the
additional research necessary to determine the nature of the issue. These metrics provide
additional information that is useful during file selection, and for gauging program health when
compared to other metrics.

For example, RCRA metric 7b is a Review Indicator for violations found during inspections, and
State X's rate is 15 percent (the national average is 36 percent in this particular year). EPA can
only determine whether this lower-than-average rate represents a performance issue through a
file review of inspection reports and violation determinations.

Use of State Guidance and Regional-State Agreements as Basis for Findings in SRF
Reviews

The State Review Framework evaluates enforcement program performance against established
OECA national program guidance. State program guidance or regional-state agreements are
applicable to the SRF review process under the following circumstances.


-------
RCRA Plain Language Guide | 4

1.	It is acceptable to use the state's own guidance to evaluate state program performance if:
1) the region can demonstrate that the state's standard(s) is(are) equivalent to or more
stringent than OECA guidance, and; 2) and the state agrees to being evaluated against
that standard(s). In these cases, regions should inform OECA/OC in advance of the
review that they intend to use state guidance and should include a statement in the SRF
report indicating that the state guidance was determined to be equivalent or more
stringent than the applicable OECA policy and was used as the basis for the review.

2.	For certain metrics, clearly specified in this Plain Language Guide, it will be necessary to
refer to state policies or guidance, or to EPA-state agreements. For example:

a.	If the state has an Alternative CMS, EPA will use these state-specific
commitments as the basis to evaluate compliance monitoring coverage.

b.	The national guidance may require only that a state establish a standard but
not actually provide the standard. In such cases, the reviewer will need to
ensure that the state has developed the required standard, and once it has been
reviewed and approved by the region, use that standard to evaluate state
performance.

3.	Where national guidance has been modified or updated, it is important to review the
corresponding state program implementation guidance to assess whether it has become
out of date or inaccurate. In such cases, the reviewer should make appropriate
recommendations for revision of the state guidance, review the revised version, and
approve it, if appropriate.

4.	Where state program guidance or regional-state agreements establish practices or
standards that are not consistent with or at least equivalent to national program guidance,
this may be an allowable flexibility under section A4 of the Revised Policy Framework
for State/EPA Enforcement Agreements (Barnes, August 1986, as revised). If so, the
region should inform OECA/OC prior to the review and note this flexibility in the
explanation of the SRF report. If the differences between the state guidance or regional-
state agreements and the national guidance is significant, or if it is unclear whether
flexibility from OECA policy is appropriate, the region should elevate the issue to OECA
for resolution (per Interim Guidance on Enhancing Regional-State Planning and
Communication on Compliance Assurance Work in Authorized States (Bodine, 2018)
prior to developing findings or a draft report.

Element and Metric Definitions

Element 1 — Data


-------
RCRA Plain Language Guide | 5

EPA uses Element 1 to evaluate data accuracy and completeness. This review is conducted in the
following two ways:

•	File review: EPA evaluates data accuracy and completeness under metric 2b, which is a
file review metric that compares data in the ECHO Detailed Facility Report (DFR) or
RCRAInfo to information in facility files.

•	Evaluating data metrics: In addition, as the reviewer has discussions with the state and
conducts the data metric analysis and file review, he or she may find an SRF data metric
to be inaccurate to a significant degree.

To provide an example, data metric 5a shows that State X inspected 5 of its 20 TSDFs.
However, the state provides its own data showing that it inspected all 20 TSDFs but
failed to enter inspections for 15 of them into RCRAInfo. This failure to enter
inspections into RCRAInfo would be an Area for State Improvement under Element 1.
Conversely, if the value for metric 5a were accurate and the state had only inspected 5 of
20 TSDFs, this would be an Area for State Improvement under Element 2 (Inspections)
for failure to inspect the required number of TSDFs.

In the case of a data metric being inaccurate, the finding should cite both the reported and, when
possible, the actual values.

Refer to ECHO Data Entry Requirements for minimum data requirements.

Key metrics: 2a, 2b, 5a, 5b, 7b, 8a, and 10a. Also consider 5d and 5e when including them in
the review.

Metric 2b — Complete and OAccurate entry of mandatory data

Metric type: File Review, Goal

Goal: 100% of data are complete and accurate

What it measures: Percentage of files reviewed where mandatory data are accurately reflected
in the national data system. The numerator = number of files reviewed that accurately reflect
mandatory data, denominator = number of files reviewed.

Guidance: Reviewers should compare data in the ECHO Detailed Facility Report (DFR) or
RCRAInfo with information in the facility files to check that the DFR accurately reflects
activities such as inspection dates, inspection types, significant noncompliance (SNC) status, and
enforcement responses. See the File Review Checklist for complete instructions.

Also, check to see if there is file information that is missing in the DFR or RCRAInfo. If
information in the files is missing from or inaccurately entered into the national database, the
data for that file is not complete or accurate. This should be noted under Element 1.


-------
RCRA Plain Language Guide | 6

Reviewers should also consider their knowledge of the agency's program when conducting this
analysis. For example, if the reviewer notices multiple compliance evaluation inspections
identified in the DFR or RCRAInfo for a facility within one week's time, it is unlikely that the
agency has actually conducted multiple CEIs in this timeframe. It is more likely that the later
ones, if they are separate actions, are follow-up inspections.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003),
current OECA National Program Manager Guidance; Note: RCRAInfo mandatory data
elements are those that are required to be entered by the system to save a record and may be
broader than the scope of SRF. General data areas to review are listed on ECHO Data Entry
Requirements.

Element 2 — Inspections

Element 2 evaluates:

1.	Inspection coverage compared to CMS commitments

2.	Inspection report completeness and sufficiency to determine compliance

3.	Inspection report timeliness

EPA is only required to evaluate metrics 5d and 5e when the agency has exercised flexibility
under an alternative CMS commitment for its inspection frequencies. When the agency does not
exercise this flexibility, EPA can choose whether to include these metrics.

Key metrics: 5a, 5b, 6a, and 6b. Also include metrics 5d and 5e when the state has an alternative
CMS for inspection coverage commitments.

Metric 5a — Two-year inspection coverage of operating TSDFs

Metric type: Data, Goal

Goal: 100%

What it measures: Of those operating at the time of the data freeze, the percentage of the
treatment, storage, and disposal facility (TSDF) universe that had a CEI inspection during the
two-year period of review. The numerator = number of TSDFs operating at the time of the data
freeze that had a CEI inspection during the two-year period of review; denominator = number of
operating TSDFs at the time of the data freeze.

Guidance: According to Section 3007 of RCRA, all non-governmental TSDFs should be
inspected every two years.

EPA should conduct a further review when lead agencies do not meet this goal. In accordance
with the RCRA Compliance Monitoring Strategy, states that are lead agencies are to cover at
least 50 percent of non-government TSDFs every year.


-------
RCRA Plain Language Guide | 7

Applicable EPA policy/guidance: Current OECA National Program Manager Guidance,
Compliance Monitoring Strategy for the Resource Conservation and Recovery Act (RCRA)
Subtitle C Program

Metric 5b — Annual inspection coverage of LQGs This metric uses either the LQG universe
from the most recent BR published before the review year (Metric 5b) or the LQG universe from
RCRAInfo (Metric 5bi). Select either Metric 5b or 5bi depending on the source of the LQG
universe:

Metric type: Data, Goal

Goal: 20%, or 100% of alternative commitment

What it measures: The percentage of the large quantity generator (LQG) universe that had a
compliance evaluation inspection (CEI) during the year reviewed as indicated from the Biennial
Report (BR) published before the review year. The numerator = number of LQGs that had a CEI
during the year reviewed; denominator = number of LQGs .

Guidance: Based on the RCRA Compliance Monitoring Strategy (CMS), EPA only counts CEIs
under this metric. The National Program Manager Guidance (NPM Guidance) states that 20
percent of LQGs should have a CEI each year.

In accordance with the CMS, states that are lead agencies are to inspect at least 20 percent of the
BR LQG universe annually; however, up to 10% of. EPA inspections can contribute toward
meeting these goals. The CMS also states that an appropriate portion of the Region's ACS
commitment of six (6) required LQG inspections may be counted toward the state's 20 percent
coverage obligation. The Region's contribution should constitute only a small portion of the
state's 20 percent obligation (e.g., less than ten percent).

Lead agencies with approved alternative CMS plans may substitute other facility inspections for
LQGs per the Guidance for RCRA Core LQG Pilot Projects. Whether or not the agency has an
alternative plan, when lead agencies do not meet this metric's goal, EPA should conduct a
further review:

• Talk to the state during the file review about why the goal or commitment was not met;

1 For example, given a universe of 100 LQGs, the state annually must conduct 20 LQG
inspections (usually CEIs). EPA's contribution to the state's coverage requirement should not

exceed two (2) inspections (i.e., 10 percent of the required 20 inspections). EPA, however, can

do more inspections, but such additional inspections will not count toward the state's coverage

requirement (16).


-------
RCRA Plain Language Guide | 8

•	Make sure that enough LQGs with inspections are selected for the file review;

•	Confirm that the alternative agreement was met.

Applicable EPA policy/guidance: Current OECA National Program Manager Guidance,
Guidance for RCRA Core LOG Pilot Pro jects (2007), Compliance Monitorins Strategy for the
Resource Conservation and Recovery Act (RCRA) Subtitle C Program

Metric 5bi (LQG universe from RCRAInfo)

Metric type: Data, Goal

Goal: 20%, or 100% of alternative commitment

What it measures: The percentage of the large quantity generator (LQG) universe that had a
compliance evaluation inspection (CEI) during the year reviewed as indicated from RCRAInfo.
The numerator = number of LQGs that had a CEI during the year reviewed; denominator =
number of LQGs.

Guidance: Based on the RCRA Compliance Monitoring Strategy (CMS), EPA only counts CEIs
under this metric. The National Program Manager Guidance (NPM Guidance) states that 20
percent of LQGs should have a CEI each year.

In accordance with the CMS, states that are lead agencies are to inspect at least 20 percent of the
LQG universe annually; however, up to 10% of EPA inspections can contribute toward meeting
these goals. The CMS also states that an appropriate portion of the Region's ACS commitment
of six (6) required LQG inspections may be counted toward the state's 20 percent coverage
obligation. The Region's contribution should constitute only a small portion of the state's 20
percent obligation (e.g., less than ten percent). 2

Lead agencies with approved alternative CMS plans may substitute other facility inspections for
LQGs per the Guidance for RCRA Core LQG Pilot Projects. Whether or not the agency has an
alternative plan, when lead agencies do not meet this metric's goal, EPA should conduct a
further review:

•	Talk to the state during the file review about why the goal or commitment was not met

•	Make sure that enough LQGs with inspections are selected for the file review

•	Confirm that the alternative agreement was met

2 For example, given a universe of 100 LQGs, the state annually must conduct 20 LQG
inspections (usually CEIs). EPA's contribution to the state's coverage requirement should not
exceed two (2) inspections (i.e., 10 percent of the required 20 inspections). EPA, however, can do
more inspections, but such additional inspections will not count toward the state's coverage
requirement (16).


-------
RCRA Plain Language Guide | 9

Applicable EPA policy/guidance: Current OECA National Program Manager Guidance,
Guidance for RCRA Core LOG Pilot Projects (2007), Compliance Monitorins Strategy for the
Resource Conservation and Recovery Act (RCRA) Subtitle C Program

Metric 5d — One-year inspection count at SQGs

Metric type: Alternative CMS

What it measures: The number of small quantity generators (SQGs) that had an inspection
during the one-year review period.

Guidance: This metric is only required when evaluating agencies with alternative CMS plans for
inspection coverage, and optional otherwise.

EPA considers RCRA evaluation types CAC, CDI, CEI, CSE, FCI, GME, and OAM as on-site
inspections under this metric.

This metric may provide important information for the review, particularly in cases where SQG
inspections are being substituted for large quantity generator (LQG) inspections per the
Guidance for RCRA Core LQG Pilot Projects. In alternative inspection plans, lead agencies may
trade off LQG inspections for increased inspection coverage of SQGs. When submitting an
alternate CMS plan that proposes the substitution of SQGs for LQGs, Regions can require that
only CEIs may serve as the substitution for the LQGs. In these cases, EPA will hold the agency
accountable under SRF for meeting its SQG inspection target.

Applicable EPA policy/guidance: RCRA Compliance Monitoring Strategy (2010)

Metric 5e — One-year inspection count at other sites

Metric type: Alternative CMS

What it measures: Number of inspections in the year of review for the following universes:

•	5e5: Very small quantity generators (VSQGs)

•	5e6: Transporters

•	5e7: Sites not covered by metrics 5a through 5e2

Guidance: This metric is only required when evaluating agencies with alternative CMS plans for
inspection coverage, and optional otherwise.

EPA counts RCRA evaluation types CAC, CDI, CEI, CSE, FCI, GME, and OAM as on-site
inspections under this metric.

This metric may provide important information for the review, particularly in cases where
agencies are substituting inspections at other sites for LQG inspections. In alternative CMS
plans, lead agencies may trade off LQG inspections for increased inspection coverage of other


-------
RCRA Plain Language Guide | 10

facility types When submitting an alternate CMS plan that proposes the substitution of VSQSs
and Transporters for LQGs, Regions can require that only CEIs may serve as the substitution for
the LQGs. In these cases, EPA will hold the agency accountable under SRF for meeting its
VSQG and transporter inspection targets.

Metric 6a — Inspection reports complete and sufficient to determine compliance
Metric type: File Review, Goal
Goal: 100%

What it measures: The percentage of on-site inspection reports reviewed that are complete and
provide sufficient documentation to determine compliance. The numerator = number of
inspection reports reviewed with complete and sufficient documentation; denominator = number
of inspection reports reviewed.

Guidance: The focus should be primarily on compliance evaluation inspections (CEIs) since
they are required for treatment, storage, and disposal facilities (TSDFs) and large quantity
generators (LQGs). At its discretion, EPA may review a limited number of other types of on-site
inspections, such as FCIs or OAMs.

EPA should use the inspection report completeness assessment at the end of the RCRA File
Review Checklist to assess and summarize each inspection. The checklist describes what needs
to be included in a complete inspection report. In general, this includes:

•	A narrative describing the facility, its RCRA-regulated activities, potential violations
observed, etc.

•	A checklist

•	Any documentary support, such as photographs, maps, sampling results, etc.

If certain components listed in the checklist are routinely missing, EPA should mention these in
the SRF report.

Agencies will likely have their own methods for completing inspection reports. EPA should
discuss this with the agency at the beginning of the review to determine if the agency's
inspection report documentation (particularly for CEI inspections) is consistent with EPA
requirements for a complete report.

EPA should also review inspection reports for sufficient documentation to determine compliance
at the facility. When an inspection report is complete as determined by the completeness
assessment in the checklist, it provides sufficient information to document compliance at the
facility. If a report is not complete, it may nonetheless contain sufficient documentation to
determine compliance.

Applicable EPA policy/guidance: RCRA Inspection Manual (1998), Review of RCRA
Inspection Report Practices


-------
RCRA Plain Language Guide | 11

Metric 6b — Timeliness of inspection report completion
Metric type: File Review, Goal
Goal: 100%

What it measures: Percentage of inspection reports reviewed that are completed in a timely
manner per the national standard described below (numerator = number of inspection reports
reviewed that were completed in a timely manner; denominator = number of inspection reports
reviewed).

Guidance:

The Hazardous Waste Civil Enforcement Response Policy (2003) states that agencies should
make a violation determination within 150 days of Day Zero. EPA should use this 150-day
standard for inspection report timeliness. Reviewers can also consider the average length of time
that it took to complete each report in the File Review Checklist when determining the finding.

If an Agency has its own inspection report timeline defined in a policy, grant workplan, or
PPA/PPG, EPA should compare the inspection report time average to the Agency's defined
timeline in the SRF report findings. The comparison can highlight where the Agency has
exceeded national performance standards for timely inspection reports and generally won't
penalize the agency for missing its more stringent time period. For example, if a state agency
has a policy of 90 days for report completion, reviewers should generally not penalize the agency
for exceeding 90 days if it completes the report in under 150 days.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003),
current OECA National Program Managers' Guidance

Element 3 — Violations

Under this element, EPA evaluates the accuracy of the agency's violation and compliance
determinations, and the accuracy and timeliness of its significant non-compliance
determinations.

Reviewers will evaluate metrics 2a, 7b, 8a, and 8b during the data metric analysis. If the
reviewer finds that violation or SNC rates are lower than the national average, he or she may
want to include additional inspections or violations in the file review to determine whether
violations and SNCs are being determined accurately.

Metric 7a covers the accuracy of compliance determinations made from inspections, and metric
8c covers the appropriateness of SNC determinations. These metrics along with metric 8b
(timeliness of SNC determinations) will generally form the basis for findings under this element.

Key metrics: 2a, 7a, 7b, 8a, 8b, and 8c


-------
RCRA Plain Language Guide | 12

Metric 2a — Long-standing secondary violators
Metric type: Review Indicator

What it measures: The number of secondary violators (SVs) with violations open for more than
240 days that have not returned to compliance or have not been designated as being a significant
noncomplier (SNC).

Guidance: If there is a high number of SVs relative to the total universe of facilities in the state,
select additional files with SVs for the file review to determine the nature of the problem. The
file review, conversations with agency personnel, and other research can help you gauge:

•	Whether the agency is designating long-standing SVs as SNCs. The 2003 Hazardous
Waste Civil Enforcement Response Policy states that agencies should consider re-
designating SVs as SNC if the violator does not return to compliance in 240 days. EPA
should review the list of violators that do not return to compliance in 240 days to
determine whether data entry problems, SNC designation issues, or SVs unaddressed by
enforcement exist. EPA should address data entry problems under Element 1, SNC
designation issues under Element 3, and unaddressed SVs under Element 4.

•	Whether enforcement is returning SVs to compliance. If there is a significant
percentage of enforcement responses for SVs that do not return sites to compliance,
discuss this with the agency and prepare a recommendation under Element 4.

•	The timeliness of enforcement for SVs. The 2003 Hazardous Waste Civil Enforcement
Response Policy states that warning letters or other appropriate notification of violations
should be made by Day 150. By Day 240, EPA requires SVs to return to compliance. By
Day 360, the implementing agency should make a referral to the Department of Justice or
the state Attorney General, or enter into a final order with the violator. If the agency is
failing to do this, address under Element 4.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)

Metric 7a — Accurate compliance determinations

Metric type: File Review, Goal

Goal: 100%

What it measures: Percentage of inspection reports reviewed that led to accurate compliance
determinations. The numerator = number of inspection reports reviewed that led to accurate
compliance determinations; denominator = number of inspection reports reviewed.

Guidance: EPA reviews inspection reports to determine accuracy of resulting compliance
determinations. Inspection reports lead to inaccurate compliance determinations when:


-------
RCRA Plain Language Guide | 13

•	There are potential violations documented in the report but there is no documentation of a
compliance determination.

•	Based on evidence in the inspection report, the agency mischaracterized a violation in the
compliance determination. For example, the inspection report indicates violations but the
compliance determination says the facility is in compliance.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003),
current OECA National Program Manager Guidance

Metric 7b — Violations found during CEI and FCI compliance evaluations
Metric type: Review Indicator

What it measures: The percentage of sites with a CEI or FCI inspection during the year
reviewed in which one or more violations was found. The numerator = number of sites with a
CEI or FCI during the review year in which one or more violations was found, denominator =
number of sites with a CEI or FCI during the review year.

Guidance: This metric provides information about the identification of violations for both
significant non-compliers and secondary violators.

When the value for this metric is low, further investigation and/or supplemental file review may
be necessary. Because this metric is a Review Indicator, EPA should use it to provide additional
context for file selection, and it should not be used alone to create a finding in an SRF report.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)
Metric 8a — SNC identification rate at sites with CEI and FCI compliance evaluations
Metric type: Review Indicator

What it measures: The percentage of sites with a CEI or FCI inspection during the year
reviewed or the preceding year that received a significant noncomplier (SNC) designation during
the year of review. The numerator = number of sites with a CEI or FCI during the year reviewed
or preceding year that received an SNC designation during the year reviewed; denominator =
number of sites with a CEI or FCI inspection in the year of review or preceding year.

Guidance: When the percentage deviates greatly from the national average, EPA may conduct a
supplemental file review. Reviewers would pull a sufficient number of facility files to evaluate
whether SNC determinations were appropriate. The metric includes a two-year inspection period
to include sites that were inspected in the previous fiscal year but identified as SNC in
RCRAInfo in the year of review. The RCRA ERP allows 150 days from the first day of
inspection (day zero) to identify a SNC, so some facilities may "straddle" two fiscal years.


-------
RCRA Plain Language Guide | 14

Because this metric is a Review Indicator, EPA should use it to guide file selection, and should
not use it alone to create a finding in an SRF report.

This file review should encompass previous enforcement actions and cases in the pipeline to
determine whether SNC did occur but went unreported, and whether violations reported as non-
SNC appear to warrant SNC status according to the Hazardous Waste Civil Enforcement
Response Policy.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)
Metric 8b — Timeliness of SNC determinations
Metric type: Data, Goal
Goal: 100%

What it measures: The percentage of significant noncompliance (SNC) determinations made
within 150 days of the first day of the inspection (Day Zero). The numerator = number of SNC
determinations made within 150 days of Day Zero; denominator = number of SNC
determinations.

Guidance: The December 2003 Hazardous Waste Civil Enforcement Response Policy states that
agencies should make and report SNC designations by Day 150. On-time SNC designation
ensures that agencies address significant problems in a timely manner.

The policy also states that agencies should re-designate SVs as SNC if the violator does not
return to compliance in 240 days. EPA should review the list of SNCs that exceed 150 days with
the Agency to determine if any were originally SVs and reclassified at a later date.

EPA may need to conduct a supplemental file review when agencies do not meet the 100 percent
goal to determine the seriousness of the issue.

For secondary violators that are reclassified as significant non-compliers: when entering the SNY
(Significant Non-Complier) evaluation in RCRAInfo, the field Reclassified SV is available to the
right of the Day Zero field. The Reclassified SV field may be chosen in place of the Day Zero
field. In this case, the Notes field must include "Reclassified SV." For translators, include
"Reclassified SV" in the Notes field, and set Day Zero to the date of the reclassification.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)

Metric 8c — Appropriate SNC determinations

Metric type: File Review, Goal

Goal: 100%


-------
RCRA Plain Language Guide | 15

What it measures: Percentage of files reviewed in which significant noncompliance (SNC)
status was appropriately determined during the year reviewed. The numerator = number of
facilities reviewed with violations correctly determined to be SNC or secondary violator;
denominator = number of facilities with violations reviewed.

Guidance: Review all selected files in which the agency determined there was a violation.
Specifically, look at inspection reports that identify potential violations and whether the facility
was subsequently designated SNC. Here is an example for how to conduct such a review:

•	The agency determined that 10 of the facilities EPA selected for file review had
violations. The agency determined that five of these facilities were SNC and five were
non-SNC.

•	When EPA reviews these 10 facility files, it determines that one of the agency's non-
SNC determinations was actually a SNC. The other nine facilities were accurately
determined to be either SNC or non-SNC.

•	The value for this metric is 9/10 = 90 percent.

For this metric, it may be necessary to review inspections and other activity from the previous
year to determine whether they should have resulted in an SNC determination during the year
reviewed.

The 2003 Hazardous Waste Civil Enforcement Response Policy defines SNC as "those violators
that have caused actual exposure or a substantial likelihood of exposure to hazardous waste or
hazardous waste constituents; are chronic or recalcitrant violators; or deviate substantially from
the terms of a permit, order, agreement or from RCRA statutory or regulatory requirements."

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)
Element 4 — Enforcement

Reviewers will use Element 4 to determine the agency's effectiveness in taking timely and
appropriate enforcement and using enforcement to return facilities to compliance.

Data verification metrics can provide counts for informal and formal actions, and the number of
actions with penalties. When comparing these counts to the violation and SNC metrics in
Element 3, reviewers get a preliminary sense of the degree to which the state is taking
appropriate enforcement. This information is helpful when selecting facility files to review. If
violation and SNC rates are high but the number of enforcement actions is low, reviewers may
wish to select extra facilities with violations and SNCs to determine why enforcement activity
was low. If enforcement numbers are high, reviewers may wish to select extra facilities with
enforcement to determine if those actions were appropriate and returned facilities to compliance.


-------
RCRA Plain Language Guide | 16

Reviewers should focus on metrics 9a (enforcement that returns sites to compliance), 10a
(timeliness of enforcement), and 10b (appropriate enforcement) when writing findings under this
element.

Key metrics: 9a, 10a, and 10b. Additional context: 2a, 7b, and 8a.

Metric 9a — Enforcement that returns violators to compliance
Metric type: File Review, Goal
Goal: 100%

What it measures: Percentage of enforcement responses that have returned or will return sites in
significant noncompliance (SNC) or secondary violation (SV) to compliance. The numerator =
number of enforcement responses reviewed for SNC and SV that document that the site is in
compliance or is on schedule to return to compliance; denominator = number of enforcement
responses against SNC and SV reviewed.

Guidance: The 2003 Hazardous Waste Civil Enforcement Response Policy (ERP) states that an
agency should address SNC with formal enforcement action and SV with at least an informal
action. The formal action should result in an enforceable agreement that seeks injunctive relief to
ensure the violator returns to compliance. Documentation of the return to compliance for both
SNC and SV should be included in the file.

Review files where the agency took enforcement during the year reviewed in response to SNC or
SV to determine if those sites have returned to compliance or are on a schedule to return to
compliance.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)
Metric 10a — Timely enforcement taken to address SNC
Metric type: Data, Goal
Goal: 80%

What it measures: The percentage of year-reviewed and previous-year significant
noncompliance (SNC) violations addressed with a formal enforcement action or referral during
the year reviewed and within 360 days of Day Zero. The numerator = year-reviewed and
previous-year SNCs that were addressed by a formal enforcement action in the year reviewed
and within 360 days of Day Zero; denominator = year-reviewed and previous-year SNCs
addressed by a formal enforcement action in the year reviewed.

Guidance: When a facility is determined to be in SNC, agencies should resolve SNC in a timely
manner so problems do not linger. For SNCs, the 2003 Hazardous Waste Civil Enforcement
Response Policy (ERP) allows 360 days from the first day of inspection (Day Zero) for final


-------
RCRA Plain Language Guide | 17

formal enforcement action or referral to EPA, the agency attorney general, or Department of
Justice.

The ERP recognizes that 20 percent of SNCs may exceed this timeline. Therefore, this metric's
goal is for 80 percent of SNCs to receive enforcement within 360 days. Supplemental file review
is necessary for lead agencies below 80 percent to ascertain whether the data metrics indicate a
problem with timely action.

For secondary violators that are reclassified as significant non-compliers: when entering the SNY
(Significant Non-Complier) evaluation in RCRAInfo, the field Reclassified SV is available to the
right of the Day Zero field. The Reclassified SV field may be chosen in place of the Day Zero
field. In this case, the Notes field must include "Reclassifed SV". For translators, include
"Reclassified SV" in the Notes field, and set Day Zero to the date of the reclassification.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)

Metric 10b — Appropriate enforcement taken to address violations

Metric type: File Review, Goal

Goal: 100%

What it measures: The percentage of files with enforcement responses that are appropriate to
the violations. The numerator = number of enforcement responses reviewed that are appropriate
to the violations; denominator = number of facilities reviewed with significant noncompliance
(SNC) or secondary violation (SV). The denominator should include all violations regardless of
whether the agency accurately identifies the violation.

Guidance: The 2003 Hazardous Waste Civil Enforcement Response Policy (ERP) states that
agencies should address SNC through a formal enforcement action. This should initiate an
administrative or civil action that results in an enforceable agreement or order and imposes
sanctions. The order should seek injunctive relief that ensures an expedient return to compliance.

For SVs, the ERP states that informal enforcement is the minimally appropriate response.
Informal enforcement notifies the violator of its violations. If the violator does not come into
compliance within 240 days of Day Zero, then the implementing agency should re-classify the
site as an SNC.

Enforcement actions for both SVs and SNCs should mandate compliance.

Applicable EPA policy/guidance: Hazardous Waste Civil Enforcement Response Policy (2003)

Element 5 — Penalties

Element 5 evaluates penalty documentation using three metrics — 1 la for gravity and economic
benefit, 12a for difference between initial and final penalty, and 12b for collection.


-------
RCRA Plain Language Guide | 18

Reviewers can gauge the level of penalty activity in the year reviewed through the RCRA
Dashboard, which provides information on the number of penalties and their dollar values.

Key metrics: 11a, 12a, and 12b.

Metric 11a — Gravity and economic benefit

Metric type: File Review, Goal

Goal: 100%

What it measures: Percentage of penalty calculations reviewed that document, where
appropriate, gravity and economic benefit. The numerator = number of penalties reviewed where
the penalty was appropriately calculated and documented; denominator = the number of penalties
reviewed.

Guidance: Lead agencies should document penalties sought, including, whenever appropriate,
the calculation of gravity and economic benefit. With regard to this documentation, Oversight of
State and Local Penalty Assessments: Revisions to the Policy Framework for State/EPA
Enforcement Agreements state the following:

EPA asks that a State or local agency make case records available to EPA upon request and during an EPA
audit of State performance. All recordkeeping and reporting should meet the requirements of the quality
assurance management policy and follow procedures established by each national program consistent with
the Agency's Monitoring Policy and Quality Assurance Management System. . .

State and local recordkeeping should include documentation of the penalty sought, including the
calculation of economic benefit where appropriate. It is important that accurate and complete
documentation of economic benefit calculations be maintained to support defensibility in court, enhance
Agency's negotiating posture, and lead to greater consistency.

Applicable EPA policy/guidance: RCRA Civil Penalty Policy (2003), Oversight of State and
Local Penalty Assessments: Revisions to the Policy Framework for State/EPA Enforcement
Agreements (1993), Revised Policy Framework for State/EPA Enforcement Agreements (1986)

Metric 12a — Documentation of rationale for difference between initial penalty calculation
and final penalty

Metric type: File Review, Goal

Goal: 100%

What it measures: Percentage of penalties reviewed that document the rationale for the final
value assessed when it is lower than the initial calculated value. The numerator = number of
penalties reviewed that document the rationale for the final value assessed compared to the initial
calculated value; denominator = number of penalties reviewed where final value assessed is
lower than initial calculated value.


-------
RCRA Plain Language Guide | 19

Guidance: According to the Revisions to the Policy Framework for State/EPA Enforcement
Agreements (1993), states should document any adjustments to the initial penalty including a
justification for any differences between the initial and final assessed penalty.

Review penalty files to identify their contents with respect to initial and final penalties. If only
one of the two penalty amounts is found in the file, ask the agency why the initial and final
assessed penalties are not both documented, along with the rationale for any differences.

Applicable EPA policy/guidance: RCRA Civil Penalty Policy (2003), Oversight of State and
Local Penalty Assessments: Revisions to the Policy Framework for State/EPA Enforcement
Agreements (1993), Revised Policy Framework for State/EPA Enforcement Agreements (1986)

Metric 12b — Penalty collection

Metric type: File Review, Goal

Goal: 100%

What it measures: Percentage of enforcement files reviewed that document collection of
penalty. The numerator = number of penalties reviewed with documentation of collection or
measures to collect a delinquent penalty; denominator = number of penalties reviewed.

Guidance: This metric assesses whether the agency has collected the final penalty. Begin by
looking in the file for a cancelled check or other correspondence documenting transmittal of the
check. If this documentation is not in the file, ask the agency if they can provide proof of
collection through the data system of record.

If the agency has not collected the final penalty, there should be documentation either in the file
or in the data system of record that the agency has taken appropriate follow-up measures. The
finding can take into consideration the reasons for difficulty of collecting penalty, such as
bankruptcy, litigation, etc.

Applicable EPA policy/guidance: RCRA Civil Penalty Policy (2003), Oversight of State and
Local Penalty Assessments: Revisions to the Policy Framework for State/EPA Enforcement
Agreements (1993), Revised Policy Framework for State/EPA Enforcement Agreements (1986)


-------
RCRA Plain Language Guide

Appendix: Acronyms

VSQG

Very small quantity generator

CMS

Compliance Monitoring Strategy

ECHO

Enforcement and Compliance History Online

EPA

U.S. Environmental Protection Agency

ERP

December 2003 Hazardous Waste Civil Enforcement Response Policy

FY

Federal fiscal year (Oct. 1 - Sept. 30)

LQG

Large quantity generator

MOA

Memorandum of Agreement

NPM Guidance

FY 2011 National Program Manager Guidance

ECHO Gov

Government-only area of ECHO

PPA

Performance Partnership Agreement

PPG

Performance Partnership Grant

RCRA

Resource Conservation and Recovery Act

RCRAInfo

RCRA national data system

TSDF

Treatment, storage, and disposal facility

SRF

State Review Framework

SNC

Significant noncomplier

SQG

Small quantity generator

sv

Secondary violator

RCRA Evaluation Types

CAC

Corrective Action Compliance Evaluation

CDI

Case Development Inspection

CEI

Compliance Evaluation Inspection

CSE

Compliance Schedule Evaluation

FCI

Focused Compliance Inspection

GME

Groundwater Monitoring Evaluation

OAM

Operation and Maintenance Inspection


-------