United States
Office of Policy
Environmental Protection (1807T)
Agency
July 2013
EPA-100-K-13-012
SEPA
OSWER Risk
Management Program
Evaluation Scoping
Project: Scoping
Assessment and
Performance Data
Improvement Plan
Final Report
July 12, 2O13
-------
TABLE OF CONTENTS
I. INTRODUCTION 1
II. LOGIC MODEL 3
III. EVALUATION QUESTIONS 6
IV. POTENTIAL APPROACHES AND EVALUATION DATA NEEDS 8
V. DATA SOURCES 11
RMP Database 11
ICIS Inspection Data 13
ICIS Enforcement Data 13
Database Connections 14
Internal ICIS Connections: Linking Inspection and Enforcement Data 14
Linking between ICIS and RMP Info 75
Linking Across ICIS Inspection Data, ICIS Enforcement Data, and RMP Info 75
Inspection Reports 16
List of High Risk Facilities 17
State Data 18
Qualitative Data 19
VI. EVALUABILITY ASSESSMENT 20
1. What effect, if any, do RMP inspections have on facility behavior? 20
2. What effect, if any, do RMP inspections have on the incidence and severity of
chemical accidents at RMP facilities? 21
3. What indications exist, if any, that RMP inspections have a deterrent effect on RMP
facilities that have not been inspected? 21
4. What effect, if any, has the change in RMP inspection strategy (to designate some
facilities as "high risk" and to devote more inspection resources to high-risk facilities)
had on facility behavior and the incidence and severity of accidents? 21
5. Based on the results of question 4, should EPA consider refining its approach to
defining "high-risk" facilities? Should EPA reconsider the current allocation of
inspection resources between high-risk facilities vs. facilities that are not judged to be
high risk? 22
VII. PERFORMANCE DATA IMPROVEMENT PLAN 23
Action Plan to Improve Data Quality and Accessibility 24
-------
Risk Management Plans 24
Inspections and Enforcement 25
Accident History 28
Risk Status 29
General Deterrence so
Pilot Study to Address Data Issues Relating to Program Implementation 30
Questions and Considerations for a Statistically-Valid Pilot Study 31
APPENDIX A. CROSSWALK OF EVALUATION QUESTIONS, DATA SOURCES,
AND POTENTIAL METHODS 34
APPENDIX B. DATA IMPROVEMENT ACTION PLAN 43
-------
I. INTRODUCTION
The Risk Management Program is implemented by the Office of Emergency
Management (OEM) in EPA's Office of Solid Waste and Emergency Response
(OSWER). The program aims to reduce accidents at RMP facilities, which is a strategic
measure under Goal 3 of EPA's 2011-2015 Strategic Plan. EPA and state and local
implementing agencies conduct inspections at RMP facilities.1 Inspections are intended to
determine compliance with RMP regulatory requirements. Because resources for
conducting inspections are limited, within the past few years EPA has begun focusing
more (but not exclusively) on high risk facilities for RMP inspections.2
The objectives of this evaluation scoping project were twofold: 1) assess the readiness of
the RMP Program for an outcome evaluation focusing on the role of inspections, and 2)
identify any additional data collection that would be required to answer outcome
questions. The project was conducted in two phases. Phase 1 examined the strengths and
limitations of the program's existing data. The results of the scoping phase indicated that
the existing data would not support an evaluation at the present time. Phase 2 focused on
the development of a Performance Data Improvement Plan for improving the
accessibility, quality, and usefulness of the program's data to make it viable for
evaluation in the future. The current volume consolidates the results of the scoping
assessment with the Performance Data Improvement Plan.
This project was funded by EPA's Program Evaluation Competition, through which
EPA's Evaluation Support Division (ESD) encourages the use of program evaluation as a
management tool throughout the Agency. The project was funded with resources from
ESD, OEM, and OSWER's Center for Program Analysis. EPA contracted with Industrial
Economics, Incorporated (lEc) to provide contractor support for this project.3 The project
team included representatives from ESD, OEM, OSWER's Center for Program Analysis,
the Office of Enforcement and Compliance Assurance (OECA), and lEc. In addition, the
project team shared the evaluation questions (presented in Section III) with the ten EPA
1 Throughout this report, the term "RMP facilities" refers to facilities that are required to submit a
Risk Management Plan; the term "RMP inspections" refers to inspections that are conducted at
these facilities. For convenience, we use the phrase "RMP Program" to refer to the Risk
Management Program.
2 The 2013 National Program Manager Guidance indicates that Regions should inspect at least
four percent of the total number of regulated facilities in the Region during FY 2013; of these
inspections, at least 30 percent should be conducted at high risk RMP facilities. Excerpt from 2013
National Program Manager Guidance: Supporting Chemical Accident Prevention, Preparedness
and Response at the Local and State Levels.
3 This report is written from lEc's perspective. Terms such as "we" and "our" refer to lEc.
-------
Regional Offices. Three Regions provided written comments, and the team further
revised the questions based on their feedback.
Following this introduction, Section II presents a logic model for the RMP Program
focusing on the role of RMP inspections. Section III presents the revised evaluation
questions and describes their connection to the logic model. Section IV identifies
potential approaches to answer the evaluation questions and the data needed. Section V
discusses the data sources explored, and explores the strengths and limitations of the
existing data. Section VI presents our overall evaluability assessment. The Performance
Data Improvement Plan is described in Section VII.
Appendix A summarizes the key findings of the scoping assessment in a crosswalk table.
A summary of the Data Improvement Action Plan is included in Appendix B.
-------
I. LOGIC MODEL
To illustrate the role of RMP inspections and to inform the development of specific
evaluation questions, EPA and lEc refined the RMP logic model focusing on RMP
inspections.4 A logic model is a graphical representation of the relationships between
program resources, activities, and outputs, and intended changes in awareness, behavior,
and conditions. As shown in Exhibit 1, the key components of the model include:
• Resources — program managers, staff, and funds dedicated to the program.
Resources also include databases with which the program collects information about
RMP facilities and records inspection results.
• Activities — the specific procedures or processes used to achieve program goals.
For example, the RMP Program issues RMP inspection guidance, conducts
inspections, and records inspection results.
• Outputs — the immediate products that result from activities. For example, RMP
outputs include RMP inspection guidance documents, inspection plans, and
inspections conducted.
• Audiences — groups and individuals targeted by RMP activities and outputs.
Audiences for RMP outputs include EPA Regional managers, delegated states,
inspectors, inspected facilities, other facilities that are not inspected, and the general
public. The model distinguishes between inspected high risk facilities, inspected
facilities not judged to be high risk, and RMP facilities not inspected. The program
aims to have a direct influence on inspected facilities, and an indirect influence on
uninspected facilities through informal business networking, media, and other means.
• Awareness — changes in awareness resulting from program outputs that are
causally linked to the RMP Program. For example, RMP outputs are intended to
make facilities more aware of the presence of inspectors, RMP requirements, and
safety risks.
• Behavior — changes in behavior resulting from changes in awareness. For
example, the RMP Program's activities are designed to first lead to increased
awareness of RMP requirements by facilities, and then lead to behavioral changes
including improved accident prevention and emergency response.
• Resulting Conditions — the overarching goals of the program, which in RMP's
case include reduced incidence and severity of chemical accidents, and improved
human and environmental health.
4 OEM undertakes a variety of activities beyond inspections that are not shown in the logic model.
-------
EXHIBIT 1. RMP PROGRAM LOGIC MODEL FOCUSING ON THE ROLE OF INSPECTIONS
Resources
Activities
Outputs
Audiences
Awareness
Behavior
Issue RMP
inspection
guidance
Select
facilities for F
inspection
• List high-
risk
facilities
(EPA)
• Use risk
criteria
(States)
Conduct
inspections
-i
V.
S
•^
s
RMP inspection
guidance
documents
• Inspection
targets and
priorities
Inspection
plans
• High-risk and
other
facilities
Inspections
conducted
^
_,
s
•>
EPA regional
managers and
delegated
states
\
t
Inspected high-
risk facilities
J
Inspected
Record
(results
Offer
compliance
initiate
enforcement
Communicate
and the public
Conduct
training
-V
s
•X
X,
s
•^
Inspection
results
and inspection
reports
Compliance
assistance;
enforcement
actions
Press releases,
presentations,
reports, and/or
informal
business
networking
Training
sessions
x,
V,
r\
... .
inspected _*
1s
:
Industry,
general public
/'Dotted lines denote ...- N
,' program's indirect \
\ influence on facilities j
\jTpt inspected.
Increased
awareness of
RMP
requirements
and risk factors
Reduction in
quantities of
regulated
chemicals
Resulting
Conditions
Increased targeting
and more frequent,
delegated
states
\
/
Inspected high-
risk facilities
J
I Inspected
lower-risk
facilities Lf
Facilities not
inspected _*
A
^
...
X
S
>
inspection
priorities
Greater
awareness of
inspector
presence
Greater
understanding of
RMP
requirements
Greater
awareness of
safety risks and
required changes
,•- »
—
•>
inspections of high-
risk facilities
i
Improved accident
prevention and *
emergency A
response programs
• Strengthened
RMPs
• Updated safety/
maintenance/
management
procedures and
hazard analysis
Improved public
and worker
safety and
accident
prevention/
response
training
• Adoption of
safer
technologies
• Faster/more
effective
emergency
response
Enhanced ability to
identify regulatory
violations
Reduced
number of
chemical
accidents
Reduced
severity of
chemical
accidents
Improved
human and
environmental
health
-------
As depicted in the logic model, the Risk Management Program makes several
assumptions in its design, which may or may not be true. Part of the purpose of an
evaluation could be to test these assumptions:
• Facilities respond to inspections by taking actions to improve safety and compliance.
• The frequency and intensity of inspections matters: More frequent and more intensive
inspections are more effective than less frequent, less intensive inspections.
• Some high risk facilities are aware of their status and respond to the possibility of
enhanced scrutiny from inspectors. Although disclosure of the overall list of "high
risk" facilities is restricted by law, inspectors sometimes inform individual high risk
facilities of their status, on a case by case basis. In addition, some facilities may be
able to guess their status based on publicly available inspection guidelines for high
risk facilities.
• High risk facilities respond to enhanced inspector presence. Even if high risk
facilities do not know their status, they may respond to more frequent inspections or
more thorough inspections resulting from EPA's greater focus on high risk facilities.
• Inspections can have a deterrent effect on facilities that are not inspected - i.e.,
facilities may take "preemptive" corrective measures in response to: communications
from EPA/states, information from business associations, media reports (e.g.,
enforcement cases), public scrutiny resulting from inspections, and/or their peers.
In addition to the factors shown in the logic model, RMP inspection activities and
outcomes may also be influenced by a variety of external factors, including:
• Resource constraints;
• Other inspections (not RMP), e.g. OSHA;
• Quality of inspectors/inspections;
• Level of sophistication of the facility (e.g., facility management and size);
• Regulatory requirements; and
• Political and economic factors.
The evaluation team developed a set of evaluation questions to test the program theory,
and to understand the extent to which the program is achieving its intended outcomes.
The following section presents the evaluation questions and describes their connection to
the logic model.
-------
I. EVALUATION QUESTIONS
Building on the original evaluation questions contained in OEM's proposal for PEC
funding, the evaluation team developed the following set of revised evaluation questions:
1. What effect, if any, do RMP inspections have on facility behavior?
a. Do facilities re-submit (update) their Risk Management Plan following an RMP
inspection? If yes, what changes do they make?
b. How, if at all, do facilities change their safety practices and procedures following
an RMP inspection?
c. Do facilities adopt safer technologies and/or reduce the quantity of regulated
chemicals held on-site following an RMP inspection? If yes, what changes do
they make?
2. What effect, if any, do RMP inspections have on the incidence and severity of
chemical accidents at RMP facilities?
a. What portion of facilities that have received an RMP inspection report a chemical
accident within two years following the inspection?
b. How do the incidence and severity of reported accidents at inspected facilities
(post-inspection) compare to the accident history at RMP facilities that have
never been inspected?
3. What indications exist, if any, that RMP inspections have a deterrent effect on RMP
facilities that have not been inspected?
4. What effect, if any, has the change in RMP inspection strategy (to designate some
facilities as "high risk" and to devote more inspection resources to high-risk
facilities) had on facility behavior and the incidence and severity of accidents?
a. How, if at all, do inspection results (i.e., the number and type of violations, and
enforcement actions) differ between high-risk facilities compared to facilities that
are not judged to be high risk?
b. How, if at all, have inspection results changed overall and by type of facility
(high-risk facilities vs. facilities that are not judged to be high risk) since the
strategy was adopted?
c. How, if at all, do the incidence and severity of chemical accidents (post-
inspection) vary between high-risk facilities compared to facilities that are not
judged to be high risk?
-------
d. How, if at all, have the incidence and severity of accidents changed (overall and
by facility type) since the strategy was adopted?
5. Based on the results of Question 4, should EPA consider refining its approach to
defining "high-risk" facilities? Should EPA reconsider the current allocation of
inspection resources between high-risk facilities vs. facilities that are not judged to be
high risk?
Exhibit 2 presents the evaluation questions and indicates the components in the logic
model shown in Exhibit 1 that the questions are meant to inform.
EXHIBIT 2. EVALUATION QUESTIONS AND CONNECTION TO THE LOGIC MODEL
EVALUATION QUESTIONS
1. What effect, if any, do RMP inspections have on facility behavior?
A. Do facilities re-submit (update) their Risk Management Plan following an RMP
inspection? If yes, what changes do they make?
B. How, if at all, do facilities change their safety practices and procedures
following an RMP inspection?
C. Do facilities adopt safer technologies and/or reduce the quantity of
regulated chemicals held on-site following an RMP inspection? If yes, what
changes do they make?
2. What effect, if any, do RMP inspections have on the incidence and severity
of chemical accidents at RMP facilities?
A. What portion of facilities that have received an RMP inspection report a
chemical accident within two years following the inspection?
B. How do the incidence and severity of reported accidents at inspected
facilities (post -inspection) compare to the accident history at RMP facilities
that have never been inspected?
3. What indications exist, if any, that RMP inspections have a deterrent effect
on RMP facilities that have not been inspected?
4. What effect, if any, has the change in RMP inspection strategy (to designate
some facilities as "high risk" and to devote more inspection resources to high-
risk facilities) had on facility behavior and the incidence and severity of
accidents?
A. How, if at all, do inspection results (i.e., the number and type of violations,
and enforcement actions) differ between high-risk facilities compared to
facilities that are not judged to be high risk?
B. How, if at all, have inspection results changed overall and by type of facility
(high-risk facilities vs. facilities that are not judged to be high risk) since the
strategy was adopted?
C. How, if at all, do the incidence and severity of chemical accidents (post-
inspection) vary between high-risk facilities compared to facilities that are
not judged to be high risk?
D. How, if at all, have the incidence and severity of accidents changed (overall
and by facility type) since the strategy was adopted?
5. Based on the results of question 4, should EPA consider refining its
approach to defining "high-risk" facilities? Should EPA reconsider the current
allocation of inspection resources between high-risk facilities vs. facilities
that are not judged to be high risk?
CONNECTION
TO THE LOGIC
MODEL
A
B, C, D, J, K
D
A, B, C, H, I, J, K,
L
H, I
-------
IV. POTENTIAL APPROACHES AND EVALUATION DATA NEEDS
Having finalized the evaluation questions, the project team identified potential evaluation
approaches. As shown in Exhibit 3, the two main approaches to answering the evaluation
questions are (1) quantitative and (2) qualitative. In both cases, the goal is to understand
the effect of RMP inspections on facility behavior and accidents.
Quantitative methods aim to identify the role of RMP inspections by making comparisons
- over time (before/after an inspection), and/or across facilities (with/without
inspections). Such comparisons can tell us whether RMP inspections are correlated with
accident history. Ideally, we would go beyond correlation and show whether RMP
inspections cause fewer, or less severe, chemical accidents. However, demonstrating
causality requires the ability to control for confounding factors, such as other types of
inspections (non-RMP inspections) and inspector/inspection quality.5 To detect
differences across groups and strengthen confidence in results, quantitative studies
generally aim to collect a large number of observations across standard and reliable
metrics. This approach allows researchers to extrapolate findings from our sample to the
general population; however, we may overlook important variables that are difficult to
quantify. Moreover, it does not tell us how and why a program is (or is not) working.
EXHIBIT 3. POSSIBLE APPROACHES TO ANSWERING THE EVALUATION QUESTIONS
QUANTITATIVE
QUALITATIVE
Before-and-after comparison:
• Compare pre- and post-inspection behavior and
accident history at inspected facilities
With-and-without comparison:
• Compare the behavior and accident history of
facilities with an RMP inspection to facilities
without an RMP inspection
• Compare the behavior and accident history of
high risk facilities and facilities not judged to be
high risk
Thematic analysis of data gathered through
interviews, focus groups, document reviews,
and literature search
In-depth case studies of the program's
pathways of influence and factors that
mediate success
Understand how and why the program is or is
not effective
Qualitative approaches typically focus on a smaller number of observations and aim to
gain greater insight into the dynamics of the program, for example, through in-depth case
studies. Given the limited number of observations, case study findings may not be
5 We may be able to extend the analysis beyond simple correlations and control for some
confounding factors. For example, we could test the strength of the correlations between
inspections and accident history, and we could control for facility characteristics that are captured
in existing datasets (e.g., NAICS code). However, some of the major confounding factors are not
captured in existing datasets - e.g., inspector/inspection quality.
-------
generalizable to the population as a whole. However, case studies can go beyond
standard, easily codified metrics to explore the program's pathways of influence and
factors that mediate success. For example, while quantitative analysis may show a
difference between facilities that have and have not been inspected, case studies can shed
light on why inspections were or were not effective in certain situations.
Quantitative and qualitative approaches need not be used in isolation. In fact, using a
combination of quantitative and qualitative approaches often generates a more
comprehensive understanding of the program, and enables us to cross-check our findings
from across multiple data sources, thereby strengthening confidence in the results.
Having identified general approaches to answering the evaluation questions, the team
turned its attention to the data needed to implement each approach. The rest of this
section identifies the major types of data needed; Section V describes the data sources
that we investigated.
• Inspection data. This is a basic data requirement for assessing the effectiveness of
RMP inspections. Simply put, we need to know if and when an RMP inspection was
conducted at a facility. It would be optimal to have more information about the
inspections - for example, what triggered the inspection, the type and intensity of the
inspection, number and type of violations discovered, and resulting enforcement
actions. At a minimum, though, we need to be able to distinguish between facilities
that have and have not received an RMP inspection.
• Behavioral indicators. The idea that facilities change their behavior in response to
inspections is a key element of the program theory, as reflected in the logic model.
Therefore, Evaluation Question 1 addresses changes in facility behavior. Behavioral
change is the crucial link between program activities (i.e., conducting RMP
inspections) and the ultimate goal of the program (reduced chemical accidents). As
we move from left to right along the logic model (from program resources and
activities, to resulting conditions), other factors beyond the control of the RMP
Program come into play. For example, if an inspected facility is found to be in
violation of RMP regulations and later returns to compliance (change in behavior),
we can be reasonably confident that the inspection caused the facility to change its
behavior. On the other hand, if an inspected facility never has a chemical accident
(resulting condition), we cannot state with any certainty that the inspection caused the
facility not to have an accident.6 After all, many facilities without an RMP inspection
never have an accident. Therefore, showing that inspections cause facilities to change
their behavior is a key intermediate step for assessing the effectiveness of inspections.
Therefore, we need information showing if and how facilities changed their behavior
following an RMP inspection. We are interested in the behaviors reflected in the
logic model, including: strengthened Risk Management Plans (RMPs); updated
safety/maintenance/management procedures and hazard analysis; improved public
and worker safety and accident prevention/response training; adoption of safer
6 This is a key challenge for prevention programs in general, where success is defined as
something not happening (e.g., a chemical accident).
-------
technologies; faster/more effective emergency response; and reduction in quantities
of regulated chemicals held on-site. Optimally, we would also like to understand
if/how RMP inspections contributed to changes in behavior. Qualitative methods,
including reviews of RMPs and inspection reports and interviews with inspectors and
facility managers, may be helpful for understanding how RMP inspections influenced
facility behavior.
Accident data. The ultimate goal of the program is to reduce the number and severity
of chemical accidents. Evaluation Question 2 aims to assess the role of inspections in
achieving this goal. As the question implies, we could calculate the portion of
inspected facilities that have a chemical accident; we could also compare the accident
history of RMP facilities that did and did not receive an RMP inspection. This would
not, on its own, prove that inspections are or are not effective; as discussed
throughout this document, other confounding factors can influence accident history.
However, when combined with changes in facility behavior, accident history
provides a useful (and necessary) indicator of the effectiveness of RMP inspections.
Information about uninspected facilities. Evaluation Question 3 aims to understand
the deterrent effect of RMP inspections on facilities that have not been inspected.
This effect - known in the literature as "general deterrence" - could result from
uninspected facilities observing or interacting with facilities that have been inspected,
reading newspaper articles about facilities that were inspected, etc. By definition,
there are no inspection results for facilities that have never been inspected. However,
it may be possible to obtain anecdotal information about how uninspected facilities
changed their behavior in response to inspections at other facilities. The literature
also includes examples of general deterrence.
Risk status of RMP facilities. Evaluation Question 4 seeks to compare outcomes for
facilities that were targeted as "high risk" to facilities that were not judged to be high
risk. At a minimum, answering this question requires knowing which facilities were
specifically designated as high risk facilities, and which were not.
Qualitative data. As indicated in Exhibit 3 above, qualitative research may
encompass a variety of methods including interviews, focus groups, document
reviews, literature search, and case studies. Each method has corresponding data
needs. For example, access to key informants (interview data), inspection reports or
other documents, and relevant academic literature may be required.
10
-------
V. DATA SOURCES
This section describes potential data sources for answering the evaluation questions.
While most of the team's effort was focused on identifying and assessing existing data
sources, we also considered new data that could be collected. Therefore, while most of
this section focuses on existing data sources, the last part of this section discusses new
data collection. The table in Appendix A presents a crosswalk of evaluation questions,
approaches, and data sources.
We investigated several data sources to assess the readiness of the RMP Program for an
outcome evaluation focusing on the role of inspections. These include: the RMP database,
inspection data in the Integrated Compliance Information System (ICIS) database, and
ICIS enforcement data. In addition, we investigated the connections among these three
data sources. Below, we describe each data source and the connections across data
sources.
RMP DATABASE
RMP-registered facilities are required to submit a Risk Management Plan (RMP) to RMP
Info. OEM provided the RMP Info data in a Microsoft Access database: RMP Review.7
The data are stored in numerous tables in the database. Working closely with OEM, we
determined that two tables - Accident History (tblS6) and Facilities (tblS) - are the
primary sources of data relevant to the evaluation questions. The Facilities table contains
information about facilities that have submitted a Risk Management Plan, including:
facility name, address, and contact information for the representative in charge of the
facility's RMP. The Accident History table contains information about the release of
RMP-regulated substances at facilities, including: accident date, accident time, type of
release, reported cause of the release, and the type of damage caused (e.g. worker deaths,
worker injuries, property damage, etc.).
• Strengths. RMP Info is a relatively comprehensive database for all facilities that
have submitted an RMP to EPA. The Accident History table contains rich
information both on the number of accidents and their severity (e.g., number of
deaths). Notably, the database includes accident history for facilities that have been
inspected and facilities that have not been inspected. Therefore, we can compare
outcome data for facilities that have and have not received an RMP inspection.
• Limitations. We have several concerns about the quality and usability of the data.
First, the data contained in RMP Info are self-reported by regulated facilities, and
may not be accurate due to a facility's potential bias to report information that shows
7 We use the terms RMP Info and RMP Review interchangeably throughout this report.
11
-------
itself in a positive light. Reporting facilities may also misinterpret RMP data fields
and provide information that is not relevant or accurate.
Second, the database may not include information for all facilities whose behavior is
affected by the program. Some facilities that are required to submit an RMP may not
comply with this requirement. Moreover, facilities that would be required to submit
an RMP may reduce the quantity of chemicals held on-site below the regulatory
threshold, and thus no longer be required to submit an RMP. Even if we examined all
of the data in RMP Info, we would not have information for non-reporting facilities.
Third, EPA is not able to correct mistakes that it identifies in RMP Info. If EPA
recognizes an error in a facility's RMP, the Agency contacts the facility and asks it to
resubmit its RMP with corrected information. This creates a delay between the time
that errors are discovered and the time they are corrected in the database.
Fourth, attributing revisions in a facility's Risk Management Plan to an RMP
inspection would be challenging and may not be possible. RMP submissions do not
indicate whether changes in the Plan, if any, were due to an RMP inspection. In
addition, a significant amount of time may elapse between when a facility receives an
RMP inspection and when it resubmits its RMP (facilities are only required to submit
an RMP once every five years or within six months following an accident8). The time
lag between receiving an inspection and resubmitting an RMP makes it difficult to tie
changes in the RMP directly to an inspection. In fact, facilities that were inspected
within the past five years may not yet have resubmitted their RMP. Moreover, the
number of RMP submissions for an individual facility may be limited: The RMP
Program has been in place since 1999; if facilities submitted an RMP once every five
years, we would expect to find two RMPs per facility as of 2013. However, a
facility's requirements may have changed over that time period (e.g., due to an
increase or decrease in the volume of chemicals held on-site) such that they have only
submitted one RMP.
Finally, RMP Info does not include reliable indicators of changes in facility behavior;
it also does not contain any information about whether a facility has received an RMP
inspection or the results of the inspection. Inspection data are contained in the
separate ICIS database; as discussed below, this requires linking the facility data in
RMP Info to the inspection data in ICIS.
8 There are several complicating factors related to accident history: First, inspections may increase
the likelihood that facilities report an accident; we may not be able to differentiate between actual
changes in the incidence of accidents vs. changes in reporting behavior. Second, because facilities
have up to six months to report an accident, accidents reported within six months of an inspection
need to be scrutinized to see if the accident occurred before or after the inspection was conducted.
Third, facilities can report information related to the same accident more than once. If we want to
know when the facility first reported the accident, we need to look at the earliest date when the
accident was reported. In this regard, RMP Info sometimes has duplicate entries for the same
accident as facilities update minor details about the incident. Identifying and removing duplicate
records would require additional time and resources.
12
-------
ICIS INSPECTION DATA
The ICIS database provides inspection data for RMP inspections and other types of
inspections (e.g., OSHA). Some of the ICIS data are publicly available through the EPA
Compliance History and Enforcement Online (ECHO) website. However, ECHO does
not provide access to all of the information in ICIS; moreover, accessing information
from ICIS in a user-friendly format can be challenging. Therefore, lEc relied on OECA to
provide the data. OECA exported the data into an Excel file, including RMP inspections
conducted in fiscal years 2007 through 2012. The data include the following information:
date of inspection, compliance monitoring activity name, sections code, compliance
monitoring activity reason, FRS facility name, FRS ID, compliance monitoring type, and
activity ID.
• Strengths. The ICIS inspection data indicate if and when an RMP inspection was
conducted at a facility, along with some basic information about the inspection type
and reason. This information is fundamental for identifying and comparing facilities
that have and have not received an RMP inspection.
• Limitations. The ICIS data only go back to FY 2007, effectively limiting our study
period to about six years if we conduct an evaluation in 2013. Moreover, ICIS does
not provide the actual inspection reports; therefore, it is not possible to determine the
outcomes and other specific details of an inspection by reviewing the database alone.
A detailed analysis of inspection outcomes would require a time-consuming manual
review of paper copies of the inspection reports that are stored at EPA's Regional
Offices. The data also do not indicate whether a facility was targeted as "high risk,"
and do not provide information on the intensity of the inspection. Furthermore,
because the RMP Program aims to inspect about five percent of RMP facilities per
year - and given the relatively young age of the program - many facilities have never
been inspected, and very few facilities have been inspected more than once. The lack
of longitudinal inspection data is a significant limitation, because the best way to
identify the results of previous inspections is to conduct follow-up inspections.9
Given these limitations, the inspection data will tell us if and when an inspection was
conducted, but will not tell us how inspections affected facility behavior.
ICIS ENFORCEMENT DATA
The ICIS database provides information about enforcement actions taken against facilities
due to RMP-related (and other) regulatory violations. OECA provided an export of the
ICIS enforcement data for fiscal years 2007 through 2012 in Excel. The enforcement data
provided contained the following information: enforcement action ID, facility name, the
Region in which the facility is located, primary law violated, section code violated, final
order date, federal penalty assessed, compliance action value total, assessed cost
recovery, and final order type.
9 Longitudinal data tracks the same type of information (e.g., inspection results) for the same
subjects (e.g., facilities) at multiple points in time. Longitudinal studies yield multiple or repeated
measurements on each subject. Longitudinal data would enable us to measure changes in
inspection outcomes at individual RMP facilities over time.
13
-------
• Strengths. The ICIS enforcement data indicate if and when facilities settled RMP-
related enforcement actions and the settlement value. The settlement value can serve
as a rough proxy for the severity of the violation settled in the enforcement case.
• Limitations. Again, the ICIS data only goes back to FY 2007. In addition, the ICIS
enforcement dataset that we received does not indicate the specific violation that
triggered the enforcement action, and the data also do not indicate how facilities
changed their behavior in response to the enforcement action. Also, the enforcement
data are only available for enforcement cases that have been settled. Enforcement
cases that are still pending are not represented in the data. However, we understand
that unsettled enforcement cases represent a small fraction of the total.
DATABASE CONNECTIONS
In addition to reviewing the above datasets, we investigated the ability to link facilities
across data sources. The ability to link across data sources is necessary for analyzing the
relationships among inspections, enforcement actions, and Risk Management Plans
(including accident history). First, we discuss linking ICIS inspection data with ICIS
enforcement data. Second, we describe the process for linking from ICIS to RMP Info.
Third, we summarize the information available after we link across datasets.
INTERNAL ICIS CONNECTIONS: LINKING INSPECTION AND ENFORCEMENT DATA
The ICIS inspection and enforcement data are stored in separate databases. There are two
methods for linking inspection and enforcement data; both methods have advantages and
disadvantages. The first method uses the Enforcement Action ID in the inspections
database to link to related enforcement actions. However, this approach requires the ICIS
user who enters the inspection data to manually set the link between the inspection and
the corresponding enforcement action. OECA provided data linked in this manner for FY
2012.10 lEc verified that the existing linkages were accurate. However, the ICIS data
specialists subsequently realized that the data were not complete, due to some missing
linkages in the database.
We subsequently received an updated data pull using a different method: linking
inspection and enforcement data on the FRS ID. That data pull includes comprehensive
inspection and enforcement data for FFY 2007 through FFY 2012. However, because it
links on FRS ID (the facility identifier) rather than Enforcement Action ID (which
specifically links inspections to their related enforcement action), the results include some
enforcement actions that were not triggered by RMP inspections. For example,
enforcement actions at some facilities occurred long before an RMP inspection was
conducted at those same facilities. Removing entries where the enforcement date
precedes the inspection date is relatively straightforward. However, this still does not
guarantee that all remaining enforcement actions were the direct result of RMP
inspections. Overall, this option is more comprehensive than linking on the Enforcement
10 For this evaluability assessment, we used FY 2012 data to test whether it was possible to link
across ICIS inspection data, enforcement data, and RMP Info. If we conducted a full evaluation,
we would use all available years of data.
14
-------
Action ID and is therefore the preferred alternative of the ICIS data specialists. However,
neither method is ideal.
LINKING BETWEEN ICIS AND RMP INFO
To answer the evaluation questions, we need to link a facility's inspection and
enforcement data in ICIS with the facility's accident history in RMP Info. However,
RMP Info and ICIS use different facility identifiers: RMP Info uses the EPA Facility ID
to identify unique facilities, whereas ICIS uses the FRS ID. Linking the two datasets
requires a "bridge table" that associates each EPA Facility ID with the corresponding
FRS ID. EPA has created a bridge table for roughly 80 percent of facilities in RMP Info.
An analysis that uses the current bridge table would include 80 percent of facilities in
RMP Info, but would exclude the other 20 percent of facilities that are not in the bridge
table. Most of the time and effort in linking the two datasets involves building out the
bridge table, an effort that EPA is leading. Once the bridge table has been developed, it is
relatively straightforward to link between ICIS and RMP Info using the bridge table.
LINKING ACROSS ICIS INSPECTION DATA, ICIS ENFORCEMENT DATA, AND RMP INFO
Exhibit 4 summarizes the process for linking between the ICIS inspection data and ICIS
enforcement data, and for linking between the ICIS data and RMP Info. Linking the data
in the manner shown in Exhibit 4 allows us to analyze the relationships among RMP
inspections, enforcement actions, and accident history. Most importantly, by linking the
data sources, we can compare accident histories at facilities that have and have not been
inspected, and identify whether a facility reported a chemical accident within the two
years following an RMP inspection.
EXHIBIT 4. LINKING INSPECTION, ENFORCEMENT, AND RMP DATA
ICIS
Inspections
Enforcemen
Action ID*
Iden
FRS:
Facility Name
Inspection Date
Inspection Type
Enforcements
Enforcemen
Action ID
FRS ID
Facility Nat
Enforcement Date
Penalty Asses
Enforcemen
ICIS-RMPBridge
RMP Review
Facilities Table
Facility N
Facility ID
SlFacilities Tab!
EPA Facfl
Facility ID
Accident Histor
Facility ID
Accident F
Accident I
Notes: The diagram shows selected fields only. It does not show every field included in the data tables.
* The Enforcement Action ID is not always available for inspections that triggered an enforcement action.
15
-------
Exhibit 5 summarizes the key information available in ICIS and RMP Info that would
support answering our evaluation questions. Note that while Exhibit 4 shows the linkages
across datasets, Exhibit 5 summarizes the content of the information contained within the
databases that would help us answer the evaluation questions. By linking across
databases, we may obtain a basic understanding of a facility's inspection, enforcement,
and accident history.
EXHIBIT 5. LINKING INSPECTION, ENFORCEMENT, AND RMP DATA
Inspection
Inspection
Report
T
*ICIS
Facility
Name/FRS ID
Inspection Date
Compliance
Monitoring
Reason
Inspection Type
Enforcement
Enforcement
Report
ICIS
Facility
Name/FRS ID
Enforcement
Date
Enforcement
Type
Enforcement
Status
Section Code
*RMPInfo
Faci lity
Name/EPA
Facility ID
Accident Date
Accident Type
Accident Result
(Injuries,
deaths,
property
Note: Not all data
contained in these reports
are transferred into a
database. Some data about
inspections, enforcements,
and RMPs are contained
solely in hard-copy reports.
KEY
< Data flow
| | EPA Action
| | Accident
I | Report
( 1 Database
Notes: *The diagram shows selected fields only. It does not show every field included in the databases.
As shown in Exhibit 5, some information is only contained in paper files and does not get
entered into an electronic database. For example, as discussed in the following section,
some inspection data are contained solely in hard-copy reports, and are not entered into
ICIS. To obtain the data that are not entered into ICIS, such as detailed inspection results,
we would need to obtain the paper files from EPA's Regional Offices.
INSPECTION REPORTS
As discussed in the previous section, ICIS contains some - but not all - of the data found
in the inspection reports. The inspection reports are stored in paper copy at the EPA
Regional Offices. The format of the reports differs across Regions. While the inspection
guidance issued by EPA Headquarters includes a report template,11 none of the Regions
currently uses a report format that exactly matches what is in the guidance. For this
evaluability assessment, EPA provided lEc with a sample of four inspection reports - one
each from Regions 3, 5, 7, and 9. The reports provide detailed descriptions of inspection
activities and findings, including potential violations. The reports do not describe
resulting enforcement actions, or changes in facility behavior following the inspection.
1: Guidance for Conducting Risk Management Program Inspections under Clean Air Act Section
16
-------
Presumably, enforcement information is contained in separate reports, while information
on changes in behavior would be documented during follow-up inspections.12
A recent report by EPA's Office of Inspector General (OIG), titled Improvements Needed
in EPA Training and Oversight for Risk Management Program Inspections, calls into
question the consistency and quality of the existing inspection reports. Based on their
review of 29 inspection reports for high risk facilities identified by OEM, OIG concluded
that, "Generally, inspection reports did not explain the extent to which the inspectors
reviewed specific elements of a covered process to determine compliance."13 According
to the OIG report, in June 2011, OEM started a review of inspection reports and
identified problems consistent with what OIG identified, including lack of supporting
facts and documentation, and basing the inspection findings on the existence or
completeness of required documentation rather than actual facility status or conditions.
OIG recommended that EPA should develop minimum inspection reporting requirements
and a monitoring program to assess the quality of inspections. EPA generally concurred
with the recommendations, and has already initiated corrective actions in some cases.14
LIST OF HIGH RISK FACILITIES
Neither RMP Info nor any other publicly available database indicates whether a facility
was designated as "high risk." This information is considered confidential and is not
shared with the public because of security reasons. The RMP high risk list is derived from
RMP Offsite Consequence Analysis (OCA) information, and as such, is restricted by law
(Public Law 106-40) to "covered persons," which essentially means government
employees and contractors with a need for the information. lEc does not have access to
the list, and we have not examined its contents.15 However, EPA provided information
about the list, which we describe below.
• Strengths. The list of high risk facilities is critical for assessing the effectiveness of
EPA's inspection strategy. Knowing which facilities have been designated as "high
risk" would allow us to compare inspections and accident histories for high risk and
non-high risk facilities. According to OEM, facilities on the high priority list can be
linked back to RMP Info data using the EPA Facility ID. OEM indicated that linking
to ICIS is more challenging, but is done every year when tallying up inspections.
• Limitations. EPA has only been implementing the strategy of designating "high risk"
facilities for a few years, which may not be enough time to establish comparisons
between high risk and non-high risk inspections and accidents. During this time, the
policy has evolved (and continues to evolve). The strategy has been adopted at
different times by different EPA Regions and in different sectors. To assess the
12 As discussed throughout this document, very few facilities have received more than one RMP
inspection.
13 U.S. EPA Office of Inspector General, Improvements Needed in EPA Training and Oversight
for Risk Management Program Inspections, Report No. 13-P-0178, March 21, 2013.
14 Ibid.
15 As an EPA contractor, lEc could go through the process of obtaining the list. This would require
us to complete EPA's process for OCA access, which includes study materials and an exam.
17
-------
strategy, we would need to know the date when each Region adopted the policy and
for which sectors, and how the policy has been applied in practice in each context.
Given the new and evolving nature of the policy, the small number of facilities
inspected each year, and the diversity of EPA Regions and industrial sectors, it would
be difficult to make a robust assessment of the strategy at the present time.
STATE DATA
While the RMP Program is administered by EPA, a small number of states have received
delegated authority for the program. Programs in delegated states must be at least as
stringent as EPA, but they can use their own data systems. Region 4 in particular has
several active delegated states; at least two states in the Region maintain their own
databases. We explored whether these state databases capture data not stored in ICIS or
RMP Info that would help us answer our evaluation questions. The project team held a
discussion with OEM's point of contact in Region 4 and representatives from two states
in the Region: North Carolina and Florida. Both states provided data to the evaluation
team via their Region 4 contact. A brief summary of the state data follows:
• North Carolina. lEc reviewed data provided by North Carolina related to chemical
accidents and RMP inspections. The data were provided in three Excel files and two
summary reports (PDF files). The data show: basic facility information (facility
name, location, etc.), whether each facility is subject to RMP regulation, the date an
accident was reported (if any), chemicals released, impact of the release (e.g., injuries
and evacuations), whether a Notice of Violation and/or penalties were issued as a
result of the accident, and trends in chemical accidents from FY 2010 onwards. In
addition, North Carolina provided a list of RMP regulated facilities in the state and a
list of the facilities inspected under the state's RMP program starting in FY 2010. The
North Carolina data identify high risk facilities, in contrast to ICIS and RMP Info,
which do not identify high risk facilities. However, in other respects, the North
Carolina data face many of the same limitations as the national RMP Program data.
For example, most facilities have only received one inspection, and the data do not
provide clear indicators of changes in facility behavior.
In addition to the information described above, the North Carolina dataset includes
the ratio of accidents by NAICS code for FY 2011, calculated as the number of
accidents divided by the total number of RMP facilities in the specified NAICS code.
This analysis could potentially be used to control for the effects of industry sector on
accident history.
• Florida. The State of Florida provided annual inspection schedules (Excel files), an
inspection map (PDF file), annual reports, and inspection and facility data (Microsoft
Access database). Of these, the database appears the most relevant for addressing our
evaluation questions. The database contains RMP inspection data and information
about regulated facilities (e.g., chemicals stored on site). The inspection data include:
facility name, inspection date, inspection reason, and inspection results. Additionally,
the inspection data indicate whether the inspected facility is a high risk facility. The
database was created in 2010 and contains all of the inspection data from that point
18
-------
onward. Some archived inspection information has been added to the database, but is
incomplete for all years prior to 2010. However, the data provided do not include
facilities' accident history.
In addition to the information in the database, Florida provided historical inspection
reports and follow-up reports for nine facilities that were inspected prior to 2010. The
reports contain information on the issues uncovered during the inspection, and
describe corrective actions taken by the facility to address the problems. This
information could be useful for conducting case studies of changes in facility
behavior following RMP inspections.
If EPA chooses to conduct an evaluation of the RMP Program, we could draw on
analytical approaches used by the states (e.g., analysis of accident history by NAICS
code), and/or supplement the national data with state-level data. However, we would not
be able to extrapolate from the North Carolina and Florida data to the national RMP
Program, due to differences across states and between the EPA-administered RMP
Program and state-delegated programs. Therefore, we would adopt a "case study"
approach, looking for insights for those specific states.
QUALITATIVE DATA
Qualitative data could play an important role in supplementing quantitative data and
providing additional insights for the RMP Program. Interviews and document reviews, in
particular, may provide insight into when and how RMP inspections influence facility
behavior. The Paperwork Reduction Act (PRA) requires obtaining an Information
Collection Request (ICR) from the Office of Management and Budget (OMB) when
asking the same question of more than nine non-federal entities; therefore, EPA would
either need to obtain an ICR or limit the number of interviews. In addition, reviewing a
representative sample of inspection reports from across a wide range of facilities and
Regions would require significant time and resources. It would also require relative
consistency in the way that different Regions record inspection results. As discussed
above, the OIG report identified areas for improvement regarding the consistency and
comprehensiveness of Regional inspection reports.
An alternative to conducting a representative sample of inspection reports would be to
conduct in-depth case studies of a smaller number of reports, supplemented with
interview data for the same facilities. This approach could provide useful and valuable
insights into the role of RMP inspections; however, we would not be able to generalize
the results of the case studies to the RMP Program as a whole.
19
-------
VI. EVALUABILITY ASSESSMENT
Section VI presents our evaluability assessment for Evaluation Questions 1 - 5.16 This
section is organized by evaluation question; for questions where we contemplate a mixed-
methods approach, the section is further divided into an evaluability assessment of
quantitative and qualitative approaches. To improve the flow of the document, this
section presents our high-level findings for the topic-level evaluation questions. The
crosswalk table in Appendix A breaks out our detailed findings by sub-question.
1. WHAT EFFECT, IF ANY, DO RMP INSPECTIONS HAVE ON FACILITY BEHAVIOR?
• Quantitative: Mostly not evaluable. Neither RMP Info nor ICIS provides reliable
indicators of changes in facility behavior as a result of inspections. RMP Info
presents changes in Risk Management Plans, but it is not possible to attribute these
changes to RMP inspections. The RMP database does not provide reliable or detailed
information about changes in facilities' safety policies and procedures, or the
adoption of safer technologies. We also have concerns about data quality, including
the self-reported nature of the data, potential underreporting, and the inability for
EPA to directly correct errors in RMP submissions. Also, it is at present possible to
link RMP Info and ICIS data for 80 percent of facilities in RMP Info; we would not
be able to conduct the analysis for the remaining 20 percent of RMP facilities.
ICIS contains inspection data and enforcement data, but in separate databases. In
some cases, inspections are linked directly to related enforcement actions, but the
linkages are not comprehensive; linking all of the inspection and enforcement data is
less direct and more prone to error. Importantly, even if we could determine that an
enforcement action resulted from an inspection, ICIS would not provide the detailed
inspection results or indicators of changes in facility behavior following the
inspection. Detailed inspection results are contained in paper files at EPA's Regional
Offices, but the quality of the reports is inconsistent, and obtaining and reviewing
these documents on a large scale would require significant time and resources.
The best way to identify changes in facility behavior is to conduct a follow-up
inspection. In general, this is not yet possible for the RMP Program; many facilities
16 Our evaluability assessment focuses on the national RMP Program. Most of the issues raised in
this section also apply to the data provided by North Carolina and Florida, but with two notable
exceptions: (1) Both states identify "high risk" facilities in their databases, and (2) Florida has rich
descriptive information on behavioral changes at facilities that have been inspected more than
once. We would not be able to extrapolate findings from North Carolina and Florida to the
national RMP Program, but we could potentially conduct case studies with the state-level data to
supplement national-level data.
20
-------
have never been inspected, and very few facilities have received more than one RMP
inspection.
• Qualitative: Evaluable. This question is evaluable with a case study approach.
Specifically, we may be able to interview inspectors and facility managers for
selected cases, and review the corresponding inspection reports for the selected
facilities, to develop an understanding of the role of inspections on facility behavior.
Case studies would provide insight into whether the program worked as intended in
those specific cases, why or why not, and the mediating factors that influenced
outcomes. However, we would not be able to generalize our case study findings to
the total population of RMP facilities. This analysis could be conducted for up to nine
non-federal facilities without obtaining an ICR; more than nine non-federal facilities
would require an ICR and additional resources.
2. WHAT EFFECT, IF ANY, DO RMP INSPECTIONS HAVE ON THE INCIDENCE AND
SEVERITY OF CHEMICAL ACCIDENTS AT RMP FACILITIES?
• Evaluable. RMP Info includes outcome data (accident histories) for facilities that
have and have not been inspected. By linking from the inspection data in ICIS to
RMP Info, we can separate out facilities that received an RMP inspection from those
that have not, and compare accident histories across the two groups. We can also
examine the accident histories at inspected facilities before and after an RMP
inspection was conducted. Although we would not be able to control for all important
confounding factors - and therefore would not be able to assert causality - this
approach would tell us whether a relationship exists between RMP inspections and
chemical accidents. A limitation is that the ICIS data only go back to FY 2007.
3. WHAT INDICATIONS EXIST, IF ANY, THAT RMP INSPECTIONS HAVE A DETERRENT
EFFECT ON RMP FACILITIES THAT HAVE NOT BEEN INSPECTED?
• Evaluable with qualitative methods. We could prepare case studies to: a) understand
the conditions under which inspections may affect behavior at uninspected facilities;
b) verify and document examples of general deterrence related to RMP inspections;
and c) assess channels of influence. We could ground truth our interview results in
the literature on general deterrence. This approach would be subject to the caveats
listed for Question 1 above - namely, a small sample size and lack of
generalizability.
4. WHAT EFFECT, IF ANY, HAS THE CHANGE IN RMP INSPECTION STRATEGY (TO
DESIGNATE SOME FACILITIES AS "HIGH RISK" AND TO DEVOTE MORE
INSPECTION RESOURCES TO HIGH-RISK FACILITIES) HAD ON FACILITY
BEHAVIOR AND THE INCIDENCE AND SEVERITY OF ACCIDENTS?
• Quantitative: Not evaluable. Assessing the effect of the RMP inspection strategy on
facility behavior is not possible, because RMP Info and ICIS do not provide reliable
indicators of changes in facility behavior. Moreover, longitudinal data on the effects
of the strategy on accident history are limited, given the short time frame that has
21
-------
passed since the strategy was implemented. The policy of targeting "high risk"
facilities began a few years ago, and EPA guidance in this area continues to evolve.
The situation is further complicated by differences in how the Regions have
implemented the policy as well as differences across sectors.
• Qualitative: Partly evaluable. If we could obtain the list of "high risk" facilities, we
might be able to employ a case study approach. However, lack of generalizability
may be even more of a limiting factor for Question 4 than for previous evaluation
questions. We would want to separate out "high risk" facilities from other facilities,
which would reduce the number of facilities in each category. In addition, the new
and evolving nature of the inspection strategy - and differences across Regions and
sectors - would further confound efforts to extrapolate lessons from the case studies
to the RMP Program as a whole.
5. BASED ON THE RESULTS OF QUESTION 4, SHOULD EPA CONSIDER REFINING ITS
APPROACH TO DEFINING "HIGH-RISK" FACILITIES? SHOULD EPA RECONSIDER
THE CURRENT ALLOCATION OF INSPECTION RESOURCES BETWEEN HIGH-RISK
FACILITIES VS. FACILITIES THAT ARE NOT JUDGED TO BE HIGH RISK?
• To be determined by EPA management. Question 5 flows directly from Question 4.
Because we determined that Question 4 is mostly not evaluable at the present time,
we would draw the same conclusion for Question 5. Ultimately, however, the
allocation of inspection resources across different types of facilities is an EPA
management decision. While a future evaluation might be able to inform
management's thinking on this matter, other factors - such as senior management
buy-in, resource availability, and potential sensitivities about specific risk-based
criteria - are also likely to play a role in the allocation of inspection resources.
22
-------
VII. PERFORMANCE DATA IMPROVEMENT PLAN
Having identified the data needs, and strengths and limitations of the existing data in the
previous sections, Section VII presents the Performance Data Improvement Plan. The
plan includes steps that EPA can take to improve the accessibility, quality, and usefulness
of outcome data relating to RMP inspections.
The results of the data scoping assessment suggest two broad categories of limitations in
the existing data. Each category has different causes and different potential solutions:
• Limitations based on data quality and accessibility. The first type of limitation
concerns the quality and accessibility of the data that EPA is already collecting. The
RMP Program is currently generating outcome data that could be used in an
evaluation, but the data are not fully accessible because of how the data are collected
or stored. For example, RMP Info contains the accident history of all RMP facilities
that have submitted a Risk Management Plan. In theory, EPA should be able to link
100 percent of the facilities in RMP Info to ICIS; separate out the facilities that have
received an RMP inspection from those that have not; and compare accident histories
across the two groups. However, the bridge table that links RMP Info and ICIS has
only been completed for 80 percent of the facilities in RMP Info. Adding more
facilities to the bridge table would allow for a more comprehensive analysis, by fully
using the data that EPA has already collected for these facilities. Similarly, it should
be possible to link all of the inspection records and enforcement records in ICIS, but
the structure of the ICIS database makes this difficult. Although there is no "quick
fix" for either of these issues, there are steps that EPA can take now to address these
challenges, without making any changes in the way it implements the RMP Program.
• Data limitations based on the characteristics of the program. The other type of
limitation stems from the nature of the RMP Program itself. For example, the absence
of follow-up inspection data poses a serious challenge, because the best indication of
changes in facility behavior after an inspection is the results of a follow-up
inspection. The lack of follow-up inspection data is a result of how the program is
implemented - namely, the designated inspection frequency. Given available
resources and guidance, EPA aims to inspect approximately five percent of RMP
facilities each year. At this rate, it would take about 20 years to inspect 100 percent of
facilities once - and even longer to inspect a significant portion of facilities twice.
We refer to this as a programmatic limitation because it stems from how the program
is implemented, not from the way the data are collected. In other words, even if EPA
overhauled its data systems and addressed all of the data quality issued noted
throughout this report, it would not change the fact that very few facilities have
received more than one RMP inspection. Later in this section, we propose a pilot
23
-------
study wherein EPA would inspect a relatively small (but statistically robust) subset of
RMP facilities at more frequent intervals. Such a study would begin to provide useful
longitudinal data within one to two years after implementation commenced.
The rest of Section VII is organized according to the two categories defined above: First,
we propose actions that EPA can take to address issues of data quality and accessibility.
Then, we propose a statistically valid pilot study to address data issues related to how the
program is implemented.
ACTION PLAN TO IMPROVE DATA QUALITY AND ACCESSIBILITY
This section addresses actions that EPA can take to improve data quality and
accessibility. We organize the discussion by indicator - Risk Management Plans,
inspections and enforcement, accident history, risk status, and general deterrence. For
each indicator, we briefly summarize the data needs and the limitations of the existing
data, and offer proposed action items to help EPA address the limitations.
A number of the action items would require coordinated action between OEM and OECA
- e.g., including more facilities in the bridge table between RMP Info and ICIS, tracking
behavioral indicators in the ICIS inspection database, and evaluating the results of EPA's
risk based inspection strategy for RMP facilities. Other action items would require
ongoing coordination between EPA Headquarters and Regional Offices - e.g., ensuring
cross-Regional consistency in inspection reports. With this in mind, we indicate the
primary responsible parties for addressing each action item.
Appendix B summarizes the action items in a crosswalk table.
RISK MANAGEMENT PLANS
Initially, the project team thought that changes in facility behavior would be reflected in
new or updated Risk Management Plans submitted after an RMP inspection. However,
our review of RMP Info raised a number of concerns, including: potential reporting
errors; potential underreporting or reporting bias; EPA's inability to correct mistakes that
it identifies in the data; attribution challenges relating to the amount of time that may
elapse between when an inspection takes place and when the facility (re)submits its Plan;
and lack of specificity about how the plans changed. Although some of these challenges
are beyond the scope of this Data Improvement Plan,17 we suggest that OEM consider
taking action in two areas: data quality assurance and indicators of behavioral change.
• Systematically review the RMP data, notify facilities about potential errors, and
track and verify that facilities make needed corrections. OEM is not able to
correct errors that it identifies in RMP Info; rather, the Office must notify the
reporting facilities and request them to make changes. This is occurring mostly on an
17 For example, addressing some of these issues would require changes in program requirements or
policies - e.g., increasing required filing frequencies (to shorten the time that elapses between an
inspection and the submission of an RMP) or increasing EPA's targeting of non-filers (to address
potential underreporting). These are policy decisions beyond the purview of this Data
Improvement Plan.
24
-------
ad hoc basis - e.g., when OEM personnel happen upon a suspicious data point.
Significant time may elapse between when OEM notices errors and notifies facilities,
and when facilities submit their corrected data. OEM should conduct a
comprehensive data quality review, assess the overall quality of the data, and make
note of any specific data points that appear to be wrong or questionable. The Office
should follow up with the facilities that reported the problematic data and record the
date when facilities submit corrected data. This would enhance the overall quality of
the data and separate out facilities whose data should not be used in an evaluation
given data quality concerns. OEM should repeat this exercise at regular intervals to
ensure that previously identified errors have been corrected, and to assure the
accuracy of newly submitted data.
• Include new data fields in RMP Info that can reliably capture changes in
behavior as reflected in Risk Management Plans. The scoping assessment
confirmed that RMP Info does not track reliable indicators of changes in facility
behavior. In addition to the data quality concerns noted in the previous bullet, we
found that most of the data fields in RMP Info (except accident history) are fairly
general and are expressed as dates rather than specific behaviors. For example, the
database includes the fields "Change Completion Date," "Most Recent Change
Date," and "Change Management Date."18 While all three fields appear to relate to
changes in facility behavior, none of them indicates what changes actually took place.
OEM should consider adding new data fields to RMP Info to understand what
actually changed at facilities. For answering the evaluation questions, it would be
useful to include fields relating to the behaviors specified in the logic model in
Section II, including: updated safety/maintenance/management procedures and
hazard analysis; improved public and worker safety and accident prevention/response
training; adoption of safer technologies; faster/more effective emergency response;
and reduction in quantities of regulated chemicals held on-site. These items could be
presented in a drop-down menu. Using a drop-down menu would reduce data entry
burden and error, and would facilitate aggregation and analysis of the data. Although
this would not prove that inspections caused the changes, it would give OEM a
clearer indication about changes that facilities have implemented.
INSPECTIONS AND ENFORCEMENT
The project team initially thought that inspection results would provide information about
the type and severity of regulatory violations, if any, discovered during an RMP
inspection. We hoped that for inspections resulting in an enforcement action, the
enforcement data would indicate how facilities changed their behavior to settle the
enforcement case. Linking inspection findings with enforcement data was therefore
expected to show how non-compliant facilities changed their behavior in response to an
RMP inspection. However, we encountered a number of limitations with the data,
18 These fields are defined, respectively, as the expected or actual date of completion of all
changes resulting from a compliance audit; the date of the most recent change that triggered
review or revision of safety information, hazard review, operating or maintenance procedures, or
training; and the date of the most recent change that triggered management of change procedures.
25
-------
including: lack of consistency in how inspections are conducted and recorded; lack of
specific information about the nature and severity of violations; lack of information on
how facilities changed their behavior to settle an enforcement case; limited longitudinal
data in ICIS (only goes back to FY 2007); difficulties linking between ICIS inspection
and enforcement data; and the fact that very few facilities have any follow-up inspection
data. The action items in this section address all but the latter, which is addressed in the
section below on conducting a pilot study.
EPA Headquarters can take a number of steps to improve the quality and accessibility of
the inspection and enforcement data:
• Continue to work with the Regions to ensure the quality and consistency of
inspection data. A recent OIG report found inconsistencies in the quality of
inspections and inspection reports.19 OEM confirmed that although Headquarters has
a standard template for RMP inspection reports, no Region is currently following the
template; each Region records inspection results somewhat differently. At the same
time, lEc's review of a small number of inspection reports provided by OEM
confirmed that inspection reports certainly have the potential to convey useful
information about the type and severity of violations uncovered during RMP
inspections. OEM and OECA should continue to work with the Regional Offices to
ensure that inspection results are recorded consistently and comprehensively.
• Add new fields to the ICIS inspections database to capture additional
information about inspection results. As EPA works to improve the consistency
and quality of the hard-copy inspection reports, it should include additional data
fields in ICIS to ensure the information is captured systematically. Fields should
include: inspection activities conducted, specific violations discovered, severity level
of each violation, whether compliance assistance was provided, and any action taken
by the facility to come back into compliance during the on-site visit. In addition, EPA
should consider including selected items from the Risk Management Program
Inspection Checklist, including items in the following areas:20 hazard assessment,
Program 2 Prevention Program, Program 3 Prevention Program, and emergency
response program. Following the Inspection Checklist, the data fields should allow
one of the following responses for each item: "yes," "no," "partial," or "not
applicable." Capturing the data electronically in this way would greatly improve the
accessibility of the data by storing all pertinent information in a central database
instead of in paper form at the Regional Offices.
• Add new fields to the ICIS enforcement database to describe actions (beyond
settlement value) that facilities take to settle enforcement cases. The ICIS
enforcement database currently includes dates and settlement values, but does not
explain what specific actions (beyond paying a penalty) the facilities took to settle the
19 U.S. EPA Office of Inspector General, Improvements Needed in EPA Training and Oversight
for Risk Management Program Inspections, Report No. 13-P-0178, March 21, 2013.
20 Guidance for Conducting Risk Management Program Inspections under Clean Air Act Section
112(r), Annex D: Inspection Checklist.
26
-------
case. If we had more detailed information on how facilities settled their enforcement
cases, we would have a better understanding of how these facilities changed their
behavior in response to alleged RMP violations and enforcement actions. This could
be accomplished by adding a drop-down menu with categories of behavioral changes,
such as: new/improved technology; updated policies/procedures; updated training;
etc. This would provide useful information for understanding behavioral changes at
the facilities without disclosing proprietary or sensitive information.
Take steps to ensure that inspections are properly linked to resulting
enforcement actions. We want to use enforcement actions (and the resulting
settlement) as a proxy for changes in behavior resulting from RMP inspections.
However, inspection and enforcement data exist in two different universes within the
ICIS database. As a result, inspections are not automatically associated with the
enforcement cases they triggered. The ICIS inspection database includes an
Enforcement Action ID that associates inspections with their resulting enforcement
actions. However, this field is not populated for all inspections that found violations
and presumably triggered enforcement actions. As a result, using the Enforcement
Action ID to link inspection and enforcement records yields results that are accurate
but incomplete. The alternative is to link on the FRS ID, which yields a larger
number of records, but some subset of the linked records includes inspections and
enforcements that should not be linked for purposes of our analysis.21
EPA should conduct a data quality review of the existing records. Starting with the
dataset that was linked on the FRS ID (the more comprehensive method of linking
the data), EPA should review the data and screen out matches that should not be
included. Each linked pair of inspection and enforcement actions should fall into one
of three categories: (1) the match includes an underlying Enforcement Action ID and
should be included; (2) the match does not include an Enforcement Action ID, and
the enforcement action precedes the inspection, and therefore the match should be
excluded; or (3) the match does not include an Enforcement Action ID, but the
inspection comes before the enforcement action, and therefore it is plausible (but not
certain) that the match should be included. EPA should focus on facilities in category
(3) and look for any indicators that confirm or refute the hypothesis that the
inspection triggered the enforcement action. If EPA determines that the inspection
did trigger the enforcement action, it should ensure that the Enforcement Action ID is
oo ?
populated.
To avoid similar confusion in the future, EPA should enhance ICIS to increase the
likelihood that users will populate the Enforcement Action ID. We suspect one of the
reasons why more users are not currently using the field is that it is only available in
the inspections database, but not in the enforcement database. Therefore, if an
inspection triggers an enforcement action a year later, the user not only needs to enter
21 For example, some facilities have an enforcement action date before their RMP inspection date.
Because our analysis aims to use enforcement actions (and the resulting settlement) as a proxy for
changes in behavior resulting from RMP inspections, we would want to exclude this type of match
from our analysis.
27
-------
the data in the enforcement database, but must also remember to go back to the
inspection database to populate the Enforcement Action ID. Including the
Enforcement Action ID (or an alert/reminder to fill it in) in the enforcement database
would help ensure that users remember to populate the Enforcement Action ID.
• Backfill the ICIS database using pre-FY 2007 inspection reports. Currently, the
ICIS data only go back to FY 2007, which limits us to about six years of longitudinal
inspection data. However, EPA has hard-copy inspection reports that precede FY
2007. After making any desired changes to the ICIS database (see previous bullets),
EPA should consider backfilling the ICIS inspection data using the information in the
hard-copy reports. This would be similar to Florida's efforts to backfill the data in its
state RMP database. Although inconsistencies in the written record may not allow for
a comprehensive backfilling of all pre-FY 2007 data, this exercise would provide at
least some additional data points for conducting a longitudinal analysis.
ACCIDENT HISTORY
EPA would like to understand the effects of inspections on the incidence and severity of
chemical accidents. This type of analysis requires reliable accident history data, and the
ability to associate a facility's inspection history with its accident history. Both of these
requirements can be met to a large extent with the existing data; however, EPA could take
further steps to enhance the reliability of the data and the connections across databases:
• Expand the bridge table between RMP Info and ICIS to include more facilities.
Accident history and inspection history are stored in separate databases (RMP Info
and ICIS, respectively). Each database assigns a unique identifier to a facility, but the
identifiers are not the same across the two databases. Therefore, EPA needs to link
each facility's accident history ID to the facility's inspection ID. To date, EPA has
created a "bridge table" that links about 80 percent of facilities in RMP Info to ICIS.
OEM and OECA should continue their efforts to expand the bridge table, to add in
the 20 percent of facilities that cannot currently be linked. Consider using Global
Positioning System (GPS) technology to identify facilities in both databases.22 Once
the bridge table is complete (or as complete as possible), consider adding the FRS ID
- the facility identifier used in ICIS - directly to the RMP database. This would
accelerate the process of linking facilities across databases, and would spare EPA
from having to update the bridge table when new facilities are added to RMP Info.
• Review accident history and inspection data to establish the chronology of
accidents and inspections and to account for duplicate accident records. Several
factors complicate the analysis of accident history. First, inspections may increase the
likelihood of facilities reporting an accident without changing the underlying
probability of having an accident. Second, because facilities have up to six months to
report an accident, accidents reported shortly after an inspection may have occurred
before the inspection took place. Third, facilities can make minor changes to their
RMP Info contains facilities' longitude and latitude. ICIS contains the street address, city, state,
and zip code. Using GIS software would enable EPA to link based on geography. As a first step,
EPA should verify the accuracy of the longitude and latitude coordinates in RMP Info.
28
-------
accident reports (e.g., time of the accident) without resubmitting their entire RMP;
each change results in a new accident record, even though it relates to the same
accident. The first limitation (likelihood of reporting an accident versus having an
accident) should be acknowledged, but cannot be "fixed" using the existing data.
However, OEM can take steps to address the second and third issues. Regarding
whether an accident occurred before or after an RMP inspection, EPA should check
the date when the accident occurred (not the date when the accident was reported),
and compare the date of the accident (in RMP Info) to the date of the inspection (in
ICIS). As a second check, EPA should review the "Compliance Monitoring Action
Reason" given for the inspection in ICIS; this should indicate whether the inspection
resulted from a chemical spill or was conducted as part of the standard inspection
schedule. To account for duplicate records, OEM should carefully review the
accident history for all facilities and flag the duplicates; the duplicate entries should
be set aside when analyzing the total number of accidents and consequences of those
accidents (e.g., number of workers injured).
RISK STATUS
To assess the effects of the inspection approach that designates some facilities as "high
risk," EPA should compare high risk facilities to facilities that were not designated as
high risk. This requires EPA to know each facility's risk status, and whether/how high
risk facilities were targeted for inspection. A facility might appear on the list of high risk
facilities, but not have been inspected in a given year. Moreover, the nature and intensity
of inspections conducted at high risk facilities differ based on the Region, sector, and date
of the inspection. For example, some facilities have been informed by EPA inspectors
that they have been designated as "high risk" facilities, while other facilities have not
been informed of their status. As discussed in Section II, the program logic assumes that
high risk facilities respond differently if they know their status. Therefore, EPA needs to
understand how different Regions have applied the risk targeting strategy, which facilities
have been inspected under the strategy, whether or not they were informed of their status,
and what type of inspection they received.
• Develop a clearer understanding of how different Regions have implemented the
inspection strategy, by sector, and how their approach has evolved. The
inspection strategy was adopted by different Regions at different times, and it
continues to evolve. EPA Headquarters should collect and analyze information from
each of the ten Regional Offices on when and how each Region adopted the strategy,
and any changes the Region has made since first adopting the strategy. Consider
asking each Regional Office to create a timeline with the date when the strategy was
adopted for each sector, dates when the strategy changed, and a brief narrative of how
the strategy has evolved. This would enhance EPA Headquarters' understanding of
how Regions have implemented the national guidance, and would provide a baseline
for assessing the effects of the strategy.
• Verify that the list of "high risk" facilities can be linked to RMP Info and ICIS.
lEc did not review the list of high risk facilities, which is considered confidential and
is not released to the public for security reasons. According to OEM, facilities on the
29
-------
high priority list can be linked back to RMP Info using the EPA Facility ID. OEM
indicated that linking to ICIS is more challenging, but is done every year when
tallying up inspections. OEM should verify that it can link from the list of high risk
facilities to their associated accident history and inspection data. In addition, EPA
should consider adding a new data field to RMP Info to designate the risk status of
each facility. This could be a "hidden" data field, which would only be available to
EPA users with internal access to the system. At least two states - North Carolina and
Florida - designate high risk facilities in their databases.
GENERAL DETERRENCE
As posited in the logic model in Section II, the program theory assumes that RMP
inspections have an effect both on inspected facilities (referred to in the literature as
"specific deterrence") and facilities that are not inspected ("general deterrence"). In the
latter case, uninspected facilities may strengthen their knowledge and behavior pertaining
to RMP requirements in response to the credible threat of being inspected in the future,
and/or based on knowledge gleaned from media reports, neighboring facilities, their
"parent" company, or other industry contacts. While EPA has anecdotal reports of general
deterrence at RMP facilities, it has not studied the effects of general deterrence in the
RMP Program in a systematic way. Such a study would be qualitative, and would allow
OSWER and OECA to document tangible examples of general deterrence, thereby
demonstrating the program's influence beyond the relatively small number of facilities
that are inspected each year. Furthermore, the study would help EPA understand the
mediating factors that determine whether and how general deterrence operates in various
sectors and for different types of facilities. This knowledge may be useful for refining
EPA's inspection strategies and maximizing the impact of limited inspection resources.
• Conduct a study on general deterrence for the RMP Program. The study should
include a literature review and interviews with inspectors and facilities. The literature
review should build on analysis that OECA has conducted in recent years,23 but
tailored to the circumstances of the RMP Program. The interviews should solicit
expert opinion on when general deterrence is most effective, and should explore
actual examples of general deterrence at RMP facilities. Consider preparing short
cases studies for a variety of facilities that demonstrated general deterrence, to better
understand and illustrate the conditions under which general deterrence occurs in the
RMP Program. EPA should "ground truth" the case study findings in the academic
literature, and vice versa. Note that interviewing more than nine non-federal entities
would require an ICR.
PILOT STUDY TO ADDRESS DATA ISSUES RELATING TO PROGRAM IMPLEMENTATION
As discussed at the beginning of Section VII, some of the most challenging data issues
for the RMP Program are the result of how the program was implemented, rather than
23 OECA's Research Literature website compiles existing research on this topic under the heading
"Understanding and Measuring Specific and General Deterrence."
http://epa.gov/oecaerth/resources/reports/compliance/research/
30
-------
how data were collected. In particular, the lack of follow-up inspections and the relatively
small percentage of facilities that are inspected every year (five percent of the universe)
create a serious evaluability limitation, because the best way to assess the results of an
inspection is to examine data from a follow-up inspection. At this point, too few facilities
have received follow-up inspections to employ this strategy in a meaningful way.
Similarly, assessing the effects of the strategy of designating some facilities as "high risk"
is hindered by the new and evolving nature of the strategy, and the even smaller number
of high risk facilities that are inspected each year. Under the current inspection
frequencies, it would take many years before EPA would have sufficient longitudinal data
- including follow-up inspection data and post-inspection accident history data for high
risk facilities - to conduct these key analyses. However, EPA may not want to wait years
or even decades to evaluate the program.
In this section, we raise the idea of conducting a statistically valid pilot study that would
generate longitudinal data in a shorter timeframe by targeting a subset of facilities for
more frequent inspections. Targeting a subset of facilities for more frequent inspection
over a period of two or three years would generate longitudinal inspection data, showing
how facilities changed their behavior after receiving two or three RMP inspections.
The strategy of targeting a subset of facilities for repeated inspections is different from
the inspection strategy currently used by the RMP Program. By increasing the frequency
at which facilities in this targeted group are inspected, the context under which
inspections are conducted for those facilities would be different than the general RMP
facility population (i.e., those facilities would face different threats of inspection and
possibly behave differently than under the "usual" RMP inspection program). It would
therefore be difficult to extrapolate findings from the pilot study to the general facility
population because the pilot study target group and the general facility population group
would face different threats of inspection. However, the approach suggested below would
generate data to help answer the question of how inspections affect facility behavior and
chemical safety.
Designing the methodology for a statistically valid pilot study is well beyond the scope of
this Data Improvement Plan; however, below we raise some topics and questions for
EPA's consideration.
QUESTIONS AND CONSIDERATIONS FOR A STATISTICALLY-VALID PILOT STUDY
Conducting a statistically valid pilot study with a subset of facilities would generate data
on the results of follow-up inspections and would facilitate comparing targeted facilities
to non-targeted facilities. However, such a study would require significant effort and
resources. Before deciding whether to undertake this type of study, EPA should consider
the following issues:
• What does EPA hope to learn from this study? The answer to this question would
inform the study design. For example, if EPA is mostly interested in collecting
longitudinal data, it may want to randomly assign RMP facilities to the target group.
However, if EPA's primary goal is to test the effects of its risk-based inspection
strategy, it should be sure to include high risk facilities in the target group.
31
-------
Furthermore, simply knowing that they are in the target group may change facility
behavior, compared to a "typical" facility that is not anticipating a follow-up
inspection in the foreseeable future. Therefore, EPA would need to decide whether it
should disclose each facility's status to the facility. Defining the purpose of the study
and developing a clear hypothesis upfront would help answer the remaining questions
in this section.
How should facilities be assigned to the target group? As discussed in the previous
bullet, EPA could either select facilities at random (high risk and other facilities), or
it could assign only high risk facilities to the target group. Another option would be a
hybrid approach that stratifies the target group by high risk and other facilities, or
sub-divides the group into two sub-groups: targeted high risk facilities and other
targeted facilities. Related questions include:
o Should EPA select facilities for the target group that have already had at least one
RMP inspection to fully leverage existing inspection data or should it start fresh
with facilities that have never been inspected?
o Should selected facilities be informed of their status, or even be told the fact that
they have been selected to participate in a pilot study?
o Should the sample be stratified by Region, state, sector, facility size, and/or
accident history?
o Should EPA focus on a single Region for this pilot study - and, if so, which
Region?
What is the right sample size? The sample size should be manageable given
available resources, but large enough to detect differences between groups. Whether
the sample is "large enough" depends, in part, on the magnitude of the difference
between groups. However, this would be difficult to predict in advance. It would also
depend on how many sub-groups (e.g., sector, state, etc.) EPA wants to include when
drawing the sample. After determining the basic study objectives, EPA should work
closely with its evaluation experts (e.g., CPA) and statistical experts to design the
sample frame and determine the appropriate sample size.
How would EPA collect information for facilities outside of the target group?
This group would include facilities that receive a regular inspection frequency. By
definition, their inspection frequencies would be limited, and longitudinal data may
not be available for these facilities. EPA should decide if it is willing to accept this
limitation, or if it would try to obtain data for facilities outside the target group. One
option would be to send Information Request Letters (IRLs) to all RMP-registered
facilities asking them a series of questions that aim to determine their compliance
with RMP requirements. Note that an ICR would be required to ask the same
questions of more than nine non-federal entities.
How would EPA control for confounding factors? As discussed throughout this
report, myriad factors beyond RMP inspections may affect compliance at RMP
facilities. EPA should identify what it considers to be the most serious confounding
32
-------
factors, and should take steps to control for these factors to the extent possible and
feasible. Although EPA does not have data for all confounding factors, it does have
data for industry sector (NAICS) and other facility information that may influence a
facility's compliance with RMP regulations. Consider running a regression analysis
using these factors as control variables.
Would EPA supplement the statistical analysis with qualitative data? EPA
should consider supplementing the results by conducting interviews with selected
facilities and inspectors. Interview data would help EPA interpret the findings and tell
the story behind the numbers. Conducting interviews with more than nine non-federal
facilities would require an ICR. Another option, which would not require an ICR,
would be to choose fewer than nine facilities in a very selective way - for example,
talk to facilities that appear to be "skewing" the results of the analysis to gain insight
into what is happening at those facilities.
33
-------
APPENDIX A. CROSSWALK OF EVALUATION QUESTIONS, DATA SOURCES, AND POTENTIAL METHODS
EVALUATION
QUESTION
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
DATA NEEDED TO
ANSWER THE
QUESTION
POSSIBLE DATA
SOURCES
DATA USES
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
1. WHAT EFFECT, IF ANY, DO RMP INSPECTIONS HAVE ON FACILITY BEHAVIOR?
A. Do facilities
re-submit
(update) their
Risk
Management
Plan following
an RMP
inspection? If
yes, what
changes do
they make?
• Option 1: For
facilities that
submitted an RMP
before and after an
inspection,
document changes
in the RMP
• Option 2: Conduct
interviews with
facility owners
and/or inspectors
to understand what
changes facilities
make, and why
Option 1:
. Date of RMP
inspection
. Dates of any/all RMP
submissions by the
inspected facility
• Content of each RMP
Option 2:
• Interview data
Option 1:
. RMP Info
. ICIS
Option 2:
• Interviews
Option 1:
• Link inspection and
RMP Info data using
the "bridge table"/
FRS numbers (this
has been done for
-80% of RMP
facilities) to identify
date of RMP
inspection
• Compare the
inspection date to
the date of the most
recently submitted
RMP
• Review RMPs
("before" and
"after" the RMP
inspection) to
identify changes in
content
Option 2:
• Summarize/
synthesize interview
data; develop case
studies on the
effects of
inspections on
facility behavior
Option 1:
• The standardized data fields in
RMP Info are not reliable
indicators of changes in facility
behavior. The fields do not
need to be updated following
an inspection. Conversely,
facilities may update the fields
even if they are not inspected.
• The data fields reveal the date
when an event occurred, but
does not describe the specific
changes that were made.
• More information may be
available in the RMP document
itself, but would require
resource-intensive review;
sampling would be required.
• Not possible to link inspection
data and RMP data for -20% of
facilities
Option 2:
• Unable to generalize case study
results to the full population
• ICR required to speak with more
than 9 non-federal facilities or
inspectors
General limitations:
• The best indicator of changes in
facility behavior is reports from
follow-up inspections; however,
most RMP facilities that have
been inspected, have only been
Option 1 : Mostly not
evaluable. Can compare the
inspection date to the date of
the most recent RMP, but this
would not tell us anything
about changes in behavior. RMP
Info does not provide reliable
information on changes in
facility behavior following an
inspection. More detailed
information may be available in
the text of the RMP Plan, but
this would require resource-
intensive document review and
would likely need to be case-
based rather than a
representative sample.
Option 2: Evaluable. Would
need to obtain buy-in from
Regions and inspectors, and
identify inspectors and facility
managers who are willing to
speak with us. Given the small
sample and potential selection
bias (e.g., only "good actors"
may want to talk to us), results
would not be generalizable.
However, this approach could
provide meaningful insight into
program dynamics and
pathways of influence.
Note: Because we cannot
control for other types of
inspections or the quality of
inspections and inspectors, we
would not be able to prove
causality (i.e., we could not
say that the RMP Program
34
-------
EVALUATION
QUESTION
B. How, if at
all, do facilities
change their
safety
practices and
procedures
following an
RMP
inspection?
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
• Option 1: For
facilities that
submitted an RMP
before and after an
inspection, review
the pre-inspection
and post-inspection
RMP to identify
changes in
procedures and
n rs f fri f ^Q
LJI dL.LIL.CTJ
• Option 2: Review
inspection results
and enforcement
actions to
understand what
issues were
identified and
resolved as a result
of RMP inspections
• Option 3: Conduct
interviews with
facility owners
and/or inspectors
to understand what
changes facilities
make, and why
DATA NEEDED TO
ANSWER THE
QUESTION
Option 1:
. Date of RMP
inspection
• Safety practices/
procedures as
documented in the
RMP before the
inspection was
conducted
• Safety practices/
procedures as
documented in the
RMP after the
inspection was
conducted
Option 2:
• Issues identified
during RMP
inspections
• Settlements of
enforcement actions
Option 3:
• Interview data
POSSIBLE DATA
SOURCES
Option 1:
. RMP Info
. ICIS
Option 2:
• Inspection
reports
• Enforcement
reports
• State databases
Option 3:
• Interviews
DATA USES
Option 1:
• Link inspection and
RMP Info data using
the "bridge table"
• Review changes in
policies and
procedures in RMPs
submitted before
and after inspection
Option 2:
• Review findings of
violations (number
and type)
• Analyze enforcement
data to understand
the nature and
severity of violations
that were settled
(may serve as a
rough proxy for
changes in behavior)
Option 3:
• Summarize/
synthesize interview
data; develop case
studies on the
effects of
inspections on
facility behavior
DATA LIMITATIONS AND
CONFOUNDING FACTORS
inspected once - therefore, no
follow-up inspection results
• Unable to control for the
effects of other types of
inspections at RMP facilities
(e.g., OSHA)
• Unable to control for
inspector/inspection quality
Option 1:
• RMP Info is not a reliable data
source for understanding
changes in facility behavior
following inspection (see above)
Option 2:
• Lack of follow-up RMP
inspection data (see above)
• ICIS data only goes back to FY
2007, effectively limiting our
study period to 6 years if we
conduct an evaluation in 2013
• Limitations linking ICIS
inspection and enforcement
data
• Enforcement data provide some
sense of the magnitude of the
violation (e.g., settlement
amount), but do not provide
detailed information on changes
in facility behavior
• Inspection reports provide
additional details, but are in
hard copy only, and would need
to be obtained from the
Regions. Quality of the reports
is inconsistent. Resource and
logistical constraints would
require sampling
EVALUABILITY
ASSESSMENT
caused these changes to
occur). Whether this is an
acceptable limitation depends
on the purpose of the
evaluation and the required
threshold of evidence.
Option 1 : Not evaluable (see
above)
Option 2: Mostly not
evaluable. Enforcement data
(e.g., settlement date and
amount) may be a rough proxy
for changes in behavior.
However, the ICIS data that I EC
has received do not include
information about the specific
nature of the violation or
specific actions taken by the
facility to address the violation.
Inspection reports contain
additional details, but quality
of reports is even, and resource
and logistical constraints would
require sampling
Option 3: Evaluable.
Note: See above for general
limitations.
35
-------
EVALUATION
QUESTION
C. Do facilities
adopt safer
technologies
and/or reduce
the quantity of
regulated
chemicals held
on-site
following an
RMP
inspection? If
yes, what
changes do
they make?
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
• Option 1: For
facilities that
submitted an RMP
before and after an
inspection, review
the pre-inspection
and post-inspection
RMP to identify
adoption of new
technologies and/or
changes in the
quantity of
regulated chemicals
held on-site
• Option 2: Review
inspection results
and enforcement
actions
• Option 3: Conduct
interviews with
facility owners
and/or inspectors
to understand what
changes facilities
make, and why
DATA NEEDED TO
ANSWER THE
QUESTION
Option 1:
. Date of RMP
inspection
• Quantity of
chemicals held on-
site before the
inspection was
conducted
• Quantity of
chemicals held on-
site after the
inspection was
conducted
• Technologies used
before the
inspection as
conducted
• Technologies used
after the inspection
was conducted
Option 2:
• May be able to glean
information from
inspection and
enforcement data
Option 3:
• Interview data
POSSIBLE DATA
SOURCES
Option 1:
. RMP Info
. ICIS
Option 2:
• Inspection data
• Enforcement
data
. State
databases?
*~v_ij__ -5 .
Option J:
• It
*
DATA USES
• See above
DATA LIMITATIONS AND
CONFOUNDING FACTORS
• Unable to generalize findings
from delegated states to the
national RMP Program
Option 3:
• See above for interview-related
caveats (non-generalizable,
resource-intensive) and
confounding factors
• See above
• May be difficult to characterize
technologies used by a facility
before and after inspections
were conducted
EVALUABILITY
ASSESSMENT
Option 1 : Not evaluable - see
above.
Option 2: Mostly not
evaluable - see above.
Option 3: Evaluable - see
above.
36
-------
EVALUATION
QUESTION
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
DATA NEEDED TO
ANSWER THE
QUESTION
POSSIBLE DATA
SOURCES
DATA USES
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
2. WHAT EFFECT, IF ANY, DO RMP INSPECTIONS HAVE ON THE INCIDENCE AND SEVERITY OF CHEMICAL ACCIDENTS AT RMP FACILITIES?
A. What
portion of
facilities that
have received
an RMP
inspection
report a
chemical
accident within
two years
following the
inspection?
B. How do the
incidence and
severity of
reported
accidents at
inspected
• Calculate the
portion of
inspected facilities
that report an
accident following
an RMP inspection
• Compare the
portion of
inspected facilities
reporting an
accident (post-
inspection) to the
• Total number of
inspected facilities
• Number of inspected
facilities reporting
an accident
. Date of RMP
inspection
• Accident date
. Date of RMP
inspection (for
inspected facilities
only)
• Incidence and
. RMP Info
(accident
history)
• ICIS inspection
data
• State databases
. RMP Info
(accident
history)
• ICIS inspection
data
• Identify all facilities
that received an RMP
inspection (ICIS)
• Link inspection and
RMP Info data using
the "bridge table"
• Query the accident
history of inspected
facilities (RMP Info)
• Verify that the date
of the accident was
after the RMP
inspection
• Divide the number of
inspected facilities
reporting an
accident (post-
inspection) by the
total number of
inspected facilities
• Link RMP inspection
and accident history
data in 2A
• Divide the number of
inspected facilities
• Not possible to link inspection
data and RMP data for -20% of
facilities
• Inspections may increase the
likelihood of facilities reporting
an accident. May not be able to
discern if trends in accident
data are due to actual
accidents vs. changes in
reporting
• Facilities have up to 6 months
to report an accident. If an
inspection falls within that 6-
month window, it may be
reported after the inspection
even if it happened before the
inspection
• RMP Info may record multiple
entries for an "accident" each
time facilities update minor
details about the accident
(e.g., the time of day the
accident occurred). Need to
manually review the records to
ensure that we are not double
counting
• Unable to control for
confounding factors, e.g. other
types of inspections, and
inspector/inspection quality
Unable to generalize findings
from delegated states to the
national RMP Program
• Same as 2A above
Evaluable, but the need to
correct for reporting anomalies
and potential double counting
(see previous column) may limit
the number of facilities that we
could review.
Note: confounding factors - see
above; also, would not be able
to control for the effects of
inspections on reporting
behavior (as opposed to actual
accidents)
Evaluable, given the caveats
noted above. May want to
select a cutoff date for our
analysis (e.g., limit the "look-
back period" for uninspected
37
-------
EVALUATION
QUESTION
facilities (post-
inspection)
compare to the
accident
history at RMP
facilities that
have never
been
inspected?
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
portion of
uninspected
facilities reporting
an accident
• Weight results by
severity
DATA NEEDED TO
ANSWER THE
QUESTION
severity of chemical
accidents at
inspected facilities
(post-inspection)
• Incidence and
severity of chemical
accidents at
uninspected
facilities
POSSIBLE DATA
SOURCES
• State databases
DATA USES
reporting an
accident (post-
inspection) by the
total number of
inspected facilities
• Divide the number of
uninspected
facilities reporting
an accident by the
total number of
uninspected
facilities
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
facilities to the past 2 years).
3. WHAT INDICATIONS EXIST, IF ANY, THAT RMP INSPECTIONS HAVE A DETERRENT EFFECT ON RMP FACILITIES THAT HAVE NOT BEEN INSPECTED?
Option 1:
• Interview facilities,
trade association
representatives,
and/or inspectors
to obtain insights
into the effect of
inspections on
uninspected
facilities
Option 2:
• Review the
literature on
general deterrence,
and attempt to
apply lessons to
RMP inspections
Option 1 :
• Interview data
Option 2:
• Journal articles,
previous evaluations,
etc.
Option 1:
• Interviews
Option 2:
• OECA and other
literature on
general
deterrence
• Previous
evaluations of
RMP Program
and/or other
inspection
programs
Option 1:
• Prepare case studies
of selected facilities
to understand: a)
the conditions under
which inspections
may affect
uninspected
facilities; b) verify/
document examples
of general
deterrence related
to RMP inspections;
and c) assess
channels of
influence
Option 2:
• Summarize the
current literature on
inspections and
general deterrence,
and its relevance to
the RMP Program
Option 1:
• Qualitative, case-based
approach -> limited sample
size, cannot generalize findings
• Resource and logistical
constraints for scheduling
interviews
• ICR requirements (number of
interviews) - would need to
obtain an ICR or limit the
number of interviews with non-
federal facilities to 9 or fewer
Option 2:
• May not be any literature on
general deterrence specific to
the RMP Program; may be
difficult to tailor findings to the
program
Option 1: Evaluable.
Option 2: Evaluable if we can
"ground truth" findings with
RMP facilities; suggest
combining with Option 1 .
Note: Both options are
qualitative; this question is not
evaluable using a quantitative
approach.
38
-------
EVALUATION
QUESTION
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
DATA NEEDED TO
ANSWER THE
QUESTION
POSSIBLE DATA
SOURCES
DATA USES
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
4. WHAT EFFECT, IF ANY, HAS THE CHANGE IN RMP INSPECTION STRATEGY (TO DESIGNATE SOME FACILITIES AS "HIGH RISK" AND TO DEVOTE MORE
INSPECTION RESOURCES TO HIGH-RISK FACILITIES) HAD ON FACILITY BEHAVIOR AND THE INCIDENCE AND SEVERITY OF ACCIDENTS?
A. How, if at
all, do
inspection
results (i.e.,
the number
and type of
violations, and
enforcement
actions) differ
between high-
risk facilities
compared to
facilities that
are not judged
to be high risk?
Option 1:
• Analyze inspection
and enforcement
data, comparing
high-risk facilities
to facilities not
judged to be high
risk
Option 2:
• Conduct interviews
with inspectors
Option 1:
• Facility risk
classification
• Inspection data for
high-risk facilities
• Inspection data for
facilities not judged
to be high risk
• Enforcement data
for high-risk
facilities
• Enforcement data
for facilities not
judged to be high
risk
Option 2:
• Interview data
Option 1:
• List of high-risk
facilities
. ICIS
• Inspection
reports
• State databases
Option 2:
• Interviews
Option 1:
• Link inspection and
enforcement data in
ICIS
• Analyze enforcement
data to understand
the nature and
severity of violations
that were settled
• Review inspection
reports and describe
inspection results
• Compare results for
high-risk facilities
and facilities not
judged to be high
risk
Option 2:
• Summarize interview
data
Option 1:
. RMP Info and ICIS do not track
risk status. The list of high-risk
facilities is considered sensitive
and is not publicly available,
but could be shared with an EPA
contractor after the contractor
obtains OCA clearance.
• Difficult to link inspection and
enforcement data in ICIS
• Difficult to control for
differences between high-risk
and other facilities that may be
correlated with inspection
results (e.g., accident history)
• Enforcement data provide some
sense of the magnitude of
violations (e.g., settlement
amount), but limited detail
• Inspection reports in hard copy
only, would need to be
obtained from Regions. Uneven
quality. Would require sampling
• One state database specifies
high-risk facilities and facilities
not judged to be high risk;
however, not generalizable
Option 2:
• Interview-related limitations
(not generalizable, response
bias, resource constraints, ICR
requirements)
Option 1: Not evaluable. The
"high risk" designation was
adopted by different
Regions/sectors at different
times, starting a few years ago,
and the strategy continues to
evolve. The list of high-risk
facilities is not public, but
could be shared with an EPA
contractor after the contractor
obtains OCA clearance.
One state database (FL)
specifies risk designation; may
be able to conduct limited
analysis for this state (case
study approach)
Option 2: Partly evaluable, if
we know which facilities were
targeted as "hi$h risk." Small
sample size may be even more
of a limiting factor for this
question than for other
questions, since we are trying
to compare two different types
of facilities (high risk vs.
other), and cannot hold other
factors constant due to
differences in how and when
Regions adopted the strategy
39
-------
EVALUATION
QUESTION
B. How, if at
all, have
inspection
results changed
overall and by
type of facility
(high-risk
facilities vs.
facilities that
are not judged
to be high risk)
since the
strategy was
adopted?
C. How, if at
all, do the
incidence and
severity of
chemical
accidents
(post-
inspection)
vary between
high-risk
facilities
compared to
facilities that
are not judged
to be high risk?
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
• Same as Option 1
above, plus look at
changes over time
Option 1:
• Compare accident
history for high-risk
facilities to
facilities not
judged to be high
risk
• Attempt to weight
accident history by
severity
Option 2:
• Conduct interviews
with inspectors
DATA NEEDED TO
ANSWER THE
QUESTION
• Same as Option 1
above, plus the date
the strategy was
adopted (may differ
across Regions and
sectors) and date of
inspection (i.e., was
the inspection
carried out before or
after the strategy
was adopted?)
Option 1:
• Facility risk
classification
• Incidence and
severity of chemical
accidents at high-
risk facilities
• Incidence and
severity of chemical
accidents at
facilities not judged
to be high risk
Option 2:
• Interview data
POSSIBLE DATA
SOURCES
• Same as Option
1 above, plus
Regions would
need to verify
the dates the
strategy was
adopted
Option 1 :
• List of high-risk
facilities
. RMP Info
(accident
history)
• State databases
Option 2:
• Interviews
DATA USES
• Same as Option 1
above, plus analyze
changes over time
overall and for each
type of facility
Option 1:
• Divide the number of
high-risk facilities
reporting an
accident by the total
number of high-risk
facilities
• Divide the number of
non- high-risk
facilities reporting
an accident by total
number of facilities
that are not judged
to be high risk
Option 2:
• Summarize interview
results
DATA LIMITATIONS AND
CONFOUNDING FACTORS
• Same as Option 1 above, plus
may be difficult to adjust for
differences in when the
strategy was adopted for
different Regions and sectors
• The strategy continues to
evolve; may be difficult to
select a firm cutoff date for the
analysis
Option 1:
. RMP Info and ICIS do not track
risk status. The list of high-risk
facilities is considered sensitive
and is not publicly available,
but could be shared with an EPA
contractor after the contractor
obtains OCA clearance.
• Difficult to control for
differences between high-risk
and other facilities that may be
correlated with inspection
results (e.g., accident history)
• Not possible to link inspection
data and RMP data for -20% of
facilities
• Inspections may increase the
likelihood of facilities reporting
an accident. May not be able to
discern if trends in accident
data are due to actual
accidents vs. changes in
reporting
• Facilities have up to 6 months
to report an accident. If an
inspection falls within that 6-
EVALUABILITY
ASSESSMENT
Not evaluable. The "high risk"
designation was adopted by
different Regions/sectors at
different times, starting a few
years ago, and the strategy
continues to evolve.
Furthermore, limited
longitudinal data (post-adoption
of the strategy) exists at
present.
Option 1 : Not evaluable - The
"high risk" designation was
adopted by different
Regions/sectors at different
times, starting a few years ago,
and the strategy continues to
evolve. Also, would not be able
to control for the confounding
factors listed in previous column
Option 2: Partly evaluable, if
we know which facilities were
targeted as "high risk." Small
sample size may be even more
of a limiting factor for this
question than for other
questions, since we are trying to
compare two different types of
facilities (high risk vs. other),
and cannot hold other factors
constant due to differences in
how and when Regions adopted
the strategy
40
-------
EVALUATION
QUESTION
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
DATA NEEDED TO
ANSWER THE
QUESTION
POSSIBLE DATA
SOURCES
DATA USES
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
month window, it may be
reported after the inspection
even if it happened before the
inspection
• RMP Info may record multiple
entries for an "accident" each
time facilities update minor
details about the accident
(e.g., the time of day the
accident occurred). Need to
manually review the records to
ensure that we are not double
counting
• Unable to control for
confounding factors, e.g. other
types of inspections, and
inspector/inspection quality
Unable to generalize findings
from delegated states to the
national RMP Program
• One state database specifies
high-risk facilities and facilities
not judged to be high risk;
however, not generalizable
Option 2:
• Interview-related limitations
(not generalizable, response
bias, resource constraints, ICR
requirements)
D. How, if at
all, have the
incidence and
severity of
accidents
changed
(overall and by
facility type)
since the
strategy was
adopted?
Same as Option 1
above, plus look at
changes within
groups over time
Same as Option 1
above, plus date the
strategy was
adopted (may differ
across regions and
sectors) and date of
accident
Same as Option
1 above, plus
Regions would
need to verify
date the
strategy was
adopted
Same as Option 1
above, plus analyze
changes over time,
overall and for each
type of facility
Same as Option 1 above, plus
may be difficult to adjust for
differences in when the
strategy was adopted for
different Regions /sectors
The strategy continues to
evolve; may be difficult to
select a firm cutoff date for the
analysis
Not evaluable - see C1 (option
1) above.
41
-------
EVALUATION
QUESTION
POSSIBLE
APPROACHES TO
ANSWERING
QUESTION
DATA NEEDED TO
ANSWER THE
QUESTION
POSSIBLE DATA
SOURCES
DATA USES
DATA LIMITATIONS AND
CONFOUNDING FACTORS
EVALUABILITY
ASSESSMENT
5. BASED ON THE RESULTS OF QUESTION 4, SHOULD EPA CONSIDER REFINING ITS APPROACH TO DEFINING "HIGH-RISK" FACILITIES? SHOULD EPA
RECONSIDER THE CURRENT ALLOCATION OF INSPECTION RESOURCES BETWEEN HIGH-RISK FACILITIES VS. FACILITIES THAT ARE NOT JUDGED TO BE HIGH
RISK?
• Opinion/judgment
based on answers
to question 4
• See above (question
4)
• See above
(question 4)
• See above (question
4)
• See above (question 4)
• Potential lack of senior
management buy-in for the
recommendations
• Resource availability
• Political sensitivities regarding
the ways in which risk criteria
are defined and applied
To be determined by EPA
management.
42
-------
APPENDIX B. DATA IMPROVEMENT ACTION PLAN
NO.
1
2
3
4
5
6
7
8
INDICATOR
Changes in Risk
Management Plans
Changes in Risk
Management Plans
Changes in facility
behavior following
an RMP inspection
Changes in facility
behavior following
an RMP inspection
Changes in facility
behavior following
an RMP inspection
Changes in facility
behavior following
an RMP inspection
Changes in facility
behavior following
an RMP inspection
Accident history at
facilities with and
without an RMP
inspection
DATA NEEDS
Changes in safety
procedures, policies,
training, or technologies
Changes in safety
procedures, policies,
training, or technologies
Inspection results (necessary
but not sufficient for
understanding changes in
behavior)
Inspection results (necessary
but not sufficient for
understanding changes in
behavior)
Corrective measures taken
to settle enforcement
actions that resulted from
an RMP inspection
Linking ICIS inspection data
to ICIS enforcement data
Follow-up inspection data
Linking facility accident
history to inspection data
DATA SOURCES
RMP Info
RMP Info
Inspection reports
ICIS inspection
database
ICIS enforcement
database
ICIS inspection
database
ICIS enforcement
database
ICIS inspection
database
RMP Info
ICIS
DATA GAPS
Data quality is questionable:
facilities self-report their data,
and EPA cannot correct errors,
but must ask the facility to
make corrections
Most of the relevant fields
include the date a change was
made, but do not describe what
changed in any detail
Lack of consistency in how
inspections are conducted and
recorded
Lack of specificity about the
nature and severity of the
violations
Lack of information on how
facilities changed their
behavior to settle an
enforcement case
Difficulties linking between ICIS
inspection data and ICIS
enforcement data due to lack
of consistent use of the
Enforcement Action ID
ICIS only goes back to FY 2007;
data for inspections prior to
FY07 not included in database
RMP Info - ICIS bridge table not
complete for 20% of facilities in
RMP Info
ACTION ITEM
Systematically review the RMP data, notify
facilities about potential errors, and track
and verify that facilities make needed
corrections
Include new data fields in RMP Info that can
reliably capture changes in behavior as
reflected in Risk Management Plans
Continue to work with EPA's Regional Offices
to ensure the quality and consistency of
inspection data
Add new fields to the ICIS inspections
database to capture additional information
about inspection results
Add new fields to the ICIS enforcement
database to describe actions (beyond
settlement value) that facilities take to
settle enforcement cases
Take steps to ensure that inspections are
properly linked to resulting enforcement
actions
Backfill the ICIS database using pre-FY 2007
inspection reports
Expand the bridge table between RMP Info
and ICIS to include more facilities
RESPONSIBLE
PARTIES
OEM
OEM
OECA and OEM,
working with EPA's
regional offices
OECA, working with
OSWER and other EPA
agencies that report
into ICIS
OECA, working with
OSWER and other EPA
agencies that report
into ICIS
OECA
OECA, in coordination
with EPA's Regional
Offices
OECA
OEM
43
-------
NO.
9
10
11
12
INDICATOR
Accident history at
facilities with and
without an RMP
inspection
Comparison of high
risk facilities to
other facilities
Comparison of high
risk facilities to
other facilities
Effects of RMP
inspections on
facilities that are
not inspected
DATA NEEDS
Reliable accident history
record
Facility status (high risk or
not)
Inspection history and
accident history, by facility
type (high risk or not high
risk)
Anecdotal examples,
supplemented with a review
of the literature on general
deterrence
DATA SOURCES
RMP Info
List of high risk
facilities
Interviews/online
questionnaire
List of high risk
facilities
RMP Info
ICIS inspection
database
Interviews
Literature review
DATA GAPS
A facility may have multiple
records for the same accident
(e.g., if the facility updates the
date)
The inspection strategy has
been implemented at different
times, and in different ways, in
different Regions and sectors
Neither ICIS nor RMP Info
indicate if a facility is "high
risk"
List of high risk facilities not
publicly available
EPA has not undertaken a
systematic study of general
deterrence specifically for RMP-
regulated facilities
ACTION ITEM
Review accident history and inspection data
to establish the chronology of inspections
and accidents and to account for duplicate
accident records
Develop a clearer understanding of how
different Regions have implemented the
targeting strategy, by sector, and how their
approach has evolved
Verify that the list of "high risk" facilities
can be linked to RMP Info and ICIS
Conduct a study on general deterrence for
the RMP Program
RESPONSIBLE
PARTIES
OEM
OECA, with
cooperation from
EPA's Regional
Offices
OEM
OECA
OEM
OECA
44
------- |