&EPA
              United States
              Environmental Protection
              Agency
           Office of
           Solid Waste and
           Emergency Response
DIRECTIVE NUMBER:   9200.2-2
TITLE: Superfund Evaluation Handbook, Fiscal Year 1987
              APPROVAL DATE:
              EFFECTIVE DATE:
              ORIGINATING OFFICE:
              £9 FINAL
              D DRAFT
                STATUS:
              REFERENCE (other documents):
               December 29, 1986
               December 29, 1986
               OERR/OPM/BIM
  OSWER     OSWER     OSWER
VE   DIRECTIVE   DIRECTIVE   D

-------
     ^     _
    Ot PA
                         United States Environmental Protection Agency
                               Washington. DC 20460
                   OSWER Directive Initiation Request
                                           1. Directive Number
                                        9200.2-2
                                 2. Originator Information
   Name of Contact Person
     Terry Ouverson
      Mail Code
       UH-548D
                                     Office
                                      nERR/npr'/RTMS
                          Telephone Number
   3. Title
     The Superfund Program Evaluation Handbook
     Fiscal Year 1987
   4. Summary of Directive (Include brief statement of purpose)
     The manual outlines procedures and measures  for use  in evaluating program
     performance in FY 1
-------
EPA
            United States
            Environmental Protection
            Agency
              Office of Emergency and
              Remedial Response
              Washington DC 20460
OSWER Directive 9200.2-2
November 1986
            Superfund
Superfund Program
Evaluation Handbook
            Fiscal Year  1987
            Final

-------
                                                       OSWER Directive 9200.2-2
                     SUPERHJND PROGRAM EVALUATION HANDBOOK


                               Fiscal Year 1987
Office of Solid Waste and Emergency Response
U.S. Environmental Protection Agency
Washington, DC  20460

-------
                                                       OSWER Directive 9200.2-2
                                 CONTENTS
                                                              Page
I.   OVERVIEW OF THE FVALUATION FRAMEWORK                       1
     The Evaluation Framework and the Management Process        3
II. . THE EVALUATION MEASURES AND THEMES                         4

     1.  FY 87 Accountability Measures                          5
     2.  FY 87 Qualitative Themes                               6
III. THE EVALUATION PROCESS                                     6

     1.  Quarterly Reporting                                    7

         a.  Development of Accountability Measures             7
         b.  Tracking an.^ .Assessment                            8
         c.  Follow-up Action                                   8

     2.  Regional Self-Evaluation Reports                       9

     3.  Annual Regional Reviews                               10

     4.  Focused Studies                                       11


APPENDIX A.  FY 1987 ACCOUNTABILITY MEASURES

APPENDIX 3.  FY 1987 QUALITATIVE THEMES AND QUESTIONS

-------
                                                       OSWER Directive 9200.2-2
I.  OVERVIEW OF THE EVALUATION FRAMEWORK

     The Superfund Evaluation Handbook offers a comprehensive framework for
reviewing program performance.  It is designed to consolidate the program's
current evaluation activities while opening new avenues of program dialogue.
Both Fund-financed and enforcement activities are included under this broad
umbrella.

     The evaluation process involves four basic tools:
     t Quarterly reporting and assessment of accomplishments;

     t Annual self-evaluation reports prepared by the Regionsr

     t Follow-up Regional review visits conducted by the Office
       of Solid Waste and Emergency Response (OSWER); and

     t Focused studies on issues requiring further analysis.
     For the most part, this structure is already in place.  Only the
Regional self-evaluation report is a completely new requirement.

     The next page provides a summary of these core elements.  It is
followed by a diagram of the evaluation timeframe and a matrix showing
the roles of key program offices.

-------
                                                   OSWER Directive 9200.2-2
           Core Elements of the Evaluation Framework
Quarterly Reporting

     Senior OSWER management review each Region's accomplishments
against a set of accountability measures established at the
beginning of the fiscal year.  The accountability measures include
all SFMS targets and most activities reported in the SCAP.
Regional Self-Evaluation Reports

     Once a year, each Region prepares a written self-evaluation
report addressing selected areas of key importance.  These areas,
or "qualitative themes," are specified in the Handbook published
at the start of the FY.  The report is reviewed at Headquarters
prior to the Regional review visit, which examines any strengths
or weaknesses that may be apparent.
Regional Review Visits

     The Office of Solid Waste and Emergency Response (OSWER)
administers the annual Regional review visits that follow the
self-evaluation.  Each review includes a series of management-
level meetings and discussions on subjects identified through
the reports or by other means (separate sessions are held for
the RCRA program).  In the concluding senior management session,
agreement is sought on further steps to be taken by HQ or the
Region.  File reviews may be performed to verify information
supplied in the self-evaluation report.
Focused Studies

     The program may initiate studies of broader scale on issues
requiring extensive, in-depth analysis  (e.g., the Quality and
timelines* of RI/FS).  Focused studies may be an outgrowth of
Regional review visits, quarterly data analysis, or any other
interaction between Regions and HQ.  Staff from a variety of
Regional and HO offices may be involved.

-------
                                  EVALUATION  PROCESS TIMELINE
                PROCESS PLANNING
-II-
PROCESS IMPLEMENTATION-

                                                                    REGIONAL  SELF  EVALUATIONS
   I    I    I     I     I    I    I     I     I    I    I     I     I    I    I     I     I    I    I     I     I    I    I
O    N   DJ    F   M    A    M   J    J    A    s'  O   N5   J    FMA   M    J    J    A   S~

                         198X                                                    198X+1
         Draft Evaluation Handbook
         Final Evaluation Handbook
         Quarterly Reporting Process (Quantitative Measures)
   1.  Develop SCAP/accountabilily measures and performance expectations for upcoming FY
   2.  Negotiate and establish targets lor SCAP/accountabilily measures
   3.  Complete revisions to targets, measures, and expectations.
                                                                                                                     o
                                                                                                                     H-
                                                                                                                     VP
                                                                                                                     NJ
                                                                                                                     N>
                                                                                                                     I

-------
                                    ROLES AND RESPONSIBILITIES MATRIX
OSWER
Coordinate quarterly review of program performance on accountability measures
Present significant findings and issues to the Administrator and/or Deputy Administrator
Administer annual Regional review process
OERR/OPM
•   $ :•.:
OWPE
Coordinate the development of accountability measures, self-evaluation themes and related
questions   . :' ••.  •  :    '         '• " '     -::;':'  :^'^^'••''•• •••':.\:.. ..V ••'•;'  •.•^•t^' ";.': .y .
Assemble quarterly evaluation data from HQ program divisions and present findings to the
Directors of OERR and OWPE, and the Assistant Administrator
Coordinate HQ review of Regional self-evaluations and HQ involvement in OSWER annual
reviews                     •       •'••'.'  ••.•'•'..:.': ''"" '•''• •  • -  '      "     '•• .. ••. ,• .•:''
 Monitor follow-up actions and focused studies                                      ;
 HQ PROGRAM
 DIVISIONS
Collect and review quarterly evaluation data reported by Regions
Review and analyze Regional self-evaluations and participate in annual Regional reviews
Initiate or participate in follow-up actions and focused studies
 REGIONS
Provide quarterly evaluation data to Headquarters
Prepare annual self-evaluations and participate in annual Regional reviews
Provide Input to the development of accountability measures, self-evaluation themes and
 related questions
Assist in follow-up actions and focused studies
                                                                                                                 o
                                                                                                                 H-
                                                                                                                 H-

                                                                                                                 0>
                                                                                                                 to

                                                                                                                 NJ

-------
                                                        OSWER Directive 9200.2-2
The Evaluation Framework and the Management Process


     In a broad context, evaluation may be viewed as one major conponent
of the Superfund management and planning cycle.  The management framework
also includes the:  (1) Agency Operating Guidance, which spells cut the
program's broad goals for the planning fiscal year; (2) program budget,
which identifies resource needs for the program- (3) workload model,
which allocates staff among the Regions? (4) Strategic Planning and
Management System (SPMS), which records key accomplishments for the
attention of the AA and Administrator- and (5) Superfund Comprehensive
Accomplishments Plan (SCAP), which identifies site-specific actions
needed to support the budget, workload model, SPMS and other program
management decisions.

     The planning process begins early in the operating fiscal year with
development of the Operating Guidance, which highlights major policy and
program initiatives and provides Se-jions and States with directions for
carrying out environmental programs.  The Guidance closely reflects the
Administrator's management plan and the priorities of each Assistant
Adminis trator.

     These guidelines are refined as the program budget is developed.
Budget preparation  involves the ef (forts of Headquarters and Regional
staff in revising cost estimates to reflect new program initiatives.  In
"uperfund, a large part of the budget is based on site schedules contained
in the SCAP.

     From January through May, the Derating Guidance is revised, the
budget adjusted to  reflect Congressional action and other changes (such
as new site schedules and cost estimates), and measures developed for
the SPMS.  During the same period, the SG\P is also revised to focus on
site-specific resource needs and schedules for the planning fiscal year.
The preliminary SCAP is used in allocating FTE's among the Regions
and setting preliminary SPMS connitinents.

     As the new fiscal year approaches, the SCAP is again updated to
reflect changes to site schedules and cost estL-rates.  Final SPMS commit-
ments are based on this revised document.  During the year, the SCAP is
updated quarterly to reflect actual accomplishments, anticipated emergency
actions for the upcoming quarter, and schedule changes for long-term
remedial v^rork.

     Evaluation measures are developed in parallel with those of the
SCAP and SPMS.  Beginning in January, SCAP and SPMS measures are revised
to reflect the new Operating Guidance and budget.  At the same time,
the Handbook's accountability treasures are> assessed for consistency
with the SCAP as work begins on a new set of qualitative themes.  This
task is completed in September so that HQ and Regions have a common
understanding of their expected roles and accomplishments for the new
fiscal year.

-------
                                                       OSWER Directive 9200.2-2
      As the year unfolds, the evaluation framework is used to monitor
program performance, guide management decisions on resource allocation,
and identify needed program and policy initiatives.  Conclusions are
factored into the Operating Guidance, budget and SCAP planning for later
fiscal years.
II.  THE EVALUATION MEASURES AND THEMES

     Program evaluation serves two distinct, but related, purposes: to
measure specific outputs against established goals r and to assess the
overall effectiveness of program operations.  Accordingly, the Handbook
contains:
     0 Accountability measures which establish performance expectations
       for key program activities, such as RI/FS starts, RA completions,
       and NPL sites with a completed PRP search;

     ° Qualitative themes which focus on program effectiveness in areas
       of policy implementation, program management, and cost and
       schedule control.
     The accountability measures are carefully selected to give a
balanced picture of the program.  They  include all SPMS measures, which
are reviewed by senior Agency management, and some measures based on SCAP
activities, which are monitored by OSWER.  Each evaluation measure has
a performance expectation based on the  associated SPMS/SCAP target
(e.g., 100% of planned RI/FS starts).   If the program is to be successful,
the Regions must have a strong record of accomplishment against these
goals.  Following each quarter, the Regional Administrator will receive
an assessment of the Region's performance.

     In some situations, evaluation targets may be missed for reasons that
are difficult to control or foresee.  The evaluation will take into account
any circumstance* that may have impeied performance.  Thus, the failure to
meet a particular target will not necessarily result in a negative
assessment.

     The qualitative themes provide a. general framework for Regional/HQ
dialogue on program analysis and improvement.  They serve to focus the
self-evaluation exercise and Regional reviews around a well-defined set
of key program issues.  As the themes are not output measures, they are
not accompanied by quantified perforrrance expectations.

-------
                                                       OSWER Directive 9200.2-2
     1.  FY 87 Accountability Measures

     For FY 87, accountability measures have been developed in seven
major program areas: Pre-Remedial, Remedial, Enforcement (Pre-Enforcement,
Judicial Enforcement and Cost Recovery), Removal, Financial Management,
Analytical Support, and Conmunity Relations.  Appendix A contains the full
list of treasures.  Some examples appear below:
                                                        Performance
              Measure                                   Expectation

     - Number of site inspections completed             100% of SCAP/
       (Pre-Remedial #2)                                SPMS target

     - Number of subsequent remedial design             100% of SCAP
       starts at NPL sites (Remedial #9)                target

     - Number of Fund-financed first removal            +/- 20% of
       starts at NPL sites (Removal *la)                Fund-financed
                                                        SPMS subtarget

     - NPL sites with completed PRP search              100% of Final
       (Fiiforcement #1)                                 SCAP NPL Update
                                                        target


     For some key program activities (e.g., RI/FS starts), the Handbook
incorporates the SPMS/SCAP combined reasura for Fund-financed and PRP-
financed activity.  These measures nay be accompanied by subtargets,
established through SCAP, for Program (or Fund-financed), Enforcement,
and PRP response, with a percentage range or "window" as the performance
expectation.  For example, while Regions are asked to achieve 100% of the
combined target for RI/FS starts, they are only expected to cone within
+ or - 20% of the Program-lead subtarget (also, there is an extra margin
of one site for all "window" subtargets).  This approach allows for lead
shifts, such as "PRP takeovers," or site substitutions t-Jiat are not
anticipated when SCAP/SFMS commitnents are set.  Combined measures are
subject to stricter accountability than are subtargets.

    Quantitative measures are reviewed and updated annually to reflect
changes in program direction, emphasis and commitments.  This occurs in
conjunction with established procedures for revising SCAP and SPMS
measures.

-------
                                                         OSWER Directive 9200.2-2
   2.  TY 87 Qualitative Themes

     The qualitative dimension of the review is based on the set of key
themes and issues outlined in Appendix B.  Taken together,  the themes
touch on most major components of the Superfund program.
                           FY 87 Qualitative Themes

                 Implementing Superfund Reauthorization
                 Conducting Pre-Enforcement Activity
                 Negotiating PRP Settlements
                 Determining Health-based Cleanup Approaches
                 Selecting Permanent, Cost-Effective Solutions
                 Improving PRP and Fund-financed Project Management
                 Coordinating EPA/State Enforcement Efforts
                 Delivering Analytic Services
                 Improving Communication with the Public
     In the written self-evaluation exercise, described in Section
III, Regions are asked to assess their performance in these thematic
areas.  The Handbook specifies major points that should be addressed
and provides detailed questions that Regions may wish to refer to in
developing the discussion (Appendix B).  However, Regions are not
required to use the questions.  Some of the questions are also used to
quide discussions in the annual review.

     Qualitative themes are updated each year.  In the third quarter of
the operating fiscal year, the Office of Emergency and Remedial Response
(OERR) and the Office of Waste Programs Enforcement (OWPE) work together
to develop draft questions for the following fiscal year.  The draft is
circulated among Regions and at HQ for review and comment.  Assistance
is provided by HQ program offices and the Management Advisory Committee
(MAC) in defining program priorities and framing individual themes and
questions.  The MAC serves as a HQ/Regional advisory group in the areas
of program management and planning, program evaluation, and information
systems.
III.  THE EVALUATION PROCESS

     As discussed, the evaluation framework includes four main elements:
quarterly reporting (on accountability measures), Regional self-evaluation
reports (on qualitative themes), follow-up Regional review visits, and
focused studies.

-------
                                                         OSWER Directive 9200.2-2
1.   Quarterly Reporting

     Program performance against accountability measures is assessed on
a quarterly basis.  At the close of each quarter, Regions supply accom-
plishment data to Headquarters through the SCAP updates,  The information
is then verified and interpreted by OERR and OWPE with input from the
Regions.  Follow-up communication, if needed, may involve informal
discussions, written letters, or meetings.

     The evalation process involves no new reporting requirements.
Evaluation data are reported through the regular SCAP process, while
accomplishment targets mirror fiscal year and quarterly SCAP commitments.


     (a) Development of Accountability Measures

     Evaluation measures are developed in parallel with the major phases
of the planning cycle:

     0  During the first quarter of the operating fiscal year, HQ and
Regions conduct preliminary discussions to identify potential changes
to SCAP methodologies, items and definitions.  This lays the groundwork
for the development of accountability measures.

     0  In the second quarter, draft SCAP npthodologies for the planning
FY are reviewed at Headquarters and Regions.  Included with the method-
ologies are definitions of all SCAP measures ^.nd tentative activity
targets based on the preliminary SCAP.  A draft set of evaluation
measures is also forwarded for review.

     0  Regions and HQ negotiate final SCAP connitments in the early
summer.  This process also leads to final accountability measures
reflecting the SCAP revisions.

     0  Evaluation measures are incorporated into hae Handbook for
publication by the beginning of October.  The Sunerfund planning and
evaluation cycle starts again in the new fiscal year.


     Once the Handbook is issued, the evaluation measures remain in effect
for the full fiscal year.  In some cases, however, particular SCAP targets
may be revised through the SCAP amendment process; for example, the RI/FS
start target for a Region might be changed from 12 to 11.  This would
not change the Handbook's performance expectation; instead, the expectation
would simply apply to the new target (e.g., 100% of the new RI/FS start
target of 11).

-------
                                                         OSWER Directive 9200.2-2
                                     8
     (b) Tracking and Assessment

      Until the consolidated CERCLIS becomes operational, Regions will
continue to report quarterly accorplishments through the integrated SCAP,
CERCLIS, Case Management System (CMS), and Remedial/Removal Financial
Management System (RRFS).  All SCAP data should be reported to HQ
within four working days frcm the end of the quarter, as spelled out
in the SCAP guidance.  In a few cases, additional calculations are needed
to convert the SCAP data to the needed form (e.g., for Remedial measures
#5 or #6, the percentage of RI/FS completed on schedule will be derived
frcm SCAP data on RI/FS completions and the completion date).  Any
extra steps are performed by Headquarters.

      Within five weeks from the end of the quarter, OERR and CWPE
assemble the reporting data supplied by the Regions and prepare a report
for the respective program Directors.  Highlighted in the report are
significant accomplishments, patterns, or issues involving program
performance.  Before presenting the report, the divisions check the
figures for completeness and accuracy, with assistance frcm the Regions.
At this stage, Regions have full opportunity to provide any information
they feel should be considered as the data is reviewed and interpreted.
The findings are presented to the Assistant Administrator, OSWER, and
a written suntnary of the data is provided to the Region.

     Ttiese procedures will be streamlined when the planned changes to
CERCLIS are put into effect.  Implementation of the new system, scheduled
for FY 87, will allow consolidated reporting and analysis of data for
SCAP and SIMS.  Most needed information will then be available from a
single source that can be accessed by Regions and Headquarters.  It is
emphasized, however, that evaluation data can be obtained frcm present
reporting arrangements as the unified system is developed.


     (c) Follow-Up Action

     After quarterly data have been reviewed, there may be follow-up
ocmnunicaticn on performance issues of interest or concern.  Very often,
this may be limited to informal discussions to verify data (if performance
expectations are achieved, such action may be waived entirely).  Program
offices are responsible for initiating follow-up.

    A more formal response is generally reserved for matters of serious
disagreement or concern.  Situations of this nature may occur if  (1) a
particular target is missed in successive quarters;  (2) actual performance
falls far below expectation; (3) a large number of targets are missed;
or (4) there is a common pattern or problem affecting several Regions.
Missing a particular target generally does not trigger this level of
response except in one of the listed cases.

-------
                                                      OSWER Directive 9200.2-2
     Follow-up action starts with a written request for information or
clarification and an offer of assistance in resolving any outstanding
issues.  The ccmnunication emphasizes the need for a cooperative approach
involving both HQ and Regions.  Regions are invited to express their
views freely and to identify actions needed from Headquarters, such as
policy clarification.

     Problems that are persistent or pervasive may be addressed through
focused studies on specific topics or through other follow-up actions.
The annual OSWER Regional visits, while focused on qualitative issues,
may provide sane opportunity for reviewing progress.


     2.  Regional Self-Evaluation Reports

     Qualitative questions are best explored through the free exchange
of information and opinion.  This is achieved by means of a two-stage
process that blends data presentation with open discussion.

     0  On an annual basis, each Region prepares a written self-
        evaluation report addressing major themes outlined in the
        Evaluation Handbook;

     0  After HQ review of the report, a follow-up visit is conducted
        through the OSWER annual review process to discuss any issues
        that may be evident.

     The self-evaluation exercise allows Regions to present their own
perspective on the program while preparing for the OSWER review.  At
the same time, the review team is able to examine these presentations
in light of information available at HQ or requested fron Regions.  The
review may also take up any issues remaining from the quarterly
reporting process.

     The evaluation cycle begins when the Handbook, containing the major
program themes, is published at the start of the fiscal year.  For each
theme,  Regions develop a written assessment of their performance,
explaining significant actions, accomplishments, and problems.  The
response should be built around the specific qualitative issues
related to the theme (Appendix B).  For example:
              Theme

     0 Selecting permanent, cost-
       effective solutions
      Specific Issues

t Use of alternative treatment/
  destruction technologies in
  remedial and removal response

t Weighing cost and effectiveness
  in selecting a cleanup approach

-------
                                                         OSWER Directive. 9200.2-2

                                    10
     This framework ensures that Regional reports will be reasonably
consistent in format.  Beyond these basic requirements, Regions nay
choose any response approach they consider useful and appropriate.
In developing the response, Regions are free to use the optional guideline
questions referenced in the list of themes/issues (Appendix B).  These
questions touch on specific points that may be useful in focusing the
discussion.  The self-evaluations should be succinct and well-focused;
quality is a more important factor than length.

     The Regional evaluation/review cycle generally begins in the first
quarter of the fiscal year.  Regions are reviewed sequentially over a
period of seven months or more.  This staggered timeframe allows all
self-evaluation reports and follow-up reviews to be performed in a
standard, two-stage sequence.  In addition, any extra work involved in
reviewing the self-evaluations is spread out evenly over the year.

     To allow time for analysis, each Region submits the completed self-
evaluation no later than six weeks before the scheduled date of the
review visit.  OSWER then reviews the report and develops a proposed
agenda for the review, outlining the schedule for individual sessions
and the specific topics to be addressed.  These topics are based on the
self-evaluation exercise, additional questions from Appendix B, issues
specific to a particular Region, or the findings of recent or ongoing
studies.  Hcwever, the agenda will not cover every question appearing
in the Handbook.  The proposed agenda is finalized after agreement on
format and scheduling.

     If the self-evaluation appears inconsistent with other program
data, OSWER may request that selected records and files be made available
to the review team for data validation.  File reviews are generally
limited to matters of priority and are discussed in advance with Regional
program managers.  In most cases, the inspections are performed by
staff-level personnel during the Regional review visits, or in the
preceding week.  Extensive data validation may require a separate visit
prior to the main OSWER review.
     3.  Annual Regional Reviews

     The OSWER Regional review consists of a 3- to 4-day series of
management-level discussions on key issues in the Superfund and RCRA
programs.  After an introductory briefing, concurrent breakout sessions
are held on individual program or policy areas, such as the delivery
of analytic services or judicial enforcement.  The visit concludes with
a senior management session, and a report of the findings is drafted on
location.

-------
                                                      OSWER Directive 9200.2-2
                                    11
     The basic structure of the OSWER review is not changed by any
requirement laid out in the Evaluation Handbook.  As in previous years,
the written agenda serves as the protocol for the review.  Program
offices have primary responsibility for selecting review team matibers
and work with OSWER on developing schedules and other logistical
arrangements.

     In the senior management session, agreement is sought on follow-up
actions to be performed by Regions and/or HQ, such as the institution of
management controls, expedited preparation of required documents, or
development or clarification of policy.  The agreement is spelled out in
a memorandum signed by the Directors of OERR and OWPE and the appropriate
Regional official within 30 days after the concluding session.  Respon-
sibility for monitoring the assignments is shared by OERR and OWPE.
These offices may also help to syntehsize and interpret findings from
the reviews.
     4.  Focused Studies

     Some issues may need to be explored in greater depth following the
annual review.  To meet this need, the program may initiate studies
of varied scope and complexity that focus on identified concerns.  The
project concepts may stem from the OSWER Regional reviews, quarterly
reporting data, OERR/OWPE initiatives, or the FMFIA exercise.  For
illustration, some FY 86 studies examined:
     -  Procedures for ensuring the quality of RI/FS;

     -  Quality control for pre-remedial steps;

     -  Resource requirements for the CERdA Enforcement Program.
     Depending on its purpose, the focused study may involve staff from
OSWER, HQ or Regional offices, special task forces or workgroups, or
other Agency offices.  Responsibility for coordination, monitoring, and
synthesis may be shared by the lead program office, management support
offices, or other units.

-------
                                      OSWER Directive  9200.2-2
                 APPENDIX A
       FY 1987 Accountability Measures
Accountability measures are defined  in the FY
1987 SCAP Methodologies and the  listing of SPMS
measures .and cornritrrents  Issued  l^y the Office
of Policy. Planning, and  Evaluation.

-------
                                                      OSWER Directive 9200.2-2
                           PRE-RPreDIAL_ ACTIVITY

OBJECTIVE-  Ccnplete identification- assessment and scoring of hazardous waste
            sites in a timely manner.
       PERFORMANCE MEASURE
          1. Number of
             preliminary
             assessments (PAs)
             completed

          2. Number of site
             inspections
             (Sis) completed

          3. Number of
             expanded site
             inspections
             completed
 TARGET
CURRENTLY
   IN
                                   SCAP   SPMS
Yes    Yes
Yes    Yes
Yes     No
PERFORMANCE
EXPECTATION.
100% of SCAP/
SPMS target
100% of SCAP/
SPMS target


100% of SCAP
target

-------
                                                       OSWER Directive 9200.2-2"
                              REMEDIAL PROGRAM
OBJECTIVE:  Develop and implement effective  cleanup approaches at
            uncontrolled hazardous vaste sites that threaten human
            health or the environment
  PERFORMANCE MEASURE
   TARGET
  CURRENTLY
     IN
SCAP    SCT1S
    PERFORMANCE
    EXPECTATION
  1. Number of first
     RI/FS starts (Fund-
     financed + PRP +
     Federal facilities)

     a . Number of program-
        lead first RI/FS
        starts at NPL
        sites

     b. Number of enforce-
        ment-lead first
        RI/FS starts at
        NPL sites

     c. Number of PRP-lead
        first RI/FS starts
        at NPL sites
Yes     Yes
Yes     No
Yes     No
Yes     No
100% of combined SPMS
target
+ 20% of program-lead
RI/FS target in final
SCAP
100% of enforcement-
lead RI/FS target in
final SCAP + any lead
shifts or substitutions

100% of PRP-lead RI/FS
target in final SCAP
_+ any lead shifts or
substitutions
  2.  Number of subsequent
     RI/FS starts at NPL
     sites

  3.  Number of first
     RI/FS conpletions at
     NPL sitOft (ROD/EDO)
Yes     Yes
Yes     Yes
100% of SCAP/SPMS
target


100% of SCAP/SEMS
target
           * Combined targets are subject to stricter accountability
             than are subtargets for program, enforcement or PRP
             response.

-------
                                                      OSWER Directive 9200.2-2
                               REMEDIAL PROGRAM
  PERFORMANCE MEASURE
   TARGET
  CURRENTLY
     IN
SCAP    SPMS
   PERFORMANCE
   EXPECTATION
  4. Number of subsequent
     RI/FS couplet ions at
     NPL sites

t 5. Percent of RI/FS
     conpleted in the
     planned completion
     quarter

t 6. Percent of RI/FS
     conpleted within one
     quarter of planned
     ccnpletion quarter

  7. Number of expedited
     response actions
     (ERAs) conpleted

* 8. Number of first  RD
     starts at NPL sites

     a .  Number of Fund-
        financed first RD
        starts at NPL sites

     b.  Number of PRP-
        financed first RD
        starts at NPL sites
Yes     Yes
No      No
No      No
Yes     No
Yes     No
Yes     No
Yes     No
100% of SCAP/SFMS
target
100% of planned
RI/FS completions
identified in the
final SCAP

100% of planned
RI/FS conpletions
identified in the
preliminary SCAP

100% of SCAP target
100% of SCAP target
_+ 20% of Fund-
financed first RD start
target in final SCAP

100% of PRP-financed
first.RD start target
in final SCAP _+ any
lead shifts or
substitutions
          t Accomplishments are evaluated against both the final
            SCAP (#5) and the preliminary SCAP  (#6).  The latter
            is the  basis  for  FTE allocation.

          * Combined targets  are  subject to stricter accountability
            than are subtargets for Fund or PRP response.

-------
                             REMEDIAL PROGRAM
                                                       OSWER Directive 9200.2-2
                                   TARGET
                                  CURRENTLY
                                     IN
   PERFORMANCE MEASURE
SCAP
       SPMS
PERFORMANCE
EXPECTATION
   9.  Number of subsequent
       RD starts at NPL sites

  10.  Number of remedial
       designs completed at
       NPL sites

t 11.  Percent of RDs
       completed in the
       planned conpletion
       quarter

t 12.  Percent of RDs
       completed within one
       quarter of the planned
       completion quarter

* 13.  Number of first
       remedial action starts
       at NPL sites

       a. Number of Fund-
          financed first RA
          starts at NPL sites

       b. Number of PRP-
          Financed first RA
          starts at NPL sites
Yes     No      100% of SCAP
                target

Yes     No      100% of SCAP
                target
No      No      100% of planned
                RD completions
                identified  in
                final SCAP

No      No      100% of planned
                RD completions
                identified  in the
                preliminary SCAP

Yes     Yes     100% of SCAP/SPMS
                target
Yes     No      + 20% of SCAP
                target  for Fund-
                financed first RA

Yes     No      100% of SCAP
                target  for PRP-
                Financed first RA
                starts
          t Accomplishments are  evaluated against both the final
            SCAP (#5) and the preliminary SCAP (#6).  The latter
            is the basis for  PTE allocation.

          * Combined targets  are subject to stricter accountability
            than are subtargets  for Fund or PRP response.

-------
                                                       OSWER Directive 9200.2-2
                              REMEDIAL PROGRAM
                                    TARGET
                                   CURRENTLY
                                      IN
    PERFORMANCE MEASURE
SCAP
SPMS
PERFORMANCE
EXPECTATION
  14.  Number of  subsequent
       remedial actions  started
       at NPL sites.

  15.  Number of  remedial
       actions completed at
       NPL sites.

t 16.  Percent of RAs
       completed  in the
       planned completion
       quarter

t 17.  Percent of RAs
       completed  within  one
       quarter of the planned
       completion quarter

  18.  Number of  sites where
       all  remedial
       implementation has
       been completed.

  19.  Number of  sites where
       the  NPL deletion
       process has been
       initiated.
Yes     Yes    100% of SCAP/SPMS
               target


Yes     Yes    100% of SCAP/SPMS
               target
No      No     100% of planned
               RA completions
               identified in
               final SCAP

No      No     100% of planned
               RA completions
               identified in the
               preliminary SCAP

No      Yes    100% of SPMS
               target
Yes     Yes    100% of SCAP/SPMS
               target
          t Accomplishments are evaluated against both the final
            SCAP  (#5) and the preliminary SCAP (#6).  The latter
            is the  basis  for FTE allocation.

-------
                                                          OSWER Directive 9200.2-2
                               ENFORCEMENT ACTIVITIES
  OBJECTITOr  Achieve responsible party financed response or recover
              funds expended by the CERCIA Program at sites where
              responsible parties failed to conduct all  needed
              response.
 PERFORMANCE MEASURE
 TARGET
CURRENTLY
   IN
SCAP SPMS
PERFORMANCE
EXPECTATION
Pre-Enforcement/Administrative

1. NPL sites with
   completed PRP search      Yes   No

2. Nbn-NPL sites with
   completed PRP search      Yes   No
                      100% of final SCAP
                      NPL Update  target

                      100% of final SCAP
                      first start removal
                      target at Non-NPL
                      sites
Judicial Enforcement

1. §106, §107, §7003 Case
   Resolution/Trial          Yes   No
2. §106 RD/RA Referrals
   w/o settlements           Yes   No
                      100% of final SCAP
                      target


                      100% of final SCAP
                      target
Cost Recovery

1. Removal Cost Recovery
   Cases >$500,000 referred
   to headquarters           Yes
     Yes
100% of final SCAP
target
2. Remedial Cost Recovery
   cases referred to
   headquarters
Yes  Yes
100% of final SCAP
target

-------
                                                         OSWER Directive 9200.2-2
                                   REMOVAL PROGRAM
  OBJECTIVE:  Provide appropriate Federal response to releases of hazardous
              substances or  the threat of such releases.
  PERFORMANCE MEASURE
   TARGET
  CURRENTLY
     IN
SCAP    SEWS
PERFORMANCE
EXPECTATION
* 1. Number of first  removal
     starts at NPL sites
     (Fund-financed + PRP)

     a . Number of Fund-
        financed first
        removal starts at
        NPL sites

     b. Number of PRP-
        financed first
        removal starts at
        NPL sites

  2. Number of removal
     restarts  at NPL  sites
     (Fund-financed + PRP)

* 3. Number of NPL sites
     stabilized through a
     removal action

     a . Number of NPL sites
        stabilized through a
        Fund-financed
        removal action

     b. Number of NPL sites
        stabilized through a
        PRP-financed
        removal action
Yes     Yes
Yes     No
Yes     No
Yas     Yes
Yes     Yes
Yes     No
Yes     No
100% of SPMS
combined target
+ 20% of Fund-
financed subtarget
100% of PRP sub-
target ^ any lead
shifts or substi-
tutions

100% of SPMS
combined measure
100% of SPMS/
SCAP target


_+ 20% of Fund-
financed subtarget
100% of PRP sub-
target + any lead
shifts or substi-
tutions
             * Combined targets are  subject to stricter accountability
               than are subtargets for Fund or PRP response.

-------
                                                         OSWER Directive 9200.2-2
                             REMOVAL PROGRAM
 PERFORMANCE MEASURE
   TARGET
  CURRENTLY
	IN	
SCAP *  SPMS
PERFORMANCE
EXPECTATION
4.  Number of first removal
    starts at non-NPL sites
    (Fund-financed + PRP)

    a. Number of Fund-
       financed first
       removal starts at
       non-NPL sites

    b. Number of PRP-
       financed first
       removal starts at
       non-NPL sites

5.  Number of removal
    restarts at non-NPL
    s ttes (Fund-
    financed + PRP)

6.  Number of: completed
    removal actions at
    non-NPL sites (Fund-
    financed -i- PRP)

    a. Number of Fund-
       financed removal
       completions at
       non-NPL sites

    b. Number of PRP-
       financed removal
       completions at
       non-NPL sites
Yes     No       At least 90%
                 of SCAP measure
Yes     No       + 20% of Fund-
                 financed submeasure
Yes     >7b       100% of PRP sub-
                 target + any lead
                 shifts or substi-
                 tutions

Yes     No       At least 90% of
                 SOAP measure
Yes     Mo       At least 90% of
                 SCAP target
Yes     Mo       + 20% of Fund-
                 financed subtarget
Yes     No       100% of PRP sub-
                 target _+ any lead
                 shifts or substi-
                 tutions
           * Combined targets are subject to stricter accountability
             than are subtargets for Fund or PRP response.

-------
                                                         OSWER Directive 9200.2-2


                                 REMOVAL PROGRAM
                                  TARGET
                                 CURRENTLY
                                 	IN  _        PERFORMANCE
 PERFORMANCE MEASURE           SCAP "" SPMS       EXPECTATION
7.  Number of sites where a    Yes     No       At least 90% of
    removal investigation                       SCAP measure
    has been completed.

3.  Number of hazardous        Yes     No       At least 90% of
    substance spill cleanups                    SCAP measure
    monitored on-scene
    (excluding oil spill
    cleanups performed with
    CWA §311 funds)
            0 These measures allow the program's planning and
              budgeting assumptions to be assessed through the
              evaluation process.  Headquarters and Regions
              share the responsibility for developing accurate
              projections.

-------
                                                          'OSWER.Directive 9200.2-2
                             OIL SPILL PROGRAM
OBJECTIVE:  Provide appropriate federal response to oil spills under
            authority of the Clean Water Act, §311
 PERFORMANCE MEASURE
                                  TARGET
                                 CURRENTLY
                                    IN
SCAP
SPMS
PERFORMANCE
EXPECTATION
1.  Number of oil spill       Yes      No
    cleanups monitored
    on-scene.

2.  Number of oil spill       Yes      No
    cleanups ccrpleted by
    Region using CWA §311
    funds.

3.  Number of SPCC inspec-    Yes      Mo
    tions completed.
                 At least 90% of
                 SCAP measure
                 At least 90% of
                 SCAP measure
                 At least 90% of
                 SCAP measure
              These measures allow the prooram's planning and
              budgeting assumptions to be assessed through the
              evaluation process.  Headquarters and Regions
              share the responsibility cor developing accurate
              projections.

-------
                                                           OSWER Directive 9200.2-2
                            FINANCIAL MANAGEMENT
OBJECTIVE;-  Effectively manage the resources of the response program.
 PERFORMANCE MEASURE
   TARGET
  CURRENTLY
     IN
SCAP    SPMS
PERFORMANCE
EXPECTATION
 1. Percent of Regional
    SCAP remedial plan
    obligated (YTD)
No      No       Obligation targets
                 by end of quarter:

                 1st: at least 90%
                 of 1st Quarter
                 obligation target
                 in final SCAP, +
                 any SCAP amendments
                 approved through the
                 ouarter

                 2nd: at least 90%
                 of cumulative 2nd
                 Quarter obligation
                 target in final
                 SCAP, + any SCAP
                 amendments approved
                 through the quarter

                 3rd: at least 90%
                 of cumulative 3rd
                 Ouarter obligation
                 target in final
                 SCAP, _+ any SCAP
                 amendments approved
                 through the ouarter

                 4th: 100% of cumula-
                 tive 4th Ouarter
                 obligation target in
                 final SCAP, + any
                 SCAP amendments
                 approved through
                 9/1/87

-------
                                                         OSV7ER Directive 9200.2-2
                         FINANCIAL MANAGEMENT
                                 TARGET
                                CURRENTLY
                                   IN
PERFORMANCE MEASURE
SCAP    SPMS
PERFORMANCE
EXPECTATION
2. Percent of Regional
   removal SCAP plan
   obligated (YTD)
No      No      Obligation targets
              .  by end of quarter:

                1st: at least 85% of
                1st quarter obligation
                target in final SCAP,
                adjusted to exclude
                planned obligations
                for sites that shifted
                to RP response during
                1st quarter

                2nd: at least 85% of
                2nd quarter obligation
                target in final SCAP,
                adjusted to exclude
                planned obligations
                for sites that shifted
                to RP response during
                2nd quarter

                3rd: at least 85% of
                3rd quarter obligation
                target in final SCAP,
                adjusted to exclude
                planned obligations
                for sites that shifted
                to ^P response during
                3rd quarter

                4th r at Least 95% of
                4th Quarter obligation
                target in final SCAP,
                adjusted to exclude
                planned obligations
                for sites that shifted
                ho RP response during
                4th quarter

-------
                                                         OSWER Directive 9200.2-2


                          FINANCIAL MANAGEMENT
                                 TARGET
                                CURRENTLY
                              	IN_	PERFORMANCE
PERFORMANCE MEASURE           SCAP ""  SPMS   EXPECTATION
   Disbursement of            Yes     No     +20% of SCAP
   Regional response                         target
   obligations
   (intramural and
   extramural TFAY9A)

-------
                                                           ,OSWERiDirective 9200.2-2


                            ANALYTICAL SUPPORT PROGRAM


OBJECTIVE:  tediitain capacity, quality and timeliness of analytical support.
                                  TARGET
                                 CURRENTLY
                                    IN	   PERFORMANCE
 PERFORMANCE MEASURE           SCAP    SPMS   EXPECTATION
 1. Percent of projected       Yes     No     Within +10% to
    sanples requested by                      -20% of SCAP
    Region                                    target

 2. Percent of booked          Yes     >fo     At least 85%
    sanples shipped by                        of the SCAP
    the Region to lab                         target

-------
                                                        OSWER Directive 9200.2-2
                               COMMUNITY RELATIONS
  OBJECTIVE:  Enhance communication between Agency and public at cleanup
              sites and provide opportunity for public input.
                                TARGET
                               CURRENTLY            PERFORMANCE
 PERFORMANCE MEASURE           	IN                EXPECTATION
                               SCAP SPMS
1. a. Number of remedial        Yes  Mo             100% of RI/FS
      sites where a com-                            start target
      munity relations                              in final SCAP
      plan was completed
      with first RI/FS
      start

   b. Number of remedial        Yes  Mo             100% of RI/FS
      sites where a public                          completion target
      comment period and                            in final SCAP
      responsiveness summary
      were completed with
      ROD/EDD signature

   c. Number of remedial        Yes   Mo            100% of MPL
      sites where a public                          deletion target
      comment period and                            in final SCAP
      responsiveness summary
      were completed with
      NPL deletion

-------
                                                        OSWER Directive 9200.2-2
                      CHEMICAL EMERGENCY PREPAREDNESS PROGRAM
Objective: Strengthen Federal, State and local preparedness for responding to
           releases of oil and hazardous substances.
                              TARGET
                             CURRENTLY             PERFORMANCE
 PERFORMANCE MEASURE            IN                 EXPECTATION
                             SCAP SPMS
1. Number of localities       Yes  Yes             100% of SPMS
   assisted in conducting                          target
   "Table-Top" and/or
   simulation exercises
   in priority and non-
   priority areas

-------
                                   OSWER Directive 9200.2-2
               APPENDIX B
FY 1987 Qualitative Themes and Questions

-------
                                                      OSWER Directive 9200.2-2


                         REGIONAL SELF-EVALUATION FORMAT

                          Sunnary of Qualitative Themes


     In the self-evaluation report, Regions are asked to provide a written
assessment of their accomplishments and problems in the following thematic
areas.  Assessments should focus on the specific topics that appear below
each heading.  In addition, the discussion should point out any actions or
assistance needed from Headquarters.

     In developing the discussion, Regions may find it helpful to refer to
the review questions referenced in parentheses.  The questions appear by
program area following this sumnary.  Their use is optional.



Implementing Superfund Reauthorization

    0 Actions initiated or planned to address:

         t Designation of state standards, RMCLs, and federal
           water quality criteria as potentially applicable or
           relevant and appropriate  (Remedial questions #3-4)
         t Administrative record requirements
           (Information Collection and Management question #4)
         t Removal/remedial consistency  (Removal question #5)
         t State involvement issues  (Remedial questions #47-50)
         t Implementation of Chemical Emergency
           Preparedness Program  (CEPP questions #1-4)
         t Health assessment requirements (Remedial questions #19-20)
         t Hiring and training needs under reauthorized program
           (Management of Human Resources questions #1-4)

         Note;  Consideration of permanent remedies and alternative
                technologies is addressed under a separate theme.


Conducting Pre-Enforcement Activity
    (Pre-Enforcement and Negotiations questions #1-6)

    0 Procedures for identifying PRPs before initial enforcement
      action

    0 How information on site conditions is collected

    0 How scope of needed response activity is defined in
      canTunications with PRP

    0 Procedures for informing PRPs of their potential liability
      and evaluating PRP response to notice letters

-------
                                                      OSWER Directive 9200.2-2
Negotiating PRP Settlements
    (Pre-Enforcement and Negotiations questions #7-12)

    8 Factors causing negotiation delays

    0 Use of deadlines to bring negotiations to conclusion

    0 Level of detail sought in PRP settlements


Determining Health-based Cleanup Approaches

    0 How site-specific cleanup goals are set for soil and
      groundwater (Remedial questions #5-7)

    0 Achievement of applicable or relevant and appropriate
      requirements at response actions (Remedial questions #1-2,
      Removal question #10)

    0 Use of environmental models and field data in assessing
      human exposure to site-related contamination
      (Remedial questions #8, 18)

    0 How indicator chemicals and toxicity values are
      obtained (Remedial questions #16-17)


Selecting Permanent, Cost-Effective Solutions

    0 Use of alternative treatment/destruction technologies
      in remedial and removal response (Remedial questions #10-12,
      Removal questions #7-8)

    8 Weighing cost and effectiveness in selecting a
      cleanup approach (Remedial questions #14-15)


Improving PRP and Fund-financed Project Management

    0 Expediting the RI/FS process (Remedial questions #27-28,
      Analytical Support questions #1-4)

    0 Procedures for ensuring that PRP response conforms
      with cleanup standards

    0 Cost control for remedial and removal response
      (Remedial questions #35-38, Removal questions #19-20)

    0 Systems for tracking PRP response (Remedial question #51)

    0 Management of cooperative agreements
      (Remedial questions #40-42)

-------
                                                      OSWER Directive 9200.2-2
Coordinating EPA/State Enforcement Efforts  (Remedial questions #43-46)

    0 How enforcement responsibility is shared between Regions
      and State enforcement agencies

    8 PRP tracking procedures under State settlements

    0 PRP compliance rate for State versus EPA settlements
Delivering Analytic Services  (Analytical Support questions #1-8)

    0 Quality and timeliness of QA work performed by ESDs,
      ESAT or Waste Management Division

    0 Allocating analytical work among CLP contractors,
      ESAT, ESDs, and remedial/removal contractors

    0 Monitoring procedures for sample delivery and analysis
      and QA/QC
Improving Ccmrunication with the Public (Camunity Relations questions #1-4)

    0 Achieving coordination among uuiuunity relations,
      enforcement and technical staff

    0 Public perceptions of conrnunity relations effort

-------
                                                   OSWER Directive 9200.2-2


                     QUALITATIVE REVIEW QUESTIONS
     This section provides a listing of issues and questions that
may be addressed in the FY 87 Regional reviews conducted by the
Office of Solid Waste and Emergency Response.  Before each review,
the program will develop a written agenda that identifies the
specific topics to be covered in that Region.  These topics may be
drawn from the Region's self-evaluation report, questions from this
appendix, or other questions specific to individual Regions.

     The questions may also be useful to the Region in developing
the self-evaluation.  Appropriate questions are referenced by number
in the preceding list of self-evaluation themes.

     Regions are not expected to address every question in the self-
assessment or annual review.  The questions are presented as
discussion guides.
                             PRE-REMEDIAL
     1.  What are the Region's quality control procedures for PAs
         and Sis?  Are they being strengthened?

            - FIT review of state-conducted PAs, Sis?

            - Review by Regional staff, including technical
              specialists (e.g., hydrogeologist)?

            - SI training for all reviewers?

     2.  What criteria are considered in making PA/SI decisions and
         priority rankings?  In assigning priority to CERCLIS sites
         for PAs?

     3.  Is the Region performing thorough QC on HRS scores calcu-
         lated by FIT, States and Regional staff?  What proportion
         of HRS scores are substantially revised after MITRE QA?
         How could the accuracy of HRS scoring be improved?

     4.  What is beinq done to implement the Expanded SI concept?
         How are sites screened for ESI selection?  How well does
         the preliminary HRS calculation predict the actual HRS
         score?

     5.  Do all Sis performed in the Region include analytical
         sampling?  Is the sampling data included in site files?

     6.  How does the Region allocate FIT resources among response
         sites?

-------
                                             OSWER Directive 9200.2-2
7.  Qi average, how long does the Region take to initiate a
    preliminary assessment after receiving a PA petition
    from the public?  Have all PAs resulting from petitions
    been completed within one year after initiation?

8.  What proportion of petition PAs have led to site
    inspections?  How does this compare to non-petition PAs?
    How often have petition PAs been conducted at sites
    previously recornrended for no further  action?


            PRE-ENPORCEMENT ACTIVITY and NEGOTIATIONS

1.  How is the accuracy and completeness of PRP search
    information assured?

2.  What procedures exist for utilizing the enhanced
    administrative subpoena authority  in the PRP search?

3.  How are enforcement staff trained  in PRP search techniques?
    Co some PRP search personnel have  experience in investi-
    gations?

4.  What criteria are used in site classification?

5.  At enforcement-lead sites, how often has the RI/FS been
    initiated 18 months or more after  the  site is proposed
    for the NPL?  What factors contributed to the delay?

6.  How are information requests and PRP responses used to
    further the PRP search?

7.  Describe the negotiation process  for removal actions,
    RI/FS, and RD/RAs.  What are the main  factors affecting
    the length of negotiations, value  of fixed negotiation
    timeframes, and level of detail provided in the settle-
    ment document?

8.  How does the Region use unilateral orders?  Discuss
    the choice of sites for unilateral orders and any
    negotiations that follow their issuance.

9.  To what extent, and at what times, does the Regional
    Counsel become involved in negotiations?

10. What screening process is used to  identify sites and
    PRPs for negotiation attempts?

11. How is the progress of negotiations assessed and
    monitored? How closely are regional managers  involved
    in negotiations?

-------
                                               OSWER Directive 9200.2-2
12. What training opportunities are available  to staff
    involved in negotiations?  How has staff turnover
    affected the negotiation process?
                          REMEDIAL PROGRAM

Remedy Selection

1.  For RODs/EDDs prepared within the past year, how often  is
    the selected remedy in conpliance with applicable or
    relevant and appropriate requirements of the Safe Drinking
    Water Act, RCRA,  and Clean Air Act?

2.  In what situations, if any, has the  Region invoked waivers
    provided in the NCP or the new waivers created under
    reauthorization (inconsistent application of State
    requirements, equivalent standard of performance)?

3.  For RODs/EDDs prepared after the date of reauthorization,
    to what extent is the selected remedy in compliance with
    applicable or relevant and appropriate requirements (ARARs)
    of State law, RMCLs, and federal water quality criteria?
    To what extent have RI/FS in progress been modified to
    ensure attainment of new ARARs?

4.  Has the Region determined whether any RODs/EDDs will need
    to be reopened to ensure that subsequent operable units
    are in compliance with new ARARs?

5.  How has the Region determined soil cleanup goals at remedial
    sites?  In what situations has the Region justified a
    cleanup goal above background through use of a site-specific
    risk assessment or a procedure equivalent to RCRA delisting?
    How often has cleanup to background  been chosen?

6.  In what situations have groundwater  cleanup targets been
    tied to MCLs, AClB, background levels,  or a combination of
    other standards and criteria (e.g.,  HEAs, ADIs)? At what
    points will the groundwater be safe  to drink?  What enforce-
    able measures or institutional controls have been imposed to
    prevent contamination beyond the compliance point?

7.  At sites with groundwater contamination, has the Region
    developed a range of alternatives satisfying Agency policy
    on target risk levels for carcinogens?  What risk levels
    are set for the "point of departure" alternative and other
    alternatives under consideration?

-------
                                              OSWER Directive 9200.2-2
8.   In developing ground water alternatives, does the Region
     use extensive field data to  estimate the rate of restoration
     and extent of remediation for each approach?  Are there
     significant data gaps or uncertainties  that iray reduce
     confidence in the estimates?

9.   In what situations has the Region used  the  "To be considered"
     criteria,  advisories and guidance listed in the Compliance
     Policy manual?

10.  To what extent have selected remedies involved permanent
     treatment  approaches that significantly reduce the toxicity,
     mobility and volume of hazardous waste? Have alternative
     treatment  technologies been  considered  in every feasibility
     study?  How often, and in what situations, have such
     approaches been selected?

11.  What steps have been taken to encourage use of innovative
     technologies at remedial sites?  What  are the most significant
     barriers to their use: e.g., technology limitations,
     prohibitive cost, lack of capacity at facilities?

12.  Has the Region collected data in the RI/FS to establish
     performance targets for alternatives involving permanent
     remedies?  To what extent has pilot  testing been conducted?

13.  How is the Region addressing the findings of the FY86 study
     study that examined how alternative  technologies had been
     considered in remedy selection?

14.  How does the Region achieve  balance  between cost and effec-
     tiveness in selecting the remedial alternative?  How are
     long-term O&M costs considered?

15.  Where long-term O&M costs are significant, is a sensitivity
     analysis conducted that varies the discount rate?  What
     other factors are considered in the  sensitivity analysis?
     In what situations, if any,  has the  analysis affected the
     choice of a remedy?

16.  In selecting indicator chemicals at  remedial sites, does
     the Region screen all chemicals found at the site?  What
     factors are considered in the screening?

17.  For indicator chemicals without applicable or relevant
     and appropriate health-based standards, how does the
     Region obtain toxicity values for use in risk assessment
     — e.g., by selecting ADIs,  reference doses, TLVs, MEE
     values?  In what situations  is ORD consulted?

18.  How does the Region decide whether to use modeling  in
     exposure assessments?  If modeling is used, how are
     particular models selected?  Have any reconnended models
     proven too complex for practical use?  What could be done
     to facilitate appropriate use of these  tools?

-------
                                              OSWER Directive 9200.2-2
19.  Has ATSDR conpleted a  public health assessment for each
     ROD/EDO signed in the  Region?  Where  this has not taken
     place, was ATSDR late  in providing the assessment?  Was
     field data provided to ATSDR on  schedule?

20.  What roles does ATSDR  perform in the  Region?  Has the
     ATSDR provided comment on  proposed remedies or issued
     recommendations on appropriate cleanup levels?

21.  What steps have been taken to ensure  that RCRA facility
     inspections are coordinated with anticipated CERCLA
     waste shipments?

22.  How has the Off-Site Policy affected  remedial actions?
     Have delays resulted from  insufficient capacity at
     permitted or interim status facilities?

23.  Does the Region ask RCRA personnel to review draft
     RODs on RCRA compliance issues?

Site Management Planning

24.  For what purposes does the Region use SMPs: as general
     program milestones (for Regional Administrators),
     detailed project nanagement goals (for REMs), or both?
     How are SMPs used in SCAP  development?  Has remedial
     management benefited from  their  use?

25.  How current are the Region's SMPs?  How often are they
     updated?  Is the process autonated?

26.  What role do States play in SMP  development (e.g.,
     initiating site plans)?

RI/FS Management

27.  Has the Region initiated key elements of the "Phased
     RI/FS approach?"

        - Use of interim authorizations to expedite
          field work pending firal approval of work
          plan

        - Creation of a Technical Advisory Committee
          to perform initial screening of  remedial
          alternatives

        - Use of FS requirements in determining
          scope of RI and preparing more accurate
          work plan

        - Incorporation of  DQOs in sampling plans
          and QAPPs

-------
                                              OSWER Directive 9200.2-2
28.  How long does the Region take  to  review work plans for
     RI/FS?  What steps have been taken to expedite the
     process?

29.  Does the Region issue formal work assignment amendments
     when the technioai direction of the RI/FS undergoes
     change?  Over the past year, how  often have such
     amendments been issued,  and for what  reasons?

30.  How does the Region monitor RI/FS deliverables and ensure
     reasonable quality?  Does the  RFM participate in regular
     project review meetings with contractors and/or States?

31.  In developing site schedules for  SMPa and the SCAP, how
     does the Region ensure that the program workload is
     is well balanced over the fisoal  year?  How are staffing
     needs and availability considered in  establishing realistic
     schedules?

32.  Has the Region fully implemented  training and performance
     requirements for REMs and supervisors?

33.  To what extent has the Region  used the REM contract award
     fee process?  Have Regional award fee packages been sub-
     mitted in complete form and in a  timely manner?

34.  Have Regions followed applicable  guidance for preparing
     action memos for Expedited Response Actions?

Remedial Design and Action

35.  During remedial design,  does the  Region participate in
     design reviews (preliminary, prefinal, and final) with
     contractor/State/COE to ensure consistency with
     environmental standards and the ROD?

36.  For RDs currently underway or  completed within the last
     two quarters, what is the incidence of major design
     changes involving a ROD amendment or  a work interruption?

37.  Hcur does the Region oversee the progress of RAs and assess
     tha adequacy of direct field inspections performed by the
     lead agency or party?  How often  have RA work assignments
     been amended, and why?

38.  Has the Region experienced significant increases in project
     cost between ROD completion and RD completion?  After
     approval of the RA budget?  What  factors were responsible
     — e.g., project delays, problems with the ROD?  How often
     have cost estimates in the ROD/EDO come within +50% to
     -30% of actual project costs?

-------
                                               OSWER Directive 9200.2-2
39.  After the construction phase, has the final  inspection
     revealed significant problems at any remedial  sites  in
     the Region?  tfeve problems become apparent through
     groundwater monitoring or in preparation of  the  final
     technical report?  To what extent were changes in site
     condition involved?

Oversight of State-lead Projects

40.  Has the Region completed reviews of state procurement
     programs and management of cooperative agreements
     under the Management Assistance Program (MAP)?

41.  What issues or problems have been disclosed  through
     MAP reviews performed by the Region?

42.  ffes the Region conducted follow-up on completed
     reviews to ensure that States are moving to  correct
     identified problems?

43.  Are States consistently achieving PRP settlements or
     cleanups at State-lead sites?   Vfriat is their record
     on:

         - Meeting goals of the Interim Settlement  Policy;
         - Achieving adequate PRP compliance;
         - Enabling NPL deletions?

44.  How involved are Regional staff in State-lead enforcement
     actions, including negotiations, certifying  and  signing
     settlements, and overseeing PRP response?

45.  Are States submitting major deliverables in  a  timely
     manner:  quarterly reports, draft orders or consent
     decrees, RI/FS reports, etc.?   Do they contain the
     required information?

46.  To what extent have States met  the criteria  for  classifying
     NPL sites as State-lead enforcement?  For sites  in this
     category, does the Region execute a formal agreement with
     the State (Cooperative Agreements or EPA/State enforcement
     agreements)?

47.  ffave States been timely in:

     (a) notifiying the Agency of their intention to  participate
         in PRP negotiations (within 30 days of notice)?

     (b) submitting comments on the  RI/FS, alternative remedies,
         and waivers from legally applicable State  standards
         (within 60 days of receipt)?

-------
                                              OSWER Directive 9200.2-2
48.  Where §106 actions do not  conform to State standards,
     what avenues are the States pursuing; e.g.:

     0 concurring with the remedy and  becoming a signatory
       to the consent decree;

     0 intervening, before entry of the  consent decree, to
       have remedy conform to State standards;

     0 no recourse.

49.  What approach has the Region taken  to obtain information
     from States on potentially applicable or relevant and
     appropriate requirements (ARARs)  of State law?  Has the .
     State provided timely notification  of potential ARARs
     for each feasibility study planned  or in progress?

50.  What is the Region doing to inform  States of reauthor-
     ization changes affecting their role?
RP Oversight

51.  How does the Region track RP response for removal actions,
     RI/FS, RDs, RAs?  Discuss the use of project management
     systems, factors that may delay project  completion, and
     steps taken to return projects to schedule.

52.  What review and approval procedures are  used for work
     plans, QAPjPs, health and safety plans, and  sampling
     plans?  WHat interim deliverables are required of
     RPs?

53.  How long does the Region take to review  required
     deliverables?

54.  On average, how many sites does a project officer review?
     To what extent does project officer turnover affect
     Agency efforts to monitor PRP compliance?
                      JUDICIAL ENFORCEMENT

1.   How are sites selected for a §106 referral for remedial
     relief or a §106 referral for penalities?  How are these
     referrals used to compel PRP response?

2.   How are the Regional Counsel and the Justice Department
     involved in the decision to pursue litigation?  In
     compliance monitoring?

3.   What forms of technical advice are required to support an
     ongoing case?  How is the TES contract vehicle used  for
     case support?

-------
                                               OSWER Directive 9200.2-2
4.  How are technical staff involved in determining the
    adequacy of settlement proposals?

5.  What factors have led to delays in case conpletion
    and what is being done to expedite the process?
                         COST RECOVERY

1.  What procedures does the Region follow in initiating cost
    recovery actions?  How are sites selected for cost recovery
    action?

2.  Have any problems been observed with cost recovery documen-
    tation?

3.  How does the Region deal with bankrupt responsible parties?

4.  How does the Region move ongoing cost  recovery actions to
    settlement or trial?  Focus specifically on the role of
    technical staff in settlement negotiations and factors
    that may delay case resolution.
                        REMOVAL PROGRAM

Removal Implementation

1.  How has the Region handled response situations  involving
    close coordination between the  removal and remedial
    programs?  Have the REM and OSC made joint decisions on
    appropriate handling of such cases? What screening
    criteria are used to determine  whether an ERA should be
    performed?

2.  In what situations has the Region requested exemption
    from statutory limitations on removal actions?  How have
    such requests been justified?

3.  Are removal response decisions  fully supported  in required
    docunentation (e.g., action memos,  OSC reports) to
    facilitate cost recovery action?

4.  Under the revised NCP,  the Region has broader authority
    to respond to threats of release.  How has this affected
    the nature or number of removal actions performed in
    the Region?

5.  Under reauthorization,  the Agency is required,  to the
    extent practicable, to ensure that  removal actions
    "contribute to the efficient performance" of  long-term
    remedial actions.  How is  the Region implementing
    this provision?

-------
                                              OSWER Directive 9200.2-2
6.   To what extent have releases or release threats recurred
     after completed removal cleanups?  What was the cause of
     the recurrences?   Have any removal restarts been related
     to difficulties with the initial response?

7.   To what extent has the  use of alternative technologies
     been considered in developing removal responses?  How
     often,  and in what situations, have such approaches been
     selected?

8.   How has the use of alternative technologies been affected
     by technology limitations, cost considerations, the
     need for urgent response, or administrative concerns
     (e.g.,  lack of incineration capacity)?

9.   Has the Region experienced any unreasonable delays in
     initiating removal action?  What steps in the process
     have been affected?

10.  How does the Region determine appropriate cleanup levels
     and trigger levels for  removal actions?  What is the role
     of "applicable or relevant and appropriate requirements"
     (ARARs) in this determination?  To what extent is the
     basis for setting cleanup levels documented in action
     memos,  OSC reports and  POLREPs?

11.  Has the Region appointed a contact person to monitor
     functions related to sample analysis and to work with
     Regional QA staff?

12.  In monitoring non-federal removal actions, how does the
     Region determine whether the State or local agency is
     handling the action properly?  How often has the Agency
     taken over removal actions initially performed by a State
     or local body?

13.  How has the Off-Site policy affected removal actions?
     Have delays resulted from insufficent capacity at
     permitted or interim status  facilities?

14.  What forms of training  are required of OSCs?  At what
     stage are new OSCs allowed  full  site responsibilities?

15.  How does the Region monitor  (federal) removal actions?
     To what extent are OSCs present  on-site?  When they cannot
     be on-site, how is monitoring accomplished?

16.  Are there protocols to ensure that important policy and
     guidance documents are  available to OSCs?  Does each
     OSC have a copy of key materials?

-------
                                               OSWER Directive 9200.2-2
17.  HOT does the Region coordinate najor actions with HQ?
     How accurate and timely are Regional data  submissions
     (e.g.,  SCAP,  SEWS)?  What personnel are responsible for
     ensuring that data are transmitted on  time?

Removal Contract Management

18.  What steps has the Region taken  to implement recommen-
     dations of the ERGS and TAT reviews conducted in 1984-86?

19.  What procedures are followed to  verify contractor
     charges and adherence to project workplans?  Are
     payment vouchers compared with OSC reports?  To what
     extent  are OSCs present on  site  to monitor project
     design and costs?  How often have  discrepancies been
     reported?

20.  Within the past year, what  proportion  of removal actions
     have stayed within the authorized  cost ceiling? How
     often has the limit been exceeded, and for what reasons?

21.  How well have remo^l timeframes reflected initial schedule
     projections?   In time-critioal actions, have ERGS
     contractors mobilized within required  response times?

22.  What is the Region's overall assessment of ERGS and TAT
     performance?   What areas need improvement?
                    ANALYTICAL SUPPORT  PROGRAM

1.   Has the Region established an  integrated  system for
     planning and tracking sample requests and shipments?
     How is this system related to  SMPs?  Does the Region
     liave a system for monitoring sample analysis against
     projected schedules and costs?

2.   How do the Region's Waste Management Division and ESD
     coordinate field sampling activity with RSCC bookings?
     How does the RSCC establish  priorities among different
     analytic needs?

3.   Does the Region's ESD/ESAT perform quality assurance on
     sampling data according to prescribed procedures and
     timeframes?  Is the validation prioritized by site status?

4.   Does the Region's ESD provide  timely review of project
     QAPPs?  Does the Region review sampling plans for adequacy
     before making requests for CLP analysis?

-------
                                               OSWER Directive 9200.2-2
5.  How is analytical work allocated among CLP contractors,
    the BSD, and remedial/renewal contractors?  How does the
    Region monitor sample analysis conducted by non-CLP
    contractors?

6.  Does the Region actively participate in the CLP caucus
    process?

7.  How does the Region provide information to the SMO on
    special methods required for SAS?  Is this performed
    thoroughly and expeditiously?

8.  Does the Regional laboratory participate in the CERdA QA
    programs by analyzing quarterly performance evaluation
    samples?
                       FINANCIAL MANAGEMENT

1.  Indicate whether the Region's funds control and financial
    document processing functions are performed by the program
    divisions or a  central administrative office  (e.g., Admin-
    istrative Services Division).

2(a).  If the program divisions have these duties:

        - Has the program division  staff received
          adequate training  for  these functions?

        - Are the procedures documented in writing?

        - What problems have arisen, and how have
          they been addressed?

2(b).  If an administrative  office has these duties:

        - Describe any problems  in  comtinication
          or coordination between the program
          division and the administrative office.

        - Indicate any problems  in  controlling and
          reporting on the overall  allowance,
          especially if the  remedial and removal
          components are handled by different offices.

3.  What systems does the Region use for tracking ccttmit-
    ments> obligations and outlays  (e.g., firstup, ORMs,
    PCs, manual OCRs)?  What are their advantages and dis-
    advantages?  Are the systems adequate to prevent over-
    obligations and project  future  funding needs?

-------
                                               OSWER Directive 9200.2-2
4.  Does the Region's financial document processing flow
    reflect the 1984 Remedial and Removal  Financial Management
    guidance?  Explain any differences  that may exist and
    any reasons for delay.
                      COMMUNITY RELATIONS

1.  Does the community relations coordinator maintain regular
    contact with technical and enforcement  staff to coordinate
    their involvement in CR activity?  Are  members of the
    technical staff present at key public meetings and
    discussions with public officials?

2.  How does the Region monitor and evaluate CR activities at
    state-lead sites?  Are state quarterly  reports reviewed to
    assess state performance  on CR? Are any problems apparent?

3.  How could the CR program  be modified to achieve its goals
    more effectively?  What CR techniques have  worked parti-
    cularly well?  What assistance is needed from Headquarters?

4.  How is the Region's CR program perceived by the public?
            CHEMICAL EMERGENCY PREPAREDNESS PROGRAM

1.  Have all States in the Region completed State Implementation
    Memos designating CEPP priority areas?  If not, what States
    are missing, and how much work remains?  On what basis were
    priority arHas selected?

2.  What contingency planning activities  (technical assistance
    or training) does the Region plan to  conduct with States?
    What is their current status?  Does the Region evaluate
    the adequacy of risk analysis performed by the States?

3.  Which table-top or field simulations  has the Region
    conducted or participated in?  Did the exercises appear
    successful?  Has the Region worked with any States to
    improve the exercise?

4.  How does the Region use contractors  (e.g., TAT) in CEPP
    activities?
                      TECHNOLOGY TRANSFER

1.  How does the Region disseminate  technical information
    among Regional staff,  contractors,  States and other
    Regions? What kinds of information  are circulated?

-------
                                              OSWER Directive 9200.2-2
2.  Has a Technology Transfer Coordinator been designated?
    What share of his/her workload involves  facilitating
    information exchange on tech transfer?

3.  Over the past year, what actions has the Region taken to
    inplement technology transfer initiatives?

4.  When questions arise on technology issues, what information
    sources does the Region most frequently  consult?  To what
    extent are inquiries directed to EPA HQ?
             INFORMATION OOLLZCTICR AND MANAGEMENT

1.  What systems and procedures las the Region developed to
    cranage the collection of activity data and ensure timely
    entry in SPMS, SCAP, and SMPs?

2.  How does the Region verify the accuracy and completeness
    of activity data obtained from RPMs,  OSGs, and other
    providers?

3.  How has the Region used Regional  and  program-wide data
    systems (e.g., CERCLIS) to improve program and project
    management?

4.  Has the Region established an administrative  record
    for all Fund-financed and enforcement response actions?
    What items are included in the record?
                  11ANA3EMENT OF HUMAN RESOURCES

1.  What plans are in place to ensure that  needed staff are
    hired and trained?   How have these plans been affected
    by reauthorization, including the specific provisions
    related to training?

2.  How has staff turnover affected the progress of  site
    activity?

3.  What is the relationship between the Region's training
    achievements and identified training needs?   How have
    those needs been assessed?

4.  How does the Region evaluate the effectiveness of its
    training program?  What are the results of the evaluation?

5.  How are available training FTEs being utilized?  Has a
    position been created for a Regional Training Coordinator?

-------