Office of Inspector General
Report of Review
PRCfl* *
Special Review of EPA's Information
Systems Program
Volume II
El SKG3-15-0098-4400038
March 24, 1994
Racycted/Recyctabl*
Printed on paper that contains
at least 50% recycled fiber
-------
Inspector General Division
Conducting the Audit:
Region Covered:
Program Offices Involved:
Technical Assistance Division
Washington, D.C.
Agencywide
Agencywide
-------
Special Review of EPA/s Information systems Program
TABLE OF CONTENTS
VOLUME II
APPENDIXES
APPENDIX I: SENATE SUBCOMMITTEE ON SUPERFUND,
RECYCLING, AND SOLID WASTE MANAGEMENT
REQUEST TO INSPECTOR GENERAL
FOR A SPECIAL IRM REVIEW .......... , . . 1-1
r .
APPENDIX II: SENATE SUBCOMMITTEE ON SUPERFUND,
RECYCLING, AND SOLID WASTE MANAGEMENT
REQUEST TO EPA ADMINISTRATOR
FOR A SPECIAL IRM REVIEW ............. II-l
»
APPENDIX III: JOINT AGENCY AND OFFICE OF
INSPECTOR GENERAL TEAM MEMBERS ...... III-l
APPENDIX IV: FOCUS GROUP SUMMARIES ............... IV-1
APPENDIX V: INTERVIEW SUMMARIES ................. V-l
APPENDIX VI: AGENCY PARTICIPANTS IN FOCUS
GROUPS AND INTERVIEWS ............... VJ-1
APPENDIX VII: OIG REPORTS, GAO REPORTS, EPA
MANAGEMENT REPORTS, AND
CONGRESSIONAL TESTIMONY. . . ..* ........ VII-1
APPENDIX VIII: APPROACH AND METHODOLOGY ........... VIII-1
APPENDIX IX: IMPLEMENTATION STATUS OF PRIOR
HIGH-PRIORITY RECOMMENDATIONS ...... '. IX-1
APPENDIX X: CHART: EPA OFFICES WITH MAJOR
IRM RESPONSIBILITIES ................ X-l
APPENDIX XI: OIRM ISSUE PAPER: STRENGTHENING
IRM AT EPA. ...................... ;.'. XI-1
APPENDIX XII: GLOSSARY OF ACRONYMS AND
ABBREVIATIONS ................ . . ..... XII-1
APPENDIX XIII: REPORT DISTRIBUTION ....... . ........ XIII-1
Report NO. E1SKG3-15-0098-4400038
-------
Special Review of EPA's Infornatioa Systems Program
(This page intentionally left blank)
Report No. E1SKQ3-15-0098-4400038
-------
,«««.,^^. -r
APPENDIX I
September 27, 1913
Honorable John C. Martin
Inspector General
United States Environmental Protection Agency
DC 20460
Dear Hr. Martini ,-
Thank you for presenting euch compelling t«»timony at our
bearing on Jtane 10, 1993. unfortunately, your efforts have railed
eeriouB new queitioni about the Agency1 • management of the fiscal
and information lyitemi programs, compounding the previously
identified problem* with EPA's contract management.
Based on information you presented at our recent hearing and
th« body of accumulated evidence on this issue, we **• requesting
that your office perform a comprehensive management review of the
EPA's fiscal and information systems programs with as much as
specific attention on the Superfund area as is possible.
• We recognise that EPA has taXen some important steps in the
past year to address its contract management problems. • However/
there has been no comparable no holds -barred, systemic management
review of the fiical and information systems programs of th«
Agency.
The apparent lack of credible basic management information
about where Agency funds, including Superfund money, are being
sprat and how, and the questions raised about what the resulting
accomplishments may -be, go to the heart of all Agency programs
including the Superfund program.
Xt is critical that these issues, to the extent they pertain
to the Superfund program, are addressed as part of the Superfund
reauthorisation process over the next year. Therefore, we) rvqueet
that you carry out the global management reviews and report the
results to us as soon as possible but no later than January 1994.
We would ask that any legislative recommendations are provided to
the Subcommittee no later than November 1, 1993. The subcommittee
understands that to accoapliah your work in this timeframa, you
have asked and Agency has agreed to be a participant in your
reviews.
1-1
-------
APPENDIX I
We truly appreciate all your efforts in identifying and
remedying the serious management problems at the Agency, and look
forward to receiving copies of the two reports requested by this
letter.
Sincerely,
Frank Lau
Chairman/.Subcommittee on Super fund, Sinking Minority Member
Recycling, and Solid Waste Management j
1-2
-------
MAX UUMl MNUNk OUIftlMI
"—•—-" APPENDIX II
.
NMMWM.MWIMMKOIM MR
Bititd States Senate
mm. tMIH tfrgttgetm CeMUnTttONniVWONMINTANpnAUCWDtKt
WMrtlMflTOft. PC «
September 27, 1993
Ma. Carol Browner
Administrator
U.S. Environmental Protection Agency
401 M Street
Washington, DC 20540
Dear Xa. Browner:
Aa you ara aware, the Subcommittee haa had serious eoneerna
about the Agency'• management of fiaeal and information systems, we
have requested that the Office of the Inspector General (OZ6)
(letter attached) perform a comprehensive management review of
SPA'a fiaeal and information systems with aa much apecific
attention on the Superfund area aa ia possible,
Given the pace of Superfund reauthorisation, it is critical
that these issues are addressed over the next few months, we have
consequently asked that the Inspector General (10) provide any
suggestions for legislative reform by November 1, 1999, and
complete his overall review of fiscal and information management
systems no later then January 1994.
In order for this to occur/ it ia important that the staff of
the ozo receive full cooperation and participation from fiaeal and
information system aa well aa program officials during the course
of the studies. This cooperation should include assistance in
identifying problems and their cauaea aa well solutions either of
a management or legislative nature.
The OZG will periodically brief staff of the Subcommittee
during the conduct of this work. We appreciate your staff* full
iavolvwewfc in assisting the IG in reviewing these critical Agency
support systems.
Sincerely
Prank Ladtenberg ^»«X Aavra DujSKoerger
Chairman, Subcommittee on Superfund, funking Minority Member
Recycling, and Solid Waste Management
United States Senate
II-l •
-------
APPENDIX III
JOINT AGENCY AND OFFICE OF INSPECTOR GENERAL
REVIEW TEAM MEMBERS
Project Director
Gordon Milbourn III, Special Assistant to the
Assistant Inspector General for Audit
OIG
Proi ect Manager
Craig Silverthorne, Chief
ADP Audits and Support Branch
OIG
Team Members
Tom Bennett, Auditor
ADP Audits and Support Branch
OIG
Eileen Falcone, Auditor
ADP Audits and Support Branch
OIG
Asa (Jack) Frost, Director
Information Management Staff
C3WER
William Gill, Computer Specialist
Office of Information Resources Management
OARM
Steve Hufford, Chief
Information Management Branch
Office of Information Resources Management
OARM
III-l
-------
• APPENDIX IV
SPECIAL IRM REVIEW - CONGRESSIONAL REQUEST
FOCUS GROUP SUMMARIES
The following transcripts of the four focus groups are
provided to give the reader the complete results of the focus
group discussions.
IV-l
-------
APPENDIX IV
FIRST FOCUS GROUP SUMMARY
The session started with the group voting, using a simple
ballot, to prioritize the previously identified problems and root
causes addressed from previous reports, and to add descriptions
of any other major IRM problems not on the list. The ballot was
structured so that respondents would answer based on their own
systems experiences. Respondents were also asked to indicate on
the ballot how they used the particular information system on
which they were basing their responses. The group identified
four high-priority problems and identified root causes and
solutions for each of them.
The Focus Group voted the following previously identified
problems as the three manor problems:
1. Systems don't provide credible information on the resulting
accomplishments from money spent. (19 points)
2. Systems don't adequately address cross-media pollution
problems. (19 points)
3. Duplicate systems have been developed. (19 points)
One new problem surfaced that was not on the list of
previously identified problems.
4. Systems have low levels of utility and friendliness to
regional users.
IV-2
-------
Other Problems Identified (from initial ballot)
RCRIS
System is overly complicated & difficult to use.
System continually needs upgrading 6 rewrites.
CERCLIS
CERCLIS t IFMS/FMS are not electronically linked.
We track too many things and definitions are too
convoluted.
Purpose of system not clearly identified 6 kept
"narrow" for national systems.
Systems developed with no clear customers.
Usefulness of systems for EPA regional managers.
FRPS
Doesn't allow easy user access.
Doesn't contain parametric data, only violations.
APPENDIX IV
(med)
(med)
(high)
(med)
(high)
(med)
(high)
(high)
. (med)
AIRS/AFS
The different data bases don't communicate/ share (high)
info well - this will have a great impact on multi-
media issues if not addressed.
Over all direction & policy is lost in the day- (high)
to-day operations — need to take time to evaluate
and refocus if it's required.
Need to have standardized & consistent minimum (high)
data elements that are agreed to by alj. programs
using the system.
Need one identifying number to track a facility (high)
on (especially multi-media sources) — too many
different numbers used now!
Stop collecting data for data sake - identify (med)
the need for & use of the data - use data
for program management in decision making.
11/17/93
IFMS/EPAYS/GICS/PPAS
Not user friendly.
Don't provide info useful to regional /
first-line manager.
•».
(EPAYS/IFMS)i Systems are too difficult for a
manager to use. Need easy to use data systems
that provide managers info quickly.
(high)
(high)
IV-3
-------
APPENDIX IV
SUMMARY
CROSS-CUTTING ROOT CAUSES: The following root causes were
identified as having impact on more than one priority problem.
The causes are identified to the problems by number in
parentheses.
1. Program and data (IRM) staff don't work together to design
useful systems, therefore no motivation exists to assure
data quality. (1) (3)
2. National systems are designed to meet EPA Headquarters needs
(and those of Congress), and do not meet regional needs.
(1) (2)
3. National systems are media specific. (1) (3)
4. Program and data management and staff often lose sight of
the philosophy/reason behind the original development of a
system, i.e. the intended purpose of the data. (2) (4)
5. Automation is not always the answer. Systems are not the
best place for answering all questions. If activities
resulting in accomplishments are qualitative as opposed to
quantitative, these would be unmeasurable in quantitative
systems. (1) (2) (3) (4)
6. There is no comprehensive master plan [for IRM] and there
are unrealistic expectations. (1) (2) (3) (4)
SUMMARY
GROUP DISCUSSION OF POTENTIAL SOLUTIONS; The group considered
the following solutions before in depth discussions to relate
solutions to root causes for the priority problems:
1. Developing a master plan for EPA information systems.
2. Setting clear expectations for systems rather than adding
functionality to existing systems ad infinitum.
3. Better planning for, and culling, the set of Agency
information systems.
4. Prioritizing customers and their competing needs.
5. Viewing data as timeless.
IV-4
-------
APPENDIX IV
6. Addressing the mismatch between data system assumptions and
congressional expectations (e.g., a "best practicable
treatment" approach in legislation will never create
information systems that can answer specific questions about
ambient conditions at individual .sites).
7. Seizing the opportunity, after NAFTA, Mexico/SEDESOL has to
build good systems from the start.
8. Addressing problems with 25 FINDS numbers being assigned to
the same facility under the air program.
9. Recognizing automation is not the best answer to all
problems.
10. Avoiding an increase in internal regulations.
11. Creating core data elements across systems so States
wouldn't have to enter duplicate data.
This discussion led to developing one particular solution:
(stakeholders forming a core committee to formulate an effective
IPM 5 year plan) and brainstorming to flesh out' that solution.
The group then identified and categorized solutions to the root
causes.
ROOT CAUSES/SOLUTIONS
Based on Previously Identified Problems/Root Causes
for IRM Problems Supplied to Participants)
PRIORITY PROBLEM #1: Systems don't provide credible information
on the resulting accomplishments from money spent. (19 points)
ROOT CAUSES FOR PRIORITY PROBLEM #1:
1. EPA establishes surrogate measures (activity measures and
"bean-counting" measures rather than trends/outcomes
measures) to quantify program success (e.g., compliance
rates, permits issued, inspections, penalties collected,
"best technology applied", etc.). This, in turn, is
because:
2. Data quality is less accurate, for "environmental" data and
is subject to many qualifiers (season, age, sex, species),
which must be interpreted
3. Environmental data is expensive to collect and limited in
amount, geography, year, etc.
IV-5
-------
APPENDIX IV
4. Federal/state/local stakeholders who would incur the
transaction costs to collect more environmental data do not
think it's worth it.
5. Reports like Reilly's "Risk Based Priorities" are a better
tool for relating program expenditures to environmental .
results (systems aren't the best place to accomplish this
task).
PRIORITY PROBLEM #2: Systems don't adequately address cross-media
pollution problems. (19 points)
1. "Media" based organizational structure of the Agency does
not encourage system designs which address risk-based
environmental problems. Furthermore, systems tend to lack
common data elements (e.g., identifier numbers) which would
facilitate utility across organizational lines.
2. Separate media programs don't encourage cross media
communication.
3. Systems developed are local in purpose and focus on single
media concerns.
5. Needs from media to media are vastly different.
6. Systems are statutorily based, not risk or problem based.
7. Too many different identifier numbers exist, not one
number that is common.
8. Systems do not communicate to share common data elements.
PRIORITY PROBLEM #3. Duplicate systems have been developed. (19
Points)
1. Information must serve multiple clients with multiple needs.
--National systems are developed by Headquarters offices to
serve, their needs without considering the needs of other
Headquarters offices, Regions, and States
--Headquarters and Regional managers have different data
needs
--State programs require more and different information and
detail
--Too little interaction exists by Regions at National level
2. National systems are too complex.
«.
--Systems are overly complicated and too difficult for
IV-6
-------
APPENDIX IV
managers or staff to use
--Systems cannot produce needed regional reports, therefore
Regions develop report writers
--System development or revision is difficult and takes too
long to obtain national consensus/ therefore people
develop their own
--Developers with no environmental program knowledge develop
systems that are not useful to the Programs they should
serve, therefore people develop their own
3. There is no comprehensive Agency or national program office
planning for system development or identifying future needs.
--There is no master plan or life cycle management of
systems
--There is no agreement on standard Agency-level data fields
--Headquarters uses end of year money for unnecessary new
system enhancement
--Program is unwilling to resolve mainframe versus PC based
debate, so they decide to do both
4. Existing systems are inflexible and cannot be easily adapted
to meet changing needs.
--canned reports only provide information valuable to
Headquarters or Congress
--new system developed contains almost the same information .
--data fields are not always normalized between systems
requiring some overlap or duplication, eg. the FRDS system
doesn't address all needs of users (states) because the
system carries only violation indicators not parametric
values
--developers fail to step back and refocus to correct the
direction; instead a new system is developed
PRIORITY PROBLEM #4. Systems have low levels of utility and
friendliness to regional users.
ROOT CAUSES FOR PRIORITY PROBLEM #4.
1. EPA's national systems are written in obsolete database mgmt
software. This obsolete software (Focus, ADABAS, S2K,
Clipper, etc.) makes it impossible or too expensive to
create user-friendly systems. No amount of user
identification and user participation in the development
process can change this.
2. There is a need to better identify "who" the system must be
friendly/useful for.
IV-7
-------
APPENDIX IV
3. The "user" group has changed over time to include everyone
with a PC on their desk.
4. Costs are high to make systems user-friendly.
5. Program staff and data staff have not worked together on
what's needed and what's to be done with what's been
collected.
6. We lose sight of the philosophy and reason behind the
original development of the system, and along with it, the
intention of how the data are to be used.
7. The systems' become cluttered with add-on activities.
8. National requirements (for Congress) are not the same as
regional requirements
9. We rely too much on contractors and work with artificial
deadlines.
SOLUTIONS: The group believed the first two of the following
solutions would have a large impact in addressing EPA's IRM
problems discussed during the two days. For those two solutions,
the group did a force-field analysis (looking at pros and cons)
of implementing those solutions:
1. Formulate an IRM 5-year plan by having stakeholders
(Administrator, Assistant Administrators, regional media
program managers and staff, state and local representatives)
form a core committee to:
a) conduct viability study of existing systems (to keep,
revise, or pull them).
b) identify the future direction of data use and data
management.
c) make hard decisions, prioritizing competing customer
needs (not just attempting to please all customers).
d) examine whether it would be cost-effective to pursue a
core data concept for the Agency, i.e., relational data
concept.
e) examine and answer the following questions [for
national systems]:
i) what is the decisign that can/will be made from
this data/system?
IV-8
-------
APPENDIX IV
ii) what level of data is necessary to provide basis
for the decision?
iii) is it necessary for this data to reside in a
national database system?
iv) are the data "timeless"?
v) are the customers transient and can we say "no" to
their needs?
FORCE-FIELD ANALYSIS FOR SOLUTION 1
PRO
there is a legislative mandate to do so
it aligns with a multimedia approach
it is an opposing force against individuals who try to make
the data systems work rather than question the usefulness
of the system
- it would simplify identification of facilities and sources
it would reduce the data reporting burden on businesses, states,
and locals
it provides for a broad-based approach to determine expectations
and goals of what the system is to do
Administrator would need to designate a "team" with a very
succinct mission and tight schedule
if the cost of maintaining current systems is factored in with the
cost of duplicate data entry, the up front costs would be worth
the long term benefit
CON
bureaucratic intransigence to change
senior managers don't have attention span to provide meaningful
involvement and direction
attempting to reach consensus on core elements may make it
difficult to define what is really "core"
AAs may be territorial or resistant
little historical interest by EPA top managers to information
management issues
2. Enhance electronic transferability of core national data, so
that regions can develop their own modular front end data
systems (interfaces to the data) to meet regional data
needs.
FORCE-FIELD ANALYSIS FOR SOLUTION 2
PRO
if the cost of maintaining current systems is factored in with
cost of duplicate entry at various levels, the up-front costs
Of this new "core system" may be worth the long-term benefit.
this would eliminate many disputes between HQ and regions and
among regions selecting "important" data fields
front-end programming expertise is developed at regions, not by
transient HQ contractors
the necessary technology is now available and affordable
this would reduce time and cost in developing new national
systems
regional users would get what they want every time
NPR culture would empower lower levels (regions) to take the lead
in meeting their own needs (there may be 10 different front-end
systems) ~~
IV-9
-------
APPENDIX IV
this would make the use of data easier
there would be regional mgmt consensus behind this idea
supports multi-media enforcement
speeds up development of enforcement cases and comparison of
statistics and trends
CON
entrenched NCC bureaucracy doesn't want to relinquish turf to
regions
some regions have no front-end developing capability, and
training costs are high to acquire expertise
need for HQ "uniformity" make it hard to let go of front end
systems
current national system managers will be resistant to change
HQ offices would have to "give up" some control. The territorial
instinct in HQ is strong
some individuals (programmers, data system managers) will continue
to try to make the existing systems work rather than question
the systems
OTHER SOLUTIONS
3. Implementation of HR 3425, which will establish a Chief Info
Officer and a Steering Committee at a high level
4. Look to the National Performance Review for recommendations
for how EPA can be better organized to address environmental
risks, i.e. address barriers we have within media
5. EPA should pay for systems management costs incurred by
States, to improve data quality (but decision to do this
should be done on a system-by-system basis to ensure cost-
effectiveness)
6. Design specifications committees/workgroups must include
regional representation at the workgroup leadership level.
They must also include state/local reps if the systems are
to be used by them, or if they will provide data to .the
systems.
7. HQ traditionally has had the lead on developing systems.
Shift the lead and contractor control to the region(s)
chosen by national consensus to be the lead.
8. Use a bottom-up approach rather than a top down development
approach for selected systems.
9. In the absence of a large committee/global approach,
establish clear expectations for each system by asking:
a) what is the decision that can/will be made from this
data/system?
b) what level of data is necessary to provide basis for
the decision?
«.
IV-10
-------
APPENDIX IV
c) is it necessary for this data to reside in a national
database system?
d) are the data "timeless"?
e) are the customers transient and can we say "no" to
their needs?
IV-ll
-------
APPENDIX IV
SECOND FOCUS GROUP SUMMARY
PRIORITY PROBLEMS
1) Exclusion of information management from regulation and
guidance development. Lack of ownership of information in
EPA systems by program people (What information do we need
to run our programs). (Success is defined by getting a
regulation "out the door").
and
Lack of understanding or participation by upper management
(Office Director and above) in information resources
management. Information management programs are not treated
as core function or critical to EPA environmental
protection.
(23 Pts.)
2) Lack of consistent Agency architecture (hardware, -software,
data), strategy and lack of power to enforce it. No
consolidated approach to EDI. [Indirectly trying to enforce
it through budget]. (13 Pts.)
3) Lack of centralized data administration function in Agency
(also none within program offices). (11 Pts.)
4) Un-implementable policies, standards, and guidance. (They
are basically "OK", but there is not power to enforce and
often can not afford it). (10 Pts.)
5) Too much dependency on contractors and inadequate in-house
IRM technical expertise. (9 Pts.)
6) Lack of defined IRM infrastructure, communication within
infrastructure, and common understanding of roles and
responsibilities (how infrastructure works). (Read about
Lotus Notes as an Agency standard in "Government Computer
News")
Unwarranted complexity of getting work assignments through
existing contracts (takes too long to get something through
contracts). (6 Pts. each)
7) EPA does not view information as a valuable tool to empower
the "public" (local environmental groups, researchers,
labor, industry) to deal with environmental problems beyond
what the government can do. (3 Pts.)
8) Budget (realities) process precludes long-range, strategic
planning.
IV-12
-------
APPENDIX IV
Lack of effective use of "life-cycle planning process" and
understanding of this by management (needs to be in more
understandable language).
State-of-the-art computer equipment out of reach
(availability on Agency contracts lags and budgetary
constraints).
Data integrity problems with EPA's mission critical systems.
GROUP COMMENTSt Data usually developed for principle
users, not secondary users. Source of data is from
States/often voluntary. However, EPA is accountable. EPA
dependent on others (trust) for data (because of statutes).
Often there is a Federal/State difference in interpretation
because delegated programs are not exact or precise.
Because of priorities and limited resources we are working
in a continuum. Upper management tends to "hang on" to
numbers (which are not real-time and constantly changing).
States want regulatory flexibility making it difficult to
aggregate data (esp. on a delegated responsibility). This
allows inconsistencies. There is also an interrelationships
. of problems. Public access may be mandated but we do not
understand costs (how much and what is the
effectiveness/benefits) and have not determined if it is
even appropriate in many cases. Secondary user often does
not want to take the time to understand the data. Example
Chesapeake Bay program's "Chessie System" -"there are
negligible number of users but a large amount of money was
spent which may have been better spent on cleaning the
environment.
(2 Pts. each)
9) No defined career paths for information management
specialists similar to that developed for scientists in EPA.
Development of duplicate systems.
GROUP COMMENTS; This is a problem. OW example is a
Congressional add-on to report on contaminant sediments
caused creation of an emergency data base because of a
mandated date. A system was in development which could have
provided the data but would have taken longer to 'complete.
How many systems store sample data? Need to put price tag
($ threshold) on what is major system. We never "kill" old
systems. Often contractors get to know the systems, have
access to them and market them to other offices,
perpetuating duplication. RACF has helped RCRIS by
preventing contractor access. Probably a problem, esp. with
IV-13
-------
APPENDIX IV
tracking systems and with other Federal agencies. OW
example - USGS, NOAA (NCPDI gets data from PCS). Even
though they may cooperate, nobody gives up systems because
they may lose funding. Seven states have their own NPDES
system because they say their systems are integrated. In-
house systems - Management wants up-to-date information
(real-time) and begins to create own systems. However, they
often use information that is not ready for release but use
it anyway. Management often will not let us (give ua time)
do it right but there is always time to do it again. Often
there are spin-off systems (LANs) because people do not like
the platform (mainframes) ., There is no standard
architecture and programs allowed to do it. No one has
power to say no.
(1 Pt. each)
No Votes
10) Lack of credible basic management information about where
Agency funds, including Superfund, are being spent and how,
and the resulting accomplishments (previously identified
problem).
GROUP CQMMBMTSi Can not account for adequacy of dollars
spent (i.e., how do you know you are spending dollars on the
right thing. Need to measure results on environment and how.
systems support that. Not an issue in some .programs
(OSW/OPPT). Systems managers and SIRMOs do not play heavily
in budget process. Congress often asks questions from
different people and do not get the same answers because of
different perspectives of respondents (Congressional staffs
go to various sources and often protocol for QA responses is
not followed in agency). A problem of communication, lack
of screening responses and is a vulnerability leading to
problems in credibility. Everyone views "accomplishments"
differently (different customers/users). Congress may ask
"Are waters cleaner?", but systems were not designed to
answer that.
11) Difficulties in identifying cross media pollution problems
(previously identified problem).
GROUP COMMENTS: This is a program definition/data
definition problem, Oftep. asking questions of systems that
they were not designed to answer. No one (upper management)
can say no! Maybe need to look at Congress as a customer
and ask what it is they need. No strong leadership at the
top (esp. in IRM). Never received questions regarding
integration. Maybe this is an access problem. Often
IV-14
-------
APPENDIX IV .
Congress does not understand complexity of the environment.
Statutes and regulations often conflict and systems are
expected to perform this function. Agency is
compartmentalized. Because subject areas (media) are so
different would most users understand the data? Maybe we
are selling users short on abilities to understand data.
12) Significant cost overruns and delays in developing
information systems (previously identified problem).
GROUP COMMENTS; The group consensus was that this was a
result of other problems not a problem in and of itself.
Budgets and statutes continually change. However, it also
happens in- systems development regardless (has there been a
system that hasn't?). Inadequate staff and dependence on
contractors are problems. Sometimes Congress never provided
enough money to begin with. Regulations and guidance are
written with no thought of the impact on information
systems. You have the same problems (i.e. result - cost
overruns/delaysJ in implementing programs. There is also
game playing. Sometimes systems are not a line 'item in a
budget. It would be better if they were.
13) Exposure of systems to unnecessary risks (access and other)
(rewording of previously identified problem - "Exposure of
Agency's financial payment systems to unnecessary access
risks").
GROUP COMMENTS; This is a problem for systems fas
reworded;. Exposure of systems to unnecessary risks.
SECOND FOCUS GROUP
ROOT CAUSES AND SOLUTIONS
PROBLEM; Lack of understanding or participation by upper
management (Office Director and above) in information
resources management. Information management programs
are not treated as core function or critical to EPA
environmental protection. Exclusion of information
management from regulation and guidance development.
Lack of ownership of information in EPA systems by
program people (What information do we need to run our
programs). (Success is defined by getting a regulation
"out the door").
ROOT CAUSES;
* Senior management does not understand linkage between
information systems and accomplishing the mission.
IV-15
-------
APPENDIX IV
Senior management does not understand role of data in
framing options in the decision-making process (Data
loses its identity).
Upper management (Office Directors and above and Branch
Chiefs in Regions) are accountable for things other
than IRM, i.e., program mission, and do not pay
attention to IRM unless they can see a return.
Designations of IRM functions not reflected in
performance standards and functions are not empowered
(responsibility without authority). (Function
examples: System managers, information management
coordinator, PC site coordinator (PCSC), LAN
Administrator/LAN Manager, SIRMO, RACF Administrator,
Account Manager, Records Managers, Telecommunications
Contacts, ADP Coordinators, ADP Training Coordinator,
EMail Coordinator, Contracts Management, etc.)
Too much turnover at the highest management levels
which does not support long-term investment in IRM -
working for short term results to improve resumes for
next j ob.
Lack of insulation from "political winds".
Management does not understand IRM (in terms of costs,
logistics, and resources).
Upper management "grew up" in an era that did not use
information systems for decision-making and is not
comfortable linking benefits of information systems to
decision-making.
EPA does not recognize long range investment/benefit of
information - too many new initiatives (initiative of
the month syndrome), play budget games, cut base
budgets.
Inadequate communication of IRM issues among upper
'management. (Example: RACF and its effects of
implementation). Issues go to lowest level. Selection
process of contactees on IRM communications is faulty.
Organizational location of IRM function(s) misplaced in
the Agency and program offices. (Sometimes functions
within Division, and can only service that division).
NDPD physically remote/lack of communication and
understanding of programmatic. IRM needs.
IV-16
-------
APPENDIX IV
COMMENTS; One participant indicated remote location had
nothing to do with poor communication or understanding,
however, another participant indicated lack of travel
money impacted the ability of NDPD to be present at
meetings in which they should be represented.
* No long-term strategy for tying IRM to Agency mission.
Management by Committee - Too many committees, lack of
coordination between committees, no consideration of
impacts on information management and the other
committees' work and the effects they are having.
Committees have no authority to make decisions (i.e.,
State Capacity). Too much consensus building.
* Lack of budget, planning and resource (staffing)
stability in IRM.
* Agency is not a stable steward of information.
* Capabilities of computers (technology) are often
oversold - management does not understand support that
is necessary to develop and maintain systems and what
can or can not be done with computers. Some managers
may be proponents of computers but never use them..
* Undefined process for making IRM decisions in the
Agency.
* Program Managers make decisions without assessing
impacts or consulting OIRM/NDPD.
* NDPD and OIRM make unilateral decisions without
consulting about or assessing impacts on customers.
Lack of effective communication plan and obtaining
information from user community for strategic planning
* No follow-through on IRM decisions.
OTHER COMMENTS; Some participants indicated that several
years ago the SIRMOs met, formed a group and provided
solutions to OIRM/NDPD but there was no follow-through by
OARM. SIRMO group eventually disbanded.
SOLUTIONS;
Create Assistant Administrator for IRM (Corporate
Information Officer - CIO) that reports to
Administrator with no ancillary responsibilities other
than IRM (including IRM_ .strategic planning) . Must have
responsibility, authority and expertise. (There was
IV-17
-------
APPENDIX IV
discussion on whether this should be a career position
and the need for someone to Jbe a political appointee to
ensure adequate power was voiced;.
Administrator should hold AA'-s accountable and
responsible for IRM effectiveness (able to devote
resources).
Revise or restructure IRM in Agency (including giving
SIRMOs more authority). Need a fresh view.
Develop a consistent IRM structure (functions and
responsibilities) in program offices and have
consistent treatment of regional and headquarters IRM
functions.
Define IRM functional responsibilities and
accountability.
IRM functional positions need to be filled with
qualified people.
OARM/IRM and programs needs to hire staff with IRM and
programmatic knowledge or obtain expertise through
rotations, details, etc.
Provide training in IRM functions.
Educate senior management on roles and responsibilities
in IRM.
IRM Community must improve communication with senior
management (use less jargon).
Develop IRM strategic plan tied to mission.
Any major IRM initiatives, standards, etc. that are
developed or changed need to go through green border
review process with cost/benefit analysis.
OARM (OIRM/NDPD) need to be customer oriented.
Establish working capital fund for IRM (tie budget
cycle to system life-cycle).
AA's need to be part of focus group with no delegation
of attendance.
Establish process for incorporating IRM in regulation
development and guidance.
IV-18
-------
APPENDIX IV
PROBLEM; Lack of consistent Agency architecture (hardware,
software, data), strategy and lack of power to enforce
it. No consolidated approach to EDI. [Indirectly
trying to enforce it through budget].
ROOT CAUSES:
* No IRM leadership or direction (via plans) and no
enforcement. Program offices go out on their own
because there is no leadership.
* EPA can not respond effectively to rapidly developing
technology.
* The EPA procurement process stinks (ineffective and
cumbersome). Procurement drives hardware, software,
etc. (not architecture baaed on user needs). Lack of
sufficient ADP procurement expertise in 0AM.
* Too much contractor input (at NDPD) on architecture.
* The people making architectural decisions (NDPD) do not
understand how computers (technology) are used in the
Agency or by the States/local governments (not paying
attention to customer's needs).
* No consideration of needs in exchange of
data/information in architectural decision-making
process (technology for technology's sake).
GROUP COMMENT: NDPD held a meeting last summer to
address architectural issues (only one participant knew
of the meeting). Only two systems managers were
invited from HQ. The rest were IRM Branch Chiefs and
contractors. The group remarked that no results have
been seen from this effort.
* NDPD does not talk/listen to customers.
* No mechanism for determining Agency standards or making
changes to existing architectures.
* Lack of data standards and the ability to implement or
enforce them.
* No common data definitions, especially through the
legislation/regulations.
IV-19
-------
APPENDIX IV
* Current "stovepipe" standards, regulations, and
management inhibit standardization. Some special
purpose (vs. corporate) data is OK.
* SIRMO's have no power to ensure that standards are
enforced (offices avoid getting their signatures).
* Resources (money) not available to implement standards
(i.e., locational data policy).
* Items in the budget dealing with data standards are the
first ones cut.
* No consideration of data as a corporate resource;
responding to stovepipe statutes.
* NDPD/OIRM do not view themselves as a corporate
resource for data and training. NDPD offers no
training for mainframe packages (Statistical Analysis
System (SAS), etc.). They do not view themselves as
responsible for this corporate resource.
GROUP COMMENT; Several individuals did point out that NDPD
has WIC training but no one shows up. There is no
management (programmatic managementJ commitment for IRM
training.
SOLUTIONS;
Develop an implementation plan for EDI in the Agency
(not written by OPPE).
Develop a legal policy for EDI signatures for external
Agency entities.
Develop a process for determining what is corporate
data vs. special purpose data resulting in an
enforceable "corporate data" policy.
Establish a "data czar" who reports directly to the
Administrator.
Maintain relational reference tables (such as zip
codes, county FIP codes, etc.) as a corporate data
resource.
Ensure that there are sufficient resources in the
budget to implement standards.
IV-20
-------
APPENDIX IV
Require red or green border review (as appropriate) on
Agency standards (both new and revised), including a
cost/benefit analysis and an analysis of impacts on the
users and IRM community.
Exercise Agency discretionary authority in implementing
16 and GAO audit recommendations.
Undertake a comprehensive review of the Agency's
architectures (both data and technology).
Undertake a comprehensive review of how the Agency
spends its approx. $260 million annual IRM budget
(agency-wide IRM program budget).
Review and evaluate ADP procurement process (both large
and small purchases) to facilitate purchasing, increase
electronic commerce, and look for better models in
other agencies to emulate. Need to do this at two
levels - work to improve on issues outside..the Agency
via National Performance Review and also issues
internal to the Agency.
IV-21
-------
APPENDIX IV
THIRD FOCUS GROUP SUMMARY
SUMMARY SOLUTIONS
At the completion of the focus group exercise, the group was
asfced to develop five broad solutions based on the detailed
solutions presented by the group in response to specific
problems. The following Jbroad solutions were presented by the
group.
1) The components of an information system (e.g.,
requirements, needs, etc.) need to be identified and agreed to up
front. There was general agreement that failure to identify and
agree to information system requirements up front results in
failure of the system to meet perceived needs later in the life
cycle.
2) Need to better anticipate needs by better communications
with customer (i.e., Congress and oversight agencies). Planning
needs to take place in a cooperative non-adversarial environment.
3) There is a need for more resources. There is a need to
consider all resource requirements before designing information
systems. Resource requirements must then be prioritized and
compared to available resources. Priorities may then be
addressed with available resources. There was concern that
systems are implemented at the Regional level without funding the
resource requirements (unfunded liabilities) needed to operate
the systems (i.e., data entry, system maintenance, etc.).
4) Longer term stability of information systems is needed.
5) User input to information systems development is
critical. There was general consensus that better communication
with users is necessary to anticipate needs. In addition, the
group generally agreed that better communication would lend
itself to building a cooperative atmosphere and better commitment
for data input and data quality.
REACTIONS TO IDENTIFIED PROBLEMS
(Based on Previously Identified Problems/Root Causes
for IRM Problems Supplied to Participants)
The session started with the group discussing the previously
identified problems and root causes addressed from previous
audits and reports. The following were not viewed as Regional
problems, however some of. them were discussed in the comments
below:
IV-22
-------
APPENDIX IV
PROBLEM: Difficulties in addressing cross-media pollution
problems. (Primarily a Superfund focus group and not
felt to be a Superfund issue).
PROBLEM: Development of duplicate systems.
PROBLEM: Exposure of financial payment systems to unnecessary
access risks.
COMMENTS ON IDENTIFIED PROBLEMS
Many of these comments have bearing on the problem of cost
overruns and delays in development. Many were carried over to
specifically identified problem discussions.
PROBLEM: Lack of credible basic management information about
where Agency funds, including Superfund money, are
being spent and how, and the resulting accomplishments.
1. There is a lack of linkage between how funds are spent
and accomplishments.
2. We do have "credible" management systems, but if
questions coming from the Hill change. we may not have
the answers in the data - Resource Limited.
3. Many of this summer's questions appeared to be directed
at Superfund reauthorization - Is Superfund fair? (And
may not be recurring questions).
4. Constantly changing focus/changing questions - i.e.,
Environmental justice -- there has been no
data/information collected oh that in past.
5. Sometimes by the time questions are answered there is
no longer need for answers. Congress (and senior
management) needs to understand.the impact of asking
questions (resource and $ costs).
6. IFMS/CERCLIS still not linked - Not a Regional problem
-but National systems problem. However, Region would
benefit from integrated system.
PROBLEM: Difficulties in addressing cross-media pollution
problems.
1. Superfund does not play in multi-media enforcement -
not amenable because addressing past actions. Region
has used GIS successfully for multi-media issues
however. Region targeted major facilities for RCRA,
IV-23
-------
APPENDIX IV
air, water - most generally OK. Small facilities
generally don't have multi-media problems.
2. Cross-media integration not an issue in Superfund. We
collect multi-media data for site. Some data never
need to be stored in database, question of where to
store it (i.e,., different media specific databases).
Mostly deal with program accomplishment reporting.
Does Congress want an organic emphasis on reporting?
Reporting structure is to different Committees/Sub-
committees for different laws (i.e., specific program
elements).
3. Planning horizon short (because of budget
processes/politics), a longer planning horizon would
allow for more stability. Also differences in program
focus over time have occurred (technology vs. risk).
4. Congressional Sub-committee setup not encouraging
multi-media activities. Have to report on specific
program elements. Strategic planning can't work (tried
in one Region but failed) because resources tied to
legislation and there is an inability to move resources
to where they are needed (Superfund resources can't go
to water or vice versa).
5. Funds are program specific and not used for multi-media
tracking system, therefore can not report if there is
no activity. Does this include environmental data?
Typically EPA's tracking & info, systems do not.
6. Superfund collects multi-media data, but its relative
impact is small (dealing with localized problems). So
much planning goes into remedial actions that
collateral impacts are generally not issues.
7. Sub-committee chairmen need to coordinate with each
other.
PROBLEM: Data integrity problems with EPA's mission-critical
information systems.
1. Need user friendly, self-feeding systems (worry about
data quality). Have problems - reasons not always
apparent. EPA sometimes pushing IRM technology limits
- causes delays, etc. Major reason for cost overruns
last year because of loss of TOSS (loss of contract
personnel - prematurely - knew contract was going
away).
IV-24
-------
APPENDIX IV
2. Data accuracy/quality concern in future years. Gaps in
budget formulation cause problems. Data element
definitions change, systems do not handle anomalies.
Happens as technology changes (mainframe replaced by
PCs) .
3. No traceability between Regional budget submissions and
final budgets (after negotiations, etc.). SCAP data
elements constantly changing - no data comparability
from year-to-year - source of data quality problems
(related to Congress?).
PROBLEM: Development of duplicate systems
1. Systems developed by Headquarters not useful to Region
- therefore Region develops own applications.
2. Changes to mainframe system happen slowly, need to
supplement with PC systems.
3. Need to define what are common data elements (relate to
reporting needs) and which ones aren't (therefore allow
for variability in individual's needs). [Relates also
to data integrity problem].
PRIORITIZED PROBLEMS
Problems are presented in priority order based upon the mulci-
voting scores given to each problem by the focus group
participants. Scores, presented as total points received, are
provided in parentheses following each problem.
1) Most information systems are designed for centralized
Headquarters management and are not useful to the
Regions. Systems need to be designed as tools for
staff, not merely to answer questions (although they
may do this). (25 points)
2) EPA does not do a good enough job of anticipating
questions from Congress. (23 points)
3) EPA receives data of questionable quality from the
States. This was of particular concern because EPA
receives most of their data from the States.
(14 points)
4) Congressional Committee structure. (13 points)
5) Data definitions. (10 points)
IV-25
-------
APPENDIX IV
6) Linkage between financial system and program systems
(e.g., IFMS and CERCLIS). (9 points)
7)
are asking different
questions. (8 points)
8) Congress needs to revisit the Agency mission to give
more flexibility to programs. There ia a continual
fight for resources. (5 points)
9) Overemphasis on contract, support versus in-house
support. This is applicable to all Agency functions,
not just IRM. (4 points)
10) Need to develop information system with more
flexibility so they can respond to changing needs.
Inflexible systems lead to data integrity problems.
There is no clear expectation or common understanding
of what is desired from information systems. (3
points)
11) Cost overruns. (0 points)
12) Need to take out negativism. (0 points)
ROOT CAUSES/SOLUTIONS
PRIORITY PROBLEM #1: Most information systems are designed for
centralized Headquarters management and are not useful to
Regions. Systems need to be designed as tools for the staff, not
to answer questions (although they may do this).
ROOT CAUSES FOR PRIORITY PROBLEM #1: Root causes for priority
problem #1 are presented in priority order as determined by group
multi-voting. Scores, presented as total points received, are
provided in parentheses following each root cause.
1. Centralized management systems are designed by
Headquarters for their needs. (41 points-)
2. Misinformation from OIG, GAO, OMB. What appears in
audit reports and studies is often not an accurate
representation of the condition. (23 points)
3. Data system developers do not understand program or
staff needs. Part of this problem may be related to
the fact that contractors (versus Agency personnel)
develop most of our information systems. (21 points)
IV-26
-------
APPENDIX IV
4. Information systems are used for different uses than
originally intended. For example, the HWDMS system was
originally designed as a tracking system. However, the
Agency tried to use as management system. (16 points)
5. Complex vs. simple information systems - need vs.
effort. There is a need for information at different
levels (e.g./ public, field offices. Congress,
management, etc.) and different requirements (i.e.,
effort, cost, motivation) for obtaining and maintaining
data/information at these levels. A data quality
problem exists when Regional resources are required to
collect information to meet Headquarters needs when
perceived benefit and/or resources are not provided.
(14 points)
The following additional root causes were identified for priority
problem #1. However, because of our desire to address more of
the prioritized problems, we did not spend focus group efforts on.
developing solutions for these root causes.
6. Information systems development has been reactive
(versus proactive) to Congressional needs. (9.points)
7. There are too many managers and too few staff in
Headquarters. (6 points)
8. Information systems are used by data managers to build •
empires. (0 points)
SOLUTIONS: Solutions are identified to the problems by number in
parentheses for clarity. Subsequent sessions were not specific.
1. The people who are designing information systems need
to understand what is needed. (1)
2. Consultation must take place between Regional and
Headquarters personnel. (1)
3. Need to determine if other Federal agencies have
information systems we can use. (1)
4. Information system designers should be required to
function as a Regional project manager. (1)
5. Need pilots and ground truthing (like that used with
the RP2M program). (1)
6. Need to tie into existing information systems - close
the loops. (1)
IV-27
-------
APPENDIX IV
7. Information systems need to allow flexibility for the
Regions. (1)
8. Headquarters needs to listen to the Regions and vice
versa. (1)
9. Headquarters needs to be more consistent in providing
either FTEs or contractor dollars to maintain and
perform data input into information systems. (1)
10. We need to identify what we truly need up front. (1)
11. We need to anticipate future IRM needs fdata and
information^. (1)
12. Information systems needs to stop being all things to
all people. (1)
13. Need more experienced auditors (particularly experience
in program areas). (2)
14. More regional involvement in audits. (2)
15. Auditors need to discuss resources as they influence
outcomes. Auditors need to analyze if program
operations are doing the best they can with given
resources, instead of auditing against a standard that
assumes adequate resources. (2)
16. Audits need to identify positives. (2)
17. Regions need a second opportunity to review report*
before final reports are issued. (2)
18. OIG, GAO, OMB should care if information reported in
the audit (results of audit) are. correct. (2)
19. Auditors need to follow the 1988 GAO audit guidance.
(2)
20. Programs should be less defensive and admit when there
are problems. (2)
21. Both sides need to be more objective in the spirit of
Total Quality Management. (2)
22. Programs need to do a better job responding to audit
reports (both draft/final). (2)
IV-28
-------
APPENDIX IV
23. Need more in-house staff for certain development
functions. (3)
24. Information system developers need to listen to end
users. (4)
25. Information system developers need to understand needs.
(4)
26. Adequate resources need to be provided. (4)
27. Programs need to admit when a new information system is
needed, instead of trying to operate with a system that
no longer works (ex. CERCLIS is broken beyond repair,
nested loops in CERCLIS keep growing). (4)
28. Need to recognize that we need to have different
information systems to do different things. It is not
always necessary to combine all the satisfaction of all
needs into one system. Decisions of this nature need
to be discussed and resolved at a high level of
management. (4)
29. Don't assume you need an integrated information system.
(4)
30. Feed information as needed. (4)
31. People making decision regarding IRM issues should have
a background in computers. (5)
32. Determine if one information system is needed (versus
multiple systems). (5)
33. Scrap old information systems when necessary (CERCLIS).
(5)
34. Design information systems properly. (5)
35. Assess cost of fulfilling data needs. (5)
36. Decision makers need to have background in ADP (should
not have to rely on contractor's technical expertise).
The Agency needs in-house expertise. (5)
37. Need long-term planning (high turnover change). (5)
38. Decision should be made at Program level (by those
persons committing resources) in concert with the
Administrator, GAO, and OMB. . (5)
IV-29
-------
APPENDIX IV
PRIORITY PROBLEM #2: Problems anticipating Congressional
questions.
ROOT CAUSES FOR PRIORITY PROBLEM #2: Root causes for priority
problem #2 are presented in priority order as determined by group
multi-voting. Scores, presented as total points received, are
provided in parentheses following each root cause.
1. The Agency is in a reactive mode. (43 points)
2. Poor communication with Congress. (39 points)
3. Too many hidden and/or different agendas. (28 points)
4. Too busy. (8 points)
5. No marketing plan for customer. (6 points)
6. Changing Congressmen/Congresswomen and changing
interest. (4 points)
7. Congress is not interested until it is time to get
votes. (2 points)
8. Have to think like Congress (anticipate congressional
needs). The group was not sure if this was a realistic
expectation. (0 points)
9. Don't debate diaries. (0 points)
SOLUTIONS:
1. More ongoing communication with Congress (borderline
lobbying).
2. Decide up front what data EPA needs in order to manage,
then be consistent (need long term stability).
3. Programs need to be proactive in asking Congress what
they want to know.
4. Congress needs to be to be more specific regarding what
they want (i.e., "specifics" vs. "general inquiries").
This level of specificity needs to take place
throughout the information systems development life
cycle.
5. Meetings between EPA and Congress should be attended by
higher level people. Subcommittee attendance at
Regional EPA briefings has been poor.
IV-30
-------
APPENDIX IV
. 6. Need to focus more on IG, GAO, etc. (i.e., the ones who
are raising the issues - misinformation). Need to work
on relationship with these groups and need to work with
an open mind.
PRIORITY PROBLEM #3: Concern for quality of data from States
(most data comes from States).
ROOT CAUSES FOR PRIORITY PROBLEM #3: Root causes for priority
problem #3 are presented in the order in which they were
presented by the group. Because of our desire to Address more
priority problems, we did not attempt to prioritize root causes
for this or subsequent problems.
1. State personnel are not clear on what data to input or
how to input data.
2. There is a general lack of resources (e.g., people,
equipment, computer systems linkups, procurement
system, etc.).
3. EPA always seems to want more data or changes the data
that it needs.
4. Information systems developers do not involve States in
system design, even though States provide most of the
data. According the group, reductions in Headquarters
eliminated State involvement during data element
definition development.
5. There is no National or Regional consistency. This
results in States entering data under different
pretenses.
6. Do we have good data quality audit reports?
7. Need to emphasize to States the benefits of using the
information systems we provide. The group believes
that the States will be better motivated if they
recognize a benefit.
8. No way to enforce data quality at the State level.
SOLUTIONS:
1. Need more resources..
2. Better define what EPA needs are. This will help get
"buy-in* from States, distinguish between EPA and State
needs, and provide specifics that States could
IV-31
-------
APPENDIX IV
implement in their own systems to pass information to
EPA.
3. Need to have State involvement in information system
design.
4. Need to allow States more flexibility in tailoring
information systems. We may be able to provide
resources for State information systems while
identifying core data requirements for EPA systems.
5. Assure we have good data audit reports. Need to
conduct audit work at the time of information system.
design.
6. States need EPA assistance in the procurement of
equipment.
7. States need to receive more and continual training on
information systems. States have high turnover.
8. Headquarters needs to establish a hotline for data
definitions and information. This will support
National consistency.
9. Need to determine if we can use the grants process to
apply leverage to the States or determine if there is
another mechanism to provide money to States while
ensuring that data meets EPA needs.
PRIORITY PROBLEM #4: Congressional Committee structure.
The participants in the focus group agreed that this problem/
although valid/ was not a problem for which we could brainstorm a
solution. Therefore/ the group choose not to address this
problem.
PRIORITY PROBLEM #5: Data definitions.
ROOT CAUSES FOR PRIORITY PROBLEM #5: Root causes for priority
problem #5 are presented in the order in which they were
presented by the group. Because of our desire to address more
priority problems/ we did not attempt to prioritize root causes
for this problem.
1. There is a continual change in the definitions of
programmatic outputs.
2. There is a redundancy in terms (i.e., a fine line
between definitions). The group believes that there is
IV-32
-------
APPENDIX IV
a lacJc of understanding of definitions between some
constituents (e.g., EPA Programs and others such as the
Inspector General and CongressJ.
3. There is no common understanding of definitions.
4. There is a false assumption that Superfund sites have
common traits. The group expressed that each Superfund
site has unique characteristics which make "national"
definitions or standardization difficult.
SOLUTIONS:
1. Common agreement on definitions from the constituencies
(e.g., Congress, IG, GAO, OMB, Headquarters) need to be
made.
2. Definitions need to be frozen for a period of time.
3. Definitions need to be revisited on a regular basis.
PRIORITY PROBLEM #6: There is no linkage between financial
systems and program systems. The group used the JFMS and CERCLIS
systems to discuss this problem.
ROOT CAUSES FOR PRIORITY PROBLEM #6: Root causes for priority
problem #6 are presented in the order in which they were
presented by the group. - Because of our desire to address more
priority problems/ we did not attempt to prioritize root causes
for this problem.
1. Built as independent systems to deal with individual
problems
2. No common goals
3. Incompatible formats
4. Did not consider financial data when designing CERCLIS
5. Too difficult to bring together
6. No resources to link
7. Inadequate preplanning
8. Needs changed over time- accountability more priority
over tracking.
IV-33
-------
APPENDIX IV
SOLUTIONS:
1. Up front communications, planning
2. Consider all directions you can go (for development)-
. CERCLIS/Finance/Contracts (diversity of users)
3. Anticipate future needs
4. Do not build one big system, use modules that will be
easier to change
5. Need own technically competent people to build (govt.
function - primarily design to be able to communicate
during development)
IV-34
-------
APPENDIX IV
FOURTH FOCUS GROUP SUMMARY
PROBLEMS IDENTIFIED
1. Systems were not designed to do what they are currently
asked to do, and an unwillingness to be forthright about it.
For example, few of our data bases interact. (52 Votes)
2. Data quality is not known. There is no comparability
mechanism to determine whether data supporting EPA's
mission-critical information systems is accurate or
inaccurate. (29 Votes)
3. Difficulties in addressing cross-media pollution problems.
(7 Votes)
4. Significant cost overruns and delays in developing and
implementing information systems. (7 Votes)
5. Development of duplicate systems. (3 Votes)
ROOT CAUSES (LIMITED TO TOP 2 PROBLEM AREAS)
Problem 1
Systems were not designed to do what they are currently asked to
do, and an unwillingness to be forthright about it. For example,
few of our data bases interact.
Root Causes for Problem 1
1. Lack of an "Information Systems Champion" and accountability
for information systems:
Lack of top management understanding of the importance
of information systems to the mission; attention to
information systems activities; and commitment to
supporting information systems.
Top management officials not accountable for
• information systems management.
Senior Management does not understand the strategic
value of IRM.
Lack of knowledgeable, experienced, and forceful
leadership in the IRM arena.
Lack of commitment to enforce standards.
IV-3 5
-------
APPENDIX IV
Inadequate planning and budgeting process for information
systems:
Lack of a overall Agency business plan with long range
goals, objectives, approaches- to meet goals and
objectives, and associated costs.
Budget cycle not consistent with technology development
activities. System development projects are not line
item budget items--funding comes from program areas.
When program budgets get cut, the system development
projects suffer. For example, Integrated Contract
Management System (ICMS) .was a 5-year project and is
currently floundering due to program budget cuts.
o Management lacks staying power in the budget
process.
o Inconsistent funding.
o Support services are the first item to go.
Administration, Congressional, and EPA top management
turnover change policies and priorities of information
systems management.
Management unwilling to redo big systems when
information requirements change. '
Changes in technology overcome information systems--
systems become obsolete.
Technology becomes dated with delays.
Computer support not tied into performance measurement
(i.e., success/accomplishments/economic benefit).
Technology activities not tied to mission.
Difficulty in using raw data for meaningful reports.
Lack of an overall Agency business plan which ties
budget with IRM needs.
Limited resources invested by OIRM in planning, policy,
and oversight.
End users are not being considered in developing and
maintaining information systems. The role of States is
not being recognized. Systems are not user friendly. .
IV-36
-------
APPENDIX IV
3. Lack of built-in flexibility and integration of Agency
information systems.
No Agency requirements for multi-media approach to
pollution prevention and enforcement. For example: 2
States are moving away from a single medium approach
for pollution prevention. One State is integrating
information regarding the air, water, and permit
activities of the top 400 facilities.
No centralized standardization of data elements
Limited initiatives supporting data integration/manage-
ment. Good examples are the Integrated Task Force
Monitoring (NOAA, EPA, and USGS); GIIS (DOD, DOT, EPA)
involving the standardization of analytical data
definitions on Superfund Federal facilities; and the
Global Position System.
Lack of consistent data dictionaries (e.g., multiple
data dictionaries for common data elements with
different data definitions).
Lack of system development standards.
4. Unrealistic expectations for information from Agency
information systems.
EPA is seen as being on the cutting edge of technology,
which is not true. This raises false expectations for
EPA in the eyes of Congress, States, and employees.
Management is not forthright with top management and
Congress regarding the inability of our current systems
to support current requirements.
o Even when requirements are well-defined in the
beginning (e.g., RCRIS-$millions to define
requirements; best example of collaborative effort
with States and other users to define
requirements) new requirements may be difficult to
implement and result in complaints of not meeting
user needs. .
Extreme difficulties are encountered in projecting
expectations for information.
5. End users not adequately consulted.
Difficult to identify and communicate information
IV-37
-------
APPENDIX IV
system projects with all customers with vested interest
to get their involvement and contribution.
EPA is not recognizing the role of the States in
implementing and maintaining information systems.
6. Insufficient qualified information system personnel staffing
due to budget and personnel policy constraints is not
competitive with the private sector.
Recommendations for Problem 1
1. Establish an information systems "Champion" (i.ef, Chief .
Information Officer) as a separate Assistant Administrator
position (or equivalent). The person selected to this
position must have good management, agency background, and
technical qualifications. This person must be bold enough
and skillful to describe the lack of systems capabilities
when appropriate. This position requires access to the
Administrator and requisite power and authority to execute
its responsibilities. We recommend that the Administrator
look to the private sector for a model and/or recruitment.
2. Develop an overall Agency business plan with long range
goals, objectives, approaches to meet goals and objectives,
and associated costs. Business plan should include high
level costs including a line item for information systems.
The Chief Information Officer needs to be involved in this
process.
3. Clearly define Agency-wide information systems needs.
Identify information systems requirements needed to meet the
Agency's missions. These needs should reflect the
requirements of the end users. Key customers should k>e
involved in this process.
4. Develop a strategic information systems plan which includes
detailed costs. Identify tasks to be completed annually
with the funds.
5. The strategic information systems plan should .assure that
systems are flexible enough to deal with changing program
needs and changing technology.
6. Update Agency-wide information systems requirements and the
strategic information systems plan annually with associated
costs of changes. Factor in technology changes.
7. Information systems "Champion" must represent the strategic
plan implementation in the budget.
IV-38
-------
APPENDIX IV
8. Develop personnel systems that allow recruiting and
adequately compensating people for jobs they do, rather than
for their credentials.
Problem 2
Data quality is not known. There is no comparability mechanism
to determine whether data supporting EPA's mission-critical
information systems is accurate or inaccurate. (29 Votes)
Root Causes for Problem 2
1. Lack of focus and attention to data administration/
management-.
Most data administration/management related activity is
on a system by system basis as opposed to an Agency-
wide basis.
No data audit process by programs exists to. ensure that
data elements/definitions within individual systems is
consistent.
2. Changes in data definitions by Headquarters program offices
on a year to year basis, (e.g., CERCLIS--changed definition
of remedial investigation/facility study (RI/FS) for Federal
facilities in 1993).
3. Differences in interpretation of data definitions between
offices (e.g., site definitions).
4. Changing legislation and requirements.
New laws and regulations come out every year which
require changes to systems which are not funded.
Systems can't keep up with the changes because of lack
of funding and delays due getting contracts in place.
For example: The Agency's inability to make required
changes to the CERCLIS Accomplishment Report for 1993
due to contract problems and delays.
Lack of resources and priority for data administration/
management.
When budget cuts occur, ADP support services is one of
the first things to go, which reduces system
maintenance.
IV-3 9
-------
APPENDIX IV
5. Complexity of design of systems.
Program systems so large and complex that ADP and
program knowledge is required to use the systems.
Systems are not user friendly.
Difficult to keep up with changing systems.
For example: PCS information is updated by the States
through interfaces with related States systems. Any
changes to PCS affects the States systems and
interfaces.
Even report writer software is too complex.
6. Users of information not educated as to the capabilities and
contents of the systems (e.g., IFMS/CERCLIS, IFMS/DOCKET).
7. Lack of adequate training in some cases.
Changes in systems and constant turnover of personnel
require annual training.
8. Lack of interface between IFMS and CERCLIS (i.e., CERCLIS
obligation data not updated from IFMS).
System design problems in both systems-preclude easy
interface.
9. Technical personnel gathering data have little or not vested
interested in data gathered in some cases (e.g., CERCLIS).
Recommendations for Problem 2
1. Data management must be considered as a "core" program
rather than a secondary function within each program with
the same weight as other program functions.
2. Senior management (i.e., Administrator) must be educated as
to the strategic importance of systems/data and of the
impact of programmatic changes to systems.
3. Establish management and staff accountability for data
(i.e., ownership of systems and data).
4. Trade information systems contractor support for more FTEs.
(Contractors have no vested interest in information
systems).
IV-40
-------
APPENDIX IV
5. Provide more funds, people, equipment, and software for data
administration and management.
6. Increase Regional involvement in legislation/rule making
process. Regions have a better understanding of the
impacts.
7. Redesign systems to allow easier adaptation to change, and
control by Regions.
8. Allow Regions more control over their aspects of systems.
9. Change budget process to establish direct budget line items
for information systems activities in order to reflect
accountability for resource allocation for information
systems and Regional portion of that allocation.
10. Survey Regions and Headquarters to determine the actual
level of resources expended for data management and systems
support.
11. Simplify the contracting process to avoid delays and to
quickly respond to changing requirements. For example:
(a) 4-6 months to execute a delivery order on an existing
contract; (b) 9 months to get an 8A contract in place; (c)
2-3 years to get an IRM contract in place is unacceptable.
12. Allocate more information systems FTEs to the regions.
13. Establish Agency-wide directories of systems and data
(information).
14. Educate the users of the information from systems (not
system users) on the capabilities, contents, and limitations
of the information systems.
15. Develop standard data definitions across information systems
with common data elements.
IV-41
-------
APPENDIX V
SUMMARIES OF INTERVIEWS
BACKGROUND: This Appendix presents the transcripts of the series
of interviews conducted on special emphasis areas (such as data
integration) and with particular people-—two Regional SIRMOs.
The transcripts are detailed narratives of the meetings.
v-i
-------
APPENDIX V
SUMMARY OF HEADQUARTERS INTERVIEW
SUBJECT: DATA INTEGRATION
PROBLEMS IDENTIFIED
1. EPA is deficient in the area of data integration (DI):
internal links among EPA mainframe systems are not
there
external links to other agencies, states, etc. are not
there
There is no unified user interface in the area of performance
indicators, environmental quality, other data fields.
2. EPA staff cannot always get information they need and they
have, over time, stopped asking many important questions because
we simply don't have the data to provide the answer or can't get
it out of the systems we have. Decisions based on limited
amounts of information becomes a general course of business.
This practice is then defined as the acceptable level.
For Example:
RCRIS/CERCLIS — Reconciliation is difficult between
RCRIS/CERCLIS and permit writing is constrained by information
availability. Its difficult for Superfund and RCRA systems to
answer questions asked now. Five and a half years ago different
questions were asked. Systems were not designed for the
questions being asked now. It is not possible to anticipate the
types of questions that will be asked.
PCS — Provides a pointed set of information on standards versus
discharges but doesn't bring in wetlands, population impacts,
ecosystems, proximity (of contamination) to the well supply.
GIGS— Congressional questions originally asked, are no longer
asked because information is not available.
CIS — Programs resist because data is entered into separate
systems. CIS users showed management that the system had
incorrect data that showed Superfund facilities in the middle of
the Atlantic Ocean and at the North Pole.
STORET cannot show how much contamination is coming out into a
body of water.
V-2
-------
APPENDIX V
3. There is a pervasive perception that EPA information systems
have no data integrity. Information systems have primary and
secondary users. The primary use of Agency data is successful,
i.e., CERCLIS helps SCAP preparers provide good data to
Headquarters staff using it. The problem in 01 is secondary use.
While primary users of CERCLIS have the information they need,
secondary users seeking the following information will not find
answers through CERCLIS: what is the nature of the hazard; is it
a PCP site; where is the hazard located, what is the degree of
the threat posed by certain chemicals, and sites, etc. are not
successful because this information was not included in the
system requirements during development. This creates the
perception that data are bad—even when the system works for the
primary user. The problem surfaces especially when the Congress
and others attempt to select information the system never
collected and never intended to provide.
ROOT CAUSES
1. Systems independently designed and managed lack both EPA and
Government-wide data standards in many areas such as
organisms, naming conventions, data architecture. Without
standards system managers will continually be putting bad,
unintelligible information into employee's personal
computers.
TQM may provide correction, however the competent people
involved are not tuned in to the protection of the
environment. The culture doesn't foster meaningful
discussions to determine commonality, instead each have
their own life, no common vision.
2. EPA has not thought strategically. The Agency culture does
not value data: it is of no use; its importance is
minimized. There is parochial focus on data, even down to
ecosystem level. Data/information are not considered
strategic resources. IRM is considered an overhead activity
rather than a key element of EPA business for the purpose of
enforcement and monitoring. Too much impetus behind current
system development has been on bean counting:, how many
permits, inspections, enforcement activities, not the
environmental data needed by users.
3. The statutory framework promulgated by Congress doesn't
address cross media. Program managers are driven by a
single Act, a requirement for reporting on activities.
Statutes compartmentalize and drive the Agency in that
direction, mandating bean counting rather than real
environmental results and DI~. This orients managers toward
V-3
-------
APPENDIX V
collecting specific information. EPA has not addressed this
problem as other agencies have successfully. EPA was not
aggressive in writing regulations to facilitate integration.
There is very little movement afoot to go in a integrated
direction like TRIS and CIS. Five years ago resources to
support CIS were pulled from all over. Five FTE's were
identified from each Region. The outcry was such that
resources were pulled back from GIS. The Agency lacks
resources and funding for FTEs which increased its
dependence on contractors.
The DI problem is not unique. USDA historically had the
problem...everyone is under the gun to meet daily
operational needs...there is no slack to look at the big
picture. USDA has been successful because they have less
dependence on contractors.
4. Administrations don't emphasize IRN: Nineteen teams were
assembled to conduct the National Performance Review at EPA.
None addressed information management as the keystone —
strategic importance in the plan. Pollution Prevention is
addressed but the importance of IRN was missed. This
continues the pattern — neglect IRM. The OE reorganization
is not emphasizing DI as one of the objectives.
5. It is still acceptable not to integrate data—there was no
penalty to be paid for not wanting integrated permits...it
was, however, not acceptable not to meet the* mission or not
to bean count or not to get out the permits. Systems were
built to do specific things and are still doing them. The
data quality questions are highlighted but no harm came to
EPA.
EPA missed an opportunity to move forward on DI. TRIS, an
Information Program ordered by Congress, is hailed as one of
EPA's successes, yet the Agency hasn't expanded lessons
learned here to other systems. STORET has only 500 users
while it should have 5000 users. Office of Water is taking
solid looks for providing for secondary users and seeking
standards for water data.
6. Information systems have primary and secondary users. If a
system under development for a primary user has no need for
latitude and longitude data, it is not collected. Nobody
can enforce other programs to meet needs of another office
where the one office does not control the resources of the
other. Current trends of demographics analysis that deal
with population at risk now need boundary data and latitude
V-4
-------
APPENDIX V
and longitude. At present, some Agency information systems
are required to provide geographical location. The Agency
still has not defined its business in terms of latitude and
longitude of information. There is risk involved (resource
and technical risks) to projects for incorporating
integration to other projects. Program managers believe,
"If I have to alter my plans to handshake with you I
introduce increased resources and risk; I won't do this
unless someone makes me.1* There is no requirement for
programs to consider secondary uses during systems
development. People may or may not be
rewarded for extra risk and effort, therefore programs
usually avoid the risk by ignoring DI.
RECOMMENDATIONS
1. Top management needs to mandate DI to enable EPA to:
better assess risk and target areas needing action
implement a risk ranking program to provide a
systematic method to compare threats, and rank and
evaluate how we are doing against threats
measure outcomes
meet the Administrator's 4 priorities which cut across
all media:
o Ecosystem protection
a
o Pollution Prevention
o Environmental Justice/Equity
o Partnership building
revise the Agency culture such that people can and will
work together more on DI projects by encouraging
developers to consider other office's needs
recognize that DI takes resources, creates risks, that
may complicate one office's systems
development/operation efforts
integrate DI requirements into the day-to-day business,
i.e., enforce environmental equity into the process by
mandating the equity requirement be met before issuing
permits
v-5
-------
APPENDIX V
2. Elevate OIRM in the organization. Establish a data
administration center, central group for DI, and a CIO~a
strong General Patton type with 'status/stature/authority.
Solving the problem is a matter of the vision and will of
top management to:
Build sound IRM processes in the Programs
Build strong IRM programs in the Program areas
Put the existing resources and logistics to work under
strong leadership
Develop and enforce data standards
3. Sound thinking about IRM needs to be built into the thinking
of program personnel—raise the strategic role of information in
protecting the environment:
Consider an alternative to the IRM steering Committee -
-high level managers don't deal with issues
Program officials on the Committee should have more at
stake
Assistant Administrators don't know ADP/IRM lower l«v«l
issues. They don't have authority over it either.
EPA should follow the path taken by FAA, Patent and
Trademark, NWS, SEC: view information as a core
business line; institute a major modernization effort.
Revisit lessons learned during one of EPA's major
efforts in ORD and OW, EMAP and STORET modernization;
determine if a link exists between systems.
4. Link in to the data highway bandwagon and open all non-CBI,
non-enforcement sensitive, non-privacy act information to the
public
would identify data quality problems
would require move toward continuous improvement in data
quality by making EPA more accountable for data quality (can
we do it without.creating fear within EPA?) provide TQM
amnesty—line it up so that data managers will not suffer
repercussions. Be prepared for chaos.
would create greater demand for DI
ensure the customer knows where to go for information—set
up an 800 number
v-6
-------
APPENDIX V
5. Create consistency, standards and naming conventions. This is
the critical path to success. Review the case in point, TOXIC
loadings model — the Great Lakes Project — which is dealing
with multiple data standards, for example, how much chlorine
loadings?
(Also important is the) Institutional ability to use DI. Do the
users have the ability to ask the questions that force DI?
What are the questions that demand DI in'order to be answered and
who's asking them? The public's naive questions require DI. (How
are we achieving) ecosystem protection data/pollution prevention
data/where are the integrators. We have no background, no
experience.
v-7
-------
APPENDIX V
SUMMARY OF HEADQUARTERS INTERVIEW
SUBJECT: OFFICE OF AIR AMD RADIATION
PROBLEMS AND ROOT CAUSES
LACK OF CREDIBLE BASIC MANAGEMENT INFORMATION ABOUT WHERE AGENCY
FUNDS, INCLUDING SUPERFUND MONEY, ARE BEING SPENT AND HOW, AND
THE RESULTING ACCOMPLISHMENTS
This previously identified problem is not considered an
issue in the AIRS program. It was felt that they have a
"complete financial structure" and can identify where funds go,
how they are distributed, and what they are used for. This is
possible because the entire AIRS system budget is contained in
one Branch budget.
DIFFICULTIES IN ADDRESSING CROSS-MEDIA POLLUTION PROBLEMS
"Root causes" preventing information integration include
both programmatic and philosophical issues. An example of a
programmatic difficulty is related to the "lack of support" that
FINDS receives. This lack of support has resulted in programs
being reluctant to participate in the program and a "lack of
trust" by the programs. Although the FINDS program is a good
idea, OIRM is "dropping the ball" by not properly supporting it.
Philosophical differences are related to the different
"power groups" outside the Agency (i.e., OAR/NOAA 6
OW/COE/USFWS/USGS) involved in the process. The fact that there
is not a lot of commonality between programs leads to serious
difficulties in cross-media integration. The recently
reorganized Office of Enforcement will have a hard time
addressing cross-media enforcement issues because of the lack of
commonality between programs. There is also a lack of
compatibility between programs.
Recent budget cuts are significantly setting back long-term
efforts at data integration.
SIGNIFICANT COST OVERRUNS AND DELAYS IN DEVELOPING AND
IMPLEMENTING INFORMATION SYSTEMS
They did not believe that cost overruns and delays are
necessarily problems (in the AIRS program). Cost overruns are of
a continuing nature and are necessary because information systems
continually need modification. For example, they recently
received $300,000 for IRM support" to implement new requirements
V-8
-------
APPENDIX V
related to the Clean Air Act authorization (caused expansion in
users, requirements, etc.).
Lack of centralized, unified data management guidance is
resulting in Regions and States "going their own way."
Additionally, Agency programmatic managers have not "bitten the
bullet" on identifying what information will be needed to run
programs in the future. This has resulted in no base budget
being established to develop the information systems necessary to
support future program operations. Small policy decisions can
change systems in a significant way.
There is no base budget to maintain software. As a result,
50% budget cuts translate to putting developed systems on the
shelf.
DATA INTEGRITY PROBLEMS WITH EPA'S MISSION-CRITICAL INFORMATION
SYSTEMS
Data integrity problems stem from the fact that there is no
consensus on who needs what information. For example, some
Regions rely totally upon the States to provide information while
other Regions do not. Additionally, managing the quality of data
received from the States is a problem. Data quality standards
are not always established and if they are established they are
not enforced. There is an inconsistency in data quality
standards. For example, the Agency was not able to force the
State of California to enter zip code information into the AIRS
system because they are not convinced that the information is
correct. Instead, the State enters XXXXX for the zip code.
DEVELOPMENT OF DUPLICATE SYSTEMS
There is a belief that use of the mainframe is the best way
for EPA to do business. However, there is tremendous pressure on
the programs to develop PC-based systems. For example, Region 4
has developed their own PC-based AIRS system for use in the
Region and the States within the Region. This system was
developed because Region 4 did not like AIRS (they found it hard
to use). Duplication exists because systems are developed in a
Regional office to meet a specific need and then are given to
other offices or States for use because systems. These systems
are then viewed as "Agency" systems without any Agency
endorsement. They believed that a formal process would not help
the situation but only add another layer of bureaucracy.
Nationally, many State systems duplicate EPA systems and
this duplication causes problems." These systems are often not
V-9
-------
APPENDIX V
integrated. It may be impractical to "force" one system (ex.
AIRS) on everybody, but data standards would certainly help.
AIRS is working with the ANSI standards committee to develop
national standards. The problem will likely get worse because of
increasing demands for information (ex. recent new Clean Air Act
requirements). The problem is further aggravated by the current
push against centralization and the push to empower States and
local governments and reduce oversight (i.e., NPR). The
reorganization of the OE and movement of the air enforcement
function to OE may lead to further fragmentation over time (as
has happened with permit systems).. In addition, "Title V" forces
have created a need for more systems and more money to be
provided to the states.
One of the participants is chairman of an OAQPM data
management workgroup (task force) and the workgroup will produce
an "action oriented paper" around April 1994 addressing data
management issues. The document is part of a long-term plan with
the Department of Defense to identify similar systems to combine
and integrate data. The ability to integrate has been impaired
by recent budget cuts.
EXPOSURE OF AGENCY'S FINANCIAL PAYMENT SYSTEMS TO UNNECESSARY
ACCESS RISKS
This issue is not believed to be a problem in the AIRS
program.
ADDITIONAL PROBLEMS
One problem they have been dealing with for some time
(twenty-plus years) was the problem of data confidentiality.
There are cases in which data requirements are promulgated by EPA
and States refuse to provide the data because the data is
considered confidential by state statute and not confidential
(i.e., subject to FOIA requests) by Federal statute. One
participant has been unable to get the Attorney General and
Office of General Counsel to make a determination as to the
confidentiality of some data in these cases. In addition, there
are different standards of Confidentiality for industry and
States resulting in data integrity problems (certain data is not
reported). In addition, vulnerability problems sometimes arise
when the level of security for particular data varies between
pieces of legislation.
A problem area in hardware and software compatibility is
related to the direction the Agency is taking technologically.
It is felt that with the new client-server technology, NDPD/OIRM
V-10
-------
APPENDIX V
has not established a new hardware/software direction for the
Agency. Hardware standards are not being built and national
priorities are not being established (e.g., future
telecommunications needs). User's needs are not being solicited.
0ARM'S consideration of cutting one of two NDPD mainframes
in response to FY94 budget cuts would be "catastrophic for
States11 and EPA would "lose a customer base we'll never regain".
NDPD used to roll out new equipment all the time but that real
future planning by one of the NDPD branches stopped 2-3 years
ago. It was also felt that user funding (fee for service) isn't
really the answer because (1) the program's budgets don't have
money to pay for the services in the first place, and (2) the
program's budgets are also being cut. User funding would be very
disruptive in the short run.
The State/EPA Data Management program (SEDM) was felt to be
an excellent program in which State and EPA personnel (at the
working level) were able to work together on small multi-media
types of projects. The participants in this project were very
enthusiastic about the work being done and the project
facilitated good communication between the Agency and States.
They believed this program was cut last year (beginning of FY93)
because of budget cuts.
It was observed that States use AIRS in one of two ways,
directly or indirectly. Direct users input data into AIRS and
extract data from AIRS for their own use. Indirect users input
data into AIRS because they are required to provide the data, but
do not extract data from the system and feel this is an onerous
request. When Regional officials support a system like AIRS, the
States in that Region are more likely to cooperate than when the
Regional office does not support a system (States are more likely
to go their own way).
ORGANIZATIONAL ACCOUNTABILITY
Senior management recognizes the data collection function
but does not always recognize this function as a component of the
broad data management process.
They believe there have been problems resulting from the
termination of the TOSS contract. The termination of TOSS has
lead to the dispersion of system development activities and this
has resulted in less standardization. They feel the MOSES
contract is not a popular development vehicle because although it
can be used to develop a system, the programs cannot use it the
support the system.
V-ll
-------
APPENDIX V
OIRM showcasing products, introducing tools, and providing
models in the past was appreciated. There is concern that there
will be a shift, on OlRM's part, toward enforcement and away from
support.
One organization has had problems getting personal computers
off the Agency contract. Some PCs have been on order for "over a
year." The problem seems to be that NDPD does not get the
support they need from 0AM. Another office felt that buying
equipment off the contract does not result in paying the lowest
prices for the equipment. They "get laughed at" when they buy
equipment at such high prices.
Regarding the FMFIA process, it was felt that material
weaknesses have no relationship to resources. For example, one
office can be performing alright with few resources and have a
need for additional resources while another office can have a lot
of resources, report a material weakness, and get additional
resources. In fact, reporting an issue as a material weakness
often results in additional resources (i.e., failure breeds
success).
PLANNING AND RESOURCES
They believed that OIRM's policy development process does
not always get all programs' input and buy-in, creating a "lose--
lose" situation. Currently, programs are not plugged into this
process.
IRM functions are currently taking a double cut in budget
cutbacks. For example, program cutbacks result in cuts to
available program IRM resources in addition to cuts in OARM,
resulting in cuts to available non-program IRM resources on which
the programs depend. The programs have very little say in OARM
discretionary budget cuts even though these cuts directly affect
program, operations. The program office has dealt with severe
cuts for the past two years. Part of the problem is related to a
lack of understanding, by the programs, of* the non-discretionary
portion of OARM'S budget.
Communication does not always follow a proper process and
procedures nor is there awareness of sensitivities. For example,
members of an audit team met with AIRS representatives to discuss
AIRS system maintenance. However, communication problems in the
OAR chain-of-command resulted in the audit team taking some time
to identify the correct group of people to interview. It was
felt that OIRM does not always keep programs adequately aware of
upcoming responsibilities. For example, the office received a
V-12
-------
APPENDIX V
notice stating that a certain system security plan was due. They
noted that they had little warning that the plan was due and
pointed out that the memo (re: installation security) did not
identify, or give guidance on, specific security issues to be
addressed (i.e., PC security or data security).
Too many SIRMOs are part-time. OAR does not have the
resources to staff the SIRMO position full-time. OSWER was used
as an example of a better way to staff a SIRMO function.
The Agency is not adequately planning and funding
information systems. There is a feeling that the Agency is
behind in the client/server environment and there isn't enough
"muscle" in the hardware "architecture." Some efforts have been
made to examine the direction the Agency should take. Part of
the problem is that IRM does not get much visibility.
DATA INTEGRATION/DATA MANAGEMENT
FINDS is a great idea that has "not hit the mark." Some
problems exist with the reliability of the data maintained by
FINDS. For example, some States refuse to use the EPA ID
(assigned by FINDS) as a primary identifier (although that was an
original intention) because of data quality problems. It was
believed that Regions/system owners need incentives to clean up
data. The Agency cannot implement policy and then "walk away." •
They felt that this is happening with the locational policy.
Apparently, several States are not willing to commit to
implementing this policy and nobody in EPA Headquarters is
forcing it. They believed there are many complicated issues
involved with the FINDS program but that the whole process is
further complicated because of the lack of continuity within the
FINDS program. It was pointed out that FINDS promotes linkage
rather than integration. In addition, policies are not enforced
through Grants or by giving technology to States for free
provided States put resources into these projects. The
fundamental issues of what data is needed in National Air
repository both now and in the future have not been addressed.
QUALITY ASSURANCE AND PROTECTION OF DATA
Quality assurance was looked at during the FMFIA process and
a data quality plan is being developed. However, the level of
budget cuts is making things difficult. There are some
weaknesses. For example, emissions (air) does not have data
quality objectives or plans but needs to develop them. However,
Clean Air Act amendments have caused an overload and many things
are not getting done. There has'been a fairly steady erosion in
V-13
-------
APPENDIX V
maintaining quality in data that is being collected. Quality
assurance "falls by the wayside" when other requirements (ex.
court-ordered deadlines or top management's "hot" initiatives)
are a higher priority.
COMMUNICATION
It was felt that OIRM does not clearly identify and state
priorities, communicate these to everyone, provide training, or
solicit input on the effects of their efforts. It was felt that
OIRM loses credibility when they ask for information and never
follow up.
SOLUTIONS
SIGNIFICANT COST OVERRUNS AND DELAYS'IN DEVELOPING AND
IMPLEMENTING INFORMATION SYSTEMS
The Agency needs centralized, unified data management
guidance promulgated by Headquarters.
Need national consistency in data (ex. emission inventory).
DATA INTEGRITY PROBLEMS WITH EPA'S MISSION-CRITICAL INFORMATION
SYSTEMS
Data quality standards need to be enforced.
•
DEVELOPMENT OF DUPLICATE SYSTEMS
OIRM needs to explore the issue of duplicative systems and
"official1* Agency endorsement of systems for use by other offices
and States to accomplish program missions (without adding
bureaucratic layers).
Establish data standards for national systems for use by
States in designing systems.
ADDITIONAL PROBLEMS
OIRM should establish a Bulletin Board System (BBS) to
provide advice on purchasing hardware and software to help solve
compatibility problems.
Looking into the future in terms of planning and providing
long-term commitment and stability to major national systems was
emphasized as being very important.
V-14
-------
APPENDIX V
Bring back the (SEDM) program.
ORGANIZATIONAL ACCOUNTABILITY
The Senior IBM Steering Committee should move from an
advisory role to a decision making role and have strong technical
subcommittees providing input and advice to senior decision
making personnel.
Senior management needs to buy into the data management
process (including the data collection activity). There needs to
be a data management plan which would address needs from today to
5 or 10 years out.
It is "critical" that top management '(political appointees)
understand and approve of data collection/systems development
efforts. Programs need to "close the loop" with senior
management on data requirements (need top level concurrence) and
that commitment must be received up front. "Don't create another
bureaucracy". Senior management buy-in needs to be informational
(i.e., understanding and agreement) versus "getting a signature
in the right blank." Senior management should share
accountability based on buy-in but buy-in need not be
"memorialized" in the form of a document. Another "paperwork
exercise" is not needed.
The SIRMOs should act as a "clearinghouse of information"
across their programs. In this capacity, the SIRMOs would be in
a position to reduce the duplication of information systems.
OIRM should have a "fostering" role in systems development
projects. Helplines could be used to support development
activities. It is "impossible" for OIRM to track development
activities and a lot of development work is being missed.
There needs to be better communication with the procurement
office, Office of Acquisition Management (OAM), NDPD, and the
program offices related to buying equipment off Agency contracts.
PLANNING AND RESOURCES
OIRM needs to get programs together and agree on policies
and standards.
Better communication needs to be established within the
Agency concerning IRM matters. Communication needs to follow a
proper process and procedures and be aware of sensitivities.
V-15
-------
APPENDIX V
Move toward integrating PCs and mainframes in a
client/server environment to help the Agency "catch up" in the
IRM arena. The Agency needs a centralized organization (NDPD) to
take the lead to make this happen.
' EPA needs to "put some muscle" into the hardware
architecture. There needs to be better centralized leadership to
address common technical problems not "what are you doing in
AIRS."
DATA INTEGRATION/DATA MANAGEMENT
Develop incentives for Regions/system owners to clean up
data in FINDS.
OIRM needs to foster importance of policy through better
policy guidance.
A National Air repository is needed and the fundamental
issues of what data is needed both now and in the future need to
be addressed. A long term plan needs to be developed.
QUALITY ASSURANCE AND PROTECTION OF DATA
Emissions (air) needs to develop data quality objectives or
plans. To accomplish this, they need the full support of senior •
management (Deputy Assistant Administrator (DAA) accountability).
COMMUNICATION
OIRM's priorities need to be clearly stated and communicated
to everyone. In addition, OIRM's role needs to be more
facilitating, showcasing, and helping not just compliance. OIRM
needs to ask if their efforts are helping. They need to identify
priorities and provide training.
OTHER COMMENTS
This was the "first time in 20 years anybody asked us these
questions" and we are very pleased to have been included in the
review.
V-16
-------
APPENDIX V
SUMMARY OF HEADQUARTERS INTERVIEW
SUBJECT: SYSTEM DEVELOPMENT CENTER
PROBLEMS AND ROOT CAUSES
MANAGEMENT CONTROL
* Change of people (turnover of Contract Officers (COs) and .
Project Officers (POs)) hasn't helped. Has led to
inconsistencies on EPA's side.
DIFFICULTIES IN ADDRESSING CROSS-MEDIA POLLUTION PROBLEMS.
* Agency not created "holistically."
* When Agency not organized "holistically" info, and data can
not be. There have been efforts to improve/combine data but
unless EPA is organized "holistically1* EPA cannot adequately
address this.
* System Development Center (SDC) not hearing a need for this.
Only need for this (from big picture) is in the mind of top
management.
* Continues to be problems in complexity and difficulties in
interpreting data (ex. Superfund can not agree on what a
site is).
* Difficulties in trying to add structure to .things that My
not be able to be structured.
* GATEWAY/ENVIROFACTS info, is not designed to be "meaningful1*
when viewed together. Have not looked at public needs.
ORGANIZATIONAL ACCOUNTABILITY
* SDC sees spotty SIRMO involvement (weak link - pressures
from other organizational roles they have).
* Most Delivery Order Project Officers (DOPOs) need to be
prodded for performance measures in DOs.
* Programs had some problems factoring some areas such as
product assurance into their process.
PLANNING AND RESOURCES
* High turnover of CO's/PO's (MOSES).
* Expertise of DOPOs mixed (not consistent). Hard to get all
expertise in one DOPO.
* Hard to get away from treating on-site contractors as staff.
* Frequent conflict -pressure to get something done vs. doing
it right.
V-17
-------
APPENDIX V
* Lack of knowledge of implications of this pressure and
balance of requirements vs. good IRM on staffing.
* Info, gathering in regulations not involved enough with IRM.
* No consistency from year-to-year for IRM involvement up
front.
QUALITY ASSURANCE AND DATA PROTECTION
* Inadequate systems documentation on systems coming into SDC,
hard to maintain, don't always know what changes will
affect.
* Can not always get back to sources.
* SDC process forces policies, but there is a limit to what
can be enforced when no funds are available.
SYSTEMS DEVELOPMENT LIFE-CYCLE
* Most systems need constant enhancement, mostly because of
regulation changes (change in information needs).
* There is an unwillingness to do some things that are
policies (i.e., documentation, life-cycle planning).
COMMUNICATION
* DOPOs without background in IRM not knowledgeable or want to
avoid it. OSWER people generally aware of IRM guidance.
* IRM management not "plugged in" or not strong enough to deal
with Steering Committee.
* Re: Policy, Procedures, Standards - Some programs are good,
some bad.
SOLUTIONS. IMPROVEMENTS. AND ACCOMPLISHMENTS
MANAGEMENT CONTROL
* MOSES process was designed to improve. SDLC
* Management structure (both in SAIC contractor and EPA)
developed to oversee work in SDC helps ensure things are
done right (i.e., EPA oversight in development of project
plan through negotiation and revision of project plan).
* Audits have made a difference on program side. IG reports
have gotten program office management's attention. Program
offices trying to change including more management control
(esp. in RCRIS) and IRM decision making. Superfund is also
improving/Govt. approves all changes - due in part to
structure/management of contract (forces controlled change).
Have reviews of projects once per month focussing on
schedule/technical issues.
V-18
-------
APPENDIX V
PRODUCT ASSURANCE
* Product assurance on each project (including configuration
management and software QA) has paid off, although there
have been complaints on added cost.
* Provides better control over changes. Contractor now
provides recommendations, but EPA meets with all principles
and agree on changes.
* Getting better at prioritizing changes and selecting those
that can be implemented within budget. Better handle on
where money is spent.
DATA MANAGEMENT
* This summer's Superfund data collection was a success
because of central consolidation of data management in SDC.
DIFFICULTIES IN ADDRESSING CROSS-MEDIA POLLUTION PROBLEMS
* GATEWAY/ENVIROFACTS is a step toward pulling together
data/info, that exists. May be a front end to public
access. Pulling data from different systems points out
differences.
* Part of GATEWAY/ENVIROFACTS is also an effort to standardize
data elements through data modeling to see what data is
there.
SIGNIFICANT COST OVERRUNS AND DELAYS IN DEVELOPING AND
IMPLEMENTING INFORMATION SYSTEMS
* MOSES process is trying to preclude this by planning and
through development of a good project plan up front.
*. However, MOSES is not dealing with entire Agency (only
Water/OSWER/some ORD - big programs). No acid rain work.
Regions do their own thing.
DATA INTEGRITY PROBLEMS WITH EPA'S MISSION-CRITICAL INFORMATION
SYSTEMS/DATA MANAGEMENT
* Data integrity rests more with programmatic area.
* RCRIS - States are putting more resources into data
integrity.
* PWSS - States anticipating needs to have data integrity
because they are involved in process.
* Currently doing data management at SDC.
* Working towards a better data definition standards in IMDA.
* Efficiency & quality improvements have been noted by SDC
centralization. A 40% reduction accrued when geographically
V-19
-------
APPENDIX V
separated functions (i.e., hotline problems and software
maintenance/development) were moved to a central location
(same system with geographically separated functions when
functions needed close communication—RCRIS).
* CERCLIS also becoming centralized.
* You get cross fertilization of experiences between different
systems when co-located.
* OERR has benefitted from sharing data when moved to same
location as other OSWER systems.
DEVELOPMENT OF DUPLICATE SYSTEMS
* SDC set up to try and avoid this through its process.
EXPOSURE OF AGENCY'S FINANCIAL PAYMENT SYSTEMS TO UNNECESSARY
ACCESS RISKS
* No financial systems in SDC other than IFMS which is an
evolving picture.
ORGANIZATIONAL ACCOUNTABILITY
* Documentation approvals seen at higher levels (OSWER, ow,
OAR). Other offices are improving.
* Need an IRM organization (centralized) budget to support
Agency. NDPD is the only unifying force via architecture.
* Contractor estimation process records changes and effort in
projects for better estimates in future.
* Use of IEF (Computer Aided Software Engineering ((CASE)
tool) helps identify performance measures.
* Govt. side of SDC providing better guidance for IGCE adding
product assurance to cost estimation items (product
assurance mandatory).
PLANNING AND RESOURCES
* No sign that qualifications & training getting worse (not
necessarily getting better either).
* Meetings conducted to educate DOPOs on SDC process.
* Would like to see more OIRM/IMDA participation in process.
* SDC has self-discipline with its existing structure - i.e.,
separate facility, "off-site" meetings, etc. therefore no
day-to-day direction.
* SDC setup forces better planning.
* Certain things (functions) should be government function.
* If resources are not available, reduce what is being done or
reduce requirements to be met..
V-20
-------
APPENDIX V
* More emphasis on IRM, skills, life-cycle training, support,
and continue to make sure DOPOs understand contract issues.
* Need programmatic pressure to make projects better.
* Need mechanism to address requirements in time period vs.
violation of good IRM.
* IRM needs to get "plugged in" to Congress/bills.
* When Congress does "how to1* legislation, EPA/IRM needs to be
involved or they should not be so specific.
* - Need better OMB interface on information collection forms,
complete with time frames and implications on systems.
* Need to identify system life but technology mainly the
driver in system life, plus NDPD support of technology
(80286 computers are still being supported).
* EPA needs FTEs to do certain functions such as LAN support
on-site.
QUALITY ASSURANCE AND DATA PROTECTION
* As work is done on "inherited" systems (existing systems
coming into the SDC), documentation improves. All new
releases are well documented.
* Data standards are being used when applicable and adhered
to.
* Contractor personnel have signed confidentiality statements.
* Risk analysis being done more frequently by program offices
and are also done during development.
* SDC supporting Agency security program.
* Considering adding cost for security as mandatory item much
like what is done for QA.
SYSTEMS DEVELOPMENT LIFE CYCLE
* Public Water Service Supply (PWSS) effort started with an
ISP which identified systems to develop first. The effort
included Regions and States for requirements. The use of
CASE helped to get immediate feedback and "buy in" with "on
the spot" documentation. Emphasis was on requirements with
validation. Also looked at public and local govt. needs.
Able to develop system more quickly (esp. through - Rapid
Application Design (RAD) of subsystem identified in ISP).
Has gone to pilots in States and getting good feedback.
* Need more.emphasis on flexibility within policy, procedures,
and standards.
* Need to identify what level of mandatory policy can be
enforced no matter what, ensure policies can be implemented
(some policies cannot be implemented, some are not
applicable).
V-21
-------
APPENDIX V
* Need understanding of policies, standards, and procedures
plus budget to implement.
* May be useful to have an O&M SDC as well as a development
SDC. Would be difficult to support an SDC at each Region.
* Need some mechanism to determine if a system should be
scrapped.
COMMUNICATION
* Need exposure to policies, procedures and standards through
training.
* Keep up pressure (audits). Enforcement of policies is
crucial.
Note: This interview session included key EPA SDC management
personnel as well as key contractor SDC management
personnel. • .
V-22
-------
APPENDIX V
SUMMARY OF HEADQUARTERS INTERVIEW
SUBJECT: 8UPERFUND
Problems Identified
Have had some problems with definitional interpretation.
The CERCLIS information system interface with the Agency's
financial system has not been reestablished following
replacement of the system with IFMS.
The Office of Information Resources Management does not
have control over the Agency's IRM program.
The Agency has not adequately defined commonality between
information systems.
Root Causes
The Office of Information Resources Management does not
enforce IRM policies, procedures, and standards.
Agency management does not understand the cost of
information.
The Office of Information Resources Management does not
have a clear mission statement.
Recommendations
Recruit and place technical professionals in the program
offices. These professionals need to be EPA FTEs.
Make available a core staff of IRM professionals (Agency
FTEs with technical expertise).
Provide technical training (e.g., systems management
training).
Make end-users responsible for data quality.
Take steps to monitor data quality.
The Office of Information Resources Management needs to
enforce IRM policies, procedures, and standards if they
develop them.
System changes need to have buy-in from Headquarters and
the Regions. Need to build consensus.
Resist asking Regions to collect information that they do
not use or need.
Need to get data off of the mainframes and in the hands of
users and managers.
Need to conduct IRM planning up front.
Need to link the IRM plan to the budget.
Control of IRM needs to take place at the program level.
IRM support services should be provided as part of a
program-level mission support contract.
V-23
-------
APPENDIX V
Additional Comments
Superfund information systems provide the information
necessary to manage the program.
Many of the core CERCLIS utilities were originally
developed in the Regions.
Senior management support is critical.
Support from the OSWER Information Management Staff has
been valuable.
Don't see a need to merge information systems that contain
unique information.
V-24
-------
APPENDIX V
SUMMARY OF HEADQUARTERS INTERVIEW
SUBJECT: PERFORMANCE MEASUREMENT
The Review Team met with representatives from the OPPE strategic
Planning and Management Division and OARM Financial Management
Division. The purpose of the meeting was to discuss the Agency's
accomplishments reporting capabilities and its efforts to
implement the "Government Performance and Results Act of 1993."
STARS
Strategic Planning and Management Division is responsible for
running STARS. STARS is the Agency's official accomplishments
reporting system. An OPPE official explained that 97% of the
data in STARS system comes from other Agency databases (CERCLIS,
RCRIS, etc.) and that the data relates primarily to "activities .
that the Agency performs." The system contains a significant
amount of enforcement and Superfund data.
Past administrations have used information reported by the system
(in quarterly reports) to question AA's and RA's about
accomplishments. However, the immediate previous and current EPA
Deputy Administrators have not used STARS information heavily.
Mr. Habicht was briefed on the STARS information before visiting
a Regional office, but did not "question" Regional management
about information reported in STARS. OPPE officials believe that
the current low usage relates to the fact that program management
has not yet been fully established or that it does not fit the
management style of the new administration.
STARS II
The STARS II system, which is currently in the conceptual stage,
will get closer to identifying outcome related information
(versus purely "bean counting" accomplishment data). OPPE
believes this would be an improvement on the current system.
However, whether or not the system will receive outcome
information from Agency systems (as STARS now receives "bean
counting" accomplishment information) is not known at this time.
System planners do not envision the system including many
environmental indicators. Primary reasons for developing STARS
II are to link resources to accomplishments and to have
representation from all program offices. Targets/commitments are
set on an annual basis. Target setting for FY 1994 is going on
now. Targets are locked at the end of April.
V-25
-------
APPENDIX V
STARS II documentation to date consisted of briefings given (in
the August tineframe). This is the latest information available
(and it may be slightly out of date -particularly from the
standpoint of incorporating the NPR results).
PERFORMANCE MEASUREMENT
A case study is currently being performed using the Chesapeake
Bay program. The purpose of the case study is to identify
obstacles, stumbling blocks, etc. An additional purpose is to
tie together resources, output, and outcomes. The problem with
this case study is that State resources are not being accumulated
as part of the project (which means that the measurement of
accomplishments per resource will be overstated).
The Agency anticipates participating at some level of response to
the Government Performance and Results Act of 1993 (6PRA). At
this time, however, EPA is not participating in any pilot
project. A GPRA workgroup is drafting a letter to OMB expressing
interest in a GPRA pilot project. The pilot area should be
identified in the near future.
The STARS II effort would be a performance, measurement project
and is expected to tie into the budget system (link to IFMS
and/or RMIS) and would report GPRA results (including
accomplishments compared to goals). STARS II is expected to tic ••
planning, budgeting, and accomplishments together. Information
needs to be tied into measurement as well, but at this point the
Agency is still grappling with conceptual issues. No software
tools were being used to identify available sources of the data
needed for STARS II at the time of this meeting. The review tea*
pointed out that it is during this conceptual stage that software
tools are needed for identification of information sources.
While more than 90 percent of STARS data is submitted from
program information systems, OPPE does not perform data quality
work. However, there is a discrepancy resolution process whereby
the Regional "numbers" and Headquarters "numbers" are printed and
compared. Differences are formally hashed out at the Assistant
Administrator and Regional Administrator level. If no agreement
can be reached, the Headquarters number is used. Discrepancies
are usually the result of timing differences and are negotiated
between the Regional Administrators and Assistant Administrators.
V-26
-------
APPENDIX V
SIRMO INTERVIEW SUMMARY No. 1
This SIRMO emphasized that much information is available, but
that a typical manager must usually rely on an intermediary
person to get the information. The ideal is for managers to have
readily accessible and more user-friendly systems that they can
use directly. The SIRMO acknowledged Headquarters resource
constraints preclude making systems more user-friendly and
believes national systems provide useful information but more is
needed. National systems that are designed to answer
Congressional questions rarely meet Regional needs. There
usually is little or no Regional ownership of the data being fed
into the national systems.
This Region has or is in process of automating many
administrative systems: training, contracts management, travel,
and procurement. The SIRMO discussed advances in the Agency: the
goals project and THIS as success stories, and stated that she
believes the Agency is "on the cusp" of beginning to really use
its information well. Possible obstacles contributing to any
inability to provide basic management information through the
Agency's information systems were discussed.
OBSTACLES
Lack of a distinct program element for IRM in the Regions leads
to inadequate funding for IRM related FTE resources stemming from
the current practice of lumping this requirement with all other
resource management issues. Outmoded software and the need for
cultural change in which managers would be seen as end users,
able to retrieve information without an intermediary were
described as obstacles. The Headquarters EPAYS system's
inability to interface with this Region's on-line training
system, requiring Regional staff to re-key training data in batch
to EPAYS, is a lost efficiency.
RECOMMENDATIONS
Establish an Headquarters IRM official organizationally placed
above all Assistant Administrators.
Establish an Agency standard relational database management
system for all applications.
Continue culture change that fosters significant user involvement
in system development projects.
V-27
-------
APPENDIX V
Ensure all system development projects have clear understanding
of customers' and users' needs. For example, repeat the recent
effort where Regional programmers went to Headquarters to work on
ICMS.
Revise FTP resource acquisition requirements which apply equally
to purchases of diskettes and toner cartridges as well as major
purchases of much greater value.
V-28
-------
APPENDIX V
SIRMO INTERVIEW SUMMARY Mo. 2
This SIRMO indicated that generally data (environmental, permit,
water quality, etc.) is not easy to get at and does not serve
purposes of what users (public/local government) want or need it
for. By way of example he said that often they dial an "800"
number, get a wrong number, nobody answers it, and when they do
get through it's not what they thought or wanted. The SEDN
program was a good program and was allocating funds to enhance
State capacity but this was dropped in FY 92 or 93. A Task
Force, started under Administrator Reilly, found that States do
not have tools to do the job. A Steering Committee is
implementing the Task Force report, with support from the
National Performance Review and Cabinet status bill for EPA.
Initiatives are integrally involved in data management.
X ".
Management often relies on staff and often assumes that things
are taken care of, especially when they have good staff. If you
have good staff (under the SIRMO functions) you tend to devote
less time to the SIRMO function. He estimated that he spends
about 25% or less on SIRMO duties because he has a very good
staff. He felt his recent detail to Headquarters was very useful
because it helped to improve communication and relationships with
key Headquarters management.
The SIRMO expressed concern over whether we really protect data
and have the right people handling data. He felt that additional
work was needed in this area.
Regarding resources, he will get the client (program manager) to
provide people before committing his resources. After work is
initiated, the client is trained to provide continuity and reduce
need for his resources on a continuing basis. He felt that
planning is pretty good in the Region. He sends out call memos
at the beginning of each FY and mid-way through to determine what
IRM support is needed. There could be benefit from long-term
planning, however.
OBSTACLES
* The public and local governments are frequently not
identified as users up front in the process.
* Lack of the ability for management to quickly communicate
issues.
* Not enough attendance by SIRMOs at ARA meetings.
i
V-29
-------
APPENDIX V
* People (program managers) not always willing to commit
resources for developing systems/applications.
* There is a tendency to add on to systems and not enhance
what is there. Quality data is best when everybody has to
access to do their job.
RECOMMENDATIONS
* state Data Capacity - one current effort trying to get
better communication, with minimum investment, via e-mail
between States/Regions/Headquarters to more quickly identify
and respond to issues.
* Management needs to be more proactive in getting information
from staff and ensuring that things are done.
* If you have senior staff substitute in some of the IRM
meetings, it is critical that you have good communication
both before and after the meeting.
* Need assessments to determine how best to handle data/who
should handle data.
*• Need to commit/invest time and dollars to get good data.
V-30
-------
APPENDIX VI
AGENCY PARTICIPANTS IN FOCUS GROUPS AND INTERVIEWS
HEADQUARTERS
Office of A^pjnlgtration and Resources Management
Office of Information Resources Management
Jeff Byron, Chief, Information Systems Management Branch, Program
Systems Division
Steve Young, Information Systems Management Branch, Program
Systems Division
Barbara Jarvis, Program Systems Division, Systems
Development Center
Margarite Shovlin, Program Systems Division, Systems
Development Center
Office of the Comptroller
Debbie Ingram, Financial Management Division
System Development Center
SAIC
Charlie Stringfellov, Program Manager, MOSES Contract
Tom Thomason, Assistant Program Manager, OSWER Projects
Office of Enforcement
Office of Compliance Analysis and Program Operations
Bruce Rothrock, Chief, Information Management Branch
Office of Policy. Planning and Evaluation
Office of Strategic Planning and Environmental Data
Phil Ross, Director, Environmental Statistics and
Information Division
Chapman Gleason, National Environmental Statistics Branch
Sue Priftis, Strategic Planning and Management Division
Office of Water
Michelle Killer, Director, Communications and Information
Management Staff
Robert King, Office of Wetlands, Oceans and Watersheds
Phil Lindenstruth, Office of Wetlands, Oceans and Watersheds
Dela Ng, Office of Wastewater Enforcement and Compliance
Larry Weiner, Office of Ground Water and Drinking Water
VI-l
-------
APPENDIX VI
Office of Solid Waste and Emergency Response
Thomas Sheckells, Director, Office of Program Management, Office
of Emergency and Remedial Response
Michael Cullen, Director, Management Systems Staff, Office of
Program Management
Linda Boornazian, Deputy Director, CERCLA Enforcement
Division, Office of Waste Programs Enforcement
Myra Galbreath, Chief, Information Management Branch, Office of
Solid Waste
Office of Air and Radiation
Office of Program Management Operations
Kelly N. Spencer, Acting Director, Resource Management Staff,
Office of Program Management Operations
Reginald Slade, Resource Management Staff
Office of Air Quality Planning and Standards
Research Triangle Park. NC
Robert G. Kellam, Acting Director, Technical Support Division
John Bosch, Chief, National Air Data Branch
David Mobley, Chief, Emission Inventory Branch
Andrea Kelsey, National Air Data Branch
Office of Prevention. Pesticides and Toxic Substances
George Bonina, Deputy Director, Information Management Division
Ruby Boyd, Information Management Division
Thomas Hooven, Office of Program Management Operations
Jim Willis, Deputy Director, Environmental Assistance Division
Office of Research and Development
Allen Johnson, Information Systems Staff
Clifford Moore, Chief, Information Systems- Staff
Focus Group Facilitator
Steven Smith
VI-2
-------
APPENDIX VI
REGION 2
Office of Policy and Management
Robert A. Messina, Chief, Information Management Branch
Barbara J. Pastalove, Chief, Planning and Evaluation Branch
Jo-Ann Velez, Financial Management Branch
Office of Regional Counsel
Janice Dudek
Air and Waste Management Division
Helen S. Beggun, Deputy Director
Water Management Division
Robert Vaughn, Chief, Water Permits and Compliance Branch
Emergency and Remedial Response Division •
William J. McCabe, Deputy Director for New York/Caribbean
Programs
Doug R. Garbarini, Deputy Director for New Jersey Superfund
Branch I
Vincent J. Pitruzzello, Chief, Program Support Branch
Richard C. Salkie, Associate Director for Removal and Emergency
Preparedness Programs
Environmental Services Division
Dr. Barbara M. Metzger, Director
National Enforcement Investigations Center
Richard Herman, Office of Criminal Investigations
Focus Group Facilitators
Allan Sommerman
Peter Brandt
VI-3
-------
APPENDIX VI
REGION 4
Office of Policy and Management
Donald J. Guinyard, Assistant Regional Administrator
for Policy and Management
William. A. Waldrop, Jr., Deputy Assistant Regional Administrator
for Policy and Management
Jack Sweeney, Chief, Information Management Branch
Randall Davis, Information Management Branch
Waste Management Division
Joseph Franzmathes, Director
Richard Green, Associate Director for Superfund and Emergency
Response
James Kutzman, Associate Director for RCRA and Federal Facilities
Elmer Akin, Director, Office of Health Assessment
Franklin Hill, Director, Office of Management Support
Doug Lair, Chief, Emergency Response and Removal Branch
Robert Jourdan III, Chief, North Superfund Remedial Branch
Douglas Mundrick, Chief, South Superfund Remedial Branch
Doug Murdock, South Superfund Remedial Branch
Jon Johnson, Chief, Federal Facilities Branch
H. Kirk Lucius, Chief, Waste Programs Branch
Jim Miller, Waste Programs Branch
Eddie Wright, Waste Management Division
Water Management Division
James Scarbrough, Chief, Water Permits and Enforcement Branch
Air. Pesticides and Toxics Management Division
Chester Wakamo, Director
Focus Group Facilitators
Lila Koroma
Annie Godfrey
VI-4
-------
APPENDIX VI
REGION 9
Office of Policy and Management
Nora L. McGee, Assistant Regional Administrator
for Policy and Management
David S. Mowday, Deputy Assistant Regional Administrator
for Policy and Management
David C. Henderson, Chief, Information Management Branch
Angle commisso, Information Management Branch
Hazardous Waste Management Division
David B. Jones, Chief, Remedial Action Branch
Michael T. Feeley, Chief, Permits and Solid Waste Branch
Betsy Curnow, Chief, Case Development Section
Tom McMenamin, Superfund Program Management Team
Water Management Division
Carey Houk, Data Base Administrator
Jon Merkle, Senior Environmental Scientist
Air and Toxics Management Division
Ed Snyder, Acting Chief, Compliance and Oversight Section
Focus Group Facilitator
Becky Tudisco
VI-5
-------
APPENDIX VII
)IG REPORTS. GAP
REPORTS. AND CONGRESSIONAL TESTIMONY
Note: The codes following each document date (e.g., OIG-A-V1)
have been assigned uniquely to each report and testimony, and
provide a cross-reference to individual recommendations in
Appendix IX, column entitled "Reference/Page."
EPA INSPECTOR GENERAL REPORTS
1. Report on Special Review - ADCR IBM Mainframe Password
Exposure, Report No. E1NMGO-15-0023-0400003, dated
December 21, 1989. (OIG-A-V1)
2. Report on CERCLIS Reporting, Report No. E1SFF9-15-0023-
0100187, dated March 12, 1990. (OIG-B-V1)
3. Review of the Fiscal Year 1988 Superfund Report to Congress,
Report No. E1SFF9-11-0015-0100227, dated March 28, 1990.
(OIG-C-V1)
4.. Flash Audit Report - Disclosure of User Passwords on EPA' s
IBM 3090 Computer Mainframes, dated May 7, 1990. (OIG-D-V1)
5. Report on Special Review - CERCLIS Post-Implementation
Evaluation, Report No. E1SFGO-15-0020-0400019, dated
June 14, 1990. (OIG-E-V1)
6. Report on Special Review - Hotline Complaint Concerning the
Office of Research and Development's Modeling and Monitoring
Tracking System, Report No. E1NBGO-15-0038-0400037, dated
September 24, 1990. (OIG-F-V1)
7. Flash Audit Report - Vulnerability of Sensitive Payroll and
Personnel Files on the National Computer Center (NCC) IBM
3090 Computer System, dated September 26, 1990. (OIG-G-V1)
8. Review of the Fiscal Year 1989 Superfund Report to Congress,
Report No. E1SFFO-11-0018-1100026, dated October 18, 1990.
(OIG-H-V1)
9. Integrated Financial Management System: Managing
Implementation of the New Accounting System, Report No.
E1AMFO-11-0029-1100153, dated March 29, 1991. (OIG-I-V1)
10. Significant Savings Possible by Increasing IBM 3090 Computer
Operations Efficiency, Report No. E1NMBO-15-0021-1100152,
dated March 29, 1991. (OIG-J-.V1)
VII-1.
-------
APPENDIX VII
11. Improvements Needed in EPA's Resource Access Control
Facility (RACF) Security Software, Report No. ElNMBO-15-
0027-1100151, dated March 29, 1991. (OIG-K-V1)
12. Annual Superfund Report to the Congress for Fiscal 1990,
Report No. P1SFFO-11-0032-1100385, dated September 16, 1991.
(OIG-L-V1)
13. Special Review of EPA's Major Information Systems, Report
No. E1RMG1-15-0041-1400061,. dated September 30, 1991. (OIG-
M-V1)
14. Special Review, on Follow-up of CERCLIS Reporting and Post-
Implementation, Report No. E1SFG1-15-5001-2400027, dated
March 27, 1992. (OIG-N-V1)
15. CONTRACT MANAGEMENT: EPA Needs to Strengthen the Acquisition
Process for ADP Support Services Contracts, Report No.
E1NMF1-15-0032-2100300, dated March 31, 1992. (OIG-0-V2)
16. Flash Report on Mainframe Access Control Weaknesses at the
National Computer Center, dated April 17, 1992. (OIG-P-V2)
17.. SOFTWARE INTEGRITY: EPA Needs to Strengthen General Controls
over System Software, Report No. E1NMF1-15-0055-2100591,
dated September 22, 1992. (OIG-Q-V2)
18. COMPUTER SYSTEMS INTEGRITY: EPA Must Fully Address
Longstanding Information Resources Management Problems,
Report No. E1NMF1-15-0032-2100641, dated September 28, 1992.
(OIG-R-V2)
19. Annual Superfund Report to the Congress for Fiscal 1991,
Report No. P1SFF1-11-0026-2100660, dated September 30, 1992.
(OIG-S-V2)
20. Special Review of EDP Internal Controls for Selected
Pesticide Revolving Funds' Information Systems, Report No.
E1EPP2-15-7001-3400043, dated March 31, 1993. (OIG-T-V2)
21. Special Review of Allegations Regarding Copyright
Infringement Within the. Office of Communications, Education,
and Public Affairs, Report No. E6AMG3-15-0071-3400042, dated
March 31, 1993. (OIG-U-V2)
22. Consolidated Report Regarding Fiscal 1992 CERCLIS Data,
Report No. E1SFF3-11-0016-3100392, dated September 29,
1993). (OIG-V-V2)
VII-2
-------
APPENDIX VII •
GENERAL ACCOUNTING OFFICE REPORTS
23. SUPERFUND: A More Vigorous and Better Managed Enforcement
Program Is Needed, Report No. GAO/RCED-90-22, dated
December 14, 1989. (GAO-A-LIB)
24. HAZARDOUS WASTE: EPA's Generation and Management Data Need
Further Improvement, Report No. GAO/PEMD-90-3, dated
February 9, 1990. (GAO-K-V1)
25. FINANCIAL AUDIT: EPA's Financial Statements for Fiscal Years
1988 and 1987, Report No. GAO/AFMD-90-20, dated March 16,
1990. (GAO-B-LIB)
26. PUBLIC ACCESS: Two Case Studies of Federal Electronic
Dissemination, Report No. GAO/IMTEC-90-44BR, dated May 14,
1990. (GAO-B-V1)
27. GEOGRAPHIC INFORMATION SYSTEMS: Status at Selected Agencies,
Report No. GAO/IMTEC-90-74FS, dated August 1, 1990. (GAO-C-
VI)
28. DISINFECTANTS: EPA Lacks Assurance They Work, Report No.
GAO/RCED-90-139, dated August 30, 1990. (GAO-L-V2)
29. DISINFECTANTS: Concerns Over the Integrity of EPA's Data
Bases, Report No. GAO/RCED-90-232, dated September 21, 1990.
(GAO-C-LIB)
30. PESTICIDES: EPA Could Do More to Minimize Groundwater
Contamination, Report No. GAO/RCED-91-75, dated April 29,
1991. (GAO-D-LIB)
31. HAZARDOUS WASTE: Data Management Problems Delay EPA's
Assessment of Minimization Efforts, Report No. GAO/RCED-91-
131, dated June 13, 1991. (GAO-E-LIB)
32. ENVIRONMENTAL ENFORCEMENT: Penalties May Not Recover
Economic Benefits Gained by Violators, Report No. GAO/RCED-
91-166, dated June 17, 1991. (GAO-F-LIB)
33. TOXIC .CHEMICALS: EPA's Toxic Release Inventory Is Useful but
Can Be Improved, Report No. GAO/RCED-91-121, dated June 27,
1991. (GAO-M-V2)
34. WASTE MINIMIZATION: EPA Data Are Severely Flawed, Report No.
GAO/PEMD-91-21, dated August 5, 1991. (GAO-G-LIB)
VII-3
-------
APPENDIX VII
35. PESTICIDES: Better Data Can Improve the Usefulness of EPA's
Benefit Assessments, Report No. GAO/RCED-92-32, dated
December 31, 1991. (GAO-N-V2)
36. FOOD SAFETY: USDA Data Program Not Supporting Critical
Pesticide Decisions, Report No. GAO/IMTEC-92-11, dated
January 31, 1992. (GAO-H-LIB)
37. INFORMATION RESOURCES: Summary of Federal Agencies'
Information Resources Management Problems, Report No.
GAO/IMTEC-92-13FS, dated February 13, 1992. (GAO-D-V1)
38. ASBESTOS REMOVAL AND DISPOSAL: EPA Needs to Improve
Compliance With Its Regulations, Report No. GAO/RCED-92-83,
dated February 25, 1992. (GAO-I-LIB)
39. WASTE MINIMIZATION: Major Problems of Data Reliability and
Validity Identified, Report No. GAO/PEMD-92-16, dated
March 23, 1992. (GAO-J-LIB)
40. ENVIRONMENTAL ENFORCEMENT: EPA Needs a Better Strategy to
Manage Its Cross-Media Information, Report No. GAO/IMTEC-92-
14, dated April 2, 1992. (GAO-E-V1)
41. ENVIRONMENTAL ENFORCEMENT: Alternative Enforcement
Organizations for EPA, Report No. GAO/RCED-92-107, dated
April 14, 1992. (GAO-0-V2)
42. SUPERFUND: Problems With the Completeness and Consistency of'
Site Cleanup Plans, Report No. GAO/RCED-92-138, dated
May 18, 1992. (GAO-P-V2)
43. WATER POLLUTION MONITORING: EPA's Permit Compliance System
Could Be Used More Effectively, Report No. GAO/IMTEC-92-
58BR, dated June 22, 1992. (GAO-F-V1)
44. Perceived Barriers to Effective Information Resources
Management: Results of GAO Panel Discussions, Report No.
GAO/IMTEC-92-67, dated September 1992. (GAO-G-V1)
45. PESTICIDES: Information Systems Improvements Essential for
EPA's Reregistration Efforts, Report No. GAO/IMTEC-93-5,
dated November 23, 1992. (GAO-H-V1)
46. Information Management and Technology Issues, Transition
Series, Report No. GAO/OCG-93-5TR, dated December 1992.
(GAO-I-VI)
47. Environmental Protection Issues, Transition Series, Report
No. GAO/OCG-93-16TR, dated December 1992. (GAO-J-V1)
VII-4
-------
APPENDIX VII
48. Superfund Program Management, High-Risk Series, Report No.
GAO/HR-93-10, dated December 1992. (GAO-A-V1)
49. HAZARDOUS WASTE: Much Work Remains to Accelerate Facility
Cleanups, Report No. GAO/RCED-93-15, dated January 19, 1993.
(GAO-Q-V2)
50. ENVIRONMENTAL ENFORCEMENT: EPA Cannot Ensure the Accuracy of
• Self-Reported Compliance Monitoring Data, Report No.
GAO/RCED-93-21, dated March 31, 1993. (GAO-R-V2)
51. SUPERFUND: EPA Actions Could Have Minimized Program
Management Costs, Report No. GAO/RCED-93-136, dated
June 1993. (GAO-S-V2)
EPA MANAGEMENT REPORTS
52. National Archives and Records Administration - Records
Management in the Environmental Protection Agency, dated
February 19, 1992. (EPA-A)
53. EPA IRM Compliance Strategy Task Group Report, dated
October 1, 1992. (EPA-B)
54. Analysis of Materiality of Weaknesses in the United States
Environmental Protection Agency's Information Resources
Management Program, dated December 14, 1992 (prepared by
Federal Sources, Inc.), (EPA-C)
55. Written comments by Paul Wohlleben, Acting Director, Office
of Information Resources Management, on strengthening EPA's
IRM program, dated May 19, 1993. (EPA-D)
56. Financial Management Status Report and .Five-Year Plan, dated
July 30, 1993. (EPA-E)
CONGRESSIONAL TESTIMONY
57. SUPERFUND: Current Progress and Issues Needing Further
Attention, GAO testimony before the Subcommittee on
Oversight, Committee on Ways and Means, U.S. House of
Representatives, Document No. GAO/T-RCED-92-56, dated
June 11, 1992. (T-GAO-A)
58. SUPERFUND: Problems With the Completeness and Consistency of
Site Cleanup Plans Two, GAO testimony before the
Subcommittee on Investigations and Oversight, Committee on
Public Works and Transportation, U.S. House of
VII-5
-------
APPENDIX VII
Representatives, Document No. GAO/T-RCED-92-70, dated
June 30, 1992. (T-GAO-B)
59. SUPERFUND: EPA Needs to Better Focus Cleanup Technology
Development, GAO testimony before the Subcommittee on
Investigations and Oversight, Committee on Public Works and
Transportation, U.S. House of Representatives, Document No.
GAO/T-RCED-92-92, dated September 15, 1992. (T-GAO-C)
60. Nomination of Carol M. Browner, Hearing before the Committee
on Environment and Public Works, U.S. Senate, on June 11,
1993. (T-EPA-D)
61. Why EPA Should be a Cabinet Department, Statement by
Administrator Carol Browner to the Committee on Governmental
Affairs, U.S. Senate, on February 18, 1993. (T-EPA-E)
62. Creation of a Department of the Environment, GAO testimony
before the Committee on Governmental Affairs, U.S. Senate,
Document No. GAO/T-RCED-93-6, dated February 18, 1993. (T-
GAO-F)
63. Testimony of Administrator Carol M. Browner before the
Subcommittee on Oversight and Investigations, Committee on
Energy and Commerce, U.S. House of Representatives, on
March.10, 1993. (T-EPA-G)
64. Testimony of EPA Inspector General John C. Martin before the
Subcommittee on Legislation and National Security, and the '•
Subcommittee on Environment, Energy and Natural Resources of
the Committee on Government Operations, U.S. House of
Representatives, on March 29, 1993. (T-OIG-H)
65. Management Issues Facing the Environmental Protection
Agency, GAO testimony before the Subcommittee on Legislation
and National Security, and the Subcommittee on Environment,
Energy and Natural Resources of the Committee on Government
Operations, U.S. House of Representatives, Document No.
GAO/T-RCED-93-26, dated March 29, 1993. (T-GAO-I)
66. ENVIRONMENTAL PROTECTION: EPA's Actions to Improve
Longstanding Information Management Weaknesses, GAO
testimony before the Subcommittee on Legislation and
National Security, and the Subcommittee on Environment,
Energy and Natural Resources of the Committee on Government
Operations, U.S. House of Representatives, Document No.
GAO/T-IMTEC-93-4, dated March 29, 1993. (T-GAO-J)
VII-6
-------
APPENDIX VII
67. SUPERFUND: Progress, Problems, and Reauthorization Issues,
GAO testimony before the Subcommittee on Transportation and
Hazardous Materials, Committee on Energy and Commerce, U.S.
House of Representatives, Document No. GAO/T-RCED-93-27,
dated April 21, 1993. (T-GAO-K)
68. Testimony of EPA Inspector General John C. Martin before the
Subcommittee on Legislation and National Security, and the
Subcommittee on Environment, Energy and Natural Resources of
the Committee on Government Operations, U.S. House of
Representatives, on May 6, 1993. (T-OIG-L)
69. Creation of a Department of Environmental Protection, GAO
testimony before the Subcommittee on Legislation and
National Security and the Subcommittee on Environment,
Energy, and Natural Resources, Committee on Government
Operations, U.S. House of Representatives, Document No.
GAO/T-RCED-93-39, dated May 6, 1993. (T-GAO-M)
70. Testimony of Administrator Carol M. Browner before the
Subcommittee on Superfund, Recycling and Solid Waste
Management of the Committee on Environment and Public Works,
U.S. Senate, on May 12, 1993. (T-EPA-N)
71. Testimony of EPA Inspector General John C.. Martin before the
Subcommittee on Superfund, Recycling and Solid Waste
Management of the Committee on Environment and Public Works,
U.S. Senate, on June 10, 1993. (T-OIG-0)
72. SUPERFUND: EPA Action Could Have Minimized Program
Management Costs, GAO testimony before the Subcommittee on
Superfund, Recycling, and Solid Waste Management, Committee
on Environment and Public Works, U.S. Senate, Document No.
GAO/T-RCED-93-50, dated June 10, 1993. (T-GAO-P)
73. Testimony of EPA Inspector General John C. Martin before the
Committee on Governmental Affairs, U..S. Senate, on June 22,
1993. (T-OIG-Q)
74. SUPERFUND: Little Use Made of Techniques to Reduce Legal
Expenses, GAO testimony before the Subcommittee on
Transportation and Hazardous Material, Committee on Energy
and Commerce, U.S. House of Representatives, Document No.
GAO/T-RCED-93-60, dated June 30, 1993. (T-GAO-R)
75. ENVIRONMENTAL PROTECTION:- EPA Faces Formidable Challenges
Managing Water Quality Data, GAO testimony before the Sub-
committee on Clean Water, Fisheries and Wildlife, Committee
on Environment and Public Works, U.S. Senate, Document No.
GAO/T-AIMD-93-2, dated Augus.t 5, 1993. (T-GAO-S)
VII-7
-------
APPENDIX VIII
APPROACH AND METHODOLOGY
I. Project Plan
The initial draft of the project plan was developed by the
OIG team members in September 1993. OIRM and OSWER team members
were selected and joined the team in October 1993. When the
entire team was assembled, the project plan was reviewed, changed
and agreed to by all team members.
The project plan identified the review objectives,
background, scope and methodology, location of the project work,
preliminary project work, and milestones and associated time
frames. The following description of the approach and
methodology documents in more detail the methodology used.
II. Initial Field Work
This phase was completed by OIG staff prior to assembling
the entire team.
A. Collection of Previous Reports
Copies of prior OIG and GAO reports addressing the IRM
and Superfund programs, and OIG and GAO position papers and
interview summaries for ongoing reviews were obtained and
reviewed. Copies of any relevant documents, reports, or
studies (e.g., task force reports, contractor studies,
Superfund reauthorization studies, testimonies, etc.)
directed at resolving problems with the Agency's IRM
program, especially within Superfund, were also obtained and
reviewed. Throughout the rest of the review, as additional
documentation was identified it was reviewed and the
references were added to the master list. A complete list
of documents used in performing this review is contained in
Appendix VII.
The scope of the review was limited to audit reports
and studies completed since fiscal 1990 so that the most
current issues would be addressed.
B. Consolidation and Analysis of Previous Report
Recommendations
OIG staff analyzed the assembled documents and listed
all the recommendations to identify root causes, problems,
issues and concerns, which were then prioritized and
VIII-1
-------
APPENDIX VIII
categorized into subject areas. This effort resulted in an
initial listing of 7 categories. Five statements of problem
issues and concerns were developed, and 23 root causes were
identified and distributed among the 7 categories. From
this analysis a two-page summary entitled "Previously
Identified Root Causes for IRM Problems" was developed.
A spreadsheet of all previous recommendations was
developed based on the assembled documentation. These
previous recommendations were then grouped by the identified
root causes. The spreadsheet further identified the office
responsible for implementing each recommendation, the Agency
response, any DIG comments, a document reference and page,
additional or associated causes, and a priority (high,
medium, or low). The recommendations were initially
. prioritized by consideration of the relationship to the root
cause(s), issue/concern, and mission; breadth of impact
(Agencywide vs. system-specific); relationship to Superfund;
whether it was a recurring issue (severity of the problem
area); currentness of the report; complexity of the
recommendation; potential cost; and timing of scheduled
implementation..
III. Previous Recommendations' Priority Determination and
Refinement of Initial Field Work Results
Once the joint DIG/Agency review team was assembled, it
reviewed, revised and approved the documents prepared to that
point. Where possible, additional documents were identified and
obtained. For example, Superfund officials initiated a
significant data collection effort during the summer and related
activities were continuing in the Superfund program in
preparation for Congressional reauthorization. The review team
collected and analyzed relevant data for inclusion in the review
process.
A second review of the listing of problem issues/concerns
and root causes resulted in increasing the problem issues/
concerns statements to six and redistributing root causes among
five rather than seven categories. The two page summary,
"Previously Identified Root Causes for IRM Problems," was
revised.
Finally, the priorities assigned in the spreadsheet of
recommendations were reviewed. A refined ranking was based on:
the relationship to root cause(s), issues/concerns, and mission;
the breadth of the recommendation's impact (addressing a broad
IRM problem area); the relationship to Superfund; whether the
VIII-2
-------
APPENDIX VIII
issue is a single or recurring issue; and the currency of the
report.
IV. Development of Legislative Issues
After initial analysis and focus group sessions, high level
IBM-related legislative issues were developed by the team. These
issues were relevant to Superfund reauthorization and/or
legislation to elevate the Agency to Departmental status.
Throughout the project, periodic briefings were given to key
Congressional staff, which included these issues.
V. Identification of and Notification to Participants
Headquarters and Regional IRM and program senior staff were
invited to participate in Total Quality Management (TQM)-style
focus groups and interviews to validate the problems, root
causes, and solutions identified by the review team analysis.
Our goal was to obtain input from a cross-section of the
Agency and focus as much as possible on Superfund, as
specifically identified in the Congressional request. Superfund
emphasis provided the framework for selection of Regional offices
and led to inclusion of a separate Headquarters interview with
Superfund system managers.
Selection of Regions was based on Superfund budget
information, the nature and variability of- Superfund work in the
Regions, review team size, time and travel constraints, and
participant impact considerations. Three Regional offices were
selected for focus group sessions: Region 2 (New York, NY),
Region 4 (Atlanta, GA), and Region 9 (San Francisco, CA).
Headquarters organizational representation focused on
obtaining a cross-section of the Agency programs: (1) SIRMOs from
all program offices, including selected systems managers
recommended by the SIRMOs, were identified to take part in a
focus group session to provide feedback from a programmatic
perspective; (2) inclusion of OPPE was designed to address
performance measurement and Agency strategic planning; (3) a
"Data Integration" session was designed to obtain feedback on
specific data integration issues; (4) the Systems Development
Center (SDC) session was selected for feedback on system
development life cycle issues related to major information
systems, with a focus on the Superfund program area; and (5) a
separate teleconference with Air program officials was held
because the AIRS major system program component is located in
Research Triangle Park, NC.
VIII-3
-------
APPENDIX VIII
A. Identification of Participants
Identification of individual participants was based on
the criteria of obtaining a management cross-section of
programs, focusing on major IRM initiatives and automated
systems, and talking to as many SIRMOs as possible. The
overriding constraints were keeping focus group sessions to
a workable size of 12 to 15 people, and keeping interviews
to 6 people or less (optimally 2-3). These group sizes were
suggested optimums based on individual team member
experience and discussions with experienced focus group
facilitators.
All SIRMOs in the selected Regions were interviewed
separately to reduce impacts on their time. To obtain
management perspective and programmatic overview, the focus
groups were comprised of Regional Division Directors/Deputy
Directors and Branch Chiefs. In addition, the IRM Branch
Chiefs and other knowledgeable staff from each "Region were
invited when possible. Participation was weighted toward
the Superfund program.
All Headquarters SIRMOs were invited to participate in
a focus group, along with a major information system manager
from their office. However, the SIRMOs from the Office of
Enforcement participated in a separate interview session,
and the SIRMO from the Office of Air and Radiation attended
a teleconference.
OPPE participation included individuals involved in
performance measurement and the STARS system. The "Data
Integration" session participants were based on leadership
in the two largest data integration initiatives,
Envirofacts/Gateway (OIRM) and IDEA (OE). The Superfund
session participants were selected from CERCLIS system/
program managers. Finally, the SDC interview participants
were selected from both EPA and the contractor.
B. Notification and Refinement of Participants
In advance of the meetings, each participant was
notified in writing and received a standard information
package, including the summary of previously identified root
causes for IRM problems,'to help them prepare to
participate. Each Assistant Regional Administrator (ARA)
(who function as the Regional SIRMOs), Headquarters SIRMO,
and "special group" participant was contacted by the project
manager or a team member to=confirm participation. We
VIII-4
-------
APPENDIX VIII
encouraged the ARAs to review the participant list to ensure
the participants were the best representatives of the
Region's IRM and Superfund efforts and to replace invitees
where appropriate.
VI. Conduct of Interviews and Focus Groups and Verification of
Feedback
A. Planning and Preparation
Team member participation in focus sessions,
interviews, and teleconferences was based on team size and
individual* team member schedules. Each team member was
assigned to a "sub-team11 consisting of one OIG staff and one
Agency staff member. Each sub-team was responsible for
setting up and attending one Regional focus group.
Attendance at the Headquarters focus group and interviews
was determined primarily by availability during the
scheduled time (some sessions were held concurrently) and
expertise in the subject matter. At least two team members
attended each session.
Team members attended each focus group session to
provide introduction and background, answer questions, and
clarify and record results. During the interviews and
teleconferences, team members also asked questions to ensure
all areas of concern were addressed.
The focus group sessions were split into two half-day
sessions, based on discussions with experienced facilitators
who indicated that was the minimum time necessary to arrive
at meaningful results while minimizing impacts on management
resources.
The list of "Previously Identified Root Causes for IRM
Problems" provided guidance to the focus group facilitators
as a basis for obtaining feedback. The sessions were to
verify the problem areas and root causes, and allowed the
participants to reject them and articulate their own
perceived problems and associated root causes. Finally, the
participants were asked to identify solutions for the root
causes. This approach was discussed with each facilitator
and was used throughout the focus group sessions.
B. Conduct of Focus Groups, Interviews and Teleconferences
Review team members and facilitators discussed and
planned all sessions in advance. All sessions used the
VIII-5
-------
APPENDIX VIII
notification package summarizing previously identified root
causes for IRM problems as the basis for obtaining feedback.
The focus group sessions at the Regional offices and
Headquarters were moderately structured. Although the
sessions were conducted under the same framework, some
flexibility was allowed based on determinations of whether
the sessions were progressing enough to elicit sufficient
feedback. One focus group had the benefit of two
facilitators, which allowed use of smaller groups to address
more root causes and solutions. In that focus group, the
results of the small groups were always discussed and
verified in the larger group. The other focus groups had
one facilitator and the process was conducted in one large
group.
The first focus group session was evaluated both during
and subsequent to its conclusion to provide feedback on
positive and negative aspects of the process for subsequent
facilitated sessions. All focus group sessions were
evaluated throughout, to continually improve the process and
ensure that sufficient feedback was obtained. Although
minor details of methods may have varied between facilitated
sessions (based on facilitators' individual experiences and
training), all sessions were conducted in a generally
consistent framework as follows.
After discussing the previously identified root
causes, each of the problems was either verified as relevant
to the participants or rejected. The participants were then
invited to identify additional problems. These problems
were prioritized by multi-level voting. There was usually a
distinct break point in voting scores that determined the
highest priority problems.
Addressing the problems in priority order, the
participants identified root causes for each, once .all the
root causes were identified, they were likewise prioritized
by multi-level voting. The highest priority root causes
were further discussed to determine solutions. The choice
of where to break the list of high priority root causes was
discussed between the review team members and the
facilitator(s) to determine what was achievable in the
remaining time. At the end of the session the results were
summarized.
1 Interviews and teleconferences were shorter and
consisted of a more free-form flow and expression of
VIII-6
-------
APPENDIX VIII
problems, root causes and solutions. Team members
functioned to clarify statements, ask questions to maintain
the pace of the interview, and address specific areas of
concern. Although useful feedback was obtained, this method
made analysis of results more difficult, and relative
priorities of problems, root causes and associated solutions
could not be determined.
C. Verification of Feedback
All session results were typed and returned to the
individual participants for verification. Participants were
given at least one week to respond with additions or
corrections. They were not required to respond if no
additions or errors were noted.
VII. Summary and Synthesis of Results
The review team evaluated the results of the focus groups.
Comprehensive assessments of the previously identified root cause
subject area headings were completed, and all results were
organized into additional or revised subject areas as identified
in the focus groups. Also, the analyses determined whether these
subject areas were valid and logical for categorizing problems
and solutions. Additional subject areas were identified, such as
the area of data management.
The logical progression of the analysis led to consolidating
the problems, root causes, and solutions statements from all
sessions. Once consolidated statements were developed, the
problem statements were matched to root causes.
The consolidated lists of solution statements were ranked
and placed into clusters of related solutions. The review team
ensured consistency and integrity in the report by tracing back
all the solution statements to their respective root causes, and
the root causes to their respective problem areas. In this
manner, the team ensured that all. root causes were addressed by
the solutions, and that the solutions were fully responsive to
the root causes. As a final analysis, the review team identified
the solutions that cut across multiple problem areas, and used
this analysis to assist in refining the presentation of the
recommendations in the report.
VIII.Follow-up on Previous Recommendations
Using the priority list of previous recommendations, members
of the Agency's IRM community were asked to provide status
VIII-7
-------
APPENDIX VIII
updates for the high priority recommendations. These members of
the IBM community were identified based on the responsible
offices for each recommendation, identification in automated
reporting and tracking systems, and the review team's subjective
judgment of which individuals within the offices would have
direct awareness of the current implementation status.
Existing audit reporting and tracking systems were used
where possible to generate the most current reports, and the
scope of the request for status information was kept tightly
focused. When automated tracking reports were not sufficient or
had not been recently updated, the respondents were contacted and
asked to provide:
r, STATUS (is the recommendation completed, in progress, or
not started?),
• DATE (date completed or target date for completion), and
• COMMENTS (a few sentences describing the effort and/or
upcoming milestones). •
Because of the breadth of prior recommendations, the request
for status information went to many organizations. These
included OPPTS, OSWER, OE, Office of Acquisition Management,
Office of the Comptroller, NDPD, and all divisions and staffs of
OIRM. In most cases, electronic mail was used to convey the
initial request, with follow-up for clarification by phone, fax, .'
or face-to-face discussion.
The review team formatted all responses similarly, and
arranged them into the matrix contained in Appendix IX. Where
timing and review team resources allowed, there was additional
follow-up to obtain missing dates and to resolve any confusion
about particular comments. Finally, responses were sent back to
the respondents for verification.
VIII-8
-------
APPENDIX IX
IMPLEMENTATION STATUS OF PRIOR
HIGH-PRIORITY RECOMMENDATIONS
BACKGROUND: This appendix contains information gathered from
selected members of the Agency's IRM community, chosen because of
their personal, hands-on knowledge of EPA's responses to prior
audit recommendations. The responses are based on the official
status of corrective actions, as tracked in various management
audit tracking systems, and include additional details obtained
directly from those who implemented the recommendations. The
responses, tend to be more detailed and specific than those
available through the formal audit implementation tracking
channels.
The recommendations highlighted in this appendix come from a
variety of OIG, GAO, and Agency reports issued since 1990. These
high-priority recommendations are a small subset of all the
recommendations contained in the documents listed in Appendix VII.
These particular recommendations were prioritized for followup
because the review team judged them to be especially important.
Further details about how the review team established the relative
priorities of prior recommendations are provided "in Appendix VIII.'-
IX-1
-------
Audit Recommendation
Notify Hie mar caimunily of the new NDPD IBM
Mainframe Security Policy which specifically addresses
generic User-Ids (SYSTEM), shared User-Id, User
Support authentication issues, and several production
control security issues.
Source program and load modules will reside In a
centralized Horary.
Sgnofffrom the EPA program office (report owner)
will 6e required for all report specifications, reports
library documentation, and test/sample reports.
Sgnofffrom the EPA program office (report owner)
will be required fat all report modifications prior
to reinstatement to the production reports'menu.
M Any change* to source code will be recorded within
.1 the program in the form of comments. Additionally,
a report change tog must be updated each time a
report Is modified.
Responsible
AA-OARM
When a change is made to one program report
developers will consult the reports librarian to
see If related or affected programs need to be
changed.
OEM/USDS will review reports usage analysis and
soli fit user comments to Identify all reports critical
to end of the year reporting and FJ90 planning. These
reports will be given highest priority for review and
correction. As problems an identified, users will
be informed.
All reports not Identified as critical to end of the
year reporting or mOplaming will be removed from
At CBU3JS National Reports lienu and made
available only through a special menu on the production
system until such time as each report can be verified,
tested and released onto the National Kepons Uenu.
As part of this complete aitdti o/CEKCUS reports.
reports will also be Identified for deletion or combination.
ID
Reference/Pane
OIG-A-V1/3
'Action
Status
Date Comments
AA-OSWER OIGB-V1/14
AA-OSWER OIG B V119
Completed Various NDPD has published user memos, brochures, and other
documents describing the NDPD security policy. RACP. etc.
Completed Dec-89 OSWER has established the reports librarian function and
revised the reports development procedures. All five of the
following audit recommendations have been incorporated
Completed Dec-89 into the reports development procedures.
AA-OSWER OIGB-V1/14 : Completed Dec-89 as above
AA-OSWER OIG-B-V1/14 Completed Dec-89 as above
AA-OSWER OIG-B-V1/14 Completed Dec-89 as above
AA-OSWER OIG-B-V1/17 Completed Dec-89 This was completed by establishing the reports librarian.
AA-OSWER OIGBV1/17 Completed Dec-89 This was completed by establishing the reports librarian.
•o
11
z
D
X
M
X
|l| NOTE: See Appendix Vn for Ml ciutio* oTccferace.
-------
Audit Recommendation
Activate the KACF option PROTECTALL.
Require that a complete assessment o/CEKCUS
software bt performed as soon as feasible.
Require that evaluations be made of the four areas
discussed herein: data management, change controls.
data bait integrity, and security.
Require the Director of the OSWER Information
Management Staff to include the requirement for
Independent testing and verification procedures
In the performance of system evaluations.
Require that RA CF training be developed and made
mandatory for system and account managers. .
Immediately Initiate a review of PAK and TAPP to
determine the appropriate access levels for
Individual users and eliminate access to those
users that do not have an absolute need.
Promulgate formal EPA guidance regarding the system
decision process during the development and
implementation of large information systems.
Responsible .
D-NDPD
AA-OSWER
HI
Reference/Page
OIG-D-V1/2
OIGE-V1/8
Action
Status
Compli
Compli
AA-OSWER OIGE-V1/6
AA-OSWER OIGE-V1/6
DOIRM
DOIRM
AA-OARM
Involve all user groups in developing and implementing AA-OARM
the Agency's financial management system, and
document that their needs and priorities were
considered In deciding the direction and plans for the
system.
OIGG-V1/3
OIG-G-V1/3
OIG-I-V1/18
Establish procedures to coordinate the update of the
Mainframe capacity report, master facility plan, and
budget trading system report to continually reflect
current workload trends and revised requirements
in all three documents.
D-OARM/RTP OIG JV1/10
Completed
Completed
Completed
Completed
In progress
OIG-I-V1/32 . Completed
Completed
Date Comments
Dec-93 The RACP PROTECTALL option was implemented in
December 1993.
Jun-90 There was a joint O1G/OSWER/OBRR decision not to
perfonn an assessment of CERCLIS while System 2000
was the DBMS since the Agency has made a decision
Jun-90 to move all S2K users to new platforms and assessment of
CERCLIS in the current environment was a moot point.
Dec-89 OSWER/IM has established an ongoing delivery order
for IV&V using the MOSES contract, which provides for
independent validation and verification of all systems .
upon request.
Dec-93 NDPD account managers were trained by 12/92. then NDPD
trained other account managers by 12/93. In addition, all
RACP security administrators received mandatory training.
Nov-93 Review of profiles and access levels
is complete, with ongoing reassessment to eliminate
access for users who do not have an absolute need.
Jan-94 revised SLCM policy drafted 6/93. review of draft in progress,
green border .review forthcoming, procedures & stds to be
developed, related changes have also been made to draft
revised charter for IRM Steering Committee
Sep-90 Two formal groups, exist to manage IPMS.
There is an executive management group and a systems
management group. Both ensure that user needs are
documented and addressed in deciding the directions for
the system.
Sep-92 NDPD updates the reports on a quarterly basis at a minimum
The Mainframe Capacity Report information is used for the
Master Facility Plan and the Budget tracking system.
•o
•o
x
H
X
|l| NOTE: See Appendix VII for foil duiio* of Kfarace.
-------
Audit Recommendation
Establish annual reviews of user-developed'
applications to determine which applications should
be updaleU and. based on the results of those reviews.
lake appropriate action.
Perform a formal cost-benefit analysis to determine
which major applications should be rewritten to
increase their performance and reduce overall costs
and. based on the review results, take appropriate
action.
Establish production control configuration
management and quality assurance procedures for
the user community, which Includes the incorporation
of a limited access central data set for all jobs
placed in production.
Charter the task force to conduct an analysis of
.data/lie and table creation, storage, and retention
to determine If efficiencies can be gained by
eliminating processing steps.
Provide guidance and training to account managers
regarding their roles and responsibilities for
controlling the issuance of access authorities.
Determine the IBM mainframe accounts that process
highly sensitive data and develop retpdranatts for
protecting resources under those accounts.
Olve account managers the primary responsibility to
control issuance of access authorities (CREATE.
GKPACC. etc.) for users assigned to their accounts.
Reduce and maintain the number of users with
SPECIAL OPERATIONS. AUDTTOK. and
ALTEK access authorities to an absolute minimum.
Activate the audit trail features of the RACF control
mechanism for highly sensitive accounts.
HI
Responsible Reference/Pane
D-OIRM OIGJV1/13
D-OIRM OIG-JV1/13
D-OARM/BTP OIG-J-V1/28
D-OARM/RTP OIG JV1/26
D-OIRM
D-OARM/RTP
COMPTROLL
D-OIRM
D-OARM/RTP
D-OIRM
D-OARM/RTP
D-OIRM
D-OARM/RTP
D-OIRM
D-OARM/RTP
OIGKV1/12
OIGKV1/19
OIGKV1/12
OIGK-V1/12
OIG-K-V1/12
Action
Status
Closed
Closed
Completed
Closed
Completed
Completed
Completed
Completed
Dale Comments
Peb-92 This recommendation Tails outside the scope of NDPD's
mission. The audit was closed by OIG on 2/2/92.
Periodic reviews of the Agency's applications are required.
per Agency Directive 2100.
Peb-92 This recommendation falls outside the scope of NDPD's
mission. The audit was closed by OIG on 2/2/92.
Jan-91 Implementation of the job scheduling package
JOBTRACK satisfied this recommendation.
Jan-91 The objectives of this recommendation will be achieved
through implementation of the job scheduling package.
During this process, the JCL will be reviewed and any
processing inefficiencies can be addressed.
Nov-93 All RSAs have been trained regarding these issues.
RACP Security Administrator's Guide was completed 11/92.
All NDPD and other account managers have been trained.
Apr-93 NDPD has trained account managers how to protect
highly sensitive accounts with RACP. However. NDPD does
not have the resources to analyze each application running
on the NCC mainframes. Directive 2I9S establishes the
requirements for protecting automated information resources.
Dec-93 Through the decentralization project. NDPD is training
account managers to assume the recommended
responsibilities.
Mty-93 The number of users with high level system RACP authorities
has been reduced to the minimum level required to maintain
security and service level goals.
o
n
Completed Dec-93 All RACF audit trail features have been activated for
sensitive accounts.
|1| NOTE: See Appendix VII for fell ciMioo of reference.
-------
Audit Recommendation
Turn on Iht RACF OPERAUtVTfeature to monitor
activities of users assigned Ike SYSTEM
OPERATIONS authority.
Revoke the ALTER authority of those users assigned
highly sensitive accounts who do not warrant this
authority and maintain a minimum number of men
with this authority.
Establish and Implement a plan to phase In the RACF
option PROTECTALL with milestone dates for
completing the plan and all phases of the
Implementation.
Identify and delete account users In 'revoke ' status
who no longer warrant system access.
Plan and implement a method of periodically reporting
to account managers the status of users with RA CF
profiles assigned under the mainframe accounts for
which they an responsible.
Determine what highly sensitive files In Agency
financial systems and systems processing privacy
data would obtain madnum security by utilizing
the RACF audit trail and/or ERASE features.
Develop a RACF Implementation plan, security
objectives, and quality assurance procedures. Include
milestone dates for each of these components.
Enhance Agency software to provide for complete
and accurate transfer and audit trail of data between
the Contract Payment System and IFUS.
Perform a software assessment to determine if
CERCUS can be altered to provide man flexibility
for information retrieval.
Responsible
HI
Reference/Page
D-OIRM OIGKV1/12
D-OARM/RTP
DOIRM OIG-K-V1/12
D-OARM/RTP
O-OARM/RTP OIG-KV1/18
D-OARM/RTP OIGKV1/18
D-OIRM OIG-K-V1/18
D-OARM/RTP
COMPTROL
D-OIRM
D-OARM/RTP
OIGK-V1/18
D-OARM/RTP OIG-K-V1/24
AA-OARM OIGL-V1/8
AAOSWER OIG-N-V1/33
AA-OARM
Action
Status Date Commenu
Completed Jun-92 OPERAUDIT is on and is being monitored on a weekly basis.
Completed May-93 The number of users with high level system RACP authorities
has been reduced to the minimum level required to maintain
security and service level goals.
Completed Nov-93 Implementation Strategy Tor RACP Features completed 6/91,
subsequently revised. PROTECT ALL fully implemented
on 11/30/93.
Completed Jan-94 RACP security admins, now receive reports identifying users
in revoke status, and are responsible for USBRID management.
Completed Dec-93 All account managers received RACP training. The training
empowers the account managers to obtain RACP profiles
as needed.
Completed Nov-93 NDPD has trained account managers how to protect sensitive
files. However. NDPD does not have the resources to analyze
each mainframe application.
Completed Jun-91 The Implementation Strategy for RACP Features was
completed on 6/91 and subsequently revised. The
implementation plan and procedures are completed and are
being implemented.
Completed Jun-91 Data are transmitted from CPS to IPMS via an automated
nightly cycle. Bach morning, personnel review a reject report
to address any transactions that did not successfully reach
IPMS. On at least a monthly basts, an automated CPS/IPMS
reconciliation report is run to verify account balances
between CPS and IPMS.
Completed Mar-93 The software assessment report was published by the
•Systems Development Center in March of 1993.
3'
•u
•n
s
u
|l| NOTE: See Appendix VU fat Ml ciMkM of itfeteace.
-------
Audit Recommendation
(I I Action
Responsible Reference/Page Status
Dale Comments
Discontinue hard-coding of parameter values subject
to change into the source code and replace them with
a table that may be read in from a data file unless
It can be shown through a cost/benefit analysis
that hard-coding is the more efficient and mart cost
effective approach. . .
Establish procedures to Include essential
reasonableness, completeness, and edit checks in
programs to preclude the reporting of erroneous data.
- Sgnoffupon the completion of report
programing with a certification that a 3-way check
and comprehensive documentation review hat been
completed.
AA-OSWER OIGNV1/33 Completed
AA-OARM
AAOSWER OIGNV1/20 Completed
AA-OSWER OIGN-V1/27 Completed
Mar-93 Reports using hard coding of parameter values are being
phased out at major revision steps of the system.
Feb-93 This is addressed by the draft report writing manual, published
in February of 1993. The manual is scheduled to be
completed in final during PY94.
Feb-93 as above
- Insert comments into the source code and modify
the CERCUS reports Horary to reflect changes when
report* are examined for inclusion in the National
M Keports library!
X • .
I
-ferfoma 3-way check of consistency between the
Report Specification Form, the CERCUS Reports
Library, and )he source code upon completion of
report programming;
Require IFUS. FHD. and CERCUS officials to work
together to establish accuracy in IfVS/CERCUS
rdauidata.
Correct the specific deficiencies identified In this
finding.
Develop error reports to separately capture
Inaccurate/incomplete CERCUS transactions for all
reports in production.
Take appropriate actions to eliminate the specific
deficiencies identified in the five reports discussed
in Alt finding.
AA-OSWER OIG-N-V1/27 Completed
Feb-93 as above
AA-OSWER OIGN-V1/27 Completed Feb-93 as above
AA-OSWER OIGN-V1/14 Completed
AA-OARM
AA-OSWER OIGNV1/20 Completed
Sep-92 1PMS. PMD. and CERCUS representatives have worked
together to improve data accuracy.
An automated interface to export data from IPMS to CBRCLIS
has been created, though the interface is not fully operational.
Sep-92 as above
AA-OSWER OIGNV1/20 Completed Mar-93 Production of error reports began in March of 1993.
AA-OSWER OIG-N-V1/27 Completed Sep-92 The reports have all been corrected or archived.
•o
•o
M
X
H
X
|l| NOTE: See Appendix VII for full citalio* of refereace.
-------
Audit Recommendation Responsible
HI
Reference/Page
x
i
-4
AA-OSWER OIGNV1/27
AA-OSWER OIG-N- V112 7
Develop written procedures and controls regarding
report programing documentation requiring
CERCUS report owners to:
— Perform a comprehensive review of report
programming evaluating adherence to selection
criteria standards and docanenlalion standards
prescribed by the National Bureau of Standards
guidance on software maintenance:
Establish procedures to accumulate costs on ADP AA-OARM OIG-O-V2/30
support services contracts for: (I) ADP equipment:
(2) proprietary software; (3) maintenance services;
(4) ADP Services; and (5) ADP support services to
ensure that the Agency obtains required approvals
fromGSA.
Establish senior DOPO/technical manager positions AA-OARM OIG-O-V2/22
with ADP technical skill requirements commensurate
with the still levels needed to technically monitor the
development of ADP DOs and deUverabtes, or
consider the use of independent evaluation contractors
to perform the technical reviews of DOs.
Perform a study to determine the appropriate AA-OARM OIG-O-V2/22
procurement office staffing to administer the current
and proposed ADP support services contracts.
Add to Ou Agency's contract management program AA-OARM OIG-O-V2/22
AgencywUe mandatory and formal IKtt standards,
project management guidance, and SDH requirements.
to efficiently manage ADP support services contracts
and information resources.
Separate the duties of Government employees who AA-OAAM OtO P V2/7
perform system operations and system security
activities.
Action
Status
Completed
Date Comments
Mar-93 This b addressed in the draft report writing manual, first
published in February of 1992, and scheduled to be made
final in PY94.
Completed Sep-92 This is also part of the draft report writing procedures manual.
Completed Jun-93 OIRM has issued interim policy requiring the accumulation
of costs by PIP resource categories as defined by the PIRMR.
This requirement is also included in OIRM's draft PIP
Resources Acquisition Manual.
Completed Oct-93 Por all national IRM contracts, OIRM performs technical
evaluation of delivery orders containing significant IRM
content or issues. Upon request, OIRM also provides DOPOs
technical assistance in preparing DOs and evaluating contractor
performance. Use of GSA zone contracts also ensures
independent review of contract delivery orders.
Completed Sep-93 OAM conducted workload analyses in support of the PY9S
budget submission. This analysis addressed current and
projected resource needs for properly managing existing and
projected ADP contracts for the Agency.
Completed Aug-91 The IRM clause was added to the EPAAR in August of 1991.
As of 8/92, all current contracts involving PIPS were modified
to reflect the EPAAR IRM requirements. The APDS system
puts the IRM clause automatically into all new solicitations
for PIP resources as a mandatory clause. In addition,
IRM training for OAM is currently in development by OIRM.
Completed Apr-92 As of 4/92. the full-time NDPD Security Officer began
reporting to the Director. NDPD. regarding all security matters.
•o
PI
o
M
x
|l| NOTE: See Appendix VO for full ciuioa of rcfeieoce.
-------
Audit Recommendation
Add RACFprotection to the following: all CPS
data sets; ADCR budget and obligation Jala; and
all Mills program, budget, and budget reconciliation
data.
Responsible
(I | Action
Reference/Paee Status
AA-OARM OIG-P-V2/6 Completed
Date Comments
Nov-93 All data sets running on the NCC IBM compatible mainframes
are protected by RACP.
Based on the specific job responsibilities, redact
the lumber of users with; SPECIAL* OPERATIONS,
and AUDITOR attributes; ALTER access to data sets
controlling RACF: ALTER access to operating system
data sett; ALTER access to IFHS and UFA f; and
CKPACC capabilities and CREATE authority In
several groups.
Develop performance standards Jar system software
maintenance, security administration, and DASD
management and require the contractor- to develop
and doeiment specific procedures to meet the standards.
Kased on specific individual job responsibilities.
reduce trie number of users with:
a. SflClAL, OPERATIONS, and AUDITOR attributes;
b. ALTER access to data sets controlling RACF:
c. ALTER access to operating system data sets; and
d. OtPACC capabilities and CREATE authority in
several groups.
Develop information security policies, standards,
and procedures for data protection, Including:
a. determining the level of protection reauired
under RACF:
b. developing RACF profiles which provide protection
wUlt maintaining adequate Internal controls suck
at separation of duties: and
c. develop and publish standards of performance
far Information security, with specific attention to
utiUiation of RACF to protect sensitive data on
me IHH systems.
AA-OARM OIGPV2/7 Completed
May-93 The number of individuals with high level system RACP
attributes has been reduced to a minimum.
AA-OARM OIGQV2/53 Completed
AA OARM OIG-Q-V2M5 Completed
Completed
AA-OARM OIGQV2/14 ' Completed
AA-OARM OIG-Q-V2/14 Completed
AA-OARM OIGQV2/14 Completed
Jun-93 NDPD developed an MVS procedures manual (6/93) and a
DASD management plan (2/93) that cover the issues in this
recommendation.
May-93 The number of individuate with high level system RACP
. attributes has been reduced to a minimum
Nov-92 NDPD created the Application RACP Security Administrator's
Guide, which addresses many of the following topics:
Nov-93 NDPD provides comprehensive recommendations on these
topics as part of training other organizations to become RACP
aware. Agency directives 219S and 2197 are also
Nov-93 responsive to these recommendations. Training for almost
all of the Agency accounts was completed by 11/93.
Peb-93 NDPD has complied with item c) through the
NDPD .Operational directives.
•o
•0
tl'
'£
rj
|1| NOTE: SeeAppoiduVUfarfallciUtiomofrrfere.*.
-------
Audit Recommendation
Strengthen NCC security policy to define:
a. controls over sensitive application data;
b. Controls over access to operating system data sets
to include requirements for minimum access authority;
c. controls over powerful RACFprivileges; and
d. division of responsibilities for RACF administration.
Update written policies, standards, and procedures for
systems software activities to include proper APF
administration and SVC installation. Hu policies
should Include the requirement to preserve IBtt system
integrity, and follow vendor guidelines for maintaining
system Integrity.
Issue a requirement that all support services contracts
Incorporate performance requirements that meet the
intent of OMB A-76 performance-oriented work
statement and quality assurance turvtUance plan.
Responsible
AA-OARM
X
I
\D
in
Reference/Pace
OIG-Q-V2/15
Separate the duties of Government employees who AA-OARM OIG-Q-V2/14
perform system operations and system security activities.
Establish a Ugh level IRtl Steering Committee DA-EPA OIG-R-V2/19
which acts as a decision-making body for significant
OUt activities, headed by DA or DSO with members
being senior executives with authority to commit
offices to action.
Formally designate a DSO in accordance with the A-EPA OIG-R-V2/19
PRA at the AA level.
Delegate the authority and responsibilities for all A-EPA OIG-R-V2/19
the OUI functions to the DSO in accordance with
the PRA. and clearly define any redelegations.
Establish a clear chain of command under the DA-EPA OIG-R-V2/19
DSO for all [Kit activities, especially between
OOtH and NDPD.
Action
Status
Completed
AA-OARM OIG-Q-V2/24 Completed
AA-OARM OIG-Q-V2/53 In progress
Completed
Completed
Completed
Completed
Date Comments
Peb-93 The NDPD security policy has been updated to reflect
the OIG recommendations.
Jun-93 NDPD has developed the MVS Standards and Procedures
Manual, which addresses these issues.
Jul-93 The TOSS Task Force reviewed all TOSS SOWs to ensure
the use of performance-oriented requirements. This
requirement is also spelled out in the PIP Resources
Acquisition Manual, to be issued as final in 2/94. NDPD
requires that all support services contracts comply with
Circular A-76 requirements, including those regarding
performance-oriented work statements and QA surveillance.
Apr-92 As of 4/92. the full-time NDPD Security Officer began
reporting to the Director. NDPD. regarding all security matters.
Dec-92 Steering Committee charter has been revised to reflect
AA/OARM as chairman. Further refinements to charter
and Committee membership in progress. AAs scheduled
to meet with the AA/OARM in January '94.
Dec-92 AA/OARM was formally designated as Agency's Senior
Official for IRM in Delegation 1-84
Dec-92 Delegation 1-84 redelegates specific portions of the IRM
program to OPPB.
In progress TBD
Further revisions to Delegation 1-84 have been drafted
to establish a clear chain of redelegations to the
office level.
tn
O
n
x
M
X
|l| NOTE: SeeAppettfcxVUforMlciuUoaofreferace.
-------
Audit Recommendation
Develop a comprehensive Agencywide oversight and
enforcement progran which focuses on software
quality and the system development life cycle and
which at a minimm should Include...
a provision of training on Information system quality
assurance.
Responsible
AA-OARM
HI
Reference/Page
OIG-R-V2/58
Action
Status
In progress
Establish a formal. Agencywide. Integrated planning
process for the direction, coordination.' and control
of IRM activities and resources that will provide
management involvement and accountablity at all
levels, which at a minimum should include the:
I a. Development and implonentallon of an action plan
O to accomplish Agencywide mission-based
bottom-up OUi planning.
b. Establishment of an evaluation and review process
for program offices' IHH mission-based plans to
ensure the plans support a consolidated Agencywide
mission-based UUI plan.
c. Integration of the responsibilities for OUi
planning and budgeting.
d. Modification of the methodology for OOI planning
to include dear policies and procedures for linkage
of the planning and budgeting processes.
DA-EPA
OIG-R-V2/46
In progress
Date 'Comments
Jul-93 The MOSES program has presented briefings on the System
Devel. Ctr. product development process, which describe the
product assurance process during development of deliverables.
A 3-day class on managing software requirements and the role
of product assurance during a software project has been pre-
sented twice. Another 3-day class, more focused on product
assurance, is being considered for presentation during FY94.
These offerings are being considered for training components
of a comprehensive Agencywide oversight and enforcement
program.
OIRM has also established the oversight and QA function in
the Oversight and Compliance Support Staff in OIRM/MES.
which ensures new acquisitions comply with EPA policies.
Jan-94 The IRM Planning Group was established to develop these
functions. In addition at a Jan. '94 meeting of the AAs.
they agreed to directly work on an IRM Strategic Plan
for the Agency.
DA-EPA
DA-EPA
OIG-R-V2/46
OIG-R-V2/46
DA-EPA
OIG-R-V2/46
DA-EPA
OIG R-V2/46
Completed
In progress
In progress
In progress
Apr-93 The action plan was completed in April of 1993.
currently being implemented.
His
Nov-93 Formal procedures are under development. Responses to
the 1993 integrated data call for IRM planning and budgeting
are currently being reviewed and will be part of the Agency
IRM Strategic Planning process.
Dec-92 The revised charter of the Executive Steering Committee for
IRM integrates these responsibilities at a high level, at did
formal designation in December 1992 of the AA/OARM
as EPA's Senior Official for IRM.
May-93 The formal procedures are currently in development. The
Agency's first integrated IRM planning and budget data call
was issued in 1993, and is currently being analyzed.
•u
v
w
z
o
M
|l| NOTE: See Appendix VU lot fell cilatkm of icfcreace.
-------
Audit Recommendation Responsible
Develop a comprehensive Agencywide oversight and AA-OARM
enforcement program which focuses on software
quality and the SDLC and which at a minimum should
Include the:
Development and implementation of an action plan to AA-OARM
accomplish Agencywide quality assurance for major
information systems.
HI
Reference/Pane
OIGR-V2/57
Action
Status
Date Comments
OIGRV2/58
In progress TBD
In progress TBD
The Systems Development Center was created to emphasize
the development of high quality information systems. Under
EPA direction, the MOSES contractor das developed a
quality assurance program and has created a separate QA
group within the SDC, reporting directly to the program
manager. Product assurance procedures are being developed.
documented and practiced, and will be refined over the
life of the MOSES contract. It is OIRM's intention that such
procedures will be broadened for adaptation to other EPA
IRM contracts.
x
i
Establishment of an oversight and enforcement
function to be responsible for the overall Information
systems quality assurance program to Include
Independently reviewing and evaluating major
information systems.
AA-OARM
OIG-RV2/58
Completed
Fomaliie and prioritize a plan for developing and
revising policies, standard!, and procedures which
addresses the Issues presented in this finding, which
also Include Ike following actions:
Review existing IRM guidance documents and
incorporate them as necessary into IRM policies.
standards, and procedures under Directive 13 IS.
tmedwlely issue temporary directives for informal
guidance and standards as set forth in Directive
1315 on critical OUi guidance documents until
green bonier review can be performed.
Develop additional comprehensive, formal.
authoritative. IKtl policies, standards, and
procedures which would cover all minimum Federal
and EPA IRM requirements.
DA-EPA
DA-EPA
DA-EPA
DA-EPA
OIG-R-V2/3S
OIG-R-V2/35
OIG-R-V2/35
OIG-R-V2/35
Completed
Completed
Jul-93 The oversight and quality assurance function has been
established in OIRM/MES and implemented by the Oversight
and Compliance Support Staff. This Staff ensures new
acquisitions comply with EPA policies for information systems
design, development, and maintenance. In addition,
OIRM assures compliance and adherence to appropriate
lifecycle practices for projects run under the MOSES
contract for national systems development. Finally, starting
in PY94, OIRM will initiate systems reviews to monitor
compliance with Agency and Federal IRM policies & stds.
Jun-93 A comprehensive IRM policy workplan has
been developed and is being used as the basis for prioritizing
revision of IRM policies, standards, and procedures.
Jun-93 OIRM coordinated with NDPD and SIRMOs to review
existing guidance documents and formalize a plan for devel-
oping and revising policies, standards, and procedures.
The plan identifies subject mailer experts and sets priorities.
Completed Apr-93 OIRM developed a list of candidate policy documents to be
issued as temporary directives, had this list reviewed by the
Agency IRM community, and had a set of 8 critical IRM
policy documents approved as formal directives on 4/30/93
Completed Jun-93 Two additional policies recently completed include a policy
on access to computer equipment by the disabled, and a
policy on use of electronic signatures within EPA.
. . Additional policies have been drafted on topics such as
telecomm.. background investigations for IRM contractors,
HW/SW standardization, and systems lifecycle mgmt.
|l| NOTE: See Appendix VII for fullciudo. of reference.
t»
o
M
X
1-1
X
-------
Audit Recommendation
Responsible
Updating and establishment of dear policies, standards. AA-O ARM
procedures, and guidelines on Information systems
quality assurance and incorporation of them into
format EPA directives.
Establish and maintain a central repository for DA-EPA
IRM policies, standards, procedures, and guidance.
Include segments on managing electronic records A-EPA
In Agency records management training sessions, m
addition, develop presentations for OUI officials
and program staff responsible for electronic records
systems that inform them of their responsibilities In
relation to electronic records, particularly In terms
H of building in maintenance and disposition at the
I system development phase, and of creating and
to maintaining appropriate documentation.
Revise and expand the guidance for creation. A-EPA
maintenance and use, and disposition of electronic
records already in place to mate It more complete
and current, hi particular, Incorporate, as appropriate
and In a farm applicable to EPA. the guidance In
NARA's regulations on electronic reeordkeeping.
found at 36 CFR 1234. and In NARA 's handbook
entitled "Managing Electronic Records. • Ensure
that records maintained on all types of systems an
Included. As part of the EPA directive system, this
guidance should be desstminated to all records managers
and administrators and OUt staff, both in headquarters
and the field. Clear assignment of responsibility
for developing and maintaining documentation for
electronic records should be included tn the directive.
ni
Reference/Page
OIG-R-V2/58
Action
Status
Date Comments
OIG-R-V2/35
EPA-A/52
EPA-A/51
In progress Jun-94 Draft of system lifecycle mgmt. policy is in pre-green bonier
review. Procedures and guidelines will be developed to
complement the policy. The current system design and
development guidance was issued as temporary Agency
directive 2182 in April 1993.
Completed Peb-93 Copies of documents in the IRM policy document inventory
are maintained in OIRM/1MSD. OA/MOD retains the central
repository of all official Agency directives and is currently
in the process of examining all directives as part of the
administration's streamlining initiative.
Completed Dec-93 An electronic records segment has been included in the
standard records management training given to program staff.
We made many presentations to IRM branch chiefs, SIRMOs,
etc.. on electronic records management 'and developed records
disposition schedules for most information systems listed in the
Agency's information systems inventory. We featured elec.
records mgmt. concerns in the 4/92 issue of INPOACCESS,
and included other articles on elec. records mgmt. policy in
other issues. We plan to develop a 2-hour training session for
all records managers on electronic records for PY95.
Completed Dec-93 We included revised wording on managing electronic records
in the revised version of Ch. 10 of the IRM Manual (records
management) scheduled to go through green border review
in PY94. We completed a study of existing policy, procedures.
and guidance on electronic records thai indicates areas where
our policies need improvement. We plan to revise Ch. 8 of
the Agency Records Management Manual (Directive 2160)
during PY94, and prepare it for informal review during PY9S.
We distributed copies of NARA's "Managing electronic
Records" to all records officers and SIRMOs.
•o
•o
B
D
M
|1| NOTE: See Appeodi* VU for full ciutioa of reference.
-------
Audit Recommendation
Develop a process to ensure /KM considerations get
addressed during development of proposed regulations,
and information collection requests QCRs) by
engaging OiRH participation In the Agency Steering
Committee and associated regulatory workgroups.
A integral component of the review should focus on
whether the collections are In compliance with
established Agency IRM policy and standards.
Responsible
D-OIRM
HI
Reference/Page
EPAB/18
Action
Status
Date Comments
In progress TBD
Establish more formally designated roles and
responsibilities for all members of the Agency IRM
community, reflected in performance standards.
Require AA/RA-tevtl reviews of all office/regional
not organiuaional structures and require formal
designation of authority and responsibility for all
positions In these structures.
Develop a model of a recommended Kit
organiuaional unit in program offices.
Revisit and redefine SIKMO concept. FDs. and
perfomunce standards In light of the SIKMO's
threefold role: Information conduit, approval
authority, and technical resource.
D-OIRM
EPA-B/9
under
consideration
TBD
D-OIRM
EPA-B/9
under
consideration
D-OIRM
EPA-B/9
In progress
D-OIRM
EPA-B/9
In progress
OPPE is leading an effort to revise the regulatory development
process. The existing process did not emphasize up front
analysis, or training, tools, and guidance for workgroup chairs
or participants regarding IRM issues. OARM reps, have
been actively involved with the workgroups developing
proposals for the new process. Our approach is to. push for
strong upfront guidance for workgroup chairs and participants.
coupled with an evaluation program that allows us to determine
where additional guidance or tools are needed. We will
maximize the impact of our experts by providing lists of
contacts on specialized issues to workgroups upfront so that
they can access experts on an as-needed basis.
The model IRM program study will define approp. rotes and
responsibilities for the SIRMOs. and in so doing will show
the relationships between the various elements of the IRM
community. More attention would need to be focused on
central IRM's specific rotes and responsibilities for adequate
perf. standards to be developed for the entire community.
Decisions on implementation of the model IRM program will
probably result in review of AA-tevel IRM structures and
responsibilities. Top management support will be required to
formally designate authorities and have them implemented.
There is a model for the Regions that appears to be working,
but this has not been addressed in the program office model
IRM study.
May-94 The model will include recommended IRM organizational
unit placement, staffing, and functions for program offices.
The model will be drafted by the end of January,
and will be shared with the IRM community in Spring.
after review and approval by OIRM/1MSD.
May-94 In essence, the SIRMO concept gels defined in the model
study. Responsibilities of the IRM unit in program offices
will be defined by function, thus these could serve as the
basis for job descriptions and functional statements.
TBD
•o
n
|1| NOTE: See Appendix VII for fall ciu«io« of reference.
-------
x
Audit Recommendation . Responsible
Establish a scalable, flowcharted review and approval D-OIRM
protest for information system development projects.
focusing the review on projects involving Ugh dollar
and/or mission critical systans. Inherent in this
recommendation Is the need to clearly define the
thresholds of projects requiring OUUt/OTUtO review.
Define criteria and thresholds for system D-OIRM
development/enhancement efforts requiring joint
OnOt/SIRHO formal review/approval. Jnis includes
new system development initiatives and/or major
changes to existing systems. Joint review and approval
at designated Intervals will involve review of the
life cycle products and documented decisions la
proceed or not proceed with the proposed work, and
conclude with the formal decision to retire an
information system. It should be noted that the
review of the project management plan for all system
development and enhancement efforts should address
the need to comply with IRM policies and standards.
Providt tailored IRM training for various groups. D-OIRM
based on levels of need and areas of responsibility.
Target groups' Include: '•
System development Delivery Order Project
Officers and Work Assignment managers;
HI
Reference/Page
EPAB/15
Action
Status
Date Comments
In progress TBD
EPAB/18
In progress TBD
This review and approval process is described in the draft
system lifecycte management policy: Thresholds
are clearly spelled out. as are the reviews by SIRMOs,
central IRM offices, and the Executive Steering Committee
for IRM.
Criteria and thresholds for system development and/or
enhancement efforts are clearly spelled out in the
draft systems lifecycte management policy.
The project management plan is one of the most
important documents referenced in the policy.
EPA-B/13
In progress Various Dates for specific mining courses vary.
Completed
The Sys. Devel. Center established under the MOSES
contract is required to operate based upon standard operating
Jan-94 procedures which describe how projects receive technical
direction from DOPOs. limits on such direction, and controls
over the deliverable development process. The contractor has
briefed DOPOs on how work flows through the SDC & more
briefings are scheduled. We are considering making the
briefing mandatory training for MOSES DOPOs. (Other
components may be determined by OAM in addition to the
current general DOPO training requirements.) In addition
over IS DOPOs attended • 3-day course on the importance
of clearly stating requirements, controlling changes to
req'ts, and the interaction between requirements, design.
the software being developed, and quality assurance.
Contractors working on system development
dealing with issues that are EPA-spedflc; and
In progress
Jan-94 Beginning in Jan. 1994. SAIC will be providing its MOSES
contract employees with information security training. These
sessions will continue until all MOSES contract employees
have received mining.
|1| NOTE: See Appendix VII for fell ciuikw of rtfeeace.
PI
SS
D
M
X
-------
Audit Recommendation
(continued...)
frogran staff developing refutations involving
information collection.
Responsible
01
Reference/Page
Action
Status
Completed
Date Comments
Jan-94 Briefings will be offered in January to program managers at
the office director, division director, and branch chief levels on
the Paperwork Reduction Act. its goab and requirements.
the ICR process, and how to comply with the Act. Direct
training is also provided to reg. workgroups upon request
to supplement what they learn from reading the ICR manual.
Decrease EPA's dependence on contract tervicet. .
increase Ike ratio of EPA to contract penonnei. and
Increase the number of Agency IHit FTEs by initiating
a conceited effort to plan for. justify and request a
significant Increase in Federal IRU FTEs.
Require that Agency programs develop mission-eased
IRU plans which will clearly Identify IRU expenditures
in their annual budget and operating plans.
Develop specific policies and processes to ensure that
Agency IRM systems developed under any EPA contract
comply with Federal and EPA IRU policies and
standards. Make certain that Utt work under these
contracts which Is considered incidental to the support
of the contract effort is only for the contractor's
internal use and will not be transferred to Agency
personnel or programs.
Require that all EPA contract, grants and Interagency
agreements must Include provisions for compliance
with EPA and other Federal OUl policies.
Review all Agency IRU policies, standards, and
guidances at regular intervals to identify gaps and
ensure they are in line with current Federal
requirements and support the Agency's primary goals
and objectives.
Establish a standard, formal process for issuance.
periodic review, and maintenance of IRU policies,
standards, procedures, and guidance which shall
include review by program and regional office staff
having functional OUI duties and responsibilities.
D-OIRM
EPAB/20
In progress TBD
D-OIRM
D-OIRM
EPA-B/6
EPA-B/6
EPA's actions to correct a declared material weakness in IRM
planning will support the development of Agency and program
IRM plans that can provide solid justification for requesting
additional FTEs. Efforts are abo underway to investigate
possibilities for converting IRM contractor funds to PTEs.
In progress Nov-93 Formal procedures to require this are being developed.
Program offices were required to submit budget, mission, and
acquisition information for the first integrated data call.
Completed Aug-91 The IRM clause of the EPAAR requires that systems
developed under any Agency contract must comply
with Federal and EPA IRM policies and standards.
In addition, the APDS system puts boilerplate language
for IRM automatically into all new solicitations.
for PIP resources, as a mandatory clause.
D-OIRM
D-OIRM
EPA-B/18
EPA-B/5
Partially
Completed
Completed
DOIRM
EPA-B/5
Completed
TBD All contracts must include provisions for IRM compliance.
Grants and lAGs are now addressed on a case-by-case basis,
and may not be readily amenable to a comprehensive solution.
May-93 All policy documents were reviewed to identify gaps during
creation of the IRM policy workplan
lun-93 The Agency's formal green border review process is the
standardized issuance process. The IRM policy workplan
provides the framework for periodic review and
maintenance of the IRM policy documents.
•o
TJ
M
§
|I|NOTE: See Appendix VU for full citation of reference.
-------
Audit Recommendation
CrealeAnainialn an effective online system to
disseminate all KM pollciej (including cross-
references, purpose. Issuing organization, points
of contact, review dale, and compliance requirement}).
Asses* user needs for the IDEA system, including
analytical capabilities; develop a formal test plan;
properly lest existing software; and document the
IDEA system design and software before developing '
additional software for the system.
Responsible
D-OIRM
HI
Reference/Page
EPA-B/11
Action
Status
In progress
Date Comments
Sep-93 Many IRM policy documents are now disseminated
electronically via NDPD's EPADOC CD-ROM. Options
for on-line distribution are being investigated
and piloted by OA/MOD.
AA-OE
GAO-E-V1/13 In progress Sep-94
a
I Address data qualify problems In the FINDS
-------
Audit Recommendation
Develop an Agencywide information systems
architecture that explains the tincture of and
communications among the Agency's information
resources that are needed to achieve its single- and
cross-media mission.
X
Responsible
A EPA
HI
Reference/Page
GAOE-V1/14
Action
Status
In progress
Complete the Agency's cross-media strategy by
developing policies and guidance'and instituting
management procedures to plan, coordinate, and
budget for crom-media information resources and
activities.
A-EPA
GAOEV1/13 In progress
Date Comments
Sep-94 The IMDA Group is leading the effort to develop an info.
architecture for EPA. with initial concentration on a data
architecture. The IMDA group is taking an approach based on
the Zachman Framework for Information Systems Architecture.
John Zachman has given a presentation to OIRM and program
office staff. Work is underway to specify the expected content
of the architectures for EPA and to prepare a development
plan. Since this is an expensive task requiring involvement of
many EPA staff, an education effort has been started by
inviting a nationally recognized speaker to present the
business case for architectures to EPA staff. Also in
preparation for this work, several EPA staff have attended
Zachman and related seminars on information system
architectures. Planning and education efforts will continue in
FY94, with architecture development being initiated in PY9S.
Jan-94 The IMDA Group's work to develop an Agency data
administration policy is supportive of this recommendation.
The draft policy will begin green border review during PY94.
During PY93 work was initiated to define conceptual,
logical, and physical level models for EPA and to develop
quality metrics for them. Procedures for reviewing these
models are currently under development. Recent ankles also
indicate that the White House Office of Environmental Policy
and OMB are discussing incorporating ecosystem planning
issues in budgeting, and a possible exec, order on ecosystem
management that would encourage interagency. state, and
federal cooperation. Admin. Browner has created a sr. mgmt.
workgroup on ecosystem protection, which will present
recommendations to Browner by 3/15.
Strengthen OPf'i conformanci with federal guidance
and generally accepted practices for automated systems
development so that OFP'i information systems are
consistently planned, developed, and enhanced. As
part of this effort. OFF should ensure that the
pesticide information needs of all users Involved In
administering and managing EPA's pesticide
reregistmtion process are defined and Unlud to an
overall program management plan.
AA-OPPTS GAO-H-V1/8
Completed Jul-93 Developing a multi-year OPP/OCM strategic planning process
and Unking this process with OPP's budgetary process in PY92
and PY93 has created a cross-divisional path for IRM project
planning and resource allocation and is a major step toward
ensuring overall program office mission goab and objectives
are reflected in OPP's IRM activities. In addition, this planning
provides IRM project managers with somewhat more realistic
and timely information about expected mission timetables and
resources for developing full-scale IRM project plans. This
greatly facilitates system Ufecycte management and contributes
to more fully coordinated and controlled systems development.
III NOTE: See Appendix VTI for hUcit*io« of reference.
•o
M
X
-------
Audit Recommendation
Establish data managtmenl policies and implement a
plan with milestones, for resolving OPP systems'
data integrity problem.
Responsible
ID
Reference/Page
AA-OPPTS GAO-H-V1/8
Action
Status
Completed
S
H"
00
As OPP moves toward systems Inttfration activities.
ensure that requirement* analyses, feasibility studies.
and cost/benefit analyses are conducted to support
OPP's automated systems solutions.
AA-OPPTS GAO-H-V1/8
Completed
and
Ongoing
Date Comments
Jul-93 The central issues in OPP's data mgmt. process are being
addressed in two major integration efforts that follow the goals
in the OPP/OCM strategic plan. These 2 teams span organiz-
ational lines and involve both clients and developers. The
teams reflect data environment lines, with core enterprise
data addressed in one team and activity tracking data in the
other. The data management issues that led to a decision to
launch full-scale integration efforts are these:
* core data in our systems is being derived from multiple
sources - leading to data validity problems
* core data in our systems is variously denned, both from a
system design and from a logical use standpoint — leading to
data reliability and interpretation problems.
* multiple update streams and non-synchronous update timing
across systems leads to data validity problems.
All these issues as well as system architectural issues are
being addressed in a standard controlled systems lifecycte
management approach, with full-scale documentation and an
OPP-wide data dictionary.
Jul-93 In 1993. OPP integrated three different pesticide reregistration
systems (ALISS, SMARTS, and DCI system) into one system
called the Chemical Review Management System (CRMS).
For this system, a requirements analysis and a feasibility study
were completed prior to developing the system. The cost/
benefit analysis is in progress and will be completed shortly.
The other systems lifecycle documents were prepared in
accordance with EPA Directive 2182, on systems design and
development. The office plans to follow Agency policy and
standards as it moves forward with other integration projects
in 1994.
§
o
III NOTE: SceAfpeadUVUforfollcitMioaofRfeftoce.
-------
Audit Recommendation
Take appropriate steps to enhance its information
system development process and fully ensure that
data collection efforts complement each other and
support the program mission. Specifically, a
comprehensive data collection plan should be developed.
Steps should be taken to improve the assignment
of responsibilities for planning and directing the
development of Information system components by
increasing the authority of the central coordinating'
office to develop data collection efforts and ensure
consistency. Finally, the lift cyde management system
should be refined to ensure the complete and
detailed analysis and documentation of each stage
of the cyde for major system components.
Ensure that stale data collection and quality control
efforts receive fully adequate support and include
specific Indicators related to data collection and
X verification in the Agency's mechanism for
I-1 monitoring state performance.
Amend federal recordteeping and reporting
regulations so that states art required to collect
and provide standard data elements in a
disaggregated form and hazardous waste handlers are
required to provide sufficiently detailed data.
Ensure that the toxic chemical release inventory
reporting system complements other hazardous
waste data collection efforts so that the data it
provides on toxic chemical concentrations can be
used to dietr maximum potential.
Responsible
HI
Reference/Page
Action
Sums
AA-OSWER GAOKV1/39 Completed
AA-OSWER GAOK-V1/103 Completed
AA-OSWER GAO-K-V1/103 Completed
AA-OSWER GAOKV1/103 Completed
A-EPA
Ensure that the new accounting system and property
control system (I) provide accurate and reliable
financial and management control records (including
the type of asset, date of acquisition, cost estimated
useful life, applicable depreciation data, physical location, and
identity of custodial officers) to account for and control property assets
and (2) contain a common data elemental or tnttrface(s)
to permit the reconciliation of accounting and property systems data.
GAOBLIB/19 In progress
Date Comments
Oct-92 The Agency prepared an amendment to the biennial report
regulations that would have ensured that all stales collected
consistent data on hazardous waste generation and mgmi.
This amendment would have required consistent data collection
and would have enabled more consistent national reporting of
the data. However. EPA has decided to defer the rule until
after the 1993 and 1995 data collection cycles are complete.
Per discussions with Slates and other interested parties. EPA has
hypothesized that collection of consistent data will not require
a change to the regulation, but rather nation-wide use of a
consistent form for data collection and system for data entry.
In 1987,18 states were using the BR forms. In 1991,47 states
used the BR form. The Agency has followed all important
lifecycle management procedures.
Oct-92 The Agency developed improved biennial reporting data
entry and report software for the states which improved slate
data collection and data quality control efforts. The Agency
developed an amendment to the federal recordkeeping and
reporting regulations so that slates would be required to collect
and provide standard data elements, and hazardous waste
Oct-92 handlers would be required to provide sufficiently detailed
data. However. EPA has deckled to defer the rule until
after the 1993 and 1995 data collection cycles are complete.
Per discussions with Suites and other interested parties. EPA has
hypothesized that collection of consistent data will not require
a change to the regulation, but niher nation-wide use of a
Oct-92 consistent form for data collection and system for data entry.
In 1987.18 states were using the BR forms. In 1991, 47 slates
used the BR form.
The Agency undertook a year-long study of possible
linkage between BRS and TR1. The result of this effort was to
show a less than 25% overlap in faculties reporting.
Sep-93 PMSD and PMD convened a quality action team (QAT)
during PY93 that examined requirements for a combined
property and financial accounting system, and recommended
acquiring an additional module of commercial software (the
PPS fixed assets module). The final decision on acquiring the
module is still pending.
•o
•a
M
|l| NOTE: See Appendix VUfor full ctUUo* of reference.
-------
EPA OFFICES WITH MAJOR
IRM RESPONSIBILITIES
APPENDIX X
ADMINISTRATOR
DEPUTY ADMINISTRATOR
OPPE
OARM
(DSO)
OIRM
OARM
RTP
OARM
Cinn
NDPD
IRMD
X-1
-------
APPENDIX XI
STRENGTHENING IRM AT EPA
OFFICE OF INFORMATION RESOURCES MANAGEMENT
Issue Paper Updated February, 1994
Background: This Issue Paper was provided by OIRM officials to
show the reader the aggregate, collective efforts the IRM
community is undertaking to improve information management at the
Agency. It presents a summary of many of EPA's current IRM
improvement initiatives, and gives a brief description of the
context in which they were developed.
XI-1
-------
APPENDIX XI
Introduction
The purpose of this paper is to present the IBM improvement
program and our overall strategy for addressing weaknesses that
have been identified with EPA's IBM program. The paper is
characterized as "strengthening1* the IRM program, not building
anew, since many of the components for a solid IRM program are in
place,.and some have been judged in the past to be excellent.
Historical Perspective
The EPA IRM program parallels the structure and operating
philosophy of EPA as a whole - centralized planning, policy, and
oversight, and decentralized implementation.
This modus operandi served the Agency fairly well in
providing systems to support the numerous stand-alone
environmental statutes and programs. To support this
arrangement, OIRM invested limited resources in planning, policy,
and oversight, and applied the bulk of its resources to services
that supported implementations, or to implementations directly.
Two factors began to emerge in the late 1980's that have
made the traditional EPA approach to IRM less effective. One,
the Agency's business has been informally changing to be more
oriented to cross-media and integrated approaches. Using the
traditional EPA model and emphasis, IRM has not been able to
support this new business very well. Second, several statutes .
and implementing regulations focused on improving Federal IRM
were issued or amended (Paperwork Reduction Act, Computer
Security Act, A-130). These authorities require a stronger
central IRM presence in EPA. Despite these two factors, EPA OIRM
has had difficulty disengaging from its services and
implementation work and reorienting resources towards the
strongly central activities of planning, policy, and oversight.
Recent budgetary pressures have lead to reduced OIRM budgets, and
have further hampered this reorientation.
Accomplishments
Despite the effect of several areas of concern, and a
historical IRM strategy that may not provide the basis for
providing strong support to the Agency's strategic direction
towards integrated, cross-media analysis, the IRM community has
achieved a number of notable accomplishments. Below are listed
some selected, representative examples:
o Implemented the Systems Development Center to improve
EPA systems through development methods and discipline.
XI-2
-------
APPENDIX XI
EPA received an award from "Government Computer News"
recognizing this effort.
o Installed a supercomputer to support environmental
research at EPA's new facility at Bay City Michigan.
o Operated the National Computer Center at a level of
efficiency, as adjudged by industry expert Nolan, Norton
and Company, in excess of 10% more cost effective than
the norm for data centers of like size*
o Established, out of base, a unit dedicated to
establishing the framework for data management in EPA,
and developed a set of initial standards. This key
activity is establishing the "rules" required to support
data integration across EPA programs and with the outside
world.
o Established disaster processing capabilities for three
key information systems in EPA, to enable EPA to continue
to operate as an organization if the data center
experienced an incapacitating event. EPA received an
award from "Government Computer News" recognizing this
achievement.
o Developed and installed in a number of offices in HQ and
the Regions the Office Forms Facilitator (OFF) suite of
systems to enhance the productivity of EPA offices. A
January 1994 study of OFF usage in typical office
settings documented significant productivity gains and
efficiency improvements for the administrative processes
automated by the system.
As an overall- indication that the EPA IRM community has
managed the overall program quite well, Congressman Jack Brooks,
former Chairman of the House Government Operations Committee,
was quoted in "Government Computer News" several years ago as
indicating that EPA was one of a select few agencies "doing a
good job" in acquiring and using computer and communications
systems and services.
Areas of Concern
The OIG has repeatedly identified IRM as a candidate for
consideration by the Senior Council on Management Controls as a
material weakness. GAO has also identified IRM weaknesses as
contributing to problems the Enforcement and Pesticides programs
have encountered in performing their missions.
XI-3
-------
APPENDIX XI
In reviewing the work and conclusions of such oversight
organizations, OIRM has organized the primary IRM concerns into
five areas: 1) IRM Planning - This area includes creating a 5-
year IRM plan and integrating that plan into the budget process;
2) Formal IRM Policies - This area focuses on formalizing through
the green border process a number of policies/standards/etc, with
Agencywide impact, developing additional policies to cover some
identified gaps, and improving communications, training, and
assistance for those affected by the policies; 3) IRM Security -
This area deals with creating a comprehensive security program
which assures cost-effective protection of sensitive Agency
information, by focusing on training and oversight, covering a
few policy gaps, and improving the mainframe environment at NCC
by enhancing certain general controls; 4) IRM Quality
Assurance/Oversight - This item focuses on enforcement of IRM
policies pertaining to system development, operations, and
maintenance, and evaluating systems' effectiveness; and 5) IRM
Contracting - This area deals with formalizing controls to ensure
IRM contracting is conducted efficiently and in accordance with
Federal regulations, both during the pre-award phase of acquiring
Federal information processing (FIP) resources and throughout
contract administration.
OIRM has examined and evaluated these five areas in
accordance with EPA's material weakness criteria. On the basis
of these analyses, OIRM acknowledged that all five areas have
certain weaknesses. In the case of IRM Planning and IRM
Security, OIRM deemed them to be sufficiently material to b*
reported to the President (see the Administrator's December 1992
FMFIA letter). In the case of IRM Policies, IRM Quality
Assurance/Oversight, and IRM Contracting, OIRM deemed the
weaknesses to be serious, but not of sufficient importance to
report to the President, and has declared these weaknesses at the
Agency-level, corrective action plans have been developed and
are being implemented.
The Improvement Program
OIRM is vigorously addressing, in cooperation with the
Agency IRM community, the underlying circumstances that have
resulted in these areas of concern by working in five key areas:
1. Revising the OIRM Emphasis - Historically, OIRM has
been substantially invested in activities related to implementing
systems. This emphasis has diverted resources from the
traditional functions of a Federal IRM organization of planning,
policy, and oversight/review.
The lack of emphasis on these stewardship functions of
policy, planning, and oversight has contributed to certain
XI-4
-------
APPENDIX XI
weaknesses in the IRM program. To address this matter we are
taking several actions:
o We.declared IRM planning a material weakness, immediately
bolstered our planning function, and established an
action plan to create a 5-year IRM plan for EPA that is
linked to the budget. The first planning cycle will be
completed in February 1994. In addition, with our
support, the Assistant Administrators gathered in January
1994 and initiated work together on IRM strategic
planning. At the same time, a subgroup of the National
Advisory Council on Environmental Policy and Technology
was used to engage the Agency's external stakeholders in
IRM strategic planning activities.
o Out of base resources, we placed senior, experienced IRM
staff at the lead of our information security program.
We have developed, and are supporting, a network of
information security officers for information technology
installations across the Agency. We have launched a
solid security awareness training program and ensured
that every data set on EPA's mainframe has been
evaluated, with a security decision made by its owner,
regarding the appropriate level of protection to be
implemented via the Resource Access Control Facility.
o We have increased attention to the IRM policy area. To
date, we have inventoried all Agency IRM policies,
standards, procedures, and guidance, created a
prioritized listing of additional policies needed, and
formally issued a number of key IRM documents as
temporary Agency directives. We recently established
Agency policies on access to computer equipment by the
disabled and on use of electronic signatures within EPA.
We are currently revising the Agency's systems life-cycle
policy and guidance, telecommunications policy, and
information security-related policies. We are developing
anew EPA hardware/software standards and policy, a
comprehensive data administration policy, and a policy
requiring appropriate suitability investigations for IRM
contractors. In the spirit of the National Performance
Review, we are focusing our policies more on desired,
measurable outcomes* (the "what",), and less on specific
procedures (the "how").
o We established an organization to review IRM
acquisitions, major systems, and IRM organizations in the
Agency. This organization has comprehensively reviewed,
to ensure inclusion of relevant IRM controls, all
requests for procurement. (RFPs) for FIP resources and all
XI-5
-------
APPENDIX XI
delivery orders for OIRM-provided national IBM contracts.
We have implemented, at our Systems Development Center
under the MOSES contract, a very thorough set of
operating principles, policies, and procedures that
ensure good contracting practices, improved planning, and
useful, high-quality deliverables.
o. To more clearly identify the functions that need to be
•performed in EPA to ensure good IBM is practiced, last
Fall we initiated an IBM business planning effort. After
the functions of IBM, both for central organizations and
the programs/regions, are clearly established, OIBM will
examine its current organization to determine if any
changes are necessary to support the central
responsibilities. If so, a formal reorganization will be
proposed.
• 2. Assuring senior—level Engagement la IRM — We formalized
the delegation from the Administrator to the Assistant
Administrator for OABM establishing the AA as EPA's Designated
Senior Official for IBM (the "DSO", as required by the Paperwork
Reduction Act and OMB Circular A-130), and we are working to
clarify related redelegations of responsibility and authority.
We have strengthened the IBM Steering Committee by ensuring
direct DSO leadership, raising the level of seniority of
Committee members, and engaging the Committee directly in key
decisions on IBM planning, contracting, policies, and major
systems activities. The Committee's new charter is presently in
the approval process.
As one of its first tasks in fiscal 1994, the Committee is
presently focusing the Assistant Administrators' and Begional
Administrators' attention on creating a strategic IBM vision.
The Committee is obtaining the views of the Agency's key external
stakeholders by soliciting input on IBM strategies from the
National Advisory Council on Environmental Policy and Technology
(NACEPT).
3. Strengthening IRM Components in the Programs — IBM is
patterned after the organizational culture in EPA - decentralized
implementation and operation. We are actively working with the
SIRMOs in the Programs to strengthen that position, and the
support available to organizations performing or
utilizing IBM in the Programs. This work is primarily in the
form of a Model IBM Program study. After completing the study in
March 1994, we will take steps to formally delegate appropriate
functions to the programs/regions, and to create proposed
organizational structures, position descriptions, etc., for the
SIRMO organizations.
XI-6
-------
APPENDIX XI
We are also working to strengthen IRM components in certain
programs by increasing their awareness and knowledge of IRM
issues. This effort will focus on periodic internal
communications, technical assistance, and formal training. For
example, we recently provided IRM training, which was very well
received, to 42 Contract Officers and Specialists within the
Headquarters Office of Acquisition Management and the Contracts
Management Division in Cincinnati.
4. supporting Data Int•oration - We are involved in this
area in at least five ways. First, we have dedicated resources
to "set the rules" by which data are defined and managed. We
established the Information Management/Data Administration
program in 1992 towards this end. This includes EPA data
policies which provide the links to integrate data, like the
Facility ID and the Locational Data Policy. Setting and
enforcing these rules is critical to success in this area.
Second, we are working to improve the utility of mechanisms that
are in place to support integration, like the OIRM-operated
FINDS. An improved version 2.0 of FINDs was released in late
1993, and we plan to implement version 3.0 during FY94. Third,
we have proposed a major initiative to build an integrated
environmental database for EPA called "ENVTROFACTS," and to
provide initial capabilities focused on selected geographic
initiatives. This "data warehouse" is envisioned as an interim
approach to enable better integration and public access, until
key program systems undergo modernization and reengineering.
Fourth, we are working on this same effort to develop access
tools (GATEWAY) to support analysis using ENVIROFACTS data.
Fifth, we are working in concert with other Federal agencies to
ensure GATEWAY, ENVIROFACTS, and other information locators are
consistent with national and international efforts towards
environmental data integration. We are also actively engaged in
positioning EPA for successful contributions to the information
superhighway, via Internet.
5. Fixing specific Deficiencies - We are aggressively
addressing the many specific recommendations made by OIG in
various IRM-related audits. Overall, we believe that implementing
the recommendations will result in a stronger IRM program. When,
on occasion, we disagree with the specifics of how to implement a
particular recommendation, our strategy is to implement solutions
that are responsive to the recommendation's intent.
We have in place specific action plans responding to
recommendations from the following audits:
XI-7
-------
APPENDIX XI
o EPA's Management of Computer Sciences Corporations
Contract Activities - of the 98 recommendations in this
audit, 32 were related and assigned to OIRM for action.
All actions assigned to OIRM were completed by the end of
September 1993.
o . pPA Needs to Strengthen the Acquisition Process for ADP
Support Services Contracts - This audit recommended that
14 specific actions be taken. All actions were completed
by the end of September 1993.
o EPA Needs to Strengthen General Controls Over System
. software - This audit recommended that 20 specific
actions be taken. All actions were completed by
September 30, 1993.
o EPA Must Fully Address Longstanding Information Resources
Management Problems - This audit recommended that 28
specific actions be taken. All actions will be completed
by the end of October 1995.
Addressing the IRM concerns raised in the recent past is
considered a very serious matter by our senior leadership team.
We are engaged in addressing these concerns on a daily basis,
and intend to make permanent changes to improve those areas of
weakness, while retaining the solid capabilities in IRM that have
served the Agency well over the years.
XI-8
-------
APPENDIX XII
GLOSSARY OP ACRONYMS AND ABBREVIATIONS
AA Assistant Administrator
AOP Automated Data Processing
AIRS Aerometric Information Retrieval System
BBS Bulletin Board System
CASE Computer Aided Software Engineering
CERCLIS Comprehensive Environmental Response, Compensation and
Liability Information System
CFO Chief Financial Officer
CIO Chief Information Officer
CO Contract Officer
DAA Deputy Assistant Administrator
DI Data Integration
DOPO . Delivery Order Project Officer
DSO Designated Senior Official
EDI Electronic Data Interchange
EPA Environmental Protection Agency
FINDS Facility Index System
FTE Full Time Equivalency
GAO General Accounting Office
GIS Geographic Information Systems
GPRA Government Performance and Results Act of 1993
ICMS Integrated Contract Management System
IDEA Integrated Data for Enforcement Analysis
IFMS . Integrated Financial Management System
XII-1
-------
APPENDIX XII
IMDA Information Management/Data Administration
IRM Information Resources Management
IRMD Information Resources Management Division-Cincinnati
IRMPG IRM Planning Group
LAN Local Area Network
NDPD National Data Processing Division
NPR National Performance Review
OAM Office of Acquisition Management
OARM Office of Administration and Resources Management
OE Office of Enforcement
OIG Office of Inspector General
OIRM Office of Information Resources Management
OMB Office of Management and Budget
OPPE Office of Policy, Planning and Evaluation
OPPTS Office of Prevention, Pesticides, and Toxic Substances
ORME Office of Regulatory Management and Evaluation
OSWER Office.of Solid Waste and Emergency Response
PCS Permit Compliance System
PCSC Personal Computer Site Coordinator
PO Project Officer
PWSS Public Water Service Supply
RAD Rapid Application Design
RCRIS Resource Conservation and Recovery Information System
RI/FS Remedial Investigation/Feasibility Study
RTP Research Triangle Park
XII-2
-------
APPENDIX XII
SDLC System Development Life Cycle
SEDM State/EPA Data Management program
SIRMO Senior Information Resource Management Official
STARS Strategic Targeted Activities for Results System
TQM Total Quality Management
TRIS Toxic Release Inventory System
XII-3
-------
APPENDIX XIII
REPORT DISTRIBUTION
Office of Inspector General
Inspector General (2410)
. Deputy Inspector General (2410)
EPA Headquarters
Administrator (1101)
Deputy Administrator (1102)
Assistant Administrator for Administration and Resources
Management (3101)
Assistant Administrator for Policy, Planning, and
Evaluation (2111)
Assistant Administrator for Enforcement (2211)
Office of General Counsel (2310)
Assistant Administrator for Water (4101)
Assistant Administrator for Solid Waste and'Emergency
Response (5101)
Assistant Administrator for Air and Radiation (6101)
Assistant Administrator for Prevention, Pesticides and
Toxic Substances (7101)
Assistant Administrator for Research and Development (8101)
Associate Administrator for Regional Operations &
State/Local Relations (1501.)
Director, Office of Information Resources Management (3401)
Agency Followup Official (3304)
Attn: Director, Resource Management Division
Agency Followup Official (3101)
Audit Followup Coordinator (3102)
Attn: Program & Coordination Office
XIII-1
-------
APPENDIX XIII
Director, Congressional Liaison Division (1302)
Director, Public Liaison Division (1702)
Regional Offices
•Regional Administrator, Region 1
Regional Administrator, Region 2
Regional Administrator, Region 3
Regional Administrator, Region 4
Regional Administrator, Region 5
Regional Administrator, Region 6
Regional Administrator, Region 7
Regional Administrator, Region 8
Regional Administrator, Region 9
Regional Administrator, Region 10
Research Triangle Park. North Carolina
Director, Office of Administration and Resources
Management (MD-2 0)
Director, National Data Processing Division/OARM (MD-34)
XIII-2
------- |