-------
oEPA TRANSMfTTAL
CLAWIMCATICN MO.: 2115 1984 Edition
October 17, 1984
AOOR
GUIDE FOR ADP REVIEWS
1. PURPOSE. This Transmittal issues a Guide for ADP Reviews.
2. EXPLANATION. This Guide supplements the Automatic Data
Processing (ADP) Manual by providing procedures for conducting
required reviews of ADP Systems.
3. FILING INSTRUCTIONS. Please note that the Classification
Number for this newly-issued Guide is 2115.
The ADP Manual
2110. Make this
2110 on the ADP Manual
place. Then file this
Classification Number was 2800 it is now
number change ("pen and ink") from 2800 to
and file the ADP Manual in its proper
Guide behind the ADP Manual and post
receipt
Guide.
of this Transmittal on the Checklist in front of the
Gary
Management
and Organisation Division
ORIGINATOR: office of Information Resources Management
EPA Poem 1315-13 (R«v. 7-«2) REPLACES EPA FORMS 131S-1A AND THE PREVIOUS EDITION OP 1319-12.
-------
TABLE OF CONTENTS
Page
Foreword i
1. Introduction
1.1 Purpose 1-1
1.2 Background 1-1
1.3 Types of AOP Reviews 1-4
1.3.1 Application Systems Reviews 1-4
1.3.2 Data Center Reviews 1-6
1.3.3 Reviews of Local Computing Capacity 1-6
1.4 Federal Guidelines For AOP Reviews 1-7
2. ADP Review Methodology
2.1 Overview 2-1
2.2 Principles for Conducting Reviews 2-1
2.3 Scoping and Planning a Review 2-3
2.3.1 Scoping a Review 2-5
2.3.2 Task Planning 2-6
2.3.3 Planning Reports 2-10
2.4 Collecting Information 2-14
2.4.1 Protocol for Initiation of the Review 2-14
2.4.2 Workpapers 2-14
2.4.3 Guidelines for Interviewing 2-16
2.4.4 Guidelines for Checklists and Questionnaires 2-17
2.4.5 Guidelines for Reviewing Documentation 2-18
2.4.6 Guidelines for Direct Observation 2-19
2.5 Synthesizing Information and Ranking Problem Areas 2-19
2.6 Formulating Recommendations 2-20
2.7 Preparing Final Report 2-22
2.7.1 Guidelines on Content of Review Reports 2-22
2.7.2 Guidelines on Style of Review Reports 2-23
3. Application System Reviews
3.1 Definition and Scope 3-1
-------
3.2 Components of a Review of Operational Application
Systems 3-1
- . 3.2.1 System Background Data 3-3
3.2.2 Technical Review Components 3-4
3.2.3 Management Review Components 3-12
3.2.4 Mission Support Review Components 3-16
3.3 Post-Implementation Reviews 3-19
3.3.1 Definition of Post-Implementation Review 3-19
3.3.2 Post-Implementation Review Components 3-21
4. Reviews of Families of Systems 4-1
5. Data Center Reviews
5.1 Definition and Scope 5-1
5.2 Data Center Review Components 5-1
5.2.1 Management Review Components 5-2
5.2.2 Technical Review Components 5-8
6. Reviews of Local Computing Capacity
6.1 Definition and Scope 6-1
6.2 Local Computing Review Components 6-1
Appendix A: Bibliography A-l
Appendix B: Sample Memoranda
81. Transmittal of Review Report 8-1
82. Response To Findings 8-1
Appendix C: Policy on Periodic Review of ADP Systems
From EPA ADP Manual C-l
-------
FOREWORD
The Automatic Data Processing Manual establishes Agency policies,
responsibilities, and procedures for effective management of AOP resources.
Chapter 2 Section 11 of the ADP Manual, "Periodic Review," establishes a
requirement for conducting regular reviews to determine whether AOP systems
continue to satisfy valid EPA requirements, are performing adequately, and
contain adequate provisions for security.
This document, Guide for ADP Reviews, contains guidelines for conducting
periodic reviews of AOP systems. It defines the types of reviews, describes
the methodology for conducting reviews and provides references for additional
guidance. The Guide for ADP Reviews will be used for AOP system reviews by the
Office of Information Resources Management (OIRM). It is also intended for use
by program managers in other parts of the Agency to carry out reviews of their
own ADP systems and services on a regular basis, as required by the ADP Manual.
The AOP review activity complements other required reviews. For example,
the results of ADP reviews should aid EPA in complying with the requirements of
Federal statutes and regulations such as the Federal Managers Financial
Integrity Act (P.I. 97-255) which requires agencies to evaluate their systems
of administrative control, OMB Circular A-123 which requires agency heads to
develop and maintain adequate systems of internal control, and OMB Circular
A-71, which defines the policy and responsibilities for the development and
implementation of computer security programs by agencies. The AOP review
program and methods described herein also respond to the Paperwork Reduction
Act (P.L. 96-511) which states that each agency shall periodically review its
information management activities and, in particular, ensure that its
information systems do not overlap each other.
In EPA, all ADP review activity will be coordinated with the Office of the
Inspector General (OIG) which has the lead responsibility to perform
independent reviews of EPA's activities. OIRM will work closely with OIG in
all phases of the AOP review and will share with OIG all materials and reports
generated through the ADP review activity. Program managers conducting their
own ADP reviews should adopt the same policy and are encouraged to consult with
both OIG and OIRM in planning system reviews.
-------
1-1
1. INTRODUCTION
1.1 Purpose
This guide provides information for individuals who will be planning and
conducting ADP reviews as part of the evaluation program within the Office of
Information Resources Management, and for program managers and staff in other
parts of the Agency who will be reviewing their own AOP systems as directed by
the AOP Manual. An ADP Review, for the purpose of this guide, is defined as an
evaluation of an information system, automatic data processing (AOP) equipment,
operations, or an AOP organization, to determine if the intended or expected
functions are being accomplished. The general purpose of an AOP review is to
improve management of information resources by ensuring that ADP systems and
services are operating efficiently, effectively, and in compliance, with
standards, operating procedures, and policies.
AOP review is primarily an oversight function. It is not the primary
agent for improving the cost-effectiveness, of AOP service delivery, for
instance. Rather, it is t.-.a evaluation of the effectiveness of existing
controls of AOP systems and services which should ensure that they are being
managed in compliance with standards, operating procedures, and policies.
The guide provides a conceptual framework for the planning and execution
of reviews. The guide is in the form of a general review methodology in
Chapter 2 and discussions of the method, scope, and objectives of different
types of AOP reviews in Chapters 3 through 6. The guide is both methodological
(addressing the substance of how to do a review) and procedural (describing the
protocols and steps for approval of review-related products).
1.2 Background
In 1982, the Assistant Administrator for Administration initiated an AOP
Systems Evaluation Program in consideration of the Agency's $50 million annual
expenditure on AOP systems development, operations, and equipment. Three
prototype AOP reviews were conducted, focusing on various aspects of
-------
1-2
applications systems.
The ADP mission review of the hazardous waste information systems focused
on the information requirements and systems of the offices responsible for the
Resource Conservation and Recovery Act (RCRA), the Comprehensive Environmental
Response, Compensation, and Liability Act (CERCLA), and Section 311 of the
Clean Water Act. The review covered the Offices of Solid Waste and Emergency
Response (OSWER), Legal and Enforcement Counsel (OLEC), and Research and
Development (ORD).
The Storage and Retrieval of Water Related Data System (STORET) is used to
store, retrieve, and analyze information about the quality of water in the
nation's streams, rivers, and lakes. It is used by EPA, other Federal
organizations, and by states. The ADP management review of STORET focused on
the efficiency and effectiveness with which EPA manages an existing system.
The Grants Information and Control System (GICS) is a management
information system developed in 1972 as a tool to support EPA Headquarters
offices in administering the Agency's grant programs. GICS has been expanded
to support the needs of EPA Regional Offices and state agencies. The technical
review of GICS focused on management direction of the system and efficiency of
the software.
OIRM recently directed a review of the National Computer Center (NCC),
encompassing operations, asset control, security, and overall management.
1.3 Types of ADP Reviews
An ADP review is characterized by the type of the review (a system, a data
center, etc.) and by the purpose of the review. The Agency's Evaluation
Program consists of the following types of reviews:
Application System Reviews
-- Operational reviews
Post-implementation reviews
Reviews of Families of Systems
-------
1-3
Data Center Reviews
Reviews of Local Computing Capacity
1.3.1 Application System Reviews
The term application system review includes reviews of operational
systems, "post-implementation reviews," and reviews of families of systems. In
general, an application system review investigates whether a single system or
group of related systems is meeting the needs of the user community in a
cost-effective manner and whether enhancements or changes in operating
procedures are warranted.
There are three categories of operational application systems reviews:
(1) A technical review is an ADP evaluation of such areas as
system functional capabilities, performance, data quality,
system design quality, system coding quality, responsive-
ness, security provisions, back-up processes, internal
review trails, operational procedures, user support,
training and documentation. A technical review does not
address the degree to which the system meets users'
perceived information requirements, or the
cost-effectiveness of the system in meeting the Agency's
programmatic objectives.
(2) The middle category, an ADP system management review, looks
at the management processes used to plan develop, refine,
operate and control usage of the system, considering cost,
staff, contract support, user satisfaction and capacity
management. The results of the review are recommendations
concerning management control improvements to ensure
efficient and effective operations.
(3) The broadest review category, a mission support review,
entails determining whether legislative or regulatory
-------
1-4
requirements continue to exist and whether budget and staff
support are available. At the same time, an evaluation is
needed of how well the system is supporting the mission, how
useful the system is for management decision-making or
operational processes, and how cost-effective the system is.
The results of a mission support review are analyses of the
cost/benefit of the system -- the overall cost of the system
versus the importance of the mission the system supports.
These three categories are useful to identify the principal type of a
review, but there is inevitable overlap in the information collected and the
issues raised. Chapter 3 "Application System Reviews" discusses the common
information essential to all information system reviews, and identifies
components that are typical of the three categories of reviews listed above.
A mission support review should generally precede the AOP system
management and technical reviews. This would ensure that scarce AOP review
resources are applied only to systems where there is a definite or essential
mission need for the system. Where there is no question of a current mission
need for the system, ADP management and technical reviews can be carried out
without a prior mission support review, and may be conducted concurrently
within a single applications review.
A post-implementation review is an evaluation of an operational system
soon after it is first installed. The purpose of a post-implementation review
is to validate that the installed system meets the functional and performance
goals set for it at the time the system was approved for development, and to
assure that the system's controls are adequate and operating reliably. A
post-implementation review also includes the actual development costs against
the planned costs, and examines such new-system components as training,
documentation, and initialization of data bases.
1.3.2 Reviews of Families of Systems
A review of a family of systems examines a group of related systems
to determine if they could be combined or integrated to function better, and
-------
1-5
whether, as a group, they adequately serve the needs of their users.
1.3.3 Data Center Reviews
A data center review investigates whether existing equipment is
adequate to provide ADP services and whether it is being operated most
efficiently from both the technical and management viewpoints. AOP equipment
reviews frequently have components of security, operations, cost accounting and
resource utilization, pricing of services and accurate recordkeeping.
A data center review generally does not address the specifics of the
applications run on the computer, except as they require aggregate computing
resources to operate.
1.3.4 Reviews of Local Computing Capacity
A review of local computing capacity is an evaluation of computer
operations in which the user, the operator, and the management of equipment are
not clearly separated. This is true when computing equipment (including
equipment to support office administrative functions) is installed in Agency
offices and laboratories and is highly accessible to staff to operate as
needed. Such reviews are a hybrid of application system and data center
reviews: they address the substance of the "applications" being operated to
adequately serve the users, and they examine the efficient use and security of
the equipment itself.
1.4 Federal Guidelines For ADP Reviews
The General Accounting Office has published several guides designed to aid
-------
1-6
in planning and conducting ADP reviews. These are described below.*
(1) Questions Designed to Aid Managers and Auditors in Assessing
the ADP Planning Process (Special Publication, Document
Number 119637).
This publication cites 58 elements considered essential to
good -ADP planning. These essential elements were identified
from research in the literature and from the results of GAO
reviews of Federal agencies, and have been amplified into
specific criteria which can be used as a reference base for
evaluating the management of the AOP planning process. This
is done by use of a question and answer format.
(2) Assessing Reliability of Computer Output (GAO-AFMO-81-91).
(3) Evaluating Internal Controls in Computer-Based Systems
(GAO-AFMD-81-76).
This is a very substantial document that covers a wide range '
of technical and management topics. It includes a textual
narrative of the principles in each area, plus numerous
detailed questionnaires that can be adopted for EPA use.
The Office of Management and Budget (OMB) issues occasional directives
that relate to agencies' internal review programs. Although these primarily
prescribe agency-level policies, the OMB directives provide part of the
GAO publications are available in limited quantities at no cost through
the GAO Document Facility in Washington, O.C. The telephone number for
ordering them is 275-6241 (FTS).
-------
Government-wide guidance on reviews. The following OMB documents are
especially relevant.
(1) Circular A-71 "Responsibilities for the Administrative and
Management of Automatic Data Processing Activities"
Transmittal Memorandum No. I.
Requires reviews of sensitive computer applications every
three years, and prescribes computer security requirements.
(2) Circular A-123 "Internal Control Systems" (revised August
16, 1983).
Prescribes agency policies on internal controls (revised
pursuant to the Federal Managers Financial Integrity Act).
The National Bureau of Standards publishes the "Computer Science and
Technology Series," some of which are directly applicable to AOP reviews. For
example, NBS's computer security publications are especially relevant, and the
Management Guide For Software Documentation (NBS Special Publication 500-87)
contains an excellent guide to the literature on documentation of ADP systems.
To obtain information about publications in this series, write to:
ICST
A209 Administration
National Bureau of Standards
Washington, O.C. 20234
and request NBS Publications List 88, "The Computer Science and Technology
Series." That document includes the various order forms.
Appendix A of this Guide lists additional GAO and OMB documents related to
ADP reviews.
2OMB documents are available at no cost from the OMB Documents Office. The
telephone number to order copies is 395-3610 (FTS).
-------
2-1
2. AOP REVIEW METHODOLOGY
2.1 Overview
Successful reviews of AOP systems depend on the reviewer's having a good
understanding of basic evaluation methods and then applying creativity and
insight to the particular situation at hand. This chapter provides the basic
framework around which every review is built. An AOP review has five phases:
Scoping and Planning
Collecting Information
Synthesizing Information and Ranking Problem Areas
Formulating Recommendations
Preparing the Final Report
Each of these phases is described below. The methodology in this chapter
applies to all reviews performed by the AOP Evaluation Program. Specific
review components for different types of reviews are in Chapters 3 through 6.
2.2 Principles For Conducting Reviews
There are a number of principles that a reviewer should keep in mind when
conducting AOP reviews. These principles are not specific methods or stand-
ards, but they provide guidelines to the proper attitude for conducting
reviews.
(1) Question Everything.
The successful reviewer learns to question every aspect of the
activity being reviewed. Learn not only what is done, but understand
the more subtle issues of how and why things are done. The table
below illustrates this approach.
-------
Instead of asking
When is it done?
2-2
You should ask
Why is it being done then?
When should it be done?
What would happen if it were not done
then?
How is it done?
Why is it done this way?
How else could it be done?
How should it be done?
Who does it?
Who else could do it?
Who should do it?
(2) Don't disrupt operations.
Do not permit the review of a particular system or operation to
impede the operation or progress of the system or any concurrent
programmatic activities. Coordinate the review with the management
of the reviewed activity.
(3) Respect protocol.
Always give notice of your intention to perform a review. . Conduct an
"entry interview" and an "exit interview". Section 2.4.1 has
guidelines for "Protocol For Initiation of the Review." Question
and/or survey program staff concerning their requirements or
evaluation of systems only with the prior knowledge and approval of
their managers.
(4) Respect confidentiality of sources.
When an individual provides information to a reviewer, whether in an
interview or a survey, that information is presumed to be given in
confidence. Do not disclose the sources of information to anyone
outside the review team. This condition is desirable to encourage
-------
2-3
full and open disclosure. If for any reason strict confidentiality
cannot be maintained, try to make all individuals within the reviewed
activity aware of any potential disclosure.
(5) Encourage mutual respect between reviewers and the staff of the
reviewed activity.
Remember that staff being reviewed typically operate under priorities
and pressures that affect how they specify, design, build, and
operate information systems. Try to understand these circumstances
and respect the professionalism of the staff being reviewed. Try to
discover, not criticize. By maintaining an attitude of respect and
objectivity, you will encourage the cooperation and respect of the
staff being reviewed.
2.3 Phase 1: Scoping And Planning A Review
Each fiscal year the managers of the ADP Evaluation Program establish the
basic OIRM agenda for ADP review. This section contains guidelines on how t~
establish the scope and define the plan for individual reviews. In other
program areas, planning of ADP reviews may be part of the program planning
done at the beginning of the operating year. Other ADP reviews may be
initiated during the year if a major issue arises.
The manager of the review team should complete the plan for each review.
The standard outline for the ADP review planning document is shown in Figure
2-1. The scope of the review corresponds roughly to items 1-6 in this
outline. The scope should be agreed upon before detailed task planning is
done. The task plan corresponds roughly to items 7-9.
The plan must be approved before the data collection phase can begin.
Frequently, several drafts may be necessary to reach a consensus on the scope
and methodology for a review. The review plan may be in a single document or
it may be in two documents, one a "scoping" paper and the other a detailed
"analytical plan" as was done for the STORET review. In either case
-------
2-4
FIGURE 2-1
Topics To Be Included In Review Plan
1. Background
2. Client of the Review
3. Purpose of the Review
4. Systems, Equipment and Organization(s) to be Included in the Review
5. Resources to be Applied
6. Review Components
7. Data Collection Methods
8. Sequence of Tasks -- Milestones and Oeliverables
9. Preliminary Outline of Final Report
-------
2-5
essentially the same subjects should be covered.
2.3.1 Scoping A Review
The scope of an ADP review consists of the following elements:
Review "client"
Purpose of the review
System(s), equipment and organization(s) to be included in
the review
Resources to be applied to the review
First, it is important to identify the "client" of the review -- the EPA
manager who will formally receive the report and who will have the
responsibility to carry out the recommendations. In most cases the client
will be relatively easy to identify. In other cases (for example, a review
dealing with a family of systems) the client may be difficult to identify, and
there may be more than one client if the review cuts across program areas.
It is not feasible to review every aspect of a system. The value of
scoping an review is to focus the subsequent phases on those components con-
sidered most important. The scoping phase starts with the general type of
review (application system, data center, local computing capability, etc.) and
the category of the review (mission support, management, technical), and
proceeds to narrow the focus to a set of components which is manageable given
the resources available. For example, the management review of the STORET
system focused on the following components and subcomponents:
t User Support
Training
Documentation
User assistance
Timesharing Budgeting and Billing
Timesharing budgeting
Billing administration
-------
2-6
System Operations and Management
-- NCC and STORET/Washington operations
STORE! enhancements
The statement of review purposes constitutes the top of the "top-down" plan-
ning process described in the following section. In some cases, it may useful
.to specifically identify issues that were considered for a review out were
determined to be out of scope. This will help avoid misunderstandings while
the review is in progress if these issues arise again.
The individuals to be included in the scope of the review and the re-
sources to be applied must be established. These two are largely inter-
dependent. The nature and number of staff to be interviewed or surveyed must
be consistent with the review purpose and with the resources available. ' For
example, a technical review of a system must cover the individuals familiar
with the details of the system's design and operation; a management review of
a national system like STORET may not be able to include all users.
Frequently, the total resources for a review are allocated before the
scoping phase is completed. In this case, the process of resource allocation
is the distribution of those resources to the different phases in order to
maximize the effectiveness of the review. The review must be carefully scoped
to coincide with the resources available.
The review plan should include a high-level estimate of direct costs
(staff time, contractor costs, time of the reviewed organization, and travel
costs) of the review. It is not necessary to estimate costs for each phase
and task. Figure 2-2 shows the standard format for estimating review costs.
The total labor as shown in Figure 2-2 should be consistent with the labor
distribution as shown in the Gantt chart in Figure 2-4.
2.3.2 Task Planning
A task is a unit of work with a definite start and end. Typically
a task has a product or event that signifies completion of the task. Such
-------
2-7
FIGURE 2-2
Cost Estimate Form
RESOURCES ALLOCATED
PROJECT TITLE: Hazardous Waste Support Systems
(Agency-Wide Strategy and Mission Support Review)
PERIOD OF PERFORMANCE: March 1, 1982 - November 5, 1982
PROJECT LABOR EST. DAYS EST. COST
Lead Computer Analyst
Clerical
Junior Analyst
TIMESHARE
TRAVEL
OVERHEAD
CONTRACTS
SPECIAL EQUIPMENT
128.0
38.8
3.0
$ 22,528.00
2,483.20
384.00
$ 2,000.00
$ 1,101.00
$ 3,444.00
5 0
$ 0
TOTAL COMPUTER DIVISION
OTHER EPA LABOR
EVALUATION DIVISION ANALYST
5 31,942.20
$ 9,040.00
TOTAL PROJECT
Estimated Impact on
Type of Contract
Senior Level
Interviews
Headquarters
Regions (4)
COSTS
Program Staff:
Estimated
Number
People
7
20
32
$ 40,980.20
Estimated
Total Hours/
Person
16
1
1
Estimated
Total Hours/
Contract Type
112
20
32
Total
59
164
-------
2-8
products or events are called milestones because they are observable facts
that can be used to measure the progress of the review. A task may have
several milestones, or only one at the end of the task. The combination of
all the tasks constitutes the review. Task planning is the definition and
structuring of the work that will be done in the review, including:
0 Identification of review measures
Selection of data collection methods
Work breakdown into tasks
Scheduling of tasks
2.3.2.1 Identification of Detailed Review Measures
The detailed plan for a review should be based on the
statement of purpose developed in the scoping phase. Each review is defined
by a series of attributes that define the review with increasing specificity.
Those attributes are as follows:
t Type of ADP Review - classifies an evaluation review of an
application system, a family of systems, a data center, or
local computing capacity. This classification was introduced
in Section 1.3 Types of ADP Reviews.
Category - further defines an application system review as an
evaluation of the mission support, management, or technical
aspects of the system under review.
t Components of a review - are the specific areas of the ADP
environment which will be reviewed. Examples of review
components are "User Support," "Data Quality," and "Staffing
and Personnel." Components of a review should usually be
broken down into subcomponents that identify more specific
areas to be examined. Subcomponents for the components above
might inlcude "User Documentation," "Source Document Control
Procedures," and "Performance Appraisals."
-------
2-9
Objectives - are statements of the desired condition or action
which provide reasonable assurance that AOP resources are used
efficiently, effectively, and in compliance with standards and
regulations.
Measures or Indicators - are questions designed to help the
reviewer assess whether the components of a review meet the
stated objectives. Review indicators may lead, in turn, to
highly detailed questions that are specific to each system
under review.
The type of review, the category of review, and the purpose of the review
should be established before detailed planning begins. The detailed planning
establishes the components for the review and moves to more specific
subcomponents and indicators that address those subcomponents. This proce'ss is
shown in Figure 2-3, which illustrates part of the planning for the review of
the STORET system. Four components were identified as the focus of the review:
User Support, Cost Consciousness, System Operation and Management, and Data
Quality. In the User Support area, three subcomponents were identified:
Training, User Assistance, and User Documentation. For User Documentation, two
measures of performance were identified, with several detailed questions to
support the conclusions reached.
2.3.2.2 Selection of Data Collection Methods
The next step in task planning is to decide how the
information will be collected to address the detailed component/measures. The
methods available are:
Interviews
t Questionnaires and Checklists
Surveys
Mail
-- Telephone
Documentation Review
Direct Observation
Using the computer
-------
FIGURE 2-3
Derivation of Review Criteria or Measures
COMPONENTS:
USER SUPPORT
SUBCOM-
PONENTS :
MEASURES:
DETAILED QUESTIONS:
COST
CONSCIOUSNESS
SYSIEM OPERATION
AND MANAGEMENT
DAIA
OUAUTV
Do users hive up-to-date, com-
plete documentation!
Is the documentation Itself
of hlph quality?
How do users request documentation?
How long does It take for documental Ion to be
sent?
Do users have all relevant documents? Are they
up-to-date?
What procedures exist for updating user documenta-
tion?
How are these updates delivered to users?
Are these deliveries effective?
Does the docuenentatIon fully describe
all SIOREI functions?
Is the documentation clear, well-organized,
easy to understand?
Is It easy to Incorporate and use documen-
tation updates?
Is there any systematic way of Identifying
problems with the documentation (e.q. typo-
graphical errors, unclear examples) and
correcting them?
ro
i
-------
2-11
Not using the computer.
Different data collection methods may be applied to different review
components and in practice some combination of these techniques is almost
always used. Guidelines for using each of these methods are in Section 2.4
Phase 2: Collecting Information.
2.3.2.3 Definition and Scheduling of Tasks
The review tasks should be documented in the plan in the
form of a time line chart (a "Gantt chart") like that shown in Figure 2-4, and
also described in textual form. Tasks should be defined in enough detail to
permit tracking the status of the review (e.g., Are interviews complete? Has
the briefing package been completed?) but still leave the reviewer discretion
in the day-to-day performance of the review. Milestones should generally be
planned at least two weeks apart for a particular tasks. (Of course, planned
milestone dates for different tasks could occur on the same day.)
In scheduling tasks, allow for unknown contingencies, administrative and
supervisory duties, and time not available due to holidays and leave. Allow
for leave plans of key individuals who will need to be contacted during the
review, as well as for leave plans of the review team. The staff available,
the staff days needed to accomplish a task, and the scheduled dates for
completion must all be consistent.
2.3.3 Planning Reports
The following items regarding reports should be decided in
the planning stage:
Report(s) to be produced
t Intended recipient(s) of reports
Principal authors of reports
t Schedule for drafts and final reports
Most reviews should result in only a single, final report. Although some
-------
FIGURE 2-4
Gantt Chart
GICS EVALUATION SCHEDULING AND STAFFING PLAN
1ASK AM A
I.I Prepare MM! Mark fit*
l.t Collect Intonation to Support
Ik* Audit
o Prepare OtU Collection Pin
t Identify Interviewee*
Schedule tnd Conduct Inter-
view*
o fitthcr technical »nd Cotl
Oilt
o Review Software and Oocu-
cnUllon
1.1 Synlhetlie 0»U ind Identify
tnd Rank Problem Area*
o Document All flndlne,*
o Prepare * Ranked lilt of
Problem
IPA 8e*le» end Revltlont
1.4 Identify end Awlyie Alterna-
tive Solution*
o Identify Alternative*
o Define Ivaluallon hUhodol-
oojr
o Conduct Analyst*
o Brief IPA on Reiutt*
I.S foriNlal* ccooMndatlon*
and Prepare final Report
* Prepare Draft Report
8rlef IPA
o Prepare final Report
PMJICIS StHIOMI (UflKS)
1 - -1
1 , 1
1... _.l
1 m, ._., . 1
1 . . .)
1 1
1 1
1 1
1 I
IOIAI
rmjit) MAM ING (IMMMS)
10IA1
74
8)7
100
704
116
6J6
PROJICI
SUMRVISOR
4
4
4
4
16
PROJKI
NWIACIR
16
88
74
64
48
740
i». 1C (II.
(OWSW1AMI
4
40
74
n
70
I6U
AMAIIil
-
AHAini
40
17
16
17
lOU
AHAIISI
64
16
48
17
UO
-------
2-13
reviews have been planned to produce an Interim report, this is usually not a
good idea. "Interim" conclusions and recommendations are actually prelimi-
nary, and should usually be conveyed to the recipient verbally for the purpose
of receiving feedback and soliciting support. Preliminary conclusions that
are based on incorrect facts or that antagonize those being reviewed can only
have negative results.
The plan should state who will be principally responsible for the report
and the time allotted for producing the report. The schedule should allow for
the following steps.
Initial drafting and typing of report
t Review and revision of draft
Submission of draft for comment to the organization, being
reviewed
Review of draft based on comments
Final typing and production
One useful technique is to create a tentative outline and estimate the number
of pages likely to be included in each chapter or section, and then estimate
the number of pages per day likely to be produced. Keep in mind that most of
the text is to be original writing and that the substance of recommendations
and their phrasing is likely to be discussed and revised during the original
writing. The planning documents for Agency reviews have included estimates of
from 5 to 9 weeks for report writing, r'rom the time the team finished the bulk
of analysis until the final report was produced. The estimate for a
particular review should take into account the following aspects of the
report:
Complexity and scope of the review
t Expected number of recommendations
Experience of the principal author(s) in writing reports
Estimates were: GICS--9 weeks, STORET--5 weeks, Hazardous Waste--9 weeks;
NCC ~5 weeks.
-------
2-U
0 Production support (will contractor or EPA be producing report?)
Number of individuals outside the review team who will review and
comment on draft.
2.4 Phase 2: Collecting Information
The information collection phase follows approval of the plan for the
review. This section describes the procedures for initiating contact with a
reviewed organization, guidelines for review workpapers, and guidelines for
the various methods of collecting information for an review.
2.4.1 Protocol For Initiation of the Review
The principal managers of the system or organization to be
reviewed will usually be informed of the review when the decision to do the
review is first made, and they may even participate to a limited degree in the
task planning for the review. Once "the plan is approved, however, it is
important to inform the relevant managers in writing of the review's scope and
schedule.
An entry interview is the first formal meeting with the staff of the
organization being reviewed. All of the review team should be at the ^entry
interview, and the management and supervisors of the reviewed activity should
be there. In the meeting., the lead reviewer should explain what will be
expected of the staff, how long the review will take, how the report will be
handled, and why their system or organization was selected for a review. If
specific documentation was not requested in advance, it should be requested at
the entry Interview.
2.4.2 Workpapers
Workpapers are required to support the findings and conclusions in
the report. Workpapers should include the following:
t Notes on interviews and observations
Questionnaire and survey returns, including analysis
-------
2-15
All documentation collected in the process of the review
Programs and other computer output (runstreams, reports,
file lists, etc.)
t Internal review team memoranda and documents.
Collectively, the workpapers provide the evidence to support the conclu-
sions of the review. The criteria for evidence are that it must be sufficient
.(is there enough?}, competent (it is reliable and will it support the find-
ings?) and relevant (is it applicable?).
Good workpapers are not just a collection of documents. Workpapers
should be arranged and filed to present a coherent package suitable for
external review. Workpapers should be neat, legible, complete, and accurate.
The following guidelines apply to the creation of workpapers:
(1) Organize and separate workpapers by review and component. Label and
date every workpaper to identify the review it relates to.
(2) Use a consistent format for interview notes (date, interviewees, in-
terviewers, location, purpose of interview). Use brief, simple
statements ("bullets") where appropriate. Use "side captions"
liberally.
(3) Clearly identify the interviewer's consents within writeups to
distinguish them from interviewees' remarks.
(4) Keep workpaper bundles to a manageable size, generally no more than
1.5 to 2 inches thick.
(5) Hake large, bulky documents "standalone" appendices to the
workpapers.
(6) Allow time in the schedule for the preparation of workpapers.
-------
2-16
Review the workpapers periodically during the review for conformance with
these guidelines.
2.4.3 Guidelines For Interviewing
Every review is likely to include some interviews. The following
guidelines are designed to make the interview process more productive:
(1) Consider group interviews. Group interactions sometimes bring out
points that one-on-one interviews fail to elicit. A group interview
also saves time. On the other hand, try to avoid scheduling a group
that would be dominated by one individual.
(2) Know the interviewees(s). Learn their position in the organization,
responsibilities, and possible biases before the interview.
(3) Tell the interviewee the purpose of the interview so that the
context is understood.
(4) Inform the interviewee if the intent is to keep the content of dis-
cussions confidential.
(5) Prepare for interviews with written questions. Verify that the
questions adequately relate to the review criteria.
(6) Repeat answers to important questions to verify what the interviewee
has said.
(7) Record key facts immediately. Don't depend on your memory.
(8) Conduct enough interviews, but not too many. Since interviews are
very time consuming, schedule only as many interviews as needed to
adequately address the review component.
(9) Try to independently validate statements made in interviews. For
example, ask several people the same question at different
-------
2-17
interviews, or corroborate statements with documentary evidence.
Don't accept as fact an unverified statement made in an interview.
2.4.4 Guidelines For Checklists and Questionnaires
Checklists and questionnaires are useful tools for collecting
information. The topics covered must be tailored to provide information that
is responsive to the components, measures, and criteria developed in the
planning phase. Questionnaires and checklists may be developed totally from
scratch, or they may be adapted from existing samples. In most cases a
"model" will exist but it should be customized for the particular review
being conducted. One of the best sources for prototype checklists and
questionnaires is the GAO guide "Evaluating Internal Controls in
Computer-Based Systems". That guide contains questionnaires that cover the
following major areas of review relevant to Agency ADP reviews:
(1) Background information on ADP department
(2) System design, development, and modification controls
- System development life cycle
- Documentation
- Program testing and system acceptance
- Program changes
(3) Data center management controls
(4) Data center protection controls
(5) System software controls
(6) Hardware controls
(7) Data origination controls
(8) Data input controls
(9) -Data processing controls
(10) Data output controls
The following guidelines apply to the development and analysis of new or
modified questionnaires for Agency ADP reviews.
(1) Derive the questionnaire directly from the components and measures/
Indicators. Ask "When this question has been answered, what will it
-------
2-18
tell me about the review components?"
(2) Plan the analysis of the questionnaire before distributing it. This
will frequently reveal problems with the way that questions are
posed. The analysis plan must precede, and then be part of, the
design of the questions.
(3) Define the target group precisely. If the target group contains
"subpopulations," include questions to identify them.
(4) Develop a brief set of instructions for the recipient.
(5) Pay attention to the physical layout and codes used for responses to
facilitate data entry and analysis.
(6) Include a control number on each questionnaire distributed, even if
the form also contains the name of the respondent.
(7) Test the questionnaire on a nonparticipant for clarity and ease in
understanding. For a large survey, consider a formal "pretest" to
detect problems in the survey form or questions.
(8) Use numeric codes for responses. Use a consistent code (e.g., "9")
for "Don't know" or "Not applicable" responses. That will simplify
computerized analysis of questionnaires. Avoid open-ended responses
like "Other (Specify)". Avoid zero as a valid code.
(9) As the first step in analysis, prepare a frequency distribution of
each question and examine the distribution for reasonableness, then
check for consistency among responses (for example a person who re-
ports "Have never used reports" should not also indicate that
reports are "very useful in my daily work").
2.4.5 Guidelines For Reviewing Documentation
The term documentation refers to any documentary evidence
-------
2-19
received in the course of an review not just technical documentation of
computer systems. Documentation may be collected to verify other evidence
(for example, statements made in an interview), or it may be principal source
of information. Index all documentation collected in the course of a review,
and include the index in the work papers. Determine whether each document
collected is to be returned or can be kept in the permanent work papers.
2.4.6 Guidelines For Direct Observation
Direct observation is one of the most valuable methods of
collecting information because the reviewer sees at first-hand the activity
being reviewed. Observation is particularly useful when an activity is
complex or when there may be reason to question testimonial evidence. For
example, the easiest way for the reviewer to determine that all access doors
to the computer room are actually locked is to test them. Similarly, the-best
way to cross-validate typical online response time may be to observe users
doing actual production work.
i
The principal risk of direct observation is that the observed activity
may not be functioning cypically at the moment it is observed -- whether or
not that is intentional. If the reviewer's intention is known in advance, the
doors may be locked, or online response time may be improved temporarily. In
general, if an reviewer observes a deficiency, then the deficiency probably
exists. If an alleged deficiency is not observed, then the reviewer should
probe further to reconcile the discrepancy.
2.5 Phase 3: Synthesizing Information And Ranking Problem Areas
In this phase of a review, the evidence collected is analyzed and
correlated with the objectives and criteria established earlier. Typically,
certain objectives will be fully met (the activity will get a "clean bill of
health" in those areas) and other objectives will either not be met or
additional data will be needed. Synthesizing information requires the
reviewer to reach conclusions based on the evidence, and the available
evidence is often circumstantial.
-------
2-20
The problem areas identified in this phase should be ranked according to
the vulnerability associated with each problem area. That vulnerability can
be assessed in terms of the size of the potential loss, the nature of the
loss, and the probability of a loss occurring. For example, a situation that
exposes the Agency to significant immediate risk of a major loss would have
top priority. The "loss" involved could include the inadvertent disclosure of
confidential data, for example, or the inability to provide information when
needed. The inefficient use of resources is a "loss" for purposes of ranking
problems, because unnecessary costs are being incurred by the Agency.
Synthesizing information and ranking problem areas are not discrete
tasks. They should be done several times when enough data is available to
make the results useful. At a minimum, evidence should be reviewed about
halfway through the data collection phase, and at the end before final
recommendations and report writing.
2.6 Phase 4: Formulating Recommendations
Recommendations are the specific actions identified by the review team to
correct or improve conditions revealed by the review. In the long run, the
recommendations are the most important part of a review because they determine
the effect that the review will have on the reviewed system or organization.
The following procedures apply to the formulation and review of recom-
mendations:
(1) When possible, .try to develop recommendations interactively with the
interviewees.
(2) Review the preliminary findings and recommendations with those being
reviewed. The initial review should" include only a verbal summary,
or at most a "briefing-type" paper.
(3) Correct any factual errors and revise the tentative recommendations
accordingly.
(4) Incorporate the revised recommendations in the draft report.
-------
Z-21
In formulating recommendations, keep in mind that all recommendations must
follow from undisputed evidence in the workpapers. The following guidelines
apply to recommendations in a report:
(1) Recommendations should be specific and clearly stated.
(Z) Recommendations must be objective. They must be based on facts that
the reviewee does not dispute, and they should avoid editorializing,
moralizing, or preaching.
(3) Recommendations must be realistic. For example, they must be
consistent with agency policy and reasonable from a budget point of
view.
(4) Recommendations should identify the party responsible for implement-
ing the recommendation. For example, the recommendation should be
phrased "The Support Service Branch should alter the weekly reports
to include month-to-date usage," instead of "The weekly reports
should be changed to include month-to-date usage."
(5) Avoid recommendations for further study or periodic reports.
(6) If alternative courses of action (recommendations) are acceptable,
they should be described, with their advantages and disadvantages.
(7) Recommendations with significant costs should have their approximate
resources described. All recommendations should have their benefits
clearly spelled out.
(8) Avoid "hang-on" recommendations not logically related to other
recommendations or not supported by the evidence.
(9) Avoid trivial recommendations. They give the impression of "nit-
picking" and distract from the more substantive items.
-------
2-22
2.7 Phase 5: Preparing the Final Report
The final report is the end-product of most reviews and the aspect of an
review that most people will remember and reference. It is important, there-
fore, that the report fairly, accurately, and completely reflect the findings
and recommendations of the review team. The following procedures apply to
reports prepared in OIRM:
(1) Initial findings and recommendations will be presented in the
form of a briefing to the Director, Office of Information and
Resources Management.
(2) The draft report will be sent to the manager of the
organization under review for comment and correction.
(3) The report will be prepared in final form and sent to the
Director, OIRM.
(4) The Director, OIRM may then formally transmit the review report
to the Assistant Administrator for Administration and Resources
Management.
(5) OIRM will share with the OIG all materials collected and
reports written.
When a program office performs a review of its ADP systems, it should
follow the procedures described in Chapter 2, Section 11 of the ADP Manual.
(Appendix C of this Guide). It should submit the final report to OIRM for
review of Its technical content. OIRM will return the report with its
comments. Program managers are also encouraged to consult with the OIG
concerning the contents of its ADP reviews.
2.7.1 Guidelines on Content of Review Reports
In addition to the guidelines for recommendations in Section 2.6,
-------
3-1
3. APPLICATION SYSTEM REVIEWS
3.1 Definition and Scope
An application system is a coherent set of automated procedures, computer
programs, and data files that serve a defined function. For review purposes,
an application system also includes the Agency program policies and goals that
justify the system, its users (broadly interpreted), the manual procedures
that precede and follow computer processing, and the input data and output
products of the system. An application system review investigates whether a
single system or a group of related systems is performing its intended
function in a cost-effective manner, and whether enhancements or changes in
operating procedures are warranted.
The components, or specific areas of the ADP environment under review,
may be at a general level, addressing the overall costs and benefits of the
system; they may be at an intermediate level addressing the effectiveness of
the system in meeting users' requirements; or they may be at a detailed
technical level addressing the data processing design and controls to ensure
proper processing. These three categories of reviews are called mission
support reviews, management reviews, and technical reviews. There is also a
core of background information essential to every application system review,
no matter what its purpose. The relation between these three categories of
application reviews is shown in Figure 3-1. The components covered in a
management review frequently overlap the technical components and the mission
support components. However, technical reviews and mission support reviews
are so different that their only overlap is in the core background information
related to the systems under review.
3.2 Components of a Review of Operational Application Systems
This section identifies potential components for reviews of operational
application systems. Post-implementation reviews are considered in Section
3.3. The information listed below under system background data should be doc-
umented for every application system review regardless of the review's
components. Components typical of technical reviews, management reviews and
-------
3-2
FIGURE 3-1
Overlap In Components of an ADP Review
Technical, Management, Mission Support Reviews
TECHNICAL REVIEW
SOURCE DOCUMENT CONTROL PROCEDURES
INPUT CONTROL PROCEDURES
PROCESSING CONTROL PROCEDURES
OUTPUT CONTROL PROCEDURES
DATA BASE INTEGRITY PROCEDURES
FILE AND DATA BASE STRUCTURES
SOFTWARE QUALITY
TECHNICAL DOCUMENTATION
CONTINGENCY PLANS
MANAGEMENT
REVIEW
SYSTEM
BACKGROUND
DATA
USER SUPPORT
USER UNDERSTANDING
DATA ACCURACY
OPERATIONAL INTEGRITY
RESOURCES
MAINTENANCE
COMPLIANCE WITH REGS
AUDITABILITY
MISSION SUPPORT REVIEW
PROGRAM GOALS
PROGRAM ORGANIZATION AND RESOURCES
INFORMATION SYSTEM SUPPORT FUNCTIONS
INFORMATION SYSTEM RESOURCES
EVALUATION OF INFORMATION SYSTEMS SUPPOR
-------
3-3
mission support reviews are listed in Sections 3.2.2 through 3.2.4.
3.2.1 System Background Data
The core system background data for every applications system
review includes the following:
(1) Owner of system
Identify the program, office, and individual(s) who are
principally responsible for the existence of the system.
For systems operated and maintained by the Agency's ADP
organization, this owner will generally be the office or
division that provides the resources to support the
system. .
(2) Users of system
Identify the principal offices and individuals who use
the products of the system, and the nature and frequency
of their use. Determine to what degree they depend on
the system (its criticality to them).
(3) Operator of system
Identify the office and individual(s) who have principal
day-to-day responsibility for operating the system --
submitting runs, performing program maintenance, dealing
with Irate users. For some systems, "operations" may be
performed by one or more program offices, and "mainte-
nance" may be performed by the Agency's ADP organiza-
tion.
(4) Functions and products of the system
Identify the principal products of the system and what
-------
3-4
they are used for. Identify other functions (e.g.,
online query, archive) that the system serves.
(5) Data processed and stored In system
Identify the types of data processed by the system
(record types and principal data elements). What real
life entities are represented in the data base, and are
they represented at a "transaction" or "summary" level?
(6) Processing environment
Identify the hardware, software, and telecommunications
used by the system. For a system that operates at NCC,
it is only necessary to state whether the system runs on'
IBM, UNIVAC, or othrer processors, since the NCC hardware
environment is well documented. Specify software
generically -- COBOL programs, COBOL with ADABAS
interface, RPG, etc.
(7) Processing cycles
Describe the major updating and reporting cycles.
Identify on-demand update and reporting capabilities.
3.2.2 Technical Review Components
(1) Source document control procedures
Identify control procedures over manual preparation,
collection and processing of source documents to make
sure no data are lost, added, or altered prior to
conversion of data to machine readable form. Potential
indicators for this component include:
a. Is there adequate separation of preparation,
-------
3-5
approval, and data entry duties (this may not
apply to certain systems like project tracking
systems)?
b. Does the system use customized data entry forms
(numbered, if appropriate) designed to minimize
errors?
c. Are there authorizing signatures on all source
documents?
d. Is there a data control group to monitor input
of transactions?
e. Are there documented procedures for preparation
and submiss-ion of forms?
f. Are there effective procedures for treatment of
error transactions (logs, followup, etc.)?
g. Is there adequate feedback to the source of
data to report successful entry of
transactions, or to correct error transactions?
(2) Input Control Procedures
Identify control procedures over data entry to assure
that data are input accurately and with an optimum use
of computerized validation and editing. Potential
indicators for this component include:
a. Is there an effective data processing control
group to monitor batch numbers, transaction
counts, etc?
b. Does the system have controls to avoid
-------
3-6
duplicate processing?
c. Are edits documented in a user manual or data
dictionary? Are error messages clear and well
documented?
d. For online systems are there adequate access
controls and update acknowledgements? Does the
system use failsafe update procedures, rigorous
edits, and formatted data entry?
e. Do the system's data entry procedures include
check digits or other automatic validation
techniques?
f. Are there controls to prevent bypassing of data
edits?
g. Are procedures documented for handling rejected
input data?
h. Is there an effective error suspense file? Are
transactions in suspense monitored?
i. Is there effective monitoring of error rates
and accumulated error volumes?
j. Are transactions edited as much as possible --
not just rejected after the first detected
error?
(3) Processing control procedures
Identify control procedures to assure that data are
accurately processed through the application and that no
data are added, lost, or altered during program
-------
3-7
execution. Potential indicators for this component
include:
a. Are there up to date operator instructions (for
online users or batch system operator, as
appropriate)?
b. Do run-to-run controls include tape label
checks, cycle number checks, control total
checks, etc?
c. Does the system check that input record counts
equal output record counts?
d. Does the system perform data edit and
validation early in the processing cycle to
prevent file update based on erroneous data?
e. Does the system create adequate review trails
of file updates -- especially for direct update
by online users? (Ideally, the review trail
should include a before and after image of
updated data, with a date/time and a user-ID
stamp.)
f. Are there adequate controls for coordination of
distributed data bases?
g. Are there adequate procedures for handling
rerun jobs, including notification of users?
(4) Output Control Procedures
Assess control procedures designed to assure that data
processing results are reliable and that reports are
distributed to users in a timely manner. Potential
-------
3-8
indicators for this component include:
a. Are all system outputs identified by
application, title of report, date and time
prepared, effective date or period covered, and
user name or distribution code?
b. Are there written instructions to explain to
users the procedures to reconcile and balance
reports and totals?
c. Are reports reviewed for quality assurance
before their release or distribution to final
users?
d. Is access to output devices secure, and are
finished reports secured to prevent
unauthorized access to sensitive or
confidential data?
e. Are procedures to explain proper handling and
distribution of outputs documented and reviewed
with operations staff?
(5) Data Base Integrity Procedures
Assess the system's data base design and software
controls to assure that the files and data base used by
the application are protected from inadvertent damage,
and that there are provisions to restore the files in
case of data loss. Potential indicators for this
component include:
a. Are restart/recovery procedures defined? Are
they tested periodically?
-------
3-9
b. Are there adequate periodic file backup
procedures?
c. Are DBMS access controls and data protection
mechanisms used effectively?
d. Is the data base effectively isolated from
arbitrary update from any source?
e. Is a periodic "data base integrity" scan made
of the entire master file to check for valid
codes and internal consistency of data values
and record structure?
(6) File and Data Base Structures
Review the file and data base design to ensure that
economic and effective use is made of bulk data storage
appropriate to the processing requirements of the
application. Potential indicators for this component
include:
a. Are logical data structures designed to
facilitate ease of programming and adequate
representation of the system's data?
b. Are physical data structures designed for
efficient use of storage and for efficient
program operation?
c. Are data structures and data elements
adequately documented for users? For program
maintenance?
d. Are there adequate procedures to modify data
base schema (or the equivalent for non-DBMS
systems)?
-------
3-10
(7) Software Quality
Review the module structure, -job control procedures, and
program code to assure that application software meets
operational requirements and technical standards.
Potential indicators for this component include:
a. Is application software performing according to
specifications?
b. Are programs, data bases, and procedures
designed to be modified with little difficulty?
c. Are job execution parameters specified for
optimum performance? Are there adequate
run-to-run controls? Is operator documentation
adequate?
d. Is software designed to be modular? Is the
choice of languages appropriate?
e. Is program code readable? For example, are
there adequate comments? Is the format of
program statements consistent? Are data and
program names used consistently? Is the code
efficient? Is the code well structured?
f. Are testing procedures adequate? For example,
1s there a test data base? Are there test
libraries and are they used in a structured
way? Is there adequate control over test and
production versions? Are test cases well
documented?
-------
3-11
(8) Technical Documentation
Review all application-related documentation for
appropriate contents and style including accuracy,
currency, availability, and readability. Potential
indicators for this component include:
a. Are there alternate documents (User Guide, Program
Maintenance Manual, etc.) for different audiences?
b. Do all documents conform with FIPS and EPA
documentation standards?
c. Is responsibility for creation, maintenance, and
distribution of documentation clearly assigned?
d. Is documentation available to those who need it? Do
users have the most current version of documents?
«
(9) Contingency Plans
Assess the ability of the system to sustain a brief or
prolonged failure of computing resources. Potential
indicators for this component include:
a. Is an alternate computing facility available in
case of major service disruption?
b. What would be the effect of a service dis-
ruption on time-critical program operations?
c. Are there adequate plans for an all-manual
back-up operation if that is required?
d. Who has responsibility for initiating
alternative actions? Is that person aware of
this responsibility?
-------
3-12
3.2.3 Management Review Components
(1) User Support
Assess how well the system provides its community of
users the information they need. Potential indicators
for this component include:
a. Are reports and the data in them timely?
b. Are reports actually read and used? Are
duplicative manual records kept?
c. Are users trained adequately to understand
procedures and reports?
d. Are there established mechanisms for suggesting
enhancements?
e. Is report distribution appropriate?
f. Are there potential users (and uses) of the
system not being exploited? Are "ultimate" as
well as "direct" users known? Are their needs
provided for?
(2) User Understanding
Review how well the users and suppliers of data
understand the processing that takes place and the
meaning of information reported to them by the system.
Potential indicators for this component include:
a. Do originators of data understand the uses to
which information is put? Do they appreciate
the consequences of not reporting complete and
-------
3-13
accurate data?
b. Do users of reports understand the meaning of
data on the reports (both detail and summary)?
.
c. Do users have access to explanations of how
data on reports is generated (timeliness of
data; filters applied; algorithms for derived
variables; etc.)?
d. Do users have access to and use appropriate
documentation on the system?
e. Do users understand all the capabilities of the
system?
(3) Data Accuracy
Assess the overall reliability of data in the system's
data base. The components of this objective are broader
than the source document control procedures and input
control procedures listed under technical reviews'. The
assessment of reliability should examine actual data
error rates and relate them to all potential sources of
data errors. Potential indicators for this component
include:
a. What are the particular sources of possible
error in the system's data?
b. What error rates are perceived by users, and do
they affect the usefulness of the data base?
c. What is the correlation between the perceived
error rate and actual error rate in data base?
-------
3-14
d. What factors may prevent transactions from
being initiated when appropriate?
e. What facilities does the system have for
detecting and correcting errors, and resolving
data-related issues?
(4) Operational Integrity
Assess the reliability of the system's operation as
perceived by the system's users. Potential indicators
for this component include:
a. How much confidence do users have in the
reliability of system processing (as distinct
from errors in the data base)?
b. Are there known problems in the system? How are
problems, communicated to other users? What
methods are used for dealing with problems?
c. How well does the system deal with programmatic
requirements (e.g.* are special cases handled
correctly? Are program-specific calculations
performed correctly?)
(5) Resources
Review the resources to operate the system and the
allocation of resources to users and their organi-
zations. Overall costs should be reasonable and costs
should be born by the "owners" and "users" of the
system. Potential indicators for this component
include:
a. Are costs accounted for by the "owner" of the
system?
-------
3-15
b. What costs are recorded? How are they
reflected in the owner's formal budget?
c. How are different users' costs determined? Do
users account for and budget costs?
d. How are "direct" data processing costs
determined? Is the cost allocation reasonable?
e. Are there alternative computer processing
arrangements that would give lower costs -- to
the program? To the users? to EPA? To the
Government?
(6) Maintenance
Assess procedures for maintaining the applications
software and responding to new requirements
(enhancements). Potential indicators for this component
include:
a. Who performs program maintenance? Is the
assignment formal?
b. How are maintenance priorities established? Do
users have any input to maintenance priorities?
c. Are adequate resources dedicated to
maintenance? How are maintenance costs
allocated (labor and computer resources)?
d. Are there formal procedures for testing,
approval, and turnover of new releases?
-------
3-16
3.2.4 Mission Support Review Components
(1) Program Goals
Identify the Agency-level goals of the program(s)
supported by the system(s) under review. Potential
indicators for this component include:
a. What are the legislative goals of the program?
b. What are the agency short-term and long-term
goals?
c. What information is needed to measure success
in reaching those goals?
d. What is the system's relation to other Agency
programs and goals?
e. Who has responsibility for achieving those
goals?
(2) Program Organization and Resources
Determine the resources and the organizations within the
Agency responsible for achieving the program goals.
Potential indicators for this component include:
a. What Agency resources (staff and budget) are
dedicated to the program?
b. What Office(s) are responsible for achieving
the goals? What group(s) are reponsible for
information regarding program requirements?
-------
3-17
c. What is the relation between Federal, state,
and local governments for establishing
priorities, budget, and operational management
of the program or activity supported by the
system?
(3) Information System Support Functions
Determine how the Agency's approach to meeting program
goals depends on information resources. Potential
indicators for this component include:
a. What is the relation of information and
information systems to meeting program goals?
In particular, how is information used:
-- to support strategic resource allocation.?
-- to support short-term decisions?
to monitor progress against goals?
-- to provide summary information, or to
report details?
b. Does the system provide support that is more
similar to "file cabinet" functions or to
"decision support" functions?
c. Does the system use centralized input or
decentralized sources of data?
d. Is the system design based on centralized use
(and value) of information systems, or
decentralized use and value?
e. Does the program have an intrinsic dependence
on information systems?
f. Is there an "idealized" information system
model to support programmatic goals and
-------
3-18
operational requirements? How closely does the
actual system resemble the ideal system? In
what ways does it differ?
(4) Information System Resources
Identify the adequacy of the resources actually
committed to the program (past, present, and planned).
Potential indicators for this component include:
a. What is the history of information systems
developed in support of the program? Identify
the systems, users, cost, etc.
b. What current system(s) support the program?
Identify the system functions., users, and
costs.
c. What resources have been allocated to planning
information systems? What resources have been
requested?
(5) Evaluation of Information Systems Support
Correlate the actual resources, services provided, and
user satisfaction to the intrinsic requirements of the
program and the ideal information system model.
Potential indicators for this component include:
a. Is there a defined long-term information system
strategy? Is responsibility for such a
strategy defined?
b. Is the strategy appropriate to the scope,
information requirements, and geographic
dispersion of the mission?
-------
3-19
c. Are there conflicting goals (e.g., reduce
overhead costs, improve services) that make
planning and development difficult?
d. Are there overlapping data collection re-
quirements or duplicative data bases?
e. Are the information resources planned and
requested consistent with the value of
information to the success of the program?
3.3 Post-Implementation Reviews
3.3.1 Definition of Post-Implementation Reviews
A post-implementation review is the first evaluation of an
operational application system after the system has been installed and reached
a substantial level of stability. Depending on the system, this will
typically be from three months to one year after initial installation. A
post-implementation review includes explicit consideration of the development
cycle, including review of the feasibility study, detailed requirements
analyses, development plan, and actual development experience. The purpose of
a post-implementation review is to determine whether a new system is meeting
the users' expectations and needs that justified its development. The
existence of a post-Implementation review program also has some indirect bene-
fits in the development of new systems. The following benefits result from
the anticipation of a post-implementation review by users and system
development staff:
(1) It encourages the system team and users to define goals func-
tions, scope, benefits, and costs with a good degree of care in
anticipation of the post-installation review.
(2) It encourages ADP staff, users and reviewers to define suitable
measures of performance, early in the development phase,
-------
3-20
against which to evaluate system results.
(3) It provides an opportunity to uncover problem areas,
oversights, and unanticipated problem's for subsequent
resolution in a manner that leads to action, as opposed to
token resolutions of problems by users or the development team.
It is appropriate to conduct a "post-implementation review" after major
enhancements to an operational system, as well as after initial installation.
Although such a review may not be as extensive as an initial review, it will
have the same benefit of validating a major software expenditure.
Part of a post-implementation review should be a review of the original
statement of requirements to determine whether the requirements are still
valid. Several factors may account for differences between the requirements
analysis and the system actually installed.
(1) Management may not have approved all the requirements
identified in the requirements analysis for incorporation in
the initial automated system. The reviewer should try to
identify requirements finally approved as being within the
scope of the initial system. This may be difficult if this
stage was not well documented.
(2) Approved features may not be installed in the system.
(3) Requirements valid earlier in the system life cycle may no
longer be valid.
In addition, new needs often develop after the original feasibility study
and requirements analysis are completed, and it is important that a system
reflect the users' current requirements.
-------
3-21
3.3.2 Post-Implementation Review Components
(1) Effectiveness and Acceptance By Users
Determine whether the system is fulfilling its functions and
whether it is being used in the manner planned. Potential
indicators for this component include:
a. To what degree are users satisfied with the system?
b. How do the current requirements as seen by users
compare to--
o documented requirements analysis?
o system in operation?
Does system reflect current requirements?
c. Are there possible unexploited uses of system to'
meet needs not recognized earlier by users?
d. How effective is the user team/support team coopera-
tion?
e. Are the human engineering aspects of the system
adequate? E.g., are report layouts, and data entry
forms easy to use?.
f. What have been the trends in satisfaction and
reliability since initial installation (are major
problems continuing, or becoming less frequent?)
g. Are there any outstanding requests or plans for
system enhancements?
h. How effective are problem reporting and fixing
procedures?
i. What is the actual scope of the system compared to
users' original expectations?
(2) Development and Operating Experience
Determine how the time and cost of the actual development of
the system compared to the plan, and how the operating cost
and operational benefits compare to the original plan.
-------
3-22
Potential indicators for this component include:
a. How did actual development costs compare to
projected development costs?
b. How did the actual development schedule compare to
the projected development schedule?
c. How did the actual operating costs compare to
projected operating costs?
d. How do the actual compare to the projected
operational characteristics and benefits?
e. What are the causes and net effects of variances
from plans?
f. What have been the trends in usage and operating
costs since initial installation?
g. What lessons were learned from the development of
the system?
(3) Compliance with Standards and Controls
Determine whether the system as currently operating is in
compliance with applicable FIPS and Agency standards, and
whether all controls planned for the system are operating
effectively. Potential indicators for this component
include:
a. What were the originally planned system controls?
b. What is the effectiveness of operational controls?
c. Are there system shortcomings and control
weaknesses?
d. Are adequate resources devoted to monitoring data
accuracy?
e. Is program code in compliance with Agency program-
ming standards?
f. Are data definitions, naming, and use in compliance
with Agency data standards?
g. What is the impact of the system on other
-------
4-1
4. REVIEWS OF FAMILIES OF SYSTEMS
A review of a family of systems is an evaluation of certain aspects of
the systems that compose a family. A family of systems is a group of ADP
systems that are related in one of three ways:
(1) "Coupled systems" that have a dependent relation. These systems
usually can only function properly when combined with some other
systems in the same family. For example, a data entry system, a
mainline processing and data base system, and a customized reporting
system may be in the same family. Dependent systems of this kind
are usually treated as a single system for review purposes because
it is difficult to evaluate the operation of any one system without
examining the others at the same time.
(2) Systems related according to the broad Agency function that they
support. For example, administrative support systems are related in
this way. Although no two systems may contain the same data, these
systems generally are transaction driven, frequently support finan-
cial activities of the Agency (payments to vendors, employees,
grantees), and have much in common with similar systems in other
agencies. The function of such systems tends to be dictated by
Government-wide regulations.
Another family of this type might be research systems that contain
experimental data. These systems tend to have static or "add only"
data bases and to be used with sophisticated analytic and
statistical processing programs.
(3) Systems related according to the contents of the data bases in the
systems. The systems in this type of family are built around data
bases that relate in some way to a common set of "real world" enti-
ties. For example, the set of all hazardous waste systems could be
treated as a family; the set of chemical information systems could
be treated as a family; or the set of systems to track facility
-------
4-2
registration and permits could be treated as a family.
There is no single basis for defining a family of systems for a review, and a
system may be considered to be part of different families depending on the
purpose of the grouping.
The focus of this type of review is the attributes of the systems that
make them a family. For example, the users they serve, the data they contain,
the Agency functions or programs they support. One of the premises of such a
review is that the systems in a family (being related in some material way)
should be part of an integrated planning process. Because of the relations
among the systems in a family, a large change to one of them, or the creation
of a totally new system, could affect the other systems in the family.
The principal purpose of a review of a family of systems is to determine
whether the systems exist at the proper level of integration. For example, is
there excessive overlap in data collected and stored in different systems?
Are the owners of related systems aware of the data available in other systems
in the same family? Are the agency's overall objectives best served by sys-
tems containing data with similar (but inconsistent) definitions? Are there
steps that could be taken to increase the effectiveness of data systems or to
reduce the cost of data collection or data processing for related systems?
These issues are not generally addressed in reviews (even mission support
reviews) of a single system.
Integration of systems does not necessarily mean the redesign of existing
systems into one all-encompassing "super system." Frequently that would be
technically Inefficient, and of no value to users of individual systems.
Integration does mean that systems that store equivalent data should use the
same names for that data; that equivalent data in different systems is kept
synchronized; that equivalent information is entered only once into Agency
information systems; and that related data (even in different systems) can be
identified and combined for purposes of analysis.
Reviews of families of systems will frequently cross organizational and
programmatic boundaries in the Agency. Because of this, the issues raised may
-------
- . ' 4-3
be very sensitive, and the interests of one group will not necessarily be
consistent with the interests of another office or program. In looking at
families of systems, the reviewer must attempt to take an Agency-wide point of
view and to identify the benefits and the costs of the current configuration
of systems that comprise a "family."
The following components that relate to families of the systems should be
considered in designing a review of a family of systems. These components are
in addition to the appropriate components listed under 3.2.3 Management Review
Components and 3.2.4 Mission Review Components. The information described in
3.2.1 System Background Data should be collected for all systems determined to
be in the family.
(1) Family membership
Determine the precise basis for membership in the family of systems
and confirm the systems to be included in the review. Potential
indicators for this component include:
a. What are the primary attributes that define the family?
What are secondary attributes possessed by all (or most) of
the systems? The "system background data" is useful for
this.
b. Could the group of systems be made more homogeneous by
dropping inconsistent system(s) or adding system(s) that
share common attributes.
(2) Forily costs
Determine the resources dedicated to the systems in the family,
including non-Agency costs such as the public data collection
burden. Potential indicators for this component include:
a. What are the direct processing costs of the systems? What
are the data entry, program maintenance, training,
-------
4-4
and other related costs of the systems?
b. What are the costs (labor and computer resources) consumed
by transferring data among related systems and validating
data in different systems for consistency? (This occurs
frequently in administrative systems)
c. What are the costs of data collection associated with all
of the systems? Are external suppliers of data required to
provide the same (or similar) data to different systems or
different offices within the Agency? If so, are all those
systems included in the family being reviewed?
(3) Fa»ily Integration and Non-integration
Evaluate the degree to which the Agency might benefit from increased
integration among the systems in a family. Potential indicators for
this component include:
a. Are there systems in the family that appear to be likely
candidates to be merged (similar data, consistent
definitions, close linkages between systems)?
b. What is the effect of consolidating the program maintenance
function for related systems? Are there possible economies
from having one ADP group maintain related systems?
c. What 1s the rationale for keeping separate systems? For
example, does DBMS technology make it possible to maintain
different logical views of data?
d. How strong are the non-technical arguments for maintaining
separate systems? Are the owners of the systems giving
arguments based on their "proprietary rights" to the data
or arguments based on preferences for systems developed and
maintained locally?
-------
4-5
(4) Planning For Families of Systems
Determine the degree to which planning for the systems takes
into account those systems' roles as members of a related
family of systems. Potential indicators for this component
include:
a. Is there a short term plan for integrating systems in the
family? Is there a long term plan? What group has
responsibility for planning the "family level?"
b. Has funding been secured to implement the plan?
c. What is the role of the ADP organization in planning and-
executing changes for families of systems?
d. Are there analyses or functions that could be supported by
more integrated systems that cannot be supported
effectively by separate systems? How are such unmet needs
included in the family-wide planning process?
e. Do individual system managers and planners acknowledge
their roles as participants in a larger (family) planning
process?
-------
5-1
5. DATA CENTER .REVIEWS
5.1 Definition and Scope
A data center review is an evaluation of a major computer center,
»
focusing on the management and technical aspects of the center's operation,
excluding the individual applications that are run at the installation. The
Agency's major data centers are the main National Computer Center (NCC)
facility at RTP and the Washington Information Center at Waterside Mall that
serves as an extension of the NCC. The review components listed in this
chapter are meant as a guide for periodic reviews of the NCC. A single review
need not exhaustively evaluate every aspect of data center operation.
However, this chapter provides a "menu" of possible components to be
considered for any one review.
The review team for a data center review should be selected based on its
experience in evaluating data center operations and on its independence and
objectivity.
5.2 Data Center Review Components
The data center review components described below address the management
and technical functions of data center operations. Data centers may have
unique administrative problems. Data center managers must have an
understanding of the technology, provide service at competitive rates, and
deal with technically sophisticated personnel. The management review
components for a data center review are listed in the sections on:
Budgeting and Resource Accounting
Procurement Procedures and Asset Management
Staffing and Personnel
Planning
Overall Management.
The technical aspects of data center management deal directly with the
-------
5-2
operational procedures for the day-to-day functioning of the data center.
These objectives are less administrative and are generally the responsibility
of senior team leaders and first line supervisors. The technical components
for data center reviews are described in sections on:
t Operations
Performance Management and Capacity Planning
t Security
Disaster and Contingency Planning
Telecommunications.
5.2.1 Management Review Components
(1) Budgeting and Resource Accounting
Assess the data center's budgeting procedures for hardware, ser-
vices, and supplies. Assess the procedures used to account for
data center costs and to allocate costs to users of the
facility. Potential indicators for this component include:
a. What items are budgeted, and at what level of detail?
What is basis for out-year budget projections?
b. Do AOP budget submissions conform to Agency budget
request procedures? Are Agency budget schedules
consistent with schedule requirements for contract
renewals and negotiations?
c. Have budget limitations affected ADP operations or
level of support?
d. Is the formal data center budget consistent with
short-term and long range ADP plans? (See (4)
Planning, below.)
e. Does the data center account for the full cost of
-------
5-3
operations? Are costs recorded for --
Separate cost centers?
Hardware and software?
Individual applications?
Are software and hardware costs capitalized?
f. Regarding costs for individual applications --
How are applications identified?
How are non-hardware costs assigned to ap-
plications (overhead costs, and application-
specific costs).
How are DBMS-related and other shared costs'
allocated to applications?
How is priority processing accounted for and
assigned?
g. Do rates charged reflect actual costs of components
(CPU time, memory, disk storage, EXCPs, etc.)?
h. How effectively does data center report costs to
users?
Totals for an application or project?
Individual steps or sessions?
Are reports verifiable against users' run control
sheets?
i. Do users understand, review, and act on cost reports
to control future costs?
To discourage prime-time use?
To discourage high priority runs?
-------
5-4
To encourage efficient use of secondary
storage?
To compare actual billed costs to budget?
To initiate discussions on cost or perfor-
mance with the data center?
h. Are charging policies consistent with GAO guidance on
allocating the cost of computers? (See for example,
Federal Accounting Pamphlet No. 4, Guidelines For
Accounting For Automatic Data Processing Costs, GAO,
1978 and OMB Circular A-71, "Responsibilities For the
Administration and Management of Automatic Data
Processing Activities," 1978, and OMB Circular A-121,
"Cost Accounting, Cost Recovery, and Interagency.
Sharing of Data Processing Facilities," 1980.
(2) Procurement Procedures and Asset Management
Assure that proper procedures are used to acquire hardware, ser-
vices, and supplies. Review procedures to account for and con-
trol assets assigned to the data center. Potential indicators
for this component include:
a. Are staff familiar with Agency ADP procurement
procedures, and the terms of GSA ADP procurement
regulations in FPMR 101-35 and FPR 1-4.1.1?
b. Are proper procedures followed for all procurements?
Do Agency and contractor staff understand the limits
of their procurement authority?
c. Are deliveries verified against purchase orders for
items delivered, quantity, and price? Do ad-
ministrative procedures assure payment only for
delivered goods and services?
-------
5-5
d. Does the center maintain adequate physical inventory
records for all AOP equipment, including terminals and
modems?
e. Is portable equipment (terminals) adequately con-
trolled to identify current users and potential
losses?
f. Are inventory records periodically reconciled to
vendor invoices for monthly leases (including serial
numbers and special features)?
g. Is responsibility clearly assigned for monitoring
assets?
h. Is the facilities management contract providing
adequate control over Government Furnished Equipment
(GFE)?
(3) Staffing and Personnel
Review the appropriateness of staffing and personnel policies at
the data center, including staffing levels, staff selection,
definition of duties, performance evaluation, and training.
Potential indicators for this component include:
a. Evaluate the center's organization and authorized
Agency and contractor staffing levels. Are all
positions filled? What is the effect of unfilled
slots?
b. How are qualified personnel selected to fill va-
cancies? Are education and experience requirements
defined and enforced? Are new contractor personnel
submitted for approval?
-------
5-6
c. Is there an approved Position Descripton for position?
Are they reviewed and updated periodically? Oo staff
have a copy of their Position Descriptions?
d. Is there a career development program? Are career
paths well defined?
e. Is there an adequate training program to avoid
technical obsolescence? Is professional development
encouraged? Are training resource adequate (budget,
time allotted, training materials, access to non-local
training)?
f. Do individuals have an annual training plan?
Is it monitored by management?
g. Are there written procedures for personnel appraisal?
Are they followed?
h. Do employees understand the performance appraisal
system? Are they provided written performance
criteria at the beginning of each evaluation period?
Oo they receive written performance appraisals?
(4) Planning
Review the planning process, including planning for equipment,
applications, staffing, software, and physical facilities. Dis-
tinguish between short-term and long-range plans. Potential
Indicators for this component include:
a. Is there a formal planning methodology? Are the
policies, procedures, and guidelines sufficiently
clear and easy to follow? Who reviews and approves
plans?
-------
5-7
b. Identify the different plans used at the data center.
Are they appropriately detailed? Are they realistic?
Are they reviewed and revised on any regular schedule?
c. Who prepares plans? Do they reflect input from the
appropriate individuals and groups within the Agency?
d. Do plans reflect regulatory constraints (e.g.,
procurement lead times); life cycle projections of
hardware and software; availability of staff;
projected costs; alternatives?
e. Are goals and milestones specific enough to permit
objective evaluation of progress and accomplishments?
f. How is data center planning coordinated with planning
in the central ADP organization and planning for ADP
needs in the program offices?
g. Do plans adequately describe current staffing,
hardware, software, applications inventory, and
resource utilization? Is there a description of
budgets by group or facility; recent performance
compared to objectives; and overall strategy?
(5) Overall Management
Assess the effectiveness of data center management, with parti-
cular reference to level of service, relations with user groups,
and contract management. Potential indicators for this
component Include:
a. Has the center developed measures of user satis-
faction? Are they tracked and compared to goals?
b. Are measures of service level or satisfaction included
-------
5-8
in service orders? Are critical applications
monitored for service level?
c. Does management encourage adequate communication with
users? Is there a formal customer service function?
Is there a newsletter or current user's guide? Can
users reach data center management?
d. What measures of contractor performance are included
in the facilities management contract? What
incentives are used to encourage superior performance?
Do performance measures and incentives adequately
reflect the data center's performance objectives?
e. If an award-fee type contract is being used for-
facilities management, is the contract properly
administered according to the award-fee terms? Is
there adequate review of award-fee conditions for each
contract term? Has the contractor appealed any of the
fee determinations? Why?
f. Is the balance of responsibilities between Agency
staff and the contractor appropriate? Is there
adequate direction, supervision, and review of
contractor performance?
g. Is the organization and staffing level of the data
center appropriate? Are any areas overstaffed?
Understaffed? Are staffing imbalances due to budget
limitations, limitations of the facilities management
contract, management discretion, or other causes?
5.2.2 Technical Review Components
(1) Operations
Assess the efficiency of computer operations, with particular
-------
5-9
attention to safeguards and controls designed to protect the in-
tegrity of data processed at the center. Potential indicators
for this component include:
a. Review procedures for physical care and handling of
magnetic media. For example, are all tapes not in use
returned to the library? Are all tapes in cannisters
when not mounted?
b. Review procedures for control of tapes and disks from
the center's library and from external sources.
Procedures should include strict physical access
controls and logs of usage. An automated tape library
control system is desirable. All tapes should be
labelled and contain an expiration date.
c. How well does the data center address preventive
maintenance on hardware? Are policies documented?
Are users informed of data center availability?
d. Is new equipment fully tested? Are new media (tapes,
disks) acceptance tested?
e. How effectively does the center handle hardware
malfunctions? Are forms and procedures adequate to
permit maintenance personnel to isolate and correct
malfunctions?
f. Is there a complete and accurate inventory of system
software including release levels, options, local
patches, etc?
g. How does the center assure that software systems are
reliable? Are procedures for installing and testing
new products and new releases adequate? Are users
adequately informed of new software releases,
-------
5-10
including new features and possible incompatibilities?
h. Are errors detected in system software adequately
managed? Are system errors logged, analyzed, and
reported to the appropriate software maintenance group
(possibly a third-party vendor)? Are users informed
of known software problems?
i. How adequate are procedures pertaining to --
Cleanliness (smoking, food, trash, etc.)?
Shift changes?
Daily activity logs (malfunctions, job aborts,
unusual occurrences, etc.)
Personnel to be notified and actions to be taken
when emergencies or serious problems occur?
Handling of media and supplies?
Are procedures documented? Are they followed?
j. Is the physical layout adequate with respect to
efficient work flow? Addition of equipment? Emergency
procedures?
k. Are the responsibilities of operations staff clearly
defined, including backup for scheduled or unscheduled
absences? Is there one designated senior operations
person for each shift?
(2) Performance Management and Capacity Planning
Assess the center's efforts to provide adequate hardware and
software facilities at reasonable cost. Potential indicators
for this component include:
a. Is there a group formally responsible for performance
-------
5-11
management and capacity planning? What is the group's
authority? What are the group's qualifications?
b. How well has the center characterized its workload --
by subsystem (batch, ISO, ADABAS, etc.)?
t by user application?
0 by time of day or day of month?
c. How adequately is performance monitored? What are the
measures of performance?
d. How well have workloads been projected? What is the
basis for projecting workload?
e. Is there a current performance optimization program?
What tools are used? Is the performance after
upgrades compared to projected performance increases?
(3) Security
Assess the overall security of ADP resources, including physical
security and protection from unauthorized access to or tampering
with data or programs. Potential indicators for this component
include:
a. Does the center follow GSA guidance contained in FPMR
101-35.3 "Security of Federal ADP and
Telecommunications Systems," and 101-36.7
"Environmental and Physical Security"? Is the center
familiar with the NBS publications series on ADP
security? Are users made aware of their
responsibilities for maintaining effective security?
b. Is there an effective security program? Is there an
individual clearly designated as responsible for
-------
5-12
security at each site (i.e. RTP, WIG)?
c. Are sensitive applications and data bases known? Has
a formal risk assessment been performed for each?
d. How well does the data center control access to system
software, including sensitive utilities like ZAP,
DBMS, and user accounting programs? How is access
controlled? Do the controls work?
e. How does the center control physical access to
facilities?
(4) Disaster and Contingency Planning
Assess the provisions for continued service in the event of un-
expected loss of ability to continue operations. Potential
indicators for this component include:
a. Does the center have an effective contingency plan?
What types of disasters does it provide for? Does the
alternate site have sufficient excess capacity to
absorb critical agency processing?
b. What high priority applications systems are to be
supported in the event of a disaster? How are they
selected?
c. Is the contingency plan tested periodically? Who has
authority to invoke the plan?
d. Are complete backup files maintained offsite for
current systems software and applications programs?
Are copies of procedures and other documentation
maintained off-site?
-------
5-13
e. What arrangements exist for conversion of tele-
communications services in the event of service
disruption?
(5) Teleconrounications
Assess the adequacy, reliability, and utility of telecommuni-
cations services used to access the center's facilities,
Potential indicators for this component include:
a. To what degree does the center's operation depend on
telecommunications? Describe the facilities,
principal systems that use those facilities, and
users' perceptions of ease of use and reliability.
b. Is the configuration of the telecommunications system
fully documented? Are procedures for use fully
documented?
c. Is telecommunications included as part of the long
range data center plan?
d. Are controls over access adequate? Who has access?
Are there programmed access controls (passwords,
account numbers, etc.)?
e. How 1s the telecommunications system maintained? What
is the record of availability and reliability of
network facilities? What is the recent record of
reliability of individual terminals?
f. Is telecommunications performance monitored, and are
monitors accurate?
g. Are there contingency plans for telecommunications?
What are the principal risks to which the
telecommunications system is exposed?
-------
6-1
6. REVIEWS OF LOCAL COMPUTING CAPACITY
6.1 Definition and Scope
Local computing capacity is AOP equipment available to Agency staff that
is not under control of the NCC. This includes computers operated by regional
offices and Office of Research and Development (ORO) laboratories, office
automation computers (the PRIMES), and personal computers (PCs) available to
Agency staff. The totality of such equipment represents a significant capital
investment, and also represents significant risks to the Agency in terms of
its potential for waste, misuse, and misappropriation.
Local computing capacity tends to be characterized by the following
attributes, compared to a major data processing facility like the NCC:
Fewer administrative barriers to use
Fewer technical barriers to use
Less formal accounting for use (often, no accounting for use)
More direct user control of hardware
Fewer standards for design and operation of applications.
Those aspects of local computing make it attractive to users, but those same
aspects make local computing capacity an important area for audit review.
6.2 Local Computing Review Components
(1) Guidelines For Use
Assess the suitability of administrative and technical guidelines on
the use of local computing. Potential indicators for this component
include:
a. Are there guidelines that prescribe the proper uses for
local computing capacity (considering such aspects as life
cycle cost, data base size, volume of data processed, need
-------
6-2
for distributed access, sensitivity or confidentiality of
data, linkages to other systems, or risks)?
b. What procedures are required to initiate a new application?
Can those procedures be bypassed?
c. Are there procedures for periodically evaluating the
suitability of systems or applications for continued use?
d. How are scheduling conflicts resolved? Do procedures
permit one user to monopolize the resource without
authorization or notification of other users?
e. Do procedures provide adequate safeguards against un-
authorized use of equipment for personal purposes?
f. Is use of the equipment accounted for? Are users or
projects charged for use?
(2) Quality Assurance
Assess whether the procedures and practices used on local computing
resources meet standards consistent with the purpose and value of the
data being processed. Potential indicators for this component
include:
a. Are personnel adequately trained in data processing to
create reliable, well engineered systems on local computing
equipment? Are adequate training materials and courses
available to users?
b. Do systems provide for adequate audit trails to determine
the history of updates to data bases?
c. Does the system provide adequate backup to protect against
unintentional loss of data or programs?
-------
6-3
d. Does the system provide adequate software safeguards
against unauthorized access to restricted data? Does the
data management system provide adequate security controls?
Is there any supervisory review of systems containing
sensitive data?
(3) Controls
Assess the adequacy of administrative and physical controls over the
use of local computing resources. Potential indicators for this
component include:
a. Is the physical environment satisfactory for the mode of
use of the equipment? Are there adequate storage'
facilities for off-line storage, with adequate security?
b. Is equipment adequately protected against theft?
c. Are there administrative controls designed to prevent
personal use?
d. Are adequate controls in place to ensure that unauthorized
individuals do not have access to sensitive information
that has been approved for local processing?
-------
A-l
APPENDIX A: BIBLIOGRAPHY
EPA DOCUMENTS '
1. ADP Audit Evaluation Strategy (AMS Report to EPA), December, 1982.
2. ADP Systems Audit Program; March 1982 to October 1982. Summary and
Conclusions. Mary Lou Melley, November 4, 1982.
3. ADP Audit Report: Hazardous Waste Information Systems.
(EPA-ADP-Audit-82-03), September 30, 1982.
4. Assessing EPA's Long Range Information Systems Needs; Draft Final Report
of the Long Range ADP User Requirements Task Force. EPA Office of
Administration, July 29, 1983.
5. Audit and Evaluation of the Grants Information and Control System GICS.
EPA Office of Administration, December, 1982 (Prepared under
Contract No. 68-01-5146 DTO-51).
6. Audit Plan For Office Systems - Draft (8002 Allen report to EPA),
September 1982.
7. Audit Status Report (EPA-AOP-Audit-Status-83-01) Mary Lou Melley, May 10,
1983.
8. Critique of the AOP System Audit Program First Phase - January 1982 to
October 1982. Mary Lou Melley, October 19, 1982.
9. 'Final Analytical Plan for the STORET Audit", Memorandum from Larry
Seldel, Jack Math1as to John Elliott, Mary Lou Melley, Don Rosene,
and Beverly Gregory, July 27, 1982.
10. Project Plan (Proposal). Hazardous Waste Support Systems (Agency-wide
Strategy and Mission Support Audit). Judith M. Lebowich, April 7,
1982.
-------
A-2
11. Project Plan (Proposal). Management Control Level and Technical Level
Audits of the Grants Information and Control System (6ICS) . Ruth E.
White. February 26, 1982.
12. Proposal To The Office of Management Information and Support Services»
U.S. Environmental Protection Agency For Audit of EPA's Main Data
Processing Facilities. AMS, May 19, 1983.
13. STORET Audit Report. EPA Office of Adminstration, September 30, 1982.
14. Strategy Paper For ADP Audit Program (Draft). M. Melley and E. Poole,
December 18, 1981.
GAP DOCUMENTS
1. Assessing Reliability of Computer Output. GAO-AFMO-82-91.
2. Bibliography of Documents Issued by GAP on Matters Related to ADP.
AFMO-82-50.
3. EDP Auditing. An Annotated Bibliography. B1b1iography Series OLS 83-01,
Published by GAO library.
4. Evaluating Internal Control in Computer-Based Systems. 6AO-AFMD-81-76.
5. Questions Designed to Aid Managers and Auditors in Assessing the AOP
Planning Process. Special Publication, GAO Accession Number 119637.
6. Standards For Audit of Government Organizations. Programs. Activities, and
Functions. 1981 Revision.
-------
A-3
OMB
1. Circular No. A-50: "Audit Follow-Up." January 15, 1979.
2. Circular No. A-71, Transmittal Memorandum No. 1: "Responsibilities for
the Adminstratlon and Management of Automatic Data Processing
Activities," July 27, 1978.
3. Circular No. A-73 Revised: "Audit of Federal Operations and Programs,"
June 20, 1983.
4. Circular No. 123: "Internal Control Systems," August 16, 1983.
OTHER SOURCES
1. Auditor General of the Navy, Audit Program No. 19A-EDP Facility Audits
(Draft) . AUOGENAVNOTE 7500, July 17, 1979.
2. Auerbach Publishers, Inc., EDP Auditing, 1983.
3. Canadian Institute of Chartered Accountants, Computer Audit Guidelines,
1975.
4. Canadian Institute of Chartered Accountants, Computer Control Guidelines.
1970.
5. Hermanson, Loeb, Saada, and Strawser, Auditing Theory and Practice,
Richard 0. Irwln, 1976.
6. Kuong, Javier F.t Audit and Control of Computerized Systems, Management
Advisory Publications, 1979.
TECHNICAL WRITING GUIDES
1. Barzun, Jacques, Simple 4 Direct: A Rhetoric For Writers, New York,
-------
A-4
Harper and Row, 1975. Especially useful are Chapter 9 "MEANING or
What Do I Want to Say?" and Chapter 10 "REVISION or What Have I
Actually Said?'
2. Lanhan, Richard A. Revising Business Prose. New York: Charles Scribner's
Sons, 1981. Lots of examples. Especially good for anyone whose
style 1s wordy.
3. Mathes, J.C. and Owlght W. Stevenson. Designing Technical Reports.
Indianapolis: Boobs-Merrill Educational Publishing, 1976. A
standard technical writing text.
4. Williams, Joseph M. Style; Ten Lessons in Clarity & Grace. Glenview,
Illinois: Scott, Foresman and Company, 1981. Sentence practice in
revising for clarity and smoothness. Tells which grammatical-rules
are musts and which are optional, and provides exercises.
5. Willis, Hulon. Grammar and Composition. New York: Holt, Rlnehart and
Winston, 1976. Sentence combining with exercises. Basic practices
for those who write in the "programmer's" style.
-------
APPENDIX B: SAMPLE MEMORANDA
81: Transmittal of Audit Report, from Assistant Administrator for
Administration.
82: Response To Audit Findings, from Assistant Administrator Office of Water,
-------