t>
            I    Office of Inspector General

            ^    Report of Audit
/V)


         INFORMATION TECHNOLOGY
                 User Satisfaction of

               the Grants Information

                 and Control System


                    E1NMF6-15-3044-7100237

                       June 24,1997
                                   MHeadquartera Library
                               1QnnoMai1 code 3201

                                 2/  ,!?nsy'vania Avenue NW
                                 Washington DC 20460

-------
Inspector General Division(s)
 Conducting the Audit

Region(s) covered

Program OfHce(s) Involved
ADF Audits and Assistance Staff
 Washington, DC

Headquarters

Grants Administration Division

-------
                  UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                 WASHINGTON, D.C. 20460
                                                                              OFFICE OF
                                     JUN 24 1997                       THE INSPECTOR GENERAL
MEMORANDUM
SUBJECT:   Report of Audit - User Satisfaction of the
             Grants Information and Control System
             Audit Report No.  E1NMF6-15-3044-7100237
FROM:      Patricia H. Hill,
               ADP Audits and Assistance Staff (242 F)   (/

TO:          Gary Katz, Director
  /            Grants Administration Division (3903F)
       Attached is our final report entitled "User Satisfaction of the Grants Information and
Control System."  The objectives of this audit were to: (1) assess user satisfaction with the
integrity of the Grants Information Control System's (GICS) data and system functionality; and
(2) test the accuracy, completeness, consistency, and timeliness of the information contained in
GICS. This is the first report of a two phase audit and will address the first objective. The
second phase will be a data integrity review.

       This audit report describes problems and recommended corrective actions the Office of
Inspector General (OIG) has identified. The report represents the opinion of the OIG.  Final
determinations on the matters in the report will be made by EPA managers in accordance with
established EPA audit resolution procedures. Accordingly, the findings described in this report do
not necessarily represent the final EPA position.

       In accordance with EPA Order 2750, you, as the action official, are required to provide
this office a written response to the audit report within 90 days of the final report date. For
corrective actions planned but not completed by your response date, reference to specific
milestone dates will  assist this office in deciding whether to close this report. In addition, please
track all action plans and milestone dates in the Management Audit Tracking System.

       We appreciate your positive response to the recommendations presented in the report and
the many actions you and your staff have initiated to improve the Grants Information and Control
System.

                                      U.S. EPA Headquarters Library
                                            Mail code 3201
                                     1200 Pennsylvania Avenue NW  O&  RecyctecMRecydabto
                                         Washington DC powsn    (X/0 pri"l«d*l">s«i"Cano|a"iko''papar'hat
                                              "yiuil u^, 
-------
       We have no objection to the further release of this report to the public.  Should you or
your staff have any questions about this report, please contact Patricia Hill, Director, ADP Audits
and Assistance Staff on (202) 260-3615.

Attachment

-------
                 EXECUTIVE SUMMARY
PURPOSE

The objectives of this audit are to: (1) assess user satisfaction with the integrity of the Grants
Information Control System's (GICS) data and system functionality; and (2) test the accuracy,
completeness, consistency, and timeliness of the information contained in GICS. This is the first
report of a two phase audit and will address the first objective. The second phase will be a data
integrity review.
RESULTS IN BRIEF
                       \
Our audit included a survey on the accuracy of data and user satisfaction with GICS. The survey
was sent to all (326) EPA employees with GICS accounts. We received 198 responses, however,
42 individuals responded that they are not current GICS users. The survey was sent to 265
Regional users and 61 Headquarters users. One hundred thirty-four (50.57%) 'Regional users and
23 (37.70%) Headquarters users responded. Overall, over 81 percent considered GICS data to
be very or somewhat important in relation to the work of their division/department/office. Over
62 percent of the respondents use GICS data on a daily basis, and over 63 percent of respondents
describe GICS data as always or almost always useful.  A majority of the respondents use GICS
for staying abreast of grants,  progress reports, and reporting milestones.  Results of the survey's
objective questions are in Appendix n, page 15. Chapter 2 of this report addresses the users'
comments to the survey's open-ended questions.
                                                            jf   '

GICS users generally agreed the system is a good national database because it stores a lot of
information; however, many users noted deficiencies with the way data is entered into the system,
the lack of a grantee table, the intensive coding within GICS, and difficulty in reporting.
Specifically: (1) Regional and Headquarters users enter the same types of grant information using
different data entry screens; (2) grant specialists or administrative assistants are required to type in
general information (i.e., grantee name, address, employer identification number, etc.) each time a
grant is established; (3) some data elements have an inordinate number of legitimate codes
associated with them; and (4) some of the listed reports within the GICS reporting system contain
incomplete data. As a result of these weaknesses, users cannot rely on the integrity of data in
reports produced by the system.
                                                                    Report No. 7100237

-------
 These weaknesses were caused by the lack of policies over a consistent method to enter data, lack
 of management oversight, and insufficient edits. Currently, there is no formal policy governing
 how data should be entered. Also, policies are needed to clarify codes and to specify the
 consistent use of codes within GICS. Furthermore, management should take a more active role in
 the operation of GIGS. Finally, GIGS does not have adequate edit checks.
RECOMMENDATIONS

The report includes seven recommendations to improve the electronic grants process.  The
recommendations include the development of polices over consistent data entry of grants
information; combining and clarifying data elements and codes, as appropriate; and following
proper system development life-cycle procedures.

In a memorandum dated June 20, 1997, the Director for the Grants Administration Division
responded to our draft report (see Appendix I). In summary, the Agency substantially agreed
with the recommendations with the exception of Recommendation 5.  Recommendation 5 called
for the analysis of the costs and benefits of developing and implementing a grantee table. We
modified Recommendation 5 in response to GAD's comments to recommend the inclusion of a
grantee table in the proposed Integrated Grants Management System (IGMS). However, if
IGMS is not implemented, we recommend analyzing the costs and benefits of developing a
grantee table in GICS or in any other system used to manage grants.

With respect to Recommendations 1-3, GAD is chartering a workgroup comprised of
Headquarters and Regional grants office personnel to identify policy, data element, code and
process requirement options and raise them to management for a national decision. These
decisions will impact future practices, and will be documented, along with Recommendations
4 - 6, in the requirements analysis for IGMS.

Consistent with Recommendation 7, GAD is following system development life cycle procedures
for the development of IGMS.
                                         11
Report No. 7100237

-------
                TABLE OF CONTENTS
                                                                      Page

EXECUTIVE SUMMARY	i

CHAPTERS

1  -   Introduction	1
      Purpose	1
      Background	1
      Scope and Methodology	3
      Prior Audit Coverage	3

2 ? -   Users Are Dissatisfied with the Grants Information and Control System	5
      Data Entry Inconsistencies in Regions and Headquarters  	5
      GICS Does Not Utilize a Grantee Table	6
      GICS Is Code Intensive	,7
      Reporting In GICS Is Difficult	8
      Questionable Integrity of GICS Reports	9
      Data Entry Policy and Greater Management Oversight Needed  	,	9
      Integrated Grants Management System 	10
      Recommendations	11
      Agency Comments and OIG Evaluation	12

APPENDICES

 I  -   Grants Administration Division's Response to Draft Report	13

II  -  . Summary of Objective Survey Questions	15

III -   System Development Lifecycle Process 	21

IV -   Report Distribution  	27
                                                             Report No. 7100237

-------
This page intentionally left blank.
                                             Report No. 7100237

-------
                                 CHAPTER  1
                                      Introduction
Purpose                         The objectives of this audit are to: (1) assess user satisfaction with
                                the integrity of the Grants Information Control System's (GICS)
                                data and system functionality; and (2) test the accuracy,
                                completeness, consistency, and timeliness of the information
                                contained hi GICS. This is the first report of a two phase audit and
                                will address the first objective. The second phase will be a data
                                integrity review.
Background                      The Grants Administration Division (GAD) is responsible for the
                                 administrative management of all of the Environmental Protection
     "r                           Agency's (EPA) assistance programs (e.g., grants, cooperative
     "•i                  .         agreements, and interagency agreements) and is the national
                                 program manager for GICS.  GICS is EPA's official information
                                 management system containing administrative, project and financial
                                 information on EPA's assistance programs.  As National Program
                                 Manager, GAD's role is to develop a well-coordinated and focused
                                 approach to automation that successfully supports the Agency's
                                 grants management system. GICS supports two major client
                                 organizations: the GAD for all non-construction grant programs
                                 and the Municipal Construction Division of the Office of Water
                                 (OW) for the construction grant and state revolving fund programs.

                                 GICS was created in 1972 to track EPA Research and
                                 Demonstration grants and has expanded to administratively track all
                                 EPA grant programs, interagency agreements, and fellowships. In
                                 1985, a modernization effort began to convert GICS to a state-of-
                                 the-art database management system. GICS currently supports on-
                                 line updating and editing, and ad hoc standardized reporting. GICS
                                 is an AD ABAS database residing on EPA's IBM Mainframe in
                                 Research Triangle Park, North Carolina. It is accessed nationwide
                                 by means of the telecommunications network maintained by the
                                 Enterprise Technology Services Division. It consists of 11  distinct
                                 databases: 10 Regional GICS databases and a single database for
                                 Headquarters. The GICS System can best be illustrated by Figure 1,
                                 on the next page.


                                             1                           ReportNo. 7100237

-------
                              Headquarters

                            NohConJ C  lAGs
                                         —  "-
                                        ..	•
                                        Asbestos
NFS - Non Poiat Source  Const - Construction Grants       NonCon - Non Construction Grants
              SRF - State Revolving Fund      lAGs - Interagency Agreements
                       Each Region has four subsystems: Non Point Source; State
                       Revolving Fund, Construction, and Non-Construction. In addition,
                       Headquarters has four subsystems:  Non-Construction grants,
                       Interagency Agreements, Fellowships, and Asbestos.  Each
                       subsystem has a.core group of national data elements common
                       throughout all databases, as well as their own Regional-specific
                       data elements.  These numerous GIGS subsystems evolved in
                       response to user needs.

                       Assistance data is input through direct on-line data entry screens,
                       from users all over the country including Headquarters, Regional
                       Grants Management Offices, Regional Program Offices, and
                       various State governmental agencies. Grant Management Offices
                       primarily use the data to create award agreements. Program
                       Offices retrieve project level data from GICS and store project level
                                                                 ReportNo. 7100237

-------
                                data in their own unique subsystems that share administrative data
                                with the grants management subsystems. GICS represents the
                                primary management information system for the Grants
                                Management Offices and selected program offices.  However,
                                although State offices input project level data, they do not generally
                                use the system as their primary management system.
                                                                   vith Government Auditing
                                                                   ! Comptroller General of
                                                                   survey on the accuracy of
                                                                   he survey was sent to all
                                                                   Dts. We received 198
                                                                   inded that they are not
                                                                   aded into three sections:
                                                                   i system functionality.
                                                                   f eight objective and open-,
                                                                   rvey results, we followed
                                                                   5 grant specialists inputting
                                                                   D, the Enterprise Systems
                                                                   ision, and OW.  In addition,
                                                                   >cedures governing
                                                    o
8
o

o
 C]
                            titled "Management of
                            ?A" identified software
                            is, including GICS. The
                            eleven conditions.  In GICS,
                            eleven test conditions. The
                            it testing and acceptance by
                            tenance plans, absence of
                            oadequate coding standards
                            ions, one (i.e., inadequate
                            is not applicable and the
                            testing, had partially
                                                                         ReportNo. 7100237

-------
This page intentionally left blank.
                                             Report No. 7100237

-------
                                   CHAPTER 2
      Users Are Dissatisfied with the Grants Information and Control System
Data Entry Inconsistencies
in Regions and
Headquarters
GICS users generally agreed the system is a good national database
because it stores a lot of information; however, many users noted
deficiencies with the way data is entered into the system, the lack of
a grantee table, the intensive coding within GICS, and difficulty in
reporting. As a result of these weaknesses, users cannot rely on the
integrity of data in reports produced by the system.  These
weaknesses were caused by the lack of policies over a consistent
method to enter data, lack of management oversight, and
insufficient system edits.


Regional and Headquarters users enter the same types of grant
information using different data entry screens.  The Regional grants
databases have eight standard data entry screens under the project
add/change menu. However, users in Region 5 have two additional
screens, while users in Region 7 have four additional screens, and
users in Region 9 have one additional screen.  Furthermore, there
are 12 screens for Headquarters users, only three of which match
the standard Regional screens.

In addition, the data elements required to establish a grant differ
between Headquarters and Regions. Users of the Headquarters
system are required to enter eight data elements to establish an
assistance agreement, whereas Regional users are required to enter
nine data elements. The following lists .the required elements:
                                        Headquarters

                                        Applicant County
                                        Application Received Date

                                        Employer ID Number
                                        Long Description
                                        Name
                                        Program
                                        Specialist
                                        State
                                  Regions
                                  Action Code/Date
                                  Applicant County
                                  Application Received Date
                                  City
                                  Name
                                  Program

                                  State
                                  Street
                                  Zip Code
                                                                            Report No. 7100237

-------
GICS Does Not Utilize a
Grantee Table
. In order to accurately track, analyze, and report on assistance
 agreements, managers need consistent .information available to
 them.  Without a requirement for the same data elements between
 Regions and Headquarters, important information may not be
 getting stored within GICS. Achieving consistent information
 between Regions and Headquarters currently depends on whether
 the person entering the data enters information for the non-required
 data elements. However, the  system neither alerts users that a non-
 required data entry field was left blank nor prevents the user from
 continuing processing. Without consistent data, the tracking,
 analyzing, and reporting of grants may be incomplete because the
 system may not include all pertinent information.


 Grant specialists or administrative assistants are required to type in
 general information (i.e., grantee name, address,-employer
 identification number, etc.) each time a grant is established. The
 field length for the grantee name is limited to 38 characters,
 therefore many of the grantee names need to be abbreviated. As a
 result,  many grantees are listed in the system using several different
 abbreviated names. GICS treats "Unrv" differently than "Univ.",
 and grantee names which start "The Grantee Name" are treated
 differently than "Grantee Name." For example, "The Ohio State
 University Research Foundation" is listed 24 different ways using
 various abbreviations. The list below shows 5 of the 24 variations:

       OHIO STATE UNTV RESEARCH FDN.
       OHIO STATE UNIV. RESEARCH FDN.
       OHIO STATE UNTV. RESEARCH FOUND.
       THE OHIO STATE UNTV. OF RESEARCH FDN.
       OHIO STATE UNTV. NATL REG. RES. DSTST.

 Compounding the abbreviation problem is the fact that grant
 specialists now enter data using both upper and lowercase letters.
 However, Natural1, which is used for generating GICS reports, is a
 case sensitive language.  Therefore, "Univ" is treated differently
 than "UNIV" when creating reports.
                                       1 Natural is a programming language tailored for use with
                                 AD ABAS, EPA's standard data base management system.
                                                                         Report No. 7100237

-------
GICS Is Code Intensive
Furthermore, some grantees are listed with multiple employer
identification numbers. For example, three employer identification
numbers are listed for "The Ohio State University Research
Foundation":

      316015919
      316401599
      316401500A1

In another example, one employer identification number (6360006 1 9)
was associated with six distinctly different grantee names:

      ALABAMA DEPARTMENT OF ENVIR. MGMT.
      ALABAMA DEPARTMENT OF PUBLIC HEALTH
      ALABAMA DEPT. OF ECON. & COMM. AFFAIRS
      ALABAMA DEPT. OF EMERGENCY MANAGEMENT
      SOUTHERN ENVIR. ENFORCEMENT NETWORK
      STATE OF ALABAMA

A table that includes general information about a grantee (e.g.,
grantee name, address, and employer identification number, etc.)
would eliminate the need to re-enter data every time a grant is
initiated. In addition,  a grantee table would eliminate the
inconsistencies related to abbreviations and misspellings of grantee
names, and errors in employer identification numbers.


Some data elements have an inordinate number of legitimate codes
associated with them.  For example, the data element "action code"
has 41 different, legitimate entries. However, many grant
specialists stated  they do not use the action codes, or only use a few
codes, because the.number of codes is confusing. In addition, when
some entries are made in the action code data element, the user is
instructed to enter another code in a different data element. For
example, if the code "F" is entered as the action code, then one of
five codes needs to be entered in the data element "record-type".

Also, within a data element the same code can have different
meanings. For example, the definition for the data element
"Region" has different meanings based on who administers the
assistance program.
                                Ubraiy

-------
Reporting In GICS Is
Difficult
Program Administrator
Headquarters
Regions
Recipients in foreign countries
State Revolving Fund projects
administered by a State
Definition for "Region"
The EPA Region in which the
headquarters of the recipient is
located.
The number of the EPA
Region which is administering
the assistance program.
"00"
The number of the EPA
Region in which the State is
located.
                                 In another example, the data element "Amount Requested" has
                                 different meanings based on the type of assistance program being
                                 administered.
                                  Assistance Program
                                  Wastewater Treatment
                                  Construction
                                  Assistance Adjustment Notice
                                  for Research, Demonstration
                                  and Training Grants and
                                  Fellowships
                               "Amount Requested"
                               The total pre-award amount of
                               a request for EPA funds to be
                               used to carry out a proposed
                               project within a specified
                               budget period established in
                               the award document
                               The .amount of the balance of
                               unobligated Federal funds
                               being withdrawn from the
                               approved award amount.
Extensive or confusing coding results in codes not being used, or
used inappropriately, because the user needs to guess at which code
is appropriate for a particular grant.  Therefore, information may
not be available, or may be incorrect, for tracking, analyzing and
reporting purposes.

Many users identified problems with the GICS report generator.
GICS utilizes a program called Report Writer to process reports.
In addition to standard reports, the Report'Writer allows the user to
create custom reports.  Each Region has a menu listing the reports
                                              8
                                         Report No. 7100237

-------
Questionable Integrity of
GICS Reports
Data Entry Policy and
Greater Management
Oversight Needed
                                  regularly used by that Region or Headquarters office. However,
                                  some of the listed reports contain incomplete data. For example, in
                                  many cases the report entitled "Active Projects Grouped By
                                  Consulting Engineer" does not contain important information
                                  relating to the engineer, such as consultant number, the consultant's
                                  name or phone number. In addition, these reports do not include
                                  summary information, such as totals (e.g., funds awarded,
                                  outstanding grants, etc.).  Users who generate customized reports
                                  described Report Writer as difficult and a process of trial and error.
                                  For example, users cannot be guaranteed that all pertinent data will
                                  be captured should they want to query the system for all grants
                                  pertaining to a particular grantee but are unable to guess all of the
                                  abbreviations and iterations used during data entry. In the case of
                                  "The Ohio State University Research Foundation," the user would
                                  need to know and include all 24 abbreviated iterations of the
                                  grantee name. Also, if a user wants to query the system based on
                                  the employer identification number for "The Ohio State University
                                  Research Foundation," they must include all three employer
                                  identification numbers to ensure all information is obtained.
As a result of the weaknesses previously discussed, users cannot
rely on the integrity of data in GICS reports.  When selecting
criteria for a report, users need to include all iterations of grantee
names, including all abbreviation and punctuation differences in
names, to ensure they are included in the report. Furthermore,
extensive and confusing codes result in codes not being used or
being used inappropriately. When users create a report based on
codes as the selection criteria, they cannot be certain the
information is complete and accurate.  Therefore, it is difficult to
ensure all information for a particular grantee is captured for
reporting purposes. In addition, resources are wasted because
reports often need to be re-run due to trial and error attempts to
capture complete information. Finally, many GICS users maintain
separate tracking systems (i.e., manual files, personal spreadsheets)
for grants information because they do not rely on the accuracy of
the data within GICS. Maintaining separate tracking systems
results in duplicate data entry and wasted resources.


The previously discussed weaknesses were caused by the lack of
policies over a consistent method to enter data, lack of management
oversight, and insufficient edits.  Currently, there is no formal
policy governing how data should be entered.  With the absence of
                                                                            Report No. 7100237

-------
Integrated Grants   •
Management System
a grantee table, policies should address the consistent use of
abbreviations (e.g., use the first three letters of a word for all
abbreviations)  or prohibit the use of any abbreviations.  Also,
policies are needed to clarify codes and to specify the consistent use
of codes within GICS. Furthermore,  management should take a
more active role in the operation of GICS.  One user stated that
there is a lack of upper management support and guidance from
Headquarters as to what data is required for entry and, therefore,
Regions cannot clearly define to the States what data element
requirements they must meet. In addition,  GAD does not have a
supervisory relationship with Regions. Although GAD is the owner
of GICS and its data, Regional users stated they report to the
Regional Administrator, not GAD. This is evident in the fact that
the screens are not standardized between Regions and
Headquarters.  Different Regions requested different changes to the
system; rather  than trying to consolidate these changes, GAD and
the Enterprise  Services Division made changes to the Regional
screens on a Region by Region basis.  This type of decentralized
approach contributed to the inconsistencies previously discussed.
Finally, GICS does not have adequate edit  checks. It is difficult, if
not impossible, to ensure all iterations of a  grantee name are
included in a report because there is'no edit check to-ensure grantee
names  are spelled, abbreviated, or capitalized the same way every
time, or the employer identification number is correct. In addition,
there are no edits which ensure consistent information is entered
between Headquarters and Regions.

GAD is currently working with the Technology, Planning, and
Management Corporation (TPMC) as part  of tile Information
Technology Architecture Support (ITAS) contract to determine if
a new Integrated Grants Management System should be developed.
GAD and TPMC are developing a feasibility study and needs
analysis.  The feasibility study will analyze the grant application,
award, and post award management processes to identify
opportunities for streamlining these processes and eliminating
unnecessary review layers.  In addition, the study will analyze the
current technical environment and develop  specific
recommendations on the optimal technical,  solutions for improving
the Agency's grants information system.  Finally, if a new
integrated grants management system is recommended, GAD plans
on developing  and implementing the  system in "modules" so that
incremental benefits may be achieved  based on funding levels. To
assist GAD in this task, we have included a description of the
                                              10
                                          Report No. 7100237

-------
                                  system* development lifecycle (SDLC) processes and phases as
                                  Appendix m.


Recommendations                We recommend the Director of the Grants Administration Division:

                                  1.      Develop, issue and monitor policies governing the
                                      ,   consistent data entry of grants information.  Specifically,
                                         these policies should address:
                                         a.      the use of abbreviations;
                                         b.      the clarification and consistent use of codes; and
                                         c.      standardized data elements required to establish a
                                                grant.

                                  2.      Review the GICS data elements and codes to determine if
                                         they are necessary and clearly defined. Based on this
                                         review:
                                         a.      combine related data elements and codes, if
                                                appropriate;
                                         b.      clarify the definitions and use of codes; and
                                         c.      eliminate unnecessary data element and codes.

                                  3.      Take an active role in clarifying and standardizing the
                                         electronic grants process to ensure it is consistent between
                                         Headquarters and Regions.

                                  4.      Require that reports from the integrated Grants
                                         Management System include summary information, such as
                                         totals (e.g., funds awarded, outstanding grants, etc.) and
                                         meet user needs.

                                  5.      Include a grantee table that would include general
                                         information about the grantee (e.g., grantee name, address,
                                         employer identification number, etc.) in the Integrated
                                         Grants Management System.  If the Integrated Grants
                                         Management System is not implemented, analyze.the costs
                                         and benefits of developing and implementing a grantee table
                                         in the Grants Information and Control System or in any
                                         other system used to manage grants.

                                  6.     Include adequate edit checks in the Integrated Grants
                                         Management System in order to facilitate consistent data
                                         entry.
                                               11   .                         Report No. 7100237

-------
Agency Comments and
OIG Evaluation
7.     Follow appropriate system development life-cycle
       procedures, as outlined in Appendix m of this report,
       during the development of the Integrated Grants
       Management System.


In a memorandum dated June 20, 1997, the Director for the Grants
Administration Division responded to our draft report (see
Appendix I). In summary, the Agency substantially agreed with the
recommendations with the exception of Recommendation 5.
Recommendation 5 called for the analysis of the costs and benefits
of developing and implementing a grantee table. We modified
Recommendation 5 in response to GAD's comments to recommend
the inclusion of a grantee table in the proposed IGNIS. However, if
IGMS is not implemented, we recommend analyzing the costs and
benefits of developing a grantee table in GICS or in any other
system used to manage grants.             •• -

With respect to Recommendations 1-3, GAD is chartering a
workgroup  comprised of Headquarters and Regional grants office
personnel to identify policy, data element, code  and process
requirement options and raise them to management for a national
decision.  These decisions will impact future practices, and will be
documented, along with Recommendations 4 - 6, in the
requirements analysis for IGMS.

Consistent with Recommendation 7, GAD is following system
development life cycle procedures for the development of IGMS.
                                             12
                                         Report No. 7100237

-------
                                       APPENDIX I
                Grants Administration Division's Response to Draft Report
                    UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                           .    . • W/&HINGTON,D.C.      "'
                                     JUN 2 o
                                                                         OFFICE OF .
                                                                       ADWNSTRATBN
                                                                       AND RESOURCES
                                                                      .  MANAGEMENT .
     •*MEMORANDtiM
     'SUBJECT:   Draft Report User Satisfaction of the Grants Information and Control System
     .    ;,   :  :. Report No.                     '                         '
     FROM:  .Gary M. Katzi Du»ctpr
          ,  •.    Grants, Administration Dhnffloji903F)

     TO:  . .,     Patricia it Hill, JWfector        '  -.  "        .
\  "      ,       ADP Audits and Asstoance Stafl'(242I)            .     .

       . .   We have reviewed the "Draft Report: User Satisfaction of the. Grants Information and
     Control System," and agree with its findings. Many of the weaknesses identified in this us» .'; :
  .  survey were also identified in an FY96 survey of grants project officers and grants specialists and
     documented in the business«ase analysis for. the replacement system for GICS, titled "Business
 T,i Process Analysis: Identification of problem Areas" (March 1997).     '-',--

•  -         In addition, we are substantially ih.agreement with the recommendations identified ihthe
     report, with the exception of Recommendation 5., That recommendation calls for the analysis of
     the costs and benefits of developing and implementing a grantee table. We believe such an..    :
     analysis is superfluous since your ^ study and our Own clearly document the need for a grantee  ;
     table.. Such a table-not only significantly increases the integrity and consistency of the data, but
     also reduces repetitive data  entry for frequent applicants. We suggest instead that you
     recommend a grantee table be included in the IGMS design.

•    -       With respect to the first three recommendations, we are chartering a workgroup  '
     comprised of Headquarters and Regional grants office personnel to identify policy, data element,
     code  and process requirement options and raise them to management for national decision by
     August 30, 1997. . These decisions will govern future practice under the current GICS and will be
     documented, as appropriate, along with Recommendations 4 - 6, in the requirements analysis for
  •  IGMS due September 30, 1997.
                                                                                    ReporfNo. 7100237
                                                        U.S. EPA Headquarters ubrary'
                                                               Ma"code320i.   •••••
                                                                   X'vania Avenue
                                                                   ton DC  20460

-------
                                                                                       Appendix I
                                                                                       Page 2 of2


Needs Survey tad Feasibility Study are complete, and a {Security PlanflUsk Analysis has been
drafted for Phase one of the system.    :  •.    /'  :                :           •*•''.•
       ',..•'.   .'" '  :      '   .-•'•."'.'•.'             '••.'.'  Y'V   ....:' ;
.      We appreciate tl^ opportunj^ tpconvneiit on the Draft Re^
final document Jfyoa have any firther questions, yWi may caa me at 26^                 ./.
                                                14
                                                                               Report No. 7100237

-------
                            APPENDIX II
                    Summary of Objective Survey Questions
      326               Surveys mailed
      156   47.85%      Surveys Received and Answered
      3     0.92%       Surveys Returned To Sender
      34   10.43%      Respondent no longer uses GIGS
      7     2.15%       Respondent never used GICS
      1     0.31%       Respondent plans to use GICS in the future
      125   38.34%      Outstanding Surveys

The following is based on the 156 surveys received and answered.

      Employee location:

          23    14.65%   Headquarters
         134    85.35%   Region


GENERAL INFORMATION

A. 1.   What subsystems of GICS do you or your office work in?

               12.85%   HAGDS
               38.96%   RAGDS
               28.51%   IAMS
                3.61%   FADS
                6.83%   Other
                9.24%   No response

A.2.   In relation to the work of your division/department/office, GICS data is:

               71.15%   Very Important
               10.90%   Somewhat Important
                8.33%   Important
                6.41%   Somewhat unimportant
                1.92%   Unimportant
                1.28%   No response


                                        15                        Report No. 7100237

-------
                                                                           Appendix II
                                                                            Page 2 of6
A.3.   How often do you or your office use GIGS data?
                 62.82%
                 12.18%
                  9.62%
                  6.41%
                  1.92%
                  2.56%
                  3.21%
                  1.28%
Daily
Weekly
Monthly
Quarterly
Semi-annually
Annually
Other
No response
A.4.  Which of the following best describes the usefulness of GIGS data?

                 41.67%    Always useful
                 22.44%    Almost always useful
                 17.31%    Frequently useful
                 16.03%    Sometimes useful
                  0.00%    Never useful
                  2.56%    No response

A.5.  How do you or you office use GICS  data?
                 34.51%
                 22.28%
                 17.12%
                  7.88%
                 17.66%
                  0.54%

ACCURACY OF GICS DATA
Stay abreast of grants
Progress reports
Report milestones
Report to Congress
Other
No response
B. 1   Generally, which of the following best describes the accuracy of GICS data reported?

                 17.95%    Always accurate
                 44.87%    Almost always accurate
                 16.67%    Frequently accurate
                  8.97%    Sometimes accurate
                  0.00%    Never accurate
                 11.54%    No response
                                           16
                                            Report No. 7100237

-------
                                                                              Appendix n
                                                                              Page 3 of6
B.2    Explain/provide examples of inaccurate GIGS data elements.

                 44.23%    Examples provided
                 37.82%    No examples provided
                 17.95%    Question skipped due to accurate data

B.3    In your opinion, which of the following best describes why GIGS data is inaccurate?

                   5.90%    Delayed data base update
                 .11.52%    Delayed data entry
                   3.93%    Delayed document processing
                 16.57%    Data input errors
                   8.99%    Data/Grants not in system
                   3.37%    Data field not in system
                 12.08%    Insufficient controls over data quality
                   7.58%    Insufficient definition of data elements
                   8.43%    Incomplete data
                   3.37%    Unknown
                   5.06%    Other
                   7.87%    Question skipped due to accurate data
                   5.34%    No response

B.4    Which of the  following best describes the action you take to achieve accuracy of GIGS
       data?

                 24.29%    Identify errors and request new run
                 24.76%    Reconcile with manual records
                   8.57%    Reconcile with automated records from another system
                 11.43%    No action, use report as best as possible
                   8.57%    Other
                 13.33%    Question skipped due to accurate data
                   9.05%    No response
                                             17                           Report No. 7100237

-------
                                                                             Appendix II
                                                                              Page 4 of6
B.5    How does inaccurate GIGS data effect the performance of your job tasks?

                  7.50%    Inaccurate reports to Congress
                  6.07%    Inaccurate data transferred to IFMS
                  3.57%    Over-obligation of grant dollars
                  4.29%    Under-obligation of grant dollars
                  13.21%    Grants not accounted for
                  12.86%    Grant dollars not accounted for
                  16.79%    Grants not closed out timely
                  15.36%    Other
                  10.00%    Question skipped due to accurate data
                  10.36%    No response
B.6
How would you improve the accuracy of GICS data?
                 14.87%    Include more edit checks
                 12.31%    Create a standard data entry sheet
                 13.85%    Modify data entry sheet to match fields in system
                 21.54%    Other
                 14.36%    Question skipped due to accurate data
                 23.08%    No response
SYSTEM FUNCTIONALITY

C. 1   Which of the following best describes your general experience with the system response time?
                 16.03%
                 37.18%
                 28.85%
                  7.69%
                  0.00%
                 10.26%
                     Very fast
                     Fast
                     Moderate
                     Slow
                     Delayed
                     No response
                                            18
                                                                  Report No. 7100237

-------
                                                                             Appendix II
                                                                             PageS of 6
C.2   Generally, how would you describe the system's operational availability record over the past six
      months?

                 12.74%   Always available
                 34.39%   Available, except for preventive maintenance
                 35.03%   Some intermittent down-time
                  4.46%   Frequent down-time
                  0.64%   Extensive down-time
                  0.00%   Not available
                 12.74%   No response

C.3   Is the system easy to learn?

                  5.77%  - Extremely easy
                  9.62%   Very easy
                 46.79%   Easy
                 21.15%   Difficult
                  8.33%   Very difficult
                  8.33%   No response

C.4   Is the system easy to use?'

                  7.05%   Extremely easy
                  8.97%   Very easy
                 48.08%   Easy
                 21.79%   Difficult
                  5.77%   Very difficult
                  8.33%   No response

C.5   Does the screen layout facilitate data entry?

                 57.69%   Yes
                 24.36%   No
                 17.95%   No response
                                             19                           Report No. 7100237

-------
                                                                             Appendix II
                                                                              Page 6 of6
C.6    Did you receive adequate training to effectively use the system?

                  50.97%    Yes
                  37.42%    No
                  11.61
                  11.61%    No response

C.7    GICS data is:
                  14.10%     Very easy to understand
                 35.90%     Somewhat easy to understand
                 29.49%     Neither easy nor difficult to understand
                  8.97%     Somewhat difficult to understand
                  2.56%     Very difficult to understand
                  8.97%    * No response

C.8   Does the data within GICS duplicate any other information received?
                 54.84%
                 22.58%
                 22.58%
No
Yes
No response
                                            20
                                            Report No. 7100237

-------
                                 APPENDIX III
                         System Development Lifecycle Process
The SDLC can be separated into six major phases—initiation, definition, system design, programming and
training, evaluation and acceptance, and installation and operation. Each of these phases encompasses
several steps which result in the production of key SDLC documentation. Each phase must be completed
before the next can be started.  At the completion of each phase, all previous work is reviewed, and a
"go/no go" decision is made. This progression provides a structured approach to the development
process. Figure 2 provides a graphical depiction of the SDLC phases and the flow of documents.
1
4IT1A1
r B

L c.

ION

-


D


II
DEFINmON

E1
I
i




•*
H4
p
"1






p





H
H'

f
V













III
SYSTEM DESfGN



2



rs


F2





2

J
J'

4




K1












IV
PROGRAMMING
& TRAINING
. . f


3


Ml 1
L1
M
-. M.J
-. G
3






•3




K2











-

1

V
EVALUATION &
ACCEPTANCE
i F
^4


4 1

O,



ki -
— -JN2
^4




.
^
M2



VI
INSTALLATION
A OPERATION


E5


•


-4
•4

Fs


LS
M3
Gr

-
       Figure 2: SDLC Process and Documentation Flow
DOCUMENTATION REQUIREMENTS CODES
A.  Needs Statement
B.  Feasibility Study
C.  Risk Analysis
D.  Cost/Benefit Analysis
E.  System Decision Paper
F.  Audit Plan
G.  Project Plan
H.  Functional Requirements Document
H.' Functional Security and Internal
    Control Requirements Document
I.  Data Requirements Document
I.'  Data Sensitivity/Criticaliry Description
J.  System/Subsystem, Program & Data Base Specifications
J.'  Security and Internal Control Related Specifications
K.  Validation, Verification and Testing Plan and Specifications
L.  User Manual
M. Operations/Maintenance Manual
N.  Installation & Conversion Plan
O.  Test Analysis & Security Evaluation Report
Note: Document subscripts refer to successive iterations of that document.
                                               21
                                       ReportNo. 7100237

-------
                                                                               Appendix III
                                                                               Page 2 of6
A description of each phase follows.

Phase 1: Initiation

The Initiation Phase begins with the recognition of a problem and the identification of a need. During this
phase, the .need is validated, alternative functional concepts to satisfy the need are explored, and a
functional recommendation is developed.  If the project is approved by management, it continues through
the remaining phases of systems development.  The decision to pursue a solution must be based upon a
clear understanding of the problem, a preliminary investigation of alternative solutions, and a comparison
of the expected benefits versus costs. Common documents produced during this phase include:

       •  Needs Statement -  A Needs Statement should describe the deficiencies in existing
          capabilities, new or changed program requirements, or opportunities for increased
          economy and efficiency. It should justify the exploration of alternative solutions.

       •  Feasibility Study - The purpose of the Feasibility Study is to provide: (1) an analysis of
          the objectives, requirements and system concepts; (2) an evaluation of alternative
          approaches for reasonably achieving the objectives; and (3) identification of a
          proposed approach.

       •  Risk Analysis -  The purpose of the Risk Analysis is to identify internal control and
          security vulnerabilities of a system.  In addition, the Risk Analysis should provide
          managers, designers, systems security specialists and auditors with recommended
          safeguards to be included during development.

       •  Cost/Benefit Analysis -  The Cost/Benefit Analysis document provides managers,
          users, designers, systems security specialists and auditors with adequate cost and
          benefit information.  It should include the impact of security, privacy and internal
          control requirements of the information, as well as  evaluate alternative approaches to
          meeting mission deficiencies.

       »  System Decision Paper  - The System Decision Paper provides the information and
          framework,critical to management's decision-making process during the SDLC.

Phase 2: Definition

In this phase, the functional requirements are defined, and detailed planning for the development of an
operable system begins. The functional requirements and processes to be automated are documented and
approved by senior management before the development effort is started. The requirements identification
and analysis of potential risk should be an iterative process.  It is critical  that  specific internal control and
                                              22
Report No. 7100237

-------
                                                                                Appendix HI
                                                                                Page 3  of6
security requirements are identified during this process.  Requirements may be, and commonly are,
modified in later phases as a better understanding of the problem is gained.  The documents produced
during this phase include:

       •  Audit Plan - The objective is to access the adequacy of internal ADP controls and
          provide the "reasonable assurances" to management that the system will comply with
          GAO's Government Auditing Standards (Yellow Book).

       •  Project Plan- The Project Plan specifies the strategy for managing the development
          project. It defines the goals and activities for all subsequent phases, and includes
          resource estimates and milestones. The Project Plan should describe the unique SDLC
          methodology to be used during the life of the particular project and include the
          methods for design, documentation, problem reporting, and change control.

      .• »  Functional Requirements Document. - The Functional Requirements Document
     .2    provides a basis for the mutual understanding between users and designers of the
          system. It should include the requirements, operating environment, and development
          plan.

       •  Functional Security and Internal Control Requirements Document - The purpose of
          this document is to focus the attention of the user and system designer on the
          security/internal control needs of the system. The basis for this document is the
          vulnerabilities identified during the Risks Analysis and established internal control
          standards.

       •  Data Requirements Document - The purpose of this document is to provide a data
          description and technical information about data collection requirements.
       •  Data Sensitivity/Criticality Description - This document provides a preliminary
          determination of data sensitivity and  a general statement of the nature and magnitude
          of potential threats for use in the formal Risk Analysis.

Phase 3: System Design

The objective of this phase is to develop detailed design specifications which describe the physical
solution to the system requirements developed during Phase 2. The solution provides a specific high-
level definition including information flows, logical processing steps, as well as all major interfaces and
their inputs and outputs. During this phase management should define and approve internal control and
security specifications prior to acquiring or starting formal development of the applications. The
validation, verification, and testing (W&T) goals are also identified during this phase, and a plan for
achieving these goals is developed. The Project Plan (schedules, budgets, deliverables, etc.) and Risk

                                               23                           Report No. 7100237

-------
                                                                              Appendix HI
                                                                              Page 4 of6
Analysis are reviewed and revised as required. The Initiation and Definition Phases are designed to
clarify and document user needs and requirements, whereas the System Design Phase takes those
requirements and converts then into specifications for a computerized system. Common documents
produced during this phase include:

       •  System/Subsystem, Program and Data Base Specifications - The purpose of this
          document is to specify the requirements, operating environment, design characteristics,
          and program specifications.  Also, the physical and logical characteristics of a
          particular data base should be specified.

       •  Security and Internal Control Related Specifications - The purpose of this is to
          document the security and internal control specifications needed to meet functional
          security and internal control requirements.

       •  W&T Plan and Specifications - The purpose of the W&T Plan establishes the plan
          for evaluating the quality and correctness of the software.  The W&T Plan includes
          plans for the testing of software, including detailed program specifications,
          descriptions, internal controls and security specifications, and procedures for all tests,
          as well as test data reduction and evaluation criteria.

Phase 4: Programing and Training

During this phase, programs will be developed and tested. Programming is the process of implementing
the detailed design specifications into code. The process of converting specifications to executable code
is primarily dependent upon the completeness and specificity of the program design.  If the program is
well defined, the process of programming is not technically complex. Completed code will then undergo
unit testing.  Training is critical to the success of the system, and therefore, attention needs to be paid to
the proper development and use of training materials.  The success  of the system will be directly
attributable to how well the users are trained. For those parts of the system which they do not
understand well, the probability exists that the users will not use those features, or use them incorrectly.
Common documents produced during this phase include:

       •  User Manual - The purpose of the User Manual is to sufficiently describe the
          functions performed by the software in rion-ADP terminology, so that users can
          determine its applicability, as well as when and how to use it.

       *  Operations/Maintenance Manual - Two separate manuals may be necessary. The
          purpose of the Operations Manual is to provide computer operations personnel with a
          description  of the software and the operational environment so the software can run.
          The Program Maintenance Manual provides the maintenance programmer with the
                                             24
Report No. 7100237

-------
                                                                                Appendix III
                                                                                Page  5 of6
          information and source code necessary to understand the programs, their operating
          environment, and their maintenance procedures and security requirements.

       •   Installation and Conversion Plan - The Implementation and Conversion Procedures
          are a tool for directing the installation or implementation of a system at locations other
          than the test site. This tool is used after testing has been completed.

Phase 5: Evaluation and Acceptance Phase

The objective of this phase is to ensure the system is acceptable to the users, prior to  placing it into a
production environment. Generally, three types of program testing are performed during this phase:
(1) unit testing which validates the testing of the unit; (2) integration testing which validates the interfaces
between the units and the operating environment; and (3) system testing which validates the interaction
between the application system and the user area. The results of these tests will provide user
management with the information necessary to make a decision  on acceptance, modification, or rejection
of the system.  Also, the system should be field tested in one or  more representative operational sites.
For particularly sensitive systems, disaster recovery and continuity of operations plans should be fully
documented and operationally tested as well.  If designated a "sensitive" system, it should be certified for
technical adequacy in meeting its security requirements by an appropriate authority, prior to accreditation
and installation.  Before certification, all W&T test results should be documented and a comparison of
actual and expected results made. A common document produced during this phase is the:

       •   Test Analysis and Security Evaluation Report - The purpose of this Report is to:
          (1) document the test analysis results and findings; (2) present the demonstrated
          system capabilities and deficiencies, (including the Security Evaluation Report needed
          for certification of the system); and (3) provide a basis for preparing a statement of
          system/software  readiness for implementation.

Phase 6: Installation and  Operation

The purpose of this final SDLC phase is to: (1) implement the approved operational plan, including
extension/installation at other sites; (2) continue approved operations; (3) budget adequately; and
(4) control all changes and maintain/modify the system during its remaining life. Problem reporting,
change requests, and a formal system of controls must be in place whereby unforseen problems or
changing user requirements  can be addressed.  At a minimum, such a system should provide for:
(1) formal, documented change requests and approvals; (2) thorough testing; (3) W&T and
recertification; and (4) user  acceptance.

In addition, periodic performance measurement and evaluation activities are performed to ensure the
system continues to meet its requirements in a cost-effective manner in the context of a changing system

                                               25       .,                     Report No. 7' 100237'
                                                       U'S-EPA Headquarters Library
                                                      19nnDMa''code 3201      X
                                                      120°^e.nnsy'vania Avenue NW
                                                         Washington DC 20460

-------
                                                                                Appendix III
                                                                                Page 6  of6
environment. These reviews should be of the entire system (both manual and automated processes), and
they should ensure the system continues to: (1) meet user requirements; (2) maintain the necessary
internal controls and security mechanisms to consistently produce reliable results; and (3) operate in
accordance with Agency and Federal standards, and approved design specifications.
                                              26
                                                                            ReportNo. 7100237

-------
                               APPENDIX IV
                                  Report Distribution
  Office of Inspector General

     Acting Inspector General (2410)

     Assistant Inspector General for Audit (2421)

     Principal Deputy Assistant Inspector General for Audit (2421)

     Deputy Assistant Inspector General for Internal Audits (2421)


i  EPA Headquarters

!     Director, Office of Grants and Debarment (3901F)

'     Director, Grants Administration Division (3903F)
i

!     Agency Audit Followup Official (3101)
'      Attn: Assistant Administrator for Administration and Resources Management

     Agency Audit Followup Coordinator (2710)
|      Attn: Audit Management Team

     Audit Liaison (3102)
      Attn: Office of Policy and Resources Management

'.     Audit Liaison (3402)
;      Attn: IRM Policy and Evaluation Division

     EPA HQs Library
                                            27                         Report No. 7100237

-------

-------