AIGA CHRON
 SUBJECT FILE NO. AMP 6
 TAS CHRON  -            "
 A-109:SKANTROWITZ:vdm:6-5-90:382-7603:303NE
                          JLW / 4 1990%
        tv
 MEMORANDUM

 SUBJECT:  Report on Special Review CERCLIS Post- Implementation
           Evaluation, Report No. E1SFGO-15-0020-0400019

 FROM:     Kenneth A. Konz
           Assistant Inspector General for Audit

 TO:       Don R. clay, Assistant Administrator
           Office of Solid Waste and Emergency Response
I
 /
/
 SCOPE AND OBJECTIVES

 We have completed a. review of the CERCLIS  (Comprehensive
 Environmental Response, Compensation and Liability Information
 System) post- implementation evaluation.  The  objective was to
 evaluate the adequacy of the work performed and the response of
 management to its conclusions and recommendations.

 The work we performed is considered a  special review.   It is not
 an audit in accordance with generally  accepted governmental
 auditing standards.  Special reviews are short-term studies of
 EPA activities.  It is our belief that such quick turnaround
 studies are useful to management by giving a  timely, informative,
 independently obtained picture of operations.  The goal of a
 special review is to produce timely constructive change while
 minimizing the resources invested in studying and documenting the
 areas at issue.

 The review was conducted at EPA Headquarters.  We examined the
 study's work plan and the final report.  We  interviewed the
 CERCLIS Project Officer and the Director of OSWER's Information
 Management Staff .  The review was performed  at the suggestion of
 the Director, Information Management Staff,  and included an
 examination of the work plan prior to  the  start of the study.

 The predominant part of our field work was performed between
 October and December 1989.  In July  1989,  we  reviewed and
 commented upon the adequacy of the work plan, which was  issued
 June 26, 1989.  We had reservations  concerning the absence of
 testing and verification procedures  for some of the proposed
en
CD
                     HEADQUARTERS LIBRARY
                     ENVIRONMENTAL PROTECTION AGENCY
                     WASHINGTON, D.C. 20460

-------
 tasks.   Our comments  were  sent to the Project Officer with a copy
 to the  Director,  OSWER Information Management Staff.  He did not
 receive a response to those comments.

 SUMMARY OF;FINDINGS

 The overall value of  the CERCLIS post-implementation evaluation
 is questionable because four critical tasks were not performed
 adequately.   In addition,  an assessment of the quality of the
 system's software was not  performed.  As a result, due to the
 vital nature of these omissions, additional evaluation work at
 additional cost must  be performed.

 Sufficient independent testing and verification were not
 performed on four critical tasks.  These tasks involved data
 management,  change controls, data base integrity, and security.
 These areas affect the ability of system users to rely on the
 information produced  by CERCLIS in making both day to day and
 strategic decisions.   A weakness in any of these areas could
 jeopardize the system's life because utilization would be so low
 that system costs could not be justified.  The contract team
 performing the study  relied solely on interviews, surveys and
 whatever documentation was made available in their evaluation  of
 the system.   Without  independent verification and testing of
 procedures,  there can be no assurance that controls are actually
 in place and effective.  As a result, any conclusions drawn by
 the study for these tasks  must be considered suspect and, due  to
 the critical nature of the tasks, the overall value of the
 evaluation questioned.  The need to expand the scope of these
 tasks to include  independent testing and verification was
 discussed in our  critique  of the work plan which we issued to  the
 Project  Officer in July 1989.  Had testing and verification
 guidelines for system evaluations been included in OSWER System
 Life Cycle Management Guidance, it is conceivable that this issue
 would not have surfaced.

 An assessment of  software  quality was not performed as originally
 planned.   The evaluation was to have included the adequacy of
 edits, the ability of data elements to support reporting
 requirements, and ease of  maintenance.  Furthermore, responsible
 officials have stated that it will not be done in the future
 because  of the work on CERCLIS Reporting  (draft report  issued
 12/13/89)  performed by the Office of Inspector General  (OIG).
 The  OIG  limited its review to report programming.  Without  a
 thorough review of software quality, there can be no assurance
 that information  needs are being met, that information  is
accurate,  and that future  maintainers of the system can be
effective and efficient.

-------
We provided a draft report, dated February 20,  1990,  to the
Assistant Administrator for the Office of Solid Waste and
Emergency Response.  The Assistant Administrator's complete
response to that draft is included in this report as an
attachment*

ACTION REQUIRED

In accordance with EPA Directive 2750, please provide us with
your written comments on the recommendations contained within
this report, within 90 days of the date of the report.  In
particular, your comments should address the actions taken or
planned.  For corrective action that is still in process, please
provide a written action plan that includes the establishment of
specific milestone dates for the completion of the corrective
action.       .                             I

We appreciated the cooperation and assistance of your staff.
Should you or your staff have any questions, please have them
contact Sheldon Kantrowitz, Acting Chief, ADP and Statistics
Unit, on 382-7603.                 .
                                        x
BACKGROUND

The Comprehensive Environmental Response, Compensation, and
Liability Information system (CERCLIS) is a data base that was
developed to aid EPA Headquarters and regional personnel with  '•<
Superfund site,  program, and project management.  CERCLIS  is the
required and sole source of Superfund planning and accomplishment
data.  As such,  information reported from it serves as, the
primary basis for strategic decision-making for the multi-billion
dollar Superfund program.

The purpose of a post-implementation evaluation is to determine
whether the system is meeting users expectations and the needs
that justified its development.  It assesses whether the system
produces accurate, timely, and reliable information.  This
evaluation normally is the first evaluation of an application
system after the system has been operating and reached  a
substantial level of stability.  The evaluation provides
significant information for further modifications to  the system.

-------
                   FINDINGS AND RECOMMENDATIONS

 1.  Certain Post-Implementation Evaluation Findings Are Suspect

 Sufficient,; independent testing and verification were not
 performed on four tasks.  These tasks involved data management,
 change controls, data base integrity, and security.   The
 contract team performing the study relied solely on interviews,
 surveys and whatever documentation was made available in their
 evaluation of the system.  Without independent verification and
 testing of procedures, there can be no assurance that controls
 are actually in place and effective.  As a result, any
 conclusions drawn by the study for these tasks must be considered
 suspect and the evaluations must be attempted again at additional
 cost to the Government.  Weaknesses that are overlooked in any of
 these four critical areas could adversely affect the life of the
 system because users would not be able to rely on it for 'their
 information needs.  The need to expand the scope of these tasks
 to include independent testing and verification was discussed in
 our critique of the work plan which we issued to the Project
 Officer in July 1989.  Had testing and verification guidelines
 for system evaluations been included in OSWER System Life cycle
 Management Guidance, it is conceivable that this issue would not
 have surfaced.

 OSWER Directive 9028.00 on System Life Cycle Management Guidance
 includes a section on the Evaluation stage,  one such evaluation
 is the post-implementation study which is described as a one time
 only review that addresses  "all aspects of the system—support of
 functional and data requirements for the system, system technical
 performance, and effectiveness of system management."  The topics
 to be included in the post-implementation evaluation report are
 listed.  The type and depth of testing and verification-methods
 to be employed by those assigned to perform the evaluation are
 not discussed, however.

 We identified four of the planned work tasks as lacking
 sufficient testing and verification procedures:

          Data Management  Procedures Assessment
          Configuration/Change Management Procedures Review
          Processing Controls and Database Integrity Assessment
          Security Assessment

The Data Management task was concerned primarily  with the
effectiveness of data quality  control features.   The  contract
team performing the review relied solely upon  interviews,
questionnaires, and so-called  CERCLIS "Audit"  reports.   The team
utilized these "Audit" reports  without testing their accuracy.

-------
 Change Control Management has to  do with assuring that all
 changes are authorized,  adequately documented, tested, and
 accomplished in a timely manner.  Those performing this review
 limited their scope to that part  of the change control process
 involving processing of user requests.  It was these requests and
 interviews rin regards to them that provided all the evidence in
 support of the team's conclusions and  recommendations.
 Consideration was not given to such areas of change control as
 assuring that only tested and authorized programming11 is entered
 into production,  records are maintained of all changes to
 production programming and of what the names of the most current
 versions are,  and changes made to one  program or system module
 are also made to  other related or dependent programs or modules.

 The Processing Controls  and Database Integrity Assessment
 involved an evaluation of (1)  the ability of the computer
 processes to detect improper data when computing totals or other
 fields and (2)  whether reports have information which allows
 clear identification of  the data  and its period of relevance.
 Information was obtained through  surveys and a review of existing
 documentation.  The team performed no  testing through the
 computer of  the accuracy and reliability of system processing.
 In  fact,  one of the recommendations was the use of CERCLIS audit
 reports,  the accuracy of which the team did not test.

 The Security Assessment  involved  evaluating procedures used to
 protect data from unauthorized access.  Managers were surveyed
 and certain  unidentified documents were reviewed.  No actual
 testing was  planned or carried out.

 In  July 1989, we  reviewed and commented upon the adequacy of the
 work plan, which  was issued June  26, 1989.  We had reservations
 concerning the  absence of testing and  verification procedures for
 some of the  proposed tasks.   The  four  cited here were included  in
 that group.s Our  comments were sent to the Project Officer with a
 copy to the  Director,  OSWER Information Management Staff.  We did
 not receive  a response to those comments.

 Due to  the omission of independent testing and verification  for
 the four tasks  of the CERCLIS post-implementation evaluation
 discussed above,  any findings,  conclusions, or recommendations
 proceeding from then must be considered suspect.  Since the  four
 tasks involved  the critical areas of data management, change
control,  data base integrity,  and security, we must conclude  that
the  entire evaluation was of questionable value.  Adequate
evaluations of  the four  areas should be attempted again because
an overlooked weakness'in any of  then  could severely  diminish the
accuracy  and reliability of the information generated by  CERCLIS
and the willingness of EPA management  and staff to use the
system.

-------
 The principal factor contributing  to the absence of sound testing
 and verification procedures was  the omission of requirements for
 them from the evaluation work plan and  from the OSWER System Life
 Cycle Management Guidance.

 RECOMMENDATIONS

 We recommended that the Assistant  Administrator for OSWER
 require:

      1.    The Director of the OSWER Information Management staff
           to include the requirement for independent testing and
           verification procedures  in the performance of system
           evaluations.

      2.    That evaluations be made of the  four areas discussed
           herein:   data management, change controls, data base
           integrity,  and security.

 SUMMARY OF OSWER COMMENTS:

 Recommendation 1;

 OSWER life cycle guidance does not address the question of "how"
 a  project manager ought to address an activity.  The decision to
 omit  such information was reached  because  (1) most such policies
 issued by Federal  organizations  do not  prescribe how a task is to
 be performed;  (2)  guidance which prescribed technical
 methodologies  tended to be too narrow to apply across a broad
 range of  applications and quickly  became outdated; and  (3) it is
 often the case that employment of  secondary sources of
 information is both appropriate  and cost effective.  This type of
 guidance  should be issued by OIRM.

 Recommendation 2:

 The Assistant  Administrator said that just because the evaluation
 was not done in a  certain way was  not sufficient reason to
 require the work to be redone.   Further, the OI6 did not address
 whether the evaluation satisfied the OSWER guidance.

 OUR EVALUATION OF  OTHER'S COMMENTS
The 016 did not recommend any  technical  methodologies.   Our
office simply pointed out that reliable  judgments on four crucial
areas could not possibly be made without independent verification
or testing.  That program officials  could allow contractors to
proceed to perform the review  without  testing is a clear
indication that more guidance  is required.   A copy of this report

-------
 will  be  sent to the Director, OIRM, to determine if any further
 Agency-wide guidance is warranted.

 Recommendation 2;

 The OIG  did not require that the review of the four areas be done
 in any special way.  Rather, as discussed above, we pointed out
 that  no  reliable determinations could be developed by the
 contractors because the work they performed was not adequate'.

 2.  A Software Assessment Should Be Performed

 An assessment of software quality was not performed as originally
 planned.  The evaluation was to have included the adequacy of
 edits, the ability of data elements to support reporting
 requirements, and ease of maintenance.  This task was not
 performed.  Furthermore, responsible officials have stated that
 it will  not be done in the future because of the work on CERCLIS
 Reporting (draft report issued. 12/13/89) performed by the Office
 of Inspector General (OIG).  The OIG limited its review to report
 programming.  Without a thorough review of software quality,
 there can be no assurance that information needs are being met,
 that  information is accurate, and that future maintainers of the
 system can be effective and efficient.

 The work plan for the CERCLIS post-implementation evaluation
 provided, for a Software Quality Assessment that would ensure that
 the software met acceptable standards and CERCLIS requirements.
 The sufficiency of edits designed to prevent bad data from
 entering the system would also be reviewed.  In particular,  10
 percent  of the programs would be examined to develop an
 assessment of source code readability and modularity.  The extent
 to which program code is easy to read, including comments, and
 the extent to which it is composed of small rather than  large
 modules  (programming routines or procedures), the easier it  is to
 understand and maintain.  Software maintenance  involves  the
 modification of programming to make corrections or to reflect
 changing requirements.

When programming is too complex to be readily understood and
 lacks adequate comaentary, error-free maintenance becomes
difficult, if not impossible.  A  further result is higher
programming costs.  A programmer will have to spend  a
considerably longer period of time on maintaining  complex code
than on  code that can be readily understood.  In fact,  programs
may have to be rewritten  from scratch.


The Software Quality Assessment was not done.   The reason given
 in the final post-implementation  evaluation report was that:

-------
           Adequate documentation was not made available
           to the evaluation team to make a technical
           assessment of the software quality.

 The evaluation's Project Officer told us that this statement in
 the report was not true.  He said  that the necessary materials
 were made available to the evaluation team, but that the team
 simply ran out of time.   In fact,  he added, the contractor
 performing the evaluation requested additional funds to complete
 the Software Assessment.

 The Project Officer said that no Software Assessment was now
 planned because of the work already performed in this area by
 auditors of the OIG in fiscal year 1989.  He also said that
 procedures implemented as a result of that audit should address
 any deficiencies in the area of software quality.

 The scope of the OIG audit was limited to a review of CERCLIS
 Reporting.   Serious deficiencies in software quality for reports
 were revealed.   These included a high number of significant
 coding errors,  one of which was an understatement of $500 million
 in a report for a single region.   Documentation was so poor as to
 make maintenance difficult,  if not impossible.  However, the
 audit did not encompass a review of software quality for other
 than report programming,  since the quality of report programming
 was inadequate,  we must conclude that it is likely that the other
 system software is inadequate as well and that an additional
 review of software quality is warranted.

 The decision by CERCLIS management not to plan for a complete
 assessment  of software quality resulted, we believe, from
 insufficient awareness of the complexity of system programming.
 Without thorough examination and testing, errors may go
 undetected  and overall performance may be unreliable.  For these
 same reasons,  there should be an awareness of the need to
 periodically perform thorough and  in-depth reviews of how
 software is being maintained and how it is performing.
RECOMMENDATION


We reconaended that the Assistant  Administrator for OSWER require
that a complete assessment  of  CERCLIS  software be performed as
soon as feasible.

SUMMARY OF OSWER COMMENTS. :
        c

The Assistant Administrator accepted this recommendation.

OUR EVALUATION OF OSWER'S COMMENTS

CERCLIS was developed and is maintained in system 2000 software.

                                 8

-------
National Computer Center memorandum #714, dated March 19,  1990,
stated that System 2000 would be completely removed from the IBM
mainframe environment by early Fiscal Year 1993 and that
applications should be converted to some other software sometime
before then.  As a result, it is now our opinion that a complete
assessmentrof CERCLIS software for the current system would no
longer be warranted.  It would be more cost effective to plan for
a careful reimplementation utilizing other software.  Plans
should emphasize the development of quality software.

-------
                        REPORT DISTRIBUTION
Director,  Office of Emergency and Remedial Response (OS-200)
Director,  Office of Program Management and Technology (OS-liO)
Director,  Office of Information Resources Management (PM-211)
Office of  the Comptroller (PM-225)
Associate  Administrator for Regional Operations  and
        State/Local Relations (A-ioi)
Agency Followup Official (PM-225), Attn:  Director, Resource
        Management Division
Inspector  General (A-109)
                                 10
                      HEADQUARTERS LIBRARY
                      ENVIRONMENTAL PROTECTION AGENCY
                      WASHINGTON, O.C. 20460

-------