\
United States
Environmental Protection
Agency
Office of
Inspector General
401 M Street, SW
Washington, DC 20460
Report of Audit
REPORT ON CERCLIS REPORTING
AUDIT NO. E1SFF9-15-0023-0100187
MARCH 12, 1990
-------
TABLE OF CONTENTS
SCOPE AND OBJECTIVES
SUMMARY OF FINDINGS
ACTION REQUIRED
BACKGROUND
FINDINGS AND RECOMMENDATIONS
1.
CERCLIS Report Program Documentation Needs
Significant Improvement
Improvements in CERCLIS Report Library and Change
controls Are Needed
Formal Testing Procedures for CERCLIS Reports Are
Needed
TECHNICAL EXHIBITS
GLOSSARY
APPENDIX A OSWER'S Response to the Draft Report
APPENDIX B Distribution
1
2
4
5
7
11
15
19
43
44
43
-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, DC 20460
OF
THE INSPECTOR GENERAi.
MEMORANDUM
SUBJECT: Report on CERCLIS Reporting
Audit Report No. E1SFF9-15-0023-0100187
TO:
FROM: Kenneth A. Konz 'I
Acting Assistant: Inspector General for Audit
Don R. Clay, Assistant Administrator
Office of Solid Waste and Emergency Response
SCOPE AND OBJECTIVES
We have completed an audit of the accuracy, reliability, and
consistency of standard CERCLIS (Comprehensive Environmental
Response and Liability Information System) reports. The
objective of the audit was to determine whether reporting was
being performed accurately and reliably.
The audit was conducted primarily at EPA Headquarters. We also
visited Regions 2 and 3 to obtain background information and
users' views on working with CERCLIS. We interviewed
Headquarters Superfund program staff to determine how they
interacted with CERCLIS and whether they had encountered any
problems in working with the system. Our work included:
examination of documentation for report programs;
verification of controls over report program libraries;
comparison of related reports for consistency;
review of quality control procedures over report
programming;
verification of the accuracy of five reports by
duplicating report selection logic provided to us by
CERCLIS officials; and
determination of whether data listed in the Superfund
Progress Report was supportable (adequacy of its audit
trail).
-------
Field work began March 1989 and was completed September 1989.
The audit reviewed CERCLIS reporting and related controls as they
existed from June through September 1989.
We performed the audit in accordance with Government Auditing
Standards (1988 Revision), issued by the Comptroller General of
the United states. In the course of our review, we identified
and reviewed internal controls related to the development and
modification of CERCLIS reports programming. The weaknesses we
found are included in this report along with our recommendations
to strengthen internal controls. No other issues came to our
attention which were significant enough to warrant expanding the
scope of our review.
SUMMARY OF FINDINGS
While our review found that the audit trail for the Superfund
Progress Report was adequate, we also found that management of
CERCLIS report development and modification was deficient due to
the absence of good controls over documentation, software
changes, and testing. Although employment of strict procedures
that would ensure the existence of these controls has long been
an accepted practice within data processing, both government and
private industry, CERCLIS management did not ensure that such
procedures were put into effect. As a result, material errors
have arisen within CERCLIS reports and any information currently
being reported by the system must be considered suspect and
employed cautiously.
Except for the SCAP (see background section) reports, which
report plans and accomplishments, CERCLIS reports were not
heavily utilized. In fact, the OIG audit of the Agency's
Superfund Annual Report for FY 1988 demonstrated the significant
differences between the information being reported at the
national level, based on CERCLIS, and the information being
maintained in the regions.
The reason for continued regional support for the CERCLIS SCAP
reporting was that users' budgets were tied to entry of strategic
data (see Background section) into the system. Regions were not
given their funds, via the Advice of Allowance process, unless
and until they fulfilled their SCAP data entry requirements.
Specifically, we found that CERCLIS report development and
modification practices were deficient in the following areas:
1. Documentation at the programming level was poor. Documenting
the specific steps of a program and the rationale behind these
steps is vital to the ability of future programmers to be able to
correctly modify existing programs. With only one exception, all
-------
of the program source code we examined contained no documenting
comments and did not agree with descriptions of report logic
prepared by contractors. It will certainly be very difficult for
future programmers to make changes to the reports. It may even
be necessary to completely rewrite programs rather than simply
change them due to a lack of knowledge of how they work. In
addition, without a good understanding of how these programs
select and process records, it may become impossible to resolve
questions regarding what the reports mean and how they relate to
other reports. In fact, it may become impossible to rely on them
at all.
2. Control over report programs was not adequate. Programmers
could change reports without approval. Control over programming
was so weak that the correct source code for one production
report could not be found. The result is that reports produced
by these programs cannot be relied upon because users cannot be
certain that the correct and error-free version of a program is
the one being executed. If users cannot rely on these reports,
they will either not be used or be used with considerable risk.
In general, as noted above, with the exception of the SCAP
reports, CERCLIS reporting is not being utilized for day to day
management decision making.
Information listed in related reports was inconsistent because of
the same lack of control. A comparison of reports which would
appear to users to list identical items showed significant
differences. Whether the reason, in each instance, was error or
inadequate labeling of the reports, the result is the same; loss
of faith in the reliability of the system and, whenever possible,
avoidance of use of information produced by it.
3. Formal testing and acceptance procedures for new report
programming or changes to old reports did not exist. If CERCLIS
officials did not receive negative comments on a report from
field users, it was assumed that all was well. The number of
coding errors found in the five reports reviewed demonstrates
that this methodology was seriously flawed. In fact, one report,
the SCAP-18 (FY89 FINANCIAL SUMMARY - TOTAL FUNDING) understated
prior years obligations for one region by $500 million.
We provided a draft report, dated December 13, 1989, to the
Assistant Administrator for the Office of Solid Waste and
Emergency Response. The AA's complete response to that draft is
included in this report as Appendix A. According to the
response, all recommendations were implemented by December 29,
1989. The response included a general comment to which we would
like to respond.
See Glossary at the end of the report,
-------
The AA stated that "There is a recurring theme in the draft
report that CERCLIS reports are not heavily utilized." He
pointed out that CERCLIS reports were used for strategic decision
making through use of the SCAP reports, for FOIA requests, and
for day to day monitoring by regional program managers using
CERCLIS data on their local area networks.
The statement was made in the SUMMARY OF FINDINGS that, except
for the SCAP reports, CERCLIS reports were not heavily utilized.
We reached this conclusion after having spoken to users in the
regions and Headquarters. In fact, only two of the SCAP reports
were used consistently: 14B and 16. They have to do with
reporting on accomplishments and plans. We also pointed out that
the only reason the SCAP reports were used to any great extent
was that Regions were not given their funds via the Advice of
Allowance process, unless and until they fulfilled their SCAP
data entry requirements.
With regard to FOIA requests, while they are important, it is not
cost-effective to build a large mainframe system like CERCLIS
just to process them.
In discussing CERCLIS with regional program managers, we found
that they did not use CERCLIS to perform their duties. The local
area networks referred to in the AA's response are completely
separate systems, called collectively HASTELAN. Their primary
link with CERCLIS is the uploading of new data to mainframe
CERCLIS. Even in the case of WASTELAN, not all program managers
said that they would use it for day to day monitoring of
activities. It should be kept in mind that CERCLIS, and not
WASTELAN, reporting was being audited.
ACTION REQUIRED
As all recommendations have been implemented and there are no
unresolved issues, no additional responses to this report are
necessary. The audit is considered closed. A Special Review to
verify that all recommendations have been implemented adequately
will be performed in approximately one year from the date of this
report.
We appreciated the cooperation and assistance of your staff.
Should you or your staff have any questions, please have them
contact Sheldon Kantrowitz, Acting Chief, ADP and Statistics
Unit, on 382-7603.
-------
BACKGROUND
The Comprehensive Environmental Response, Compensation, and
Liability Information System (CERCLIS) is a data base that was
developed to aid EPA Headquarters and regional personnel with
Superfund site, program, and project management. CERCLIS is the
required and sole source of Superfund planning and accomplishment
data. As such, information reported from it serves as the
primary basis for strategic decision-making for the multi-billion
dollar Superfund program. A region will not be issued monies
through the Advice of Allowance process unless it satisfies all
of its CERCLIS data entry responsibilities.
Data entry responsibilities are at the regional level. Any user
can access any of the standard reports from the on-line menu.
Headquarters relies on CERCLIS as the repository of data on plans
and accomplishments.
CERCLIS has been designed to meet the reporting requirements of
Superfund Comprehensive Accomplishments Plan (SCAP), Strategic
Planning and Management System (SPMS), and Superfund Progress
Report (SPR). Site managers monitor activities at assigned sites
and develop quarterly SCAP plans in cooperation with other staff.
Regional Information Management Coordinators (IMCs) review the
quarterly SCAP plans. The site managers identify and report
changes in site plans as they occur and coordinate changes with
the IMCs to ensure compliance with each region's overall SCAP.
Regional managers also ensure that changes have been recorded
correctly in CERCLIS.
The SCAP is a quarterly report of each region's plans and
accomplishments. The Office of Emergency and Remedial Response
(OERR) and the Office of Waste Programs Enforcement (OWPE) use
SCAP to allocate resources to regions and to evaluate regional
performance.
The SPMS is a quarterly report of each region's accomplishments
with respect to certain measures and targets. The Administrator
and others use SPMS to assess progress in each EPA program.
Superfund SPMS targets and measures are incorporated in the SCAP.
The SPR is a monthly report of accomplishments to date at
individual sites and for the Superfund program as a whole. The
information reported in the SPR is very similar to and overlaps
in many cases with SCAP information. SPR is used for general
information on the status to date of the Superfund program.
CERCLIS has three main groups of data elements: Headquarters
required data elements with national definitions, non-required
data elements with national definitions, and non-required data
elements with definitions that each region is free to set, such
as intermediate subevents and milestones. CERCLIS consists of
-------
two primary data bases: one site-specific data base (also named
CERCLIS) and a second non-site-specific data base (CERHELP). A
third data base, CERTRAN, is used for logging transactions. The
Update reports (UPDT) provide information drawn from the CERTRAN
data base.
-------
FINDINGS AND RECOMMENDATIONS
1.
Improvement
nentation Neec
Our review of documentation for six reports currently available
to users on the CERCLIS Reports Menu showed that documentation at
the computer program level needed significant improvement. We
found discrepancies between write-ups of programming logic
(Report Specification Forms), write-ups contained within the
official CERCLIS Reports Library, and the source code1 itself.
The source code and the Report Specification Forms were used by
programmers to understand and maintain (correct and modify) the
reports. The Reports Library write-ups were prepared for and
employed by users as a tool for understanding the information
provided by each report. The source code contained no comments
relating what information each of the program variables contained
and how processing proceeded. CERCLIS management was not
ensuring that the contractors who were developing report
programming provide and maintain adequate documentation. Lack of
adequate documentation can lead to difficulty in learning how a
particular program works and/or the inability to modify it. It
may be necessary to completely rewrite a program rather than
simply modify it. In fact, when documentation does not match
output, both programmers and users tend to lose confidence in a
system.
National Bureau of Standards (NBS) Special Publication 500-106,
Guidance on Software Maintenance (December 1983), section 8.1.5,
addresses the necessity for the comments in computer source code.
It emphasizes the importance of making programs readable;
providing information on program history, purpose, input/output
requirements, formats, etc.; and helping the programmers who have
to maintain programming understand aspects of the code that are
not clear. Sections 4.2 and 10.2.4 discuss the necessity for
good documentation and enforcement of documentation standards.
Documentation is described as communication which "should include
design specifications, code comments, programmer notebooks, and
other documentation.1* Coding and documentation ". . .standards
must be continually enforced via technical review and examination
of all work performed by the software maintenance staff."
Without adequate program documentation, computer program
maintenance becomes more difficult and programmer error more
frequent.
We found discrepancies and inconsistencies between computer
program logic (Report Specification Forms), write-ups contained
See Glossary at the end of the report.
7
-------
in the official CERCLIS Reports Library Manual, and the computer
program source language. Source code contained no comments and
did not relate the information each of the programming variables
contained and how processing proceeded. One exception was
Enforcement Report 22 (ENFR-22). This program had adequate
comments. The technical details of these findings are presented
in exhibits 1-1 through 1-5.
Although SCAP (Superfund Comprehensive Accomplishments Plan)
reports are the most critical and the most widely used within
CERCLIS, documentation for them was as bad as for any of the
other reports reviewed. The preparation of write-ups (Report
Specification 'Forms) for SCAP reports was not initiated until FY
1989. Not all were completed as of June 1989. None of the
source code for these reports had comments describing the fields
used and the flow of the logic. This, we were told, was planned
for a later time. We were also told that the rush to produce a
multitude of reports during the two prior fiscal years resulted
in documentation being a secondary priority.
The other reports reviewed represented, although secondary to the
strategic SCAP reports, some of the most widely used and relied
upon CERCLIS products. They included the financial, remedial,
removal, and enforcement areas. The same poor documentation
practices and discrepancies were found in each instance.
Due to the lack of documentation that adequately describes how
report programming works, it will be very difficult for future
programmers to make changes to the reports. It is likely that
programmers are experiencing difficulties at this time because of
the poor quality of the program documentation. Inconsistent
documentation exacerbates the problem. It may even be necessary
to completely rewrite programs rather than simply change them due
to a lack of knowledge of how they work. In addition, without a
good understanding of how these programs select and process
records, it may become impossible to resolve questions regarding
what the reports mean and how they relate to other reports. In
fact, it may become impossible to rely on them at all.
User documentation was also inadequate. Write-ups in the Reports
Library were developed so that users might understand what
information was being listed in each of the reports. To the
extent that this information was not reliable, the potential for
error increased.
We found that management was not ensuring that contractors
provide and maintain sufficient and accurate documentation for
report programming, in this regard, CERCLIS officials were
relying exclusively on their contractors without any effort at
verification.
8
-------
CONCLUSIONS
CERCLIS management had not given adequate attention to program
documentation. Accurate and timely documentation of computer
programs is essential for efficient system maintenance. It is
too late to develop this type of documentation after the original
programmers have left or so much time has passed that they no
longer have an adequate grasp of how programs work. Given that
most of the life of a program or system is dedicated to
maintenance, good programming documentation is essential as the
means by which the mechanics of a program are communicated to
future maintainers. It is less expensive to write documentation
than to have to completely rewrite programs at a later date. In
addition, the integrity of CERCLIS will be in question as long as
its documentation is insufficient or inaccurate.
DRAFT RECOMMENDATIONS
In response to our position paper detailing the findings
discussed herein, the Director, OERR, stated that the following
steps were being considered for implementation:
When reports are examined for inclusion in
the national library, comments will be
inserted into the code and the CERCLIS
Reports Library will be modified to reflect
the changes. A three way check will be
instituted between Report Specification Form,
the CERCLIS Reports Library, and the
documented code.
Signoff from the EPA program office (report
owner) will be required for all report
specifications, reports library
documentation, and test/sample reports.
In the future all report documentation will
undergo comprehensive review where it will be
evaluated for adherence to selection criteria
standards as well as documentation standards
prescribed in the National Bureau Standards
Guidance on Software Maintenance and
Documentation and FIPS PUB 38.
We recommended that the Assistant Administrator for OSWER
implement and monitor the actions listed in the OERR response to
our position paper on CERCLIS documentation.
-------
SUMMARY OF OSWER'S COMMENTS!
OERR implemented each of the three steps outlined in its response
to our position paper on CERCLIS documentation by December 29,
1989. OERR additionally requested that the OSWER Information
Management Staff develop a monitoring plan or procedures to
implement the monitoring portion of our recommendation.
OUR EVALUATION OF OSWER'S COMMENTSj.
The decision to implement the three courses of action should help
to resolve CERCLIS documentation problems. We do not see the
need, however, to create any special procedures to support those
actions and to finalize the recommendation. The three steps
presented by OERR are a normal part of system maintenance. As
long as CERCLIS is active, they must be performed on a daily
basis. No further recommendations are being made.
10
-------
2. Improvements in CERCLIS Report Library and Chance Controls
Are Needed
Our review found that CERCLIS report library and change controls
were not adequate, inconsistencies and inaccuracies arose in
CERCLIS reports because its management did not ensure that
controls over program report libraries and changes to programs
within those libraries were adequate. Procedures for providing
these controls were never formalized. Reliance was placed on
verbal communications and meetings between CERCLIS managers and
contractor representatives. The contractor who was given the
responsibility of controlling access and changes to those files
was found not -to be performing that function. Documentation
tracking the disposition of programming changes and identifying
individual responsibilities was not complete; it ended at the
point at which changes were submitted to a contractor for
implementation. Contractors should have maintained documentation
describing each step in the change process, naming each
individual involved, annotating pertinent dates, and detailing
the testing methodology and results of testing. Poor library
controls result in the dissemination of erroneous and
inconsistent data and, potentially, in loss of confidence in both
the system and Superfund program.
Adequate library controls ensure that only authorized,
tested, and the correct version of programs are executed.
Lack of control severely impacts the integrity of a system.
Procedures should exist whereby changes to production
programs must be approved formally, and the process of
designing, coding, testing, and implementing modifications is
fully documented and controlled.
Reliable reporting also ensures that reports purporting to
provide similar data be consistent and reconcilable.
If there are valid reasons for the existence of reports which
appear to contain related data and yet are materially different,
then their differences should be made obvious to readers by means
of such tools as headings, labels and footnotes.
In the course of our audit work, we examined and tested change
control procedures, reviewed controls over program libraries, and
compared apparently related reports for consistency. As a result
of this effort and information gathered during performance of
other audit steps, we concluded that CERCLIS librarian and change
control functions needed improvement.
See Glossary at the end of the report.
11
-------
Changes to the production program library3 can and have been made
by programmers without approval because the responsible
contractor was not providing the necessary controls and
oversight. In fact, this contractor did not even have copies of
the source code for two programs which had been in production
since the beginning of FY 1989. Likewise, when a change to one
report was made, no attention was given to changing related
reports. The responsibility for this type of error was shared by
both the contractor performing the coding changes and the first
contractor, responsible for controlling the production library
itself. There were also no controls to ensure that a particular
version of the source code for a production report was the
correct one and that all changes to the report over a period of
time had been documented. In one instance, our auditors were
given access to the wrong source code for a production report.
Contract staff did not know where the correct version of the
source code was.
We found it difficult to verify the implementation of changes
because the paper trail ended once requested changes were
submitted to contractors. Furthermore, there were no formal
procedures for informing users that requested changes had been
made. In the case of emergency fixes, no attempt was made to
notify users at all.
Our work on reporting inconsistency revealed material
discrepancies in selected financial, SCAP, and enforcement
reports. In total, twelve CERCLIS reports were reviewed.
An examination of two reports, SCAP-13 and SCAP-14A, led us to
the conclusion that certain items listed in them were related and
that they could be reconciled to each other. In response to a
position paper on this subject, which had been issued to the
Director, OERR, we were informed that our comparison of the two
reports was not valid because the ". . .reports calculate
mutually exclusive SCAP categories and address separate
functions."
It is our opinion that reports which appear to be related and
show apparently inconsistent information do as much damage to the
credibility of a system as reports which are truly related but
which do not reconcile due to error. Users had the same reaction
to both apparent and actual inconsistencies. That reaction is
lack of trust in the reports and abandonment of them. The case
of SCAP-13 and SCAP-14A is to the point in that two categories
appearing on the reports, planning estimates (targets) for
Preliminary Assessment Completions and Screening Site Inspection
Completions, reconciled to the same numbers. All other
categories, for both plans and accomplishments, differed. The
See Glossary at the end of the report.
12
-------
logical conclusion would be that one, or both, of the reports was
in error and that it would be imprudent to use either report.
A detailed technical discussion of these items is contained in
exhibits 2-1, Control over the Production Reports Library; 2-2,
Control over the Source Code Library; and 2-3, Reporting
Inconsistencies.
As a result of the findings discussed above, there could be no
assurance that the correct and authorized version of a program
was in production or that a program was reporting results
consistent with what was reported in other programs. During our
audit, a number of users told us that errors and inconsistent
reporting were the principal problems they faced in using
CERCLIS. As a result, they did not rely on CERCLIS data. Except
for the SCAP (see background section) reports, which report plans
and accomplishments, CERCLIS reports were not heavily utilized.
The reason for continued regional support for the CERCLIS SCAP
reporting was that users' budgets were tied to entry of strategic
data into the system. Regions were not given their funds, via
the Advice of Allowance process, unless and until they fulfilled
their SCAP data entry requirements. The data reported by the
SCAP reports were also used as measures of success or failure.
The confusion and lack of faith that are engendered cause users
to abandon a system and, at times, create their own personal
ones. Most significantly, inconsistencies and errors could
potentially cause loss of confidence in the Superfund program
itself. In the instance of the source code for a production
program that was not found, modifications will have to be made by
completely rewriting the program. This represents a significant
waste of computing resources.
Reliance placed by CERCLIS management on informal procedures and
verbal communication was the primary cause for the conditions
discussed here. Management did not feel the need to closely
monitor and control the development and change control process
for report programming. Thus, CERCLIS officials never became
aware that its contractors were not ensuring the integrity of the
reporting function and that many of the complaints received from
users concerning errors and inconsistencies could be laid at the
door of the system's own contractors.
DRAFT RECOMMENDATIONS
In response to a position paper which we issued detailing the
findings discussed herein, the Director, OERR, stated that the
following steps were being considered for implementation:
13
-------
signoff from the EPA program office (report
owner) will be required for all report
modifications prior to reinstatement to the
production reports menu.
Source programs and load modules will reside
in a centralized library.4
Program offices and reports librarian must
approve changes to the production program
library. Librarian will be responsible for
making sure the library contains the latest,
approved copy of each program. Librarian
will use the time and date stamp of the
source code to verify the latest version.
Any changes to source code will be recorded
within the program in the form of comments.
Additionally, a report change log must be
updated each time a report is modified.
When a change is made to one program report
developers will consult the reports librarian
to see if related or affected programs need
to be changed.
Users will be notified of modifications to a
report via electronic mail.
We recommended that the AA for OSWER implement the steps listed
in the OERR response to our position paper on CERCLIS library and
change controls.
SUMMARY OF OSWER'S COMMENTS;
OERR implemented the six steps outlined in its response to our
position paper on CERCLIS documentation by December 29, 1989.
OUR EVALUATION OF OSWER*S COMMENTS;
The decision to implement the six courses of action should help
to resolve the CERCLIS system development and maintenance
problems discussed in this section of the report. No further
recommendations are being made.
See Glossary at the end of the report.
14
-------
3.
Formal Testing Procedures for CERCLIS Reports Are Needed
Our examination of five CERCLIS reports disclosed that formal
testing procedures that included fixing of responsibility,
ensuring adequate authorization, and the use of test transactions
were not in effect. This absence of procedures had given rise to
a significant number of coding errors. CERCLIS managers had not
ensured that report program testing was adequate because they
believed that report content was exclusively the responsibility
of users. If users did not complain, then it was assumed that
the report in question was acceptable. Lack of program quality
assurance procedures leads to unreliable reports which, in turn,
causes users to abandon use of a management information system.
Effective testing is necessary to ensure that programming is
reliable. Without an adequate level of reliability, users lose
confidence in and eventually abandon an information system. NBS
Special Publication 500-106, Guidance on Software Maintenance
(pages 11 and 43), describes the place of testing within the
system life cycle process and provides guidelines for its
performance.
Configuration management is a term which broadly refers to the
identification of the items that make up a system and formal
controls over any changes to it. One of the goals of
configuration management is to help maintain the integrity of a
system throughout its life. The OSWER System Life Cvcle
Management Guidance. Part 3: Practice Paper. Configuration
Management states:
A rigorous and disciplined configuration
management function will maintain system
integrity for these systems, and all other
systems as well, in the following ways. . .:
Helps ensure that an adequate review of
requested changes to the system, and approval
by an authorized organization and individual,
takes place before the system is changed. . .
Helps ensure effective control over changes
to the software and release of changes to
users....
We uncovered the reporting errors being discussed here by the
method of developing our own programming which duplicated the
select logic purportedly being used for the selected reports.
limited our review to only five reports because of the time-
consuming nature of our methodology and the need to perform a
substantial amount of additional work in order to confirm the
15
We
-------
discrepancies that were revealed. The reports selected represent
a broad spectrum: financial, removal, SCAP, and enforcement.
Although all of the errors were serious, we were especially
concerned with the financial and SCAP reports because so much of
the planning is dependent on the information contained within
them. The financial report (FINC-4A) did not list all applicable
sites and did not process negative transactions (e.g.,
deobligations) properly. The resulting variances for Region 2
amounted to over $2.5 million. One of the SCAP reports (SCAP-18)
understated prior year obligations for one region by $500
million. Detailed discussions of our findings are in exhibits
3-1 through 3-4. A complete discussion of the probjems
concerning SCAP-18 is included in exhibit 2-3B.
We found that formal testing and acceptance procedures for new
report programming or changes to old reports did not exist. If
CERCLIS officials did not receive negative comments on a report
from the field, the assumption was that all was well. A data
base of test transactions was not being used. We were informed
that a test data base was being developed and would be
implemented. Furthermore, responsibility for SCAP and
enforcement (OWPE) report testing was relegated to the program
(user) offices responsible for developing report requirements and
not to CERCLIS management.
The errors that were revealed by our review of just five of the
reports available gave ample indication of the dangers inherent
in running systems that have not been subjected to adequate
testing procedures. Users cannot rely on reports which have not
been tested and which contain so many errors. Furthermore,
errors known to exist in some reports can cause users to suspect
all reports and to completely abandon a system. When that
happens, people usually start developing their own personal
information systems. These private systems represent wasted
resources and duplication of efforts. Furthermore, should EPA
management or Congress rely on these reports, there is a
likelihood, given the high error rate, that any decisions made
would be erroneous ones.
Conclusions
Testing procedures for CERCLIS report programs must be improved
if users are to gain confidence in the system and the data it
provides. There must be a review and approval process put in
place that certifies that programming meets requirements and is
reliable before it is entered into production. CERCLIS
management must be directly involved in the approval process and
must continually monitor it if it is to succeed.
16
-------
DRAFT RECOMMENDATIONS
In response to the position paper which we issued on Reporting
Consistency, the Director, OERR, stated that the following
actions would be taken immediately:
OERR/MSDS will review reports usage analysis
and solicit user comments to identify all
reports critical to end of the year reporting
and FY90 planning. These reports will be
given highest priority for review and
correction. As problems are identified,
users will be informed.
Each of these critical reports will be placed
on a fast track for systems analysis, program
confirmation and sign-off, reprogramming,
verification of code against select logic,
testing, approval by program office and
submission to the National Library.
All reports not identified as critical to end
of the year reporting or FY90 planning will
be removed from the CERCLIS National Reports
Menu and made available only through a
special menu on the production system until
such time as each report can be verified,
tested and released onto the National Reports
Menu. As part of this complete audit of
CERCLIS reports, reports will also be
identified for deletion or combination.
In response to a position paper which we issued detailing the
findings discussed herein, the Director, OERR, stated that the
following steps were being considered for implementation:
A report change request form and a new report
request form will be the only mechanisms by
which existing reports can be modified or new
reports requested, respectively. Report
testers will use these documents to ensure
that modifications have been implemented
correctly and that new reports meet original
specifications.
For new reports, report testers will use
report documentation, specifically,
specification forms and report layouts, to
validate the successful development of the
report. The report will be executed using
test criteria and the results will be
compared against the initial documentation.
17
-------
As each new report is developed, it will be
added to the CEREVAL and CERTEST system
directly after programming and initial
testing of the report has been completed.
CEREVAL will be accessible to all CERCLIS
users while CERTEST will be limited to
independent testers only.
At the same time independent report
developers are testing a report, the program
office or report owner will be evaluating the
report to verify validity of the report.
Reports librarian members will solicit
comments from report requestors/users
concerning the acceptance of the report.
Report users can also use the CERCLIS Hotline
to report any problems they have encountered
with the report output.
The response also indicated that when the reports we identified
as requiring modification in our position paper on reporting
inconsistency would be removed from the production menu, they
would be added to the CEREVAL and CERTEST systems. CEREVAL will
contain reports currently being used by report requesters but
that are still undergoing testing. CERTEST will contain reports
currently undergoing testing by independent testers. At some
later date, all remaining production reports would be evaluated
for selection criteria consistency and mathematical accuracy.
We recommended that the AA for OSWER:
1. Implement the three actions quoted above which were
included in the response to our position paper on
CERCLIS report testing.
2. Immediately evaluate the five reports cited in this
discussion in the same way the reports cited in
Finding 2 were evaluated.
3. Abandon the concept of allowing users to access reports
for which testing has not been completed.
SUMMARY OF OSWER'S COMMENTS:
OERR implemented each of the recommendations by December 29,
1989.
OUR EVALUATION OF OSWER'S COMMENTS:
The decision to implement all recommendations should help to
resolve the CERCLIS report quality assurance problems discussed
herein. No further recommendations are being made.
18
-------
TECHNICAL EXHIBITS
TABLE OF CONTENTS
PROGRAM DOCUMENTATION
EXHIBIT 1-1
EXHIBIT 1-2
EXHIBIT 1-3
EXHIBIT 1-4
EXHIBIT 1-5
SCAP-14B
RMDL-21
RMVL-18
FINC-4A
ENFR-22 AND ENFR-13
20
22
24
25
26
LIBRARY AND CHANGE CONTROLS
EXHIBIT 2-1
EXHIBIT 2-2
EXHIBIT 2-3
TESTING
EXHIBIT 3-1
EXHIBIT 3-2
EXHIBIT 3-3
EXHIBIT 3-4
Control over the Production Reports Library 27
Control over the Source Code Library 29
Reporting Inconsistencies 30
FINC-4A
RMVL-18
SCAP-14B
ENFR-22
35
38
39
41
19
-------
SCAP-14B;
Report
EXHIBIT 1-1
FY89 Targets and Accomplishments Site Summarv
The Report Specification Form was to complement the information
listed in the CERCLIS Reports Library. In the Reports Library
listing there was a summary of "Data Elements Used.1' The
following CERCLIS fields were included in the summary of "Data
Elements Used" for the SCAP-14B in the CERCLIS Reports Library,
but were not referred to in the Report Specification Form:
• Clll ENTRY-CITY
C137 ENTRY-SITE-INCIDENT-CATEGORY
C226 ENTRY CLASSIFICATION
C309 ENTRY-FINAL-NPL-UPDATE-NO
C315 ENTRY-FMS-SS-ID
C1719 ENF-ACT-NEG-LIT-OUTCOME
C3124 SUBEVENT-ACTUAL-START
We also noted that the Report Specification Form included
reference in several instances to CERHELP field CP210, "T-A-FUND-
FINANCED-CEILING." This field was not referred to in the listing
for SCAP-14B in the reports library. Similarly, CERCLIS fields
2141, EVEN-ACTUAL-COMPLETION, 2903, ENF-FIN-TYPE and 2907, ENF-
FIN-AMOUNT were not referred to in the reports library.
A review of the source code for the SCAP-14B disclosed that other
than a general and brief four line program description, there
were no other comments. The variables, their meaning and use
within the program, were not explained. The general flow of the
program was not supported by comments either.
The first page of the source code included a listing of changes
made in FY 1989. There was no evidence based on this listing
alone that the listing was all inclusive; that is, that no other
changes had been made. While the name of the original program's
author was noted, no indication of who wrote and compiled the
changes was made. The CSC managers informed us that comments
would be added after the next round of changes to the SCAP.
The following fields were used by the source program, but were
not listed in the reports library:
CERCLIS
C2141 EVENT-ACTUAL-COMPLETION
C2801 ENF-MS-MILESTONE
C2903 ENF-FIN-TYPE
CERHELP
CP209
T-A-SCAP-SPMS-FLAG
20
-------
CP1204 TARGET-NUMBER
CP2203 TARGET-SITE-EPA-ID
CP2204 TARGET-SITE-NAME
CP2205 TARGET-SITE-OPERABLE-UNIT
CP2206 TARGET-SITE-EVENT-CODE
CP2207 TARGET-SITE-ENF-ACTIVITY
CP2209 T-A-SITE-EVENT-LEAD
C2801, CP1204, and CP209 were not found in the Report
Specification Form either.
There was no information at all in the Report Specification Form
on how CERHELP and CERCLIS records for enforcement were to be
matched. A description of such a matching was provided for the
Event (fund) side only.
21
-------
EXHIBIT 1-2
RMDL-21: Sites on the Proposed 175RA List
We examined the write-up of report logic (Report Specification
Form) provided to us by the CSC contractors. Our examination
disclosed two inaccuracies. The SUBEVENT field, C3101, was
described as having two characters while it actually had three.
The fields listed as C2128 and C2129 were C3128 and C3129 in the
data base. In addition, we found the document to be inadequate
as a source of programming specifications. What information was
to go in what column or row of the report was not specified. It
was not clear from the write-up whether the report listed all
regions and had regional break logic or if it was run for one
region at a time.
We found it extremely difficult to understand, from the Report
Specification Form, the flow of the program and how records were
to be selected. After describing a match of a subset of records
between CERHELP and CERCLIS, it then described additional fields
to be selected from CERCLIS. Only after trial and error testing
and time-consuming examination of source code (which lacked
comments describing variables and processing) did it become
apparent that the processing logic was actually more complex and
additional records, not fields, were required to be selected. In
duplicating the logic, we found that we selected more records
than were shown on the report, namely, multiple occurrences of
the same event for the same site that were in the data base.
This was particularly true when codes RA and RD were involved
because their selection was not limited by the first start
indicator code. Omission of these records was not documented in
the write-up.
Examination of the source code showed the logic to be
significantly different from that indicated by the Report
Specification Form. According to the code, after an initial
match on a key (EPA ID, OP UNIT, and EVENT) between CERHELP and
CERCLIS, additional CERCLIS records were selected for specific
types of generic EVENTS for those previously matched records
(same EPA ID and OP UNIT).
Processing, per the source code, involved multiple passes through
the data base. The additional records selected had to match the
EPA ID and OP UNIT of previously matched records. Secondly,
records were selected for certain generic EVENTS (generic meaning
that the sequence number, the third character, was not
considered). These were CO, RI, FS, RA and RD. For CO and RD,
second passes through the data base were made if the initial
selections did not result in the finding of any CO or RD records
that were first starts (first start codes of A or B); records of
any start status were then selected. In the case of an RA EVENT,
should a first start code of A or B not be found, a subevent of
22
-------
SV was sought. If there were multiple occurrences of a single
generic event, such as RD1, RD2, etc., only the first, in
ascending order of magnitude, would appear in the report. None
of this special processing was clearly delineated in the write-
up.
Examination of the description for RMDL-21 in the Reports Library
disclosed that the listing of data elements used in the report
included eleven elements that were not used in the report based
upon the Report Specification Form and our own experience in
duplicating the logic. The data elements so listed were:
CO110 ENTRY-STREET
C0204 ENTRY-RPM-OSC-NAME
C0111 ENTRY-CITY
C0112 ENTRY-ZIP
C0305 ENTRY-STATUS
C2108 EVENT-TYPE-SORT
C2130 EVENT-CURRENT-PLAN-START
C2131 EVENT-CURRENT-PLAN-COMPLETION
C3202 FINANCIAL-TYPE
C3230 FINANCIAL-AMOUNT
C3206 FINANCIAL-FMS-FLAG
On the other hand, there were data elements not listed in the
reports library description that were listed in the Report
Specification Form. These were:
C2101 EVENT-TYPE
C2132 EVENT-CURRENT-PLAN-START-FYQ
C2133 EVENT-CURRENT-PLAN-COMPLETION-FYQ
C3101 SUBEVENT-ID
C312 4 SUBEVENT-ACTUAL-START
C3125 SUBEVENT-ACTUAL-COMPLETION
C3128 SUBEVENT-CURRENT-START-FYQ
C3129 SUBEVENT-CURR-COMPLETION-DATE-FYQ
The documentation did not include a listing of the applicable
CERHELP data elements.
23
-------
EXHIBIT 1-3
RMVL~18i Planned Starts. Completions. Ongoings, and
Obligations for Region
In the write-up of report logic (Report Specification Form)
provided to us, there were two locations in which instructions
indicated that CERCLIS records were to be selected for C305, KPL
indicator, equal to 'D1, 'P1, 'F1, 'N', or »R'. In two other
places, C305 was to be selected for any value not equal to '0'.
The results would not be the same because there is a value of 'S'
for which records would not be selected in the first case, but
would be selected in the second.
The Form also included a listing of the CERCLIS fields used in
the report. That listing included fields which were neither
described in the write-up as to their use nor could they be found
in the actual report:
C3229 FINANCIAL-EVENT-BUDGET-SOURCE
C3201 FINANCIAL-Id
C3218 FINANCIAL-PLANNED-OBL-FYQ
C3202 FINANCIAL-TYPE
C3225 FINANCIAL-FUND-PRIORITY-STATUS '
It could not be discerned from the Report Specification Form how
records selected from CERHELP were to be used or how they related
to records selected from CERCLIS.
The Form provided no explanation of how records were segregated
by category within the report.
A comparison of the Report Specification Form with the
description for RMVL-18 in the CERCLIS reports library disclosed
that while the reports library showed that data elements C2122
and C2123, EVENT-ORIGINAL-PLAN-START-FYQ and EVENT-ORIG-PLAN-
COMPLETION-FYQ, respectively, were used, the Form indicated that
C2132 and C2133, EVENT-CURRENT-PLAN-START-FYQ and EVENT-CURRENT-
PLAN-COMPLETION-FYQ, respectively, were used instead.
We then examined the source code. As we noted before in
discussing other reports, necessary comments were omitted. There
were no comments describing variables or processing. After
further examination, we found that this source code could not be
for the executable program, load module, now in production. This
matter is discussed in more detail in exhibit 2-2.
24
-------
FINC-4A;
Reoion
EXHIBIT 1-4
Remedial/Removal Site Specific Funding Report bv
A review of the Report Specification Form for the FINO4A report
revealed these problems:
Source program, load program and JCL libraries were not
listed in the specifications.
Under the heading "SELECT CRITERIA" and in the "EVENTS"
select Iqgic it read ". . .AND C3203 CONTAINS 'FA1. . ."
when it should have read "AND POSITIONS 3 AND 4 OF C3204
CONTAIN 'FA1. . .." C3203 is the Document Control Number
while C3204 is the FHS Account Number.
Under the section entitled "BREAKS" the LINE specification
showed C3204 and then DOCUMENT CONTROL NUMBER. C3204 is the
FMS ACCOUNT NUMBER field. It was unclear whether the report
broke on account number or document control number.
Under the heading "SELECT CRITERIA", in the "EVENTS" select
logic, it read ". . .and C305 NE 0. . .." Under the heading
"PROCESSING LOGIC" in section B subsection 1 the last
sentence read "IF NATION REPORT ALSO C305 NE O". The select
logic implied that the logic "C305 NE 0" occurred in all
cases and the processing logic implied it occurred when
running only a national report, i.e., all regions.
The FINC-4A source program code was not adequately documented.
There was no explanation of the program's purpose, no comments
explaining the purpose of data groups and individual variables,
and no comment lines explaining the processes that manipulate and
transform data.
25
-------
EXHIBIT 1-5
ENFR-22: FY89 Enforcement Planned Dollars
Select logic in the ENFR-22 source program did not natch the
select logic for ENFR-22 in the CERCLIS Reports Library manual.
The program source code indicated that fewer records would be
selected than indicated in the Reports Library select logic
description.
ENFR-13t
Dollars
FY89 Enforcement Obligations and Approved Planned
Documentation in the Reports Library Manual had an error in the
select logic for planned obligations. The documentation read ".
. .and C2903 EQ D. . ." when it should have read ". . .and C2903
EQ G. . .." C2903 is the Enforcement Activity Financial Type
field.
26
-------
EXHIBIT 2-1
Control over the Production Reports Library
CERCLIS contractor Sycom was supposed to be in charge of the
reports library. Its staff was supposed to have the exclusive
ability to write source code to the production load library.
Report programming was being developed by CERCLIS contractor CSC.
Source code developed or modified by CSC was to be turned over to
Sycom for entry into the appropriate production load library. We
found, however, through both discussion and personal observation,
that CSC staff could and did make changes to source code and
immediately update the production load library. Such a change
was made when one of our auditors referred an error on report
FINC-4A to one of the OERR staff members. That staff member
referred the problem to a CSC programmer who made the change to
both the source and load modules. Sycom was not involved at all.
It should also be noted that when the change was made to report
FINC-4A, no similar change was made to related FINC reports.
This type of process can and did lead to report inconsistency;
see exhibit 2-4, section A.
During the course of the audit, we requested access from an OERR
staff member to CERCLIS source code libraries. At that time, we
were particularly interested in examining the source code for
report RMDL-21, which is actually two programs: RA175F and
RA175RP, and RMVL-18. A Sycom staff member thereupon contacted
us and told us we would find the source code we were seeking in
one of the following source libraries:
NTSH.NATL.PROD.PLEX.COBOL
NTSH.NATL.PROD.COPY.COBOL
NTSH.NATL.PROD.SCHM.COBOL
Examination of the contents of the files revealed that the files
we were looking for were not there.
We then contacted the Sycom staff member who said that he guessed
CSC had not turned over the programs to Sycom as yet. We then
verified that the two programs had been in the production reports
library since at least the beginning of the fiscal year.
RMVL-18 was entered into the production load library in September
1988 and the RMDL-21 in October 1988.
i
We thereafter received a call from a CSC supervisor who said that
the programs we wanted were in two files his staff controlled:
GCMNTSH.SIW.SOURCE.COBOL
NTSH.ROAR.COBOL
The source programs we were looking for were in these two files.
27
-------
(We later found that the version of the source code for RMVL-18
to which we were given access was not even the correct version
for the RMVL-18 report in the production load library.)
28
-------
EXHIBIT 2-2
Control over the Source Code Library
Examination of the source library control procedures employed by
CERCLIS contractor CSC disclosed that there were no such
procedures in effect. There could be no assurance that all
program modifications were recorded in the source code and there
was no log, manual or automated, in which all changes and the
current correct version of the source code could be identified.
We were informed by a CSC supervisor that librarian software was
being sought that would provide the necessary controls.
While examining the source code for report RMVL-18, we found that
this source code could not be for the executable program, load
module, now in production. First, according to the working
storage section of the code provided to us, the fourth line of
the report (the heading) could be one of the following three:
RMVL-18: QUARTERLY REMOVAL APPROVED PLANS
RMVL-18: QUARTERLY REMOVAL ALTERNATE PLANS
RMVL-18: QUARTERLY REMOVAL APR/ALT PLANS
The actual heading of the report was, line 4,
RMVL-18: PLANNED STARTS, COMPLETIONS, ONGOINGS, AND
Line 5 was different as well.
As further verification, we checked the date the executable
program in the production library (load module) was compiled with
the date of compilation noted on the source code. The
date on the source code was February 1988; that of the load
module, September 8, 1988.
It should be noted that a CSC supervisor thought this was the
correct version of the code and that Sycom did not have a copy of
it at all.
29
-------
EXHIBIT 2-3
ReportingInconsistencies (Region 2 data only)
A. Financial Reports
We reviewed the following reports:
FINC-4A - RMDL/RMVL SITE SPECIFIC FUNDING REPORT (Region
Level)
FINC-4B - RMDL/RMVL SITE SPECIFIC FUNDING REPORT (STATE
LEVEL) for New York, New Jersey, Puerto Rico and the Virgin
Islands, i.e., States in Region 2)
FINC-5A - REMEDIAL SITE SPECIFIC FUNDING REPORT (Region
Level)
FINC-6A - REMOVAL SITE SPECIFIC FUNDING REPORT (Region
Level)
We found that the grand total for the FINC-4A Report was not
correct and that negative financial transactions were not being
deducted from obligation and outlay amounts, which in turn
affected site totals. We also found that sites with NPL
indicators of "S" were not being selected. See Finding 3.
These problems were brought to the attention of CERCLIS program
personnel. They had a contract programmer correct the FINC-4A
Report and move it into production.
Because the FINC-4A was corrected without regard to correcting
any like reports, the problems originally discovered in the
FINC-4A report still existed in FINC-4B, FINC-5A and FINC-6A and
thus these related reports were no longer reporting information
consistently. There had been no follow-up by Superfund or
contractor personnel to correct reports which used the same logic
as FINC-4A.
These problems resulted:
Total region obligations varied by $2,631,204 and region
outlays were incorrect by $1,872,300 when a total was
computed from the four state level reports (FINC-4B) for
Region 2. The combined totals from the remedial report
(FINC-5A) and the removal report (FINC-6A) showed the same
results. The reason the totals differed was that Albert
Steel Drum (SSID«02L4, EPAID=NJD000525154) and Chemical
Insecticides (SSID=0294, EPAID=NJD98048653) were not on the
FINC-4B, FINC-5A AND FINC-6A reports. The sites were
missing because the code value "S" which represents "HAS
SCAP PLAN REMEDIAL ACTIVITIES" was omitted for the "S/I NPL
INDICATOR" field (C305) in the report programs.
30
-------
Site obligation totals for Region 2 on the FINC-5A and
FINC-4B were incorrect for two sites because negative
amounts were not subtracted from the site totals.
The General Motors/Central Foundry Division site
(SSID=02A6, EPAID=NYD091972554) showed a zero
obligation amount for DCN KJ0029, Account 8TFA02K986,
but the correct amount was a negative $367. The site
total was correct.
The Pomona Oak Wells site (SSID=02D3,
EPAID=NJD980769350), showed a zero obligation amount
for DCN KCS389, Account 7QFA02KLD3; the correct amount
is a negative $45,000. The negative $45,000 amount was
not reflected in the site total. The total reported
for obligations for this site on the two reports was
$1,200,000, but should have been $1,155,000.
The site total for obligations for the Pomona Oak Wells site
was still incorrect on the FINC-4A Report, although the
program had been modified based on our earlier findings.
The site total reported was $480,000. The correct total
should have been $1,155,000.
B. SCAP-15 and SCAP-18
We compared SCAP-15, "FY89 FINANCIAL SUMMARY - APPROVED FUNDING",
and SCAP-18, "FY89 FINANCIAL SUMMARY - TOTAL FUNDING", to
verify the consistency of reported funding amounts. As implied
by the titles, the major difference between the reports was that
SCAP-15 showed Approved Funds and SCAP-18 showed Total Funds
(Approved and Alternate Funding). The reports had identical
column and row names (labels) with each having nine columns of
funding data.
The SCAP-15 report column totals and row totals were verified to
be mathematically correct. The SCAP-18 report output, however,
was found not to be consistent with SCAP-15 and to have errors.
These problems were relayed to EPA staff responsible for SCAP
reports. We suggested that the SCAP-18 Report be taken off the
CERCLIS Report Menu until the errors could be corrected. The
SCAP-18 report was, for a time thereafter, listed as unavailable
on the CERCLIS menu, it has since been returned to the menu and
its problems resolved. The problems found when we compared the
two reports were:
Prior Years Obligation Column - Logically prior year
obligation totals should have been the same on both reports.
Remedial obligations were identical on the SCAP-15 and
SCAP-18 reports but the subtotal amount was $100,000,000
less on SCAP-18. The Remedial RA amount was $300,000,000
31
-------
less than the same amount on the SCAP-15 report and the
Regional Grand Total amount was $500,000,000 less. These
errors were caused by truncation of numbers.
Total Remaining Plan Obligation Column - In the SCAP-15
Report (Approved Funding), $25,000 was reported for
remaining headquarters enforcement funding. The SCAP-18
report showed zero dollars for this amount. If the approved
funding report showed $25,000 as a valid amount then a
logical conclusion could be drawn that the total funding
report (a report of all funding) should have shown $25,000
or a greater amount.
t
Unexpended Obligations As Of 89/4 Column - The Remedial
subtotal amount on the SCAP-18 Report did not agree with the
SCAP-15 subtotal amount although both reports had identical
remedial line item amounts from which the subtotals were
calculated. The SCAP-15 amount showed the correct sum,
$67,804.9 (dollars in thousands). SCAP-18 reported a sum of
$32,195.0 thousand which is a difference of $35,609.9
thousand. The SCAP-18 Remedial RA line item amount was
$300,000,000 less than the SCAP-15 Remedial RA amount. The
Regional Grand Total line amount was $400,000,000 less on
the SCAP-18 report. The Remedial RA and Regional Grand
Total differences resulted from truncation errors.
FY 89 Planned/Actual Obligations and Commitments, 89/3
Column - The Enforcement line item amount was minus $28.5
thousand on the SCAP-15 report while the same total on the
SCAP-18 report was a positive $28.5 thousand. However, the
Regional Grand Total was $31,793.9 thousand on both reports.
FY 89 Planned/Actual Obligations and Commitments, 89/4
Column - The same problem that the Total Remaining Plan
Obligation Column on the Headquarters Enforcement line was
experiencing was occurring in this column. The Approved
Funding Report (SCAP-15) showed $25.0 thousand while the
Total Funding Report (SCAP-18) showed zero dollars. The
total funding amount should have been equal to or greater
than the approved funding amount. This problem was carried
over to the Annual Funding Column, which is a sum of the
quarterly amounts under the FY 1989 Planned/Actual
Obligations and Commitments title.
Regional Allowance (TFAY9A EXTRAMURAL) Cumulative Obligation
Authority By Quarter - The 89/3, 89/4 and FY 1989 funding
columns reflected truncated left most numbers on the Total
Remedial line and the Regional Grand Total line on the SCAP-
See Glossary on last page of the report.
32
-------
18 Report. This caused a discrepancy of $100,000,000 in
each of six cells of information.
Prior Years Outlays - On the Regional Response Outlays row,
the dollar amount on SCAP-18 was truncated causing the
correct amount to be in error by $100,000,000.
C. ENFR-13 AND ENFR-22
We reviewed two enforcement reports:
ENFR-13: FY89 Enforcement Obligations and Approved Planned
' Dollars, By Region, State, and Sitename
ENFR-22: FY89 Enforcement Planned Dollars,
By Region, Activity/Event Type, and Funding Status
We found that FY 1989 planned approved obligation dollars for
Region 2 were reported differently on enforcement reports ENFR-13
and ENFR-22. Report ENFR-13 showed the Region 2 grand total as
$6,297,000. Report ENFR-22 showed a grand total of $4,779,000,
which is a difference of $1,518,000. Several sites listed on
ENFR-13 were missing from ENFR-22.
It should also be noted that the select logic in the ENFR-22
source program did not match the select logic for ENFR-22 in the
CERCLIS Reports Library manual. The program source code
indicated that fewer records would be selected than indicated in
the Reports Library description.
We discussed the findings with the Chief, Regional Planning
Section, Enforcement Branch, OWPE. He had no immediate
explanation of why the totals in question were different. He
stated that ENFR-22 and ENFR-11 were reports used for showing
planned obligation dollars. ENFR-13 was not used by headquarters
but might be used in the regions. He had been planning to
propose that several reports including ENFR-13 be taken off the
CERCLIS menu because they were no longer needed.
In a later conversation, the Chief, Regional Planning Section,
Enforcement Branch, OWPE, added that the select logic in the
ENFR-22 source program was developed for selecting acceptable
planned obligations. Records with planned approved obligation
dollars that did not meet the select criteria were not considered
valid records. He said that changes in report procedures for FY
1990 would cause records that did not meet the established select
criteria to be printed on a separate error report.
D. SCAP-l AND FINANCE-4A
The site obligation and outlay totals for SCAP-l and FINC-4A
reports were checked for consistency and mathematical accuracy.
In one instance, EPA ID NY002249563 and SSID 02S7, we found that
33
-------
the SCAP-1 showed a rounded amount of $1,739 thousands for
outlays to date (August l, 1989). The FINC-4A showed a listing
totaling $1,697,518 for the same site.
Our own duplication of the FINC-4A logic revealed two negative
transactions totaling $41,929 that, apparently, while not being
listed on the FINC-4A, were part of the calculation of total
outlays for the site on that report. On the other hand, they
played no part in arriving at the total shown on the SCAP-1.
E. SCAP-13 And SCAP-14
Comparisons were made between SCAP-13 (SCAP/SPMS Planning
Estimates and Measures Summary Report) and SCAP-14A (FY 1989
Targets and Accomplishments Summary Report) for consistencies in
counts. Planning estimates on SCAP-13 were compared against
SCAP/SPMS Targets on SCAP-14A. Accomplishments on SCAP-13 were
compared against accomplishments on SCAP-14A. Planning estimates
(targets) for Preliminary Assessment (PA) Completions and
Screening Site Inspection (SSI) Completions on the Activity
Measure/SCAP SPMS Target side of the reports reconciled to the
same numbers.
The like appearance of the two reports and what they appeared to
be showing, led us to the conclusion that a comparision of data
between them would be appropriate. Our finding that two
categories actually matched supported that assumption. Further
discussion with CERCLIS officials revealed that the two reports
should not have been compared because the record selection
criteria differed.
Reports which appear to be related and show apparently
inconsistent information do as much damage to the credibility of
a system as reports which are truly related but which do not
reconcile due to error. The reports did not have detailed
titles, labels or formulas to explain what was being counted as
an accomplishment on each report. It could only be assumed,
because the accomplishment counts were different for a number of
comparable categories, that the selection criteria for
accomplishments was different for each report. For example, it
was unclear on SCAP-13 whether the accomplishment counts for FY
89 included sites already started or completed during the fiscal
year or if a projection based on planning estimates was used.
we found two problems with SCAP-13 that might have also
contributed to differences. In the total column under
accomplishments for SSI Completions for State, we found a count
of 29. The related quarterly columns when added together totaled
30. In a number of other instances, quarterly columns showed
zeros while the FY 1989 Totals had numbers greater than zero,
i.e., the FY total could not be verified by the quarterly counts.
34
-------
EXHIBIT 3-1
FINO4A; Remedial/Removal Site Specific Funding Report by
Region
Several significant problems were identified in the FINC-4A
report for Region 2 when it was compared with a duplicate report
developed within the OIG:
Two sites on the OIG duplicate report did not appear on the FINC-
4A production report. The sites were:
Albert Steel Drum, SSID=02L4, EPAID=NJD000525154
Chemical Insecticide, SSID=0294, EPAID=NJD980484653
After we referred this problem to an OERR staff member who in
turn referred it to a CSC programmer, the programmer discovered
that the problem was caused by the code "S" missing as a value
for the NPL INDICATOR field in the production report code logic.
Code "S" was used for a site that "HAS SCAP PLAN REMEDIAL
ACTIVITIES" (CERCLIS Data Dictionary, Page 31, Dated 11/28/88).
Region 2 grand totals were, as a result, understated by the
following dollar amounts: Obligations - $2,631,204, Outlays -
$1,872,300. The CSC programmer corrected the program source
code, recompiled the program and placed it into production.
In the obligation column two separate negative obligation amounts
were reported in our duplicate program, but not in the FINC-4A
production program. In the case of General Motors/Central
Foundry Div. (SSID=02A6, EPAID=NYD091972554), the negative
amount, although not appearing, was correctly subtracted from the
site total. In the case of Pomona Wells (SSID=02D3,
EPAID=NJD980769350), a negative transaction of $45,000 not only
did not appear on the report, but was not subtracted from the
site total.
The FINC-4A production report for Region 3 showed no major
problems. We observed that two sites had the same SSID (SSID =
03Z3), Colwell Lane Site and Whitemarsh Twp Drum Dump.
Region 4 had sites missing from the FINC-4A report due to the
same problem that caused missing sites on the Region 2 copy of
the report; see the discussion above. The missing sites for
Region 4 caused the Region's grand total to be understated by the
following amounts: Commitments - $285,942, Obligations -
$1,289,430 and Outlays - $207,056. The missing sites were:
Bay Drum, SSID=04L5, EPAID-FLD088783865
B M F Industries Petroleum Products, SSID-0410,
EPAID=ALD990831968
35
-------
Some sites that had obligations and outlays with negative amounts
did not show correct site totals. The sites involved were:
The 62nd Street Dump (SSID=0452) site showed an outlay site
total that was $22,044 less than the OIG duplicate report
outlay site total. The OIG duplicate report showed an
outlay transaction of $22,044 subtracted from the site
total. We concluded that the CERCLIS FINC-4A program
doubled the negative transaction amount in arriving at the
site total.
The Martin Marietta-SODY (SSID=0467) site showed an outlay
site total with $1,734 less than the duplicate report site
total. The CERCLIS FINC-4A report program was tripling the
amount of the only negative transaction of $578.
Three sites on the CERCLIS FINC-4A report had negative
transaction amounts that were not reflected in site totals:
Canton Plating and Bumper Works, Inc. (SSID=04E6) - a
$452.00 negative outlay.
Coleman Evans Wood Preserving Co. (SSID=0441) - a
$968.00 negative outlay.
Zenith Chemical (SSID=04Z5) - a negative obligation of
$1,273.00.
The Golden Strip Septic Tank site (SSID=04Z4) showed a site
obligation total of zero on the FINC-4A report. The OIG
duplicate program showed a site total of minus $300,000. This
total was composed of one positive and two negative transactions
of $300,000. Superfund finance staff expressed the opinion that
one of the negative obligation amounts was a duplicate entered by
the region.
By developing transaction listings, we found records that had
incomplete information or questionable field values. While these
could have been examples, in some instances of poor quality
control over entered data, we discuss them here because of their
impact on the quality of reporting and because they may indicate
the need for improved data entry edits. The first two items are
applicable to the entire country (all regions); the balance
concerns only Region 2:
We found a number of financial records with missing SSID's
(CERCLIS field C315). We had been told that this field was
critical to a good deal of processing. We submitted a national
listing of these records to the OERR Finance and Administrative
Management Section.
36
-------
We found several account numbers with a character "1" in the
first position and the remainder of the field blank. The
obligation dates on these records were for fiscal years 1981 and
1982. The obligations were not reported on the FINC-4A report.
We turned over a national list of these account numbers to the
OERR Finance and Administrative Management Section.
In examining Region 2 transactions, we found that:
Ringwood Mines site (SSID - 0262) showed an outlay of
$34,000 with no Account Number/DCN, no obligating date and
no obligating document number. A copy of this transaction
was given to the OERR Financial and Administrative
Management Section.
Montclair/West Orange Radium Site showed an event as "DAI"
(design assistance) for commitment and "RD2fl for obligation.
SSID '0216* showed two different site names:
Field1 and 'Suffolk County ANG Base'.
'Olean Well
37
-------
EXHIBIT 3-2
RMVL-18; Planned Starts. Completions. Ongoings, and
Obligations for Region
A comparison of the results of our duplication of the select
logic contained in available documentation with the actual report
disclosed significant discrepancies. The comparison was made for
Region 2 only. In the ongoing sites category, the duplication
listed 18 event transactions not contained in the CERCLIS report
and the CERCLIS report had 7 not contained in the duplication.
In the planned completions category, the standard report had
three more sites than the duplication. Further research revealed
that the additional records that appeared in the duplication did
so properly, according to the select logic contained in the
Report Specification Form and that the questionable transactions
appearing in the CERCLIS report should not have been so listed.
Listings for the 28 event transactions in question were developed
to verify whether, according to the select criteria, these
entries should or should not have appeared on the standard
CERCLIS report. In each case, we verified that, based on the
select criteria presented to us, our results were more accurate.
The events that did not appear on the official CERCLIS report
were all of the proper event type, NPL type, fund priority, and
financial type that would warrant their inclusion. Our
examination of the ones that our duplication did not select
generally lacked the correct fund priority and/or financial type.
Finally, perusal of the official CERCLIS version of RMVL-18
disclosed that monetary totals did not accurately reflect the sum
of the amounts in each of the three categories. These totals
were for approved spending. There were no totals for alternate
spending.
The summary of planned quarterly approved obligations on the
final page showed an amount of $500,000 for "CONTINGENCY." There
was no explanation anywhere of what this amount represented or
where it came from. In addition, the number of sites reported in
this summary did not reflect the detailed listings and was,
therefore, confusing.
38
-------
SCAP-14B:
Report
EXHIBIT 3-3
FY89 Targets and Accomplishments Site Summary
Due to the complexity of this report, we limited our review to
only two sections: RI/FS FIRST STARTS and COST RECOVERY REFERRAL
ACT, SECT 107 RMVLS. Our findings were referred to an OERR staff
member working on the SCAP reports.
(1) RI/FS First Starts.
We found that if a FY 1988 planned activity was
accomplished in FY 1989, it would be included in RI/FS
FIRST STARTS for FY 1989 (this fact was not noted in
the documentation), but would not be included in the
total for planned starts. CSC contractors informed us
that they thought this bug was being worked on.
We found, in the case of Region 3, that there were
seven instances in which the activity LEAD codes did
not print on the SCAP-14B although they were in the
data base. We developed a special report of LEAD codes
to verify this fact. In the case of Region 4, one of
the actual start dates did not print although it was in
the data base.
(2) COST RECOVERY REFERRAL ACT. SECT 107 RMVLS
Comparison of the official SCAP-14B and a duplicate
report developed by OIG staff, disclosed that there
were two Region 2 sites that were not listed on the
official version, although they apparently should have
been. These were Bridgeport Rental, EPA ID
NJD053292652, and Lone Pine LF, EPA ID NJD980S055425.
Both had records with SV activities that included a
SPMS target status of P, financial codes of F, remedy
types that were either VM or VO or vp (both applicable
records were remedy types VD), and financial amounts
over $200,000. As such, they should have appeared on
the CERCLIS SCAP-14B report.
We also noted, for Region 2, that a "plan start11
scheduled for the first quarter of the next fiscal year
may have appeared because a current year accomplishment
date existed. According to the Report Specification
Form, the record with the FY 1990 plan date should not
have been selected. In addition, its selection caused
the total of plan dates to be inaccurate because the FY
•90 date was not added into the total.
39
-------
In the case of Region 3, we found that an FY 1990 plan
date had been selected, inaccurately. This time there
was no actual start date that might occasion the
selection of the 1990 date. In addition, no "Totals"
line was printed for the section.
For Region 4, we again found that FY 1990 plan dates
were selected, although such selection was not
indicated by the Report Specification Fora, and that
the plan start date total was inaccurate due to failure
to add FY 1990 plan dates into that total.
As an additional test, we interactively subjected the
six Region 4 sites that had only target dates (only
CERHELP planning records were selected; no matching
CERCLIS records were selected), to additional
processing to determine why there were no matching
records for them. We found that there were no SV
transactions for these sites that amounted to over
$200,000, even when combining multiple like records.
The question must then be raised as to why there would
be any reason to print these records if one of the
requirements for this section is that planned or actual
financial amounts be over $200,000. Their presence
could confuse or mislead.
In reviewing Region 4 data, we noticed that an actual
start date for McDonald Farm (EPA ID 'GAD981929748')
was printed on the CERCLIS SCAP-14B, November 30, 1988.
It was not printed on the OIG duplication. Further
examination of the records for this EPA ID within the
data base revealed that there was no record or
combination of records that met all of the code and
date selection criteria and had financial value over
$200,000. The actual start date should not, per the
documentation, have appeared on the report.
40
-------
EXHIBIT 3-4
ENFR-22: FY89 Enforcement Planned Dollars
In examining the ENFR-22 program, which used a SAS data base
version of CERCLIS data and SAS programming, we found coding
errors in a program segment that calculated fiscal year and FY
quarter variables from, in one instance, the current date (the
SAS system variable TODAY) and, in another, the Events and
Enforcement Activity Completion Dates. The fiscal year and
quarter variables were then concatenated6 resulting in combined
fiscal year/quarter variables (Example: "894"). The variable
"FYQA" (based -on the completion dates) was then used in the
select logic to compare against the Event or Enforcement Planned
Financial FY/QTR Dates and the other variable, "FYQ" (based on
the current date), was used in comparison with Event or
Enforcement Planned Start Dates. The results developed when the
fiscal year/quarter variables were incorrectly calculated were
unpredictable.
Errors arose in the way in which quarters were determined. If
the month was July, the program logic would set the FY quarter to
"3," when it should have been set to "4." This placed four
months in the third quarter (April, May, June and July). If the
month was October, the quarter was set to "4," when it should
have been set to one. For example, October, 1989 would normally
be represented as "901," i.e., FY 90, quarter 1 (the program
increased the fiscal year by 1 if the month was greater than or
equal to 10). The erroneous coding, however, resulted in the
fiscal year/quarter "904," FY 90, quarter 4. The effect of this
error would depend on the time of year the report was being run
and/or the Actual Completion dates contained in the file. The
erroneous section of the source code read as follows:
IF Y >=4 AND Y <= 7 THEN
QTR=3;
ELSE
IF Y >=7 AND Y <= 10 THEN
QTR»=4;
That section should have read:
IF Y >- 4 AND Y <= 6 THEN
QTR»3;
ELSE
IF Y >-7 AND Y <* 9 THEN
QTR*4;
See Glossary at the end of the report,
41
-------
When we executed an OIG duplication of the program with the
corrected code, we found that a site was selected that had not
been selected on previous executions. The site, Chemical Control
Corporation (EPAID-NJD000607481), had an alternate planned
obligation amount of $300,000. The grand total planned
obligation dollar amount for Region 2 changed from $5,736,000 to
$6,036,000.
A discussion with officials of the contract firm of BOOZ, Allen
and Hamilton confirmed the errors and we received assurance that
the necessary corrections would be made.
42
-------
!• source code
2. program library
3. production librarv
4. load module
5. truncation
6. concatenated
Glossary
A computer program written in a
high level (english-like) language.
It must be changed to a machine
language format (load module)
before it can be processed
by a computer.
A special file which is composed of
one or more individual programs. It
has its own directory which records
the programs contained within it.
In IBM mainframe systems, load
modules must be in a program library
to be executed. Source code does
not have to be in a program library,
but often is in a large multiuser
system.
This is a program library which is
composed of fully tested programs
that perform the day to day
processing and reporting for a
system. It may be differentiated
from a development or test library
which is composed of programs not
fully tested.
A computer program which has
been translated into a form which
enables it to be processed by a
computer. Programs in this form
cannot be read. They also cannot be
converted back to source code format
so that they can be read and
modified.
The failure to include in
calculations and/or show in computer
output the significant digits of a
number. The significant digits are
the left most ones in numbers
greater than zero.
Combined end-to-end two strings of
text or two variable values to form
a single string.
43
-------
STATES ENv-RGiVY.ElM A. PRO'1C '.O'
WASHINGTON D C 20460
FEB 9(990
MEMORANDUM
SUBJECT: OIG Draft Report, "Report on CERCLIS Reporting,"
Audit Number E1SFF9-15-0023
FROM:
TO:
j Don R. Clay
kpAssistant Administrator
Kenneth A. Konz
Acting Assistant Inspector General for Audit
Office of the Inspector General (A-109)
The purpose of this memorandum is to respond to the findings
and recommendations contained in the subject draft report. While
we generally concur with most of the draft report, we have restated
the findings and recommendations which we feel require our
comments .
Summary of Findings (page 2)
Except for the SCAP reports, which report plans and
accomplishments, CERCLIS reports were not heavily
utilized.
Response
There is a recurring theme in the draft report that CERCLIS
reports are not heavily utilized. We feel that is not an accurate
reflection of CERCLIS based upon our experience. CERCLIS has been
designed to meet the reporting requirements of the Superfund
Comprehensive Accomplishments Plan (SCAP), the Strategic Planning
and Management System (SPMS), and the superfund Progress Report
(SPR). CERCLIS also is used to fulfill hundreds of Freedom of
Information Act (FOIA) requests received annually by Headquarters
and Regional Offices. Additionally, CERCLIS data residing on the
Headquarters and Regional Office local area networks are used by
program managers for the in-depth analysis required for planning
and understanding the overall progress of the Superfund program.
We believe these are not inconsequential uses of CERCLIS
information and reports.
frmtiA e* KtCfcttd f*f*t
-------
- 2 -
Summary of Findings (page 2)
In fact, the OIG audit of the Agency's Superfund Annual
Report for FY 1988 demonstrated the significant
differences between the information being reported at the
national level, based on CERCLIS, and the information
being maintained by the regions.
Response
We believe- the wording of this sentence is inconsistent with
the information in the audit and implies that a serious problem
exists with data quality between Headquarters and the Regions. The
summary finding addresses the utilization of CERCLIS reports. We
suggest that this statement be reworded or eliminated in the final
report.
Recommendation (page 9)
We recommend that the Assistant Administrator for OSWER
implement and monitor the actions listed in the OERR response
to our position paper on CERCLIS documentation.
Response
We accept this recommendation. The Director, OERR, previously
submitted three steps that were under consideration to improve the
CERCLIS documentation in his response to the position papers.
These steps were included in the draft report with an evaluation
by auditors that if implemented, these steps should improve CERCLIS
documentation to a satisfactory level. OERR adopted the
implementation portion of this recommendation before December 29,
1989, and we request that this be included in the final report.
We also request that this recommendation be reworded in the
final report to request that OSWER develop a monitoring plan or
procedures. The monitoring of this recommendation will be provided
by the OSWER Information Management staff (IMS). Although the new
procedure will stay in place, we do not want to keep this
recommendation open indefinitely. We will submit a time frame to
implement the monitoring portion of this recommendation in our
response to the final report.
45
-------
Recommendation (page 14)
We recommend that the AA for OSWER implement the steps listed
in the OERR response to our position paper on CERCLIS library
and change controls.
Response
We accept this recommendation. Again, the Director, OERR,
previously submitted six steps that were being cdnsidered for
implementation to improve the integrity of CERCLIS. These steps
were included in the draft report with an evaluation by your office
that if implemented, the integrity of CERCLIS reporting and the
reliance by users on the data reported should materially improve.
We implemented these recommendations by December 29, 1989, and we
request that this be included in the final report.
Recommendation (page 18)
We recommend that the AA for OSWER:
1. Implement the three actions quoted above which were
included in the response to our position paper on CERCLIS
report testing.
Response
We accept this recommendation. As in the previous
recommendations, the Director, OERR, previously submitted these
three steps, along with several others, that were being considered
for implementation to improve CERCLIS report testing. These steps
were included in the draft report. Since December 29, 1989, these
steps have been implemented and we request that this be included
in the final report.
R^fnmiriendation (page 18)
We recoamend that the AA for OSWER:
2. Immediately evaluate the five reports cited in this
discussion in the same way the reports cited in Finding
2 were evaluated.
Response
The five reports listed in the draft report were evaluated by
December 1989, and we request that this be included in the final
report.
46
-------
- 4 -
Recommendation (page 18)
We recommend that the AA for OSWER:
3. Abandon the concept of allowing users to access reports
for which testing has not been completed.
Response
By Decemb'er 29, 1989, we implemented this recommendation and
thus we request that this be included in the final report. we
would like to clarify a point concerning this recommendation. By
placing untested reports in the CEREVAL data base, only the
requestor or testers have access to these reports, not all users.
Thank you for the opportunity to provide our comments to this
draft report. If you have any questions, please contact Jose
Acevedo, OSWER Audit Follow-up Coordinator, at 382-4510.
cc: Christian Holmes
Mary Gade
Henry Longest
Asa Frost
Thad Juszczak
Sonya Stelmack
Jose Acevedo
Jim Maas
47
-------
APPENDIX B
REPORT DISTRIBUTION
Director, Office of Emergency and Remedial Response (OS-200)
Director, Office of Program Management and Technology (OS-110)
Director, Office of Information Resources Management (PM-211)
Office of the Comptroller (PM-225)
Associate Administrator for Regional Operations and
State/Local Relations (A-101)
Agency Followup Official (PM-225), Attn: Director, Resource
Management Division
Inspector General (A-109)
48
------- |