Environmental Protection Agency J
5
Office of Pesticide Programs \
^>
Detailed Design for an Information
Processing System for the Pesticides
Monitoring Program
I. System Overview and
II. Implementation Plan
December, 1978
Contract No. 68-01-3833
ARTHUR YOUNG & COMPANY
-------
ARTHUR YOUNG & COMPANY
IO25 CONNECTICUT AVENUE N W
WASHINGTON D C 2OO3S
December 20, 1978
Mr. Elgin Fry, Project Officer
Environmental Protection Agency
Office of Pesticides Programs
Technical Services Division
Waterside Mall East, Room B17
401 M Street, S.W.
Washington, D.C. 20460
Reference: Contract No. 68-01-3833
Subject: Office of Pesticides Programs Feasibility Study
and System Design Project Phase II Final Report
Dear Mr. Fry:
Arthur Young & Company is pleased to submit the Detailed
System Design Final Report as specified in the referenced con-
tract. The final report is presented in six separate volumes.
Volume 1 contains a. System Overview and Implementation Plan,
Volume 2 the Data Base Design, Volume 3 the Input/Output Speci-
fications, Volume 4 the Functional Module Specifications, and
finally Volume 6 the Users Manual. The final report has been
modified in response to comments and clarifications offered by
EMB and SSB personnel as a result of a draft circulated in the
previous month.
If you have any questions regarding this final report,
please contact Ms. Ellen DuPuy or me at (202) 785-4747.
Very truly yours,
ARTHUR YOUNG & COMPANY
Gerald Mendenhall
Partner
-------
OPP MONITORING SYSTEM
DETAILED DESIGN DOCUMENTATION
The Detailed Design Documentation for the EPA/OPP pesticides
Monitoring System consists of 7 Chapters which are bound in 6
volumes. These chapters are:
Chapter Title
I System Overview Bound Together
II Implementation Plan
III Data Base Design
IV Input/Output Specifications
V Functional Module Specifications
71 Operations Manual
VII User's Manual
Each of these chapters is largely self-contained but may refer to
information contained in other chapters as necessary.
-------
I. SYSTEM OVERVIEW
-------
I. SYSTEM OVERVIEW
1. BACKGROUND
The Office of Pesticide Programs (OPP) has the responsibility
for monitoring pesticide residues in the environment. The basis for
this responsibility is the amended Federal Insecticide, Fungicide, and
Rodenticide Act (FIFRA). The primary Monitoring Program objectives
include:
Provide a framework for intensified monitoring of potential
or perceived problem areas
Detect and quantify trends in pesticide residue levels in
environmental components (e.g., air, soil, water, man, plants,
and animals)
Support an alert level system to predict environmental
problems.
To achieve these objectives, the Ecological Monitoring Branch of
OPP collects samples from the environment, analyzes them, and records
the levels of pesticide residues found. These samples are taken from
five modes of transport: air, water, human tissue, estuaries, and
soils. A variety of manual and automated systems have evolved to
support the retention and analysis of the laboratory results. However,
the systems have not been adequate in meeting evolving needs of the
Monitoring Program.
A feasibility study was conducted in 1977 to address automated
support for the Program. This study was comprised of a requirements
analysis, feasibility and cost benefit analysis, and development of a
recommended system concept. The study recommended use of a DBMS for
an integrated monitoring system. A subsequent study compared a
conventional system, STORET, and System 2000 DBMS in detail and
concluded that System 2000 was more effective.
Following that study, the Office of Pesticide Programs underwent
a major reorganization which exercised a significant impact on the
participants in the recommended monitoring system. In addition, a
greater level of detail with regard to data elements and specific
reports was desired to assist the Office in the decision making process.
Subsequently, a supplementary study was conducted in 1978 to:
Confirm previously defined information requirements
Detail data elements and anticipated changes
1-1
-------
Define and quantify report requirements.
Following the Confirmation of Requirements study, Arthur Young &
Company conducted a comparative evaluation of the System 2000 and IDMS
data base management systems. Based on the results of that evaluation
Arthur Young & Company recommended that System 2000 be employed for
the Monitoring Program.
Concurrent with the DBMS evaluation study Arthur Young & Company
began the detailed design of an information system to support the OPP
Monitoring Program. Following the selection by OPP of IDMS as the
DBMS to be employed, Arthur Young & Company completed the detailed
design. This design is presented in six separately bound documents.
The design is based on the requirements analysis and conceptual design
developed in the earlier documents. We briefly review, in this chapter,
the highlights of that analysis and design activity.
A definition of requirements for the OPP Monitoring System is
depicted in Exhibit 1-1. The requirement is comprised of data from
five modes of transport, a variety of report types including ad hoc
requests for information, and a variety of users.
The information requirements were studied in detail to develop
a list of data elements and to identify specific reports. Since the
Monitoring Program is characterized by ad hoc requests for information,
these were also quantified when possible. Over 20 annual reports (14
for Human Tissues, 3 for Soils, and 4 for Water System) were identified.
This number is many times larger when all the different reports of
the Soils System, which are variations on basic reports, are considered.
An estimated 250 ad hoc requests for information per year (20 per
month for Human Tissues and 10-12 per year for each of Soils and water)
were also identified. Data element analysis revealed a common and
basic structure among modes of transport consisting of geographical
identification data, sampling context, and concentration measures. In
addition, there are description data elements unique to each mode of
transport.
2. ISSUES AND RECOMMENDATIONS
The additional requirements analysis, and the data elements and
reports definition confirmed conclusions of the earlier study of the
need for an integrated system to support the monitoring program. The
organizational changes within OPP have centralized the risk analysis
in the Hazard Evaluation Division. This has led to the need for summary
reports on pesticide residues outside EMB rather than the
interpretations that EMB previously provided. There was some increased
emphasis on obtaining somewhat more detailed monitoring data. In
addition, there was increased interest in analysis across modes of
transport. Therefore, the original recommendation for an integrated
monitoring information system remained valid. Additionally, the
projected increase in manual activity has resulted in an accumulation
of analyzed data from the HANES II program (conducted as a cooperative
effort between EMB and the Public Health Service).
1-2
-------
OVERALL SYSTEM REQUIREMENTS
SUUIICE
DATA
PROCESS
OUTPUTS
IUCIPIENIS
USE
n«|iml liu Dili
Paianielell
- l«d ol Analyin
- Uuib ol lui»|iuil
- Pnliuiln
- Lacaiiuubl
- luulil
- Dugiiotn
- tic
Validate lauul Dill
Updall Oau Bata
Suinimim Dili by
Paiaiuclcit
Inlnpicl nnutll
Suppail luluimaliun
nl
Paiaiiielm
c
one
Public
.Zl
llnliitiulian
ul n Peiluiiu llnh Aulyui
f teilliack on Pall
AdiiilnnlialHe Occlllunt
Allow FIIK lulling ul
InlPiuiuult
> Ilieat fwilrni.!
lni.liidii lu Pacftaya lu
liiiSiiillallin
liuluda in Pai.lij||0
lu ho Still In IIEII
llto in Scltnlilu
llutcaitb
Ilia in letliiiioiiy lul
LilHjatloii
m
X
5
-------
(1) Issues
In the course of developing the recommendation for the
Monitoring Program, several key issues emerged. These issues,
presented in Exhibit 1-2, are important areas for management
consideration and are significant factors in the effective
implementation of the monitoring system.
These issues, in terms of impact and actions required, are:
Misunderstanding within OPP as to the role of the monitoring
program is reflected in different levels of information
requirements expressed and different values placed on the
monitoring data. It is necessary that the Program be clearly
defined, the data availability identified, and dissemination
procedures established. The integration of the five modes
of transport into a single data base with a central point
of contact can assist in developing a strong, favorable image
for the Program. Effective access and data quality control
can help to assure that monitoring data is valued for its
contribution to risk analysis.
Administrative control over data use and access is currently
inconsistent across all files in the Program. Uncontrolled
data handling, coding conventions, and edit criteria can lead
to incompatible data bases. This incompatibility affects
the timeliness with which data may be accessed, inhibits
analyses across modes of transport or across years, and can
contribute to an unfavorable image for the data. The
establishment of a Data Base Administrator who has
responsibility for the integrity of the data base is key to
effective data management.
The direction of the Monitoring Program indicates an
increasing number of short-term studies requiring increased
data access, reduced response time, and changes in data
characteristics. These projections point to a need for a
more responsive data analysis capability than is currently
available. Although an integrated system will facilitate
data retrieval, more formal data management and planning is
also required. This planning, which should include
coordination with users of the monitoring data such as SPRD,
must be directed at ensuring that available data is timely,
accurate, and pertinent to current and anticipated studies.
(2) Recommendations
The recommended system concept for OPP monitoring data was
a blend of a decision support system and a management information
system. The decision support system may be contrasted with a
traditional management information system in that:
1-3
-------
SYSTEM REQUIREMENTS
FINDINGS
Lack of Data Standards Between Systems
Causes Incompatible Data Bases
ACTION REQUIRED
Develop Common Data Formats
Standardize Data Codes
Develop Consistent Data Element Names
Develop a Consistent Data Hierarchy
Response-to Information Requests
not Timely Enough
Design Standards Describing Level of Summaries.
Kinds of Egyrcgations to bo Reported
Allow Analysis Across Modes of Transport
Make Available to Users of Data a List of the
Currently Available Reports
Develop Queries Representative of Anticipated
Retreivals and Reports
Sometimes There is Insufficient
Correlation Between Data
Requested and Data Provided
Develop a Policy as to what Monitoring Data will
be Provided in Response to a Request
Clarify Disscmen alion Procedures
Improve Communication among OPP Divisions
Increasing Emphasis is being Placed
Short Term Projects
There is a Need for more Statistical Analysis
and Longitudinal Data Presentation
(=>
Formalize Data Management and Planning
Provide the Flexibility to Accommodate the
Evolving Nature of tho Pesticide Monitoring Program
Develop an Interface with Statistical and Graphics
Software Packages
Provide the Capability for having all Dnlu Needs Salislicd
Iiv inn: liiiiii Si
i
00
-------
The decision support system is driven by independent data
accesses rather than scheduled reports
The system must respond to rapid changes in its output
requirements
It generally supports a multi-user environment comprised of
multiple functions
Data access is varied, stressing the need for flexibility
and capability in the system rather than refinement of
predictable access patterns.
The frequency and predominance of ad hoc requests for data, either
to satisfy the public or to support special studies, give the
Monitoring System the decision support character. On the other
hand, there are to be a substantial number of periodic reports
which will provide much of the information required by the several
divisions of OPP. These periodic reports give the system an MIS
flavor.
A decision support system is often best supported by a data
base management system which emphasizes data access. We have
recommended an integrated system utilizing a DBMS. For the
pesticides monitoring data, a DBMS which accommodates the
hierarchical nature of the data and which provides easy to use
query language and report-generation capability is needed. IDMS,
a DBMS from the CODASYL family marketed by Cullinane Corporation,
was selected by OPP for managing the Monitoring System Data Base.
SYSTEM DESIGN DOCUMENTATION
In accordance with our contract, Arthur Young & Company has
prepared several documents, bound separately, which collectively
comprise the documentation of our system design. These documents are;
System Overview and Implementation Plan
Data Base Design
Input/Output Specifications
Functional Module Specifications
Operations Manual
Users Manual.
In this system overview document we describe each of the other
documents and present highlights of the system design. We also present
a plan for implementing the system. The individual design documents
are self-explanatory and require little narrative explanation.
1-4
-------
(1) System Overview and Implementation Plan
As we have mentioned, the basic conceptual design of the OPP
Monitoring System is a decision support system with a substantial
number of periodic reports to be generated. Central to the system
design is an integrated data base of monitoring data from the
five transport modes of:
Human Tissues
Soils
Water and Bottom Sediment
Estuaries and Ocean Organisms
Air.
From this data base a number of reports are to be generated on
a regular basis. In addition, ad hoc requests for information
and aperiodic generation of reports must also be handled.
A detailed Implementation Plan is presented in Chapter II
which is bound with this System Overview document. The
Implementation Plan consists of detailed steps and estimated
resource allocations required for the following phases:
Organization and Administration
System Development
System Test
Conversion.
The Organization and Administration phase spans the duration
of the other three phases. It is designed to provide continuity
over the entire implementation process and to assure a proper
post-implementation review. The tasks focus on control and
quality review points, thus allowing OPP management to exercise
control over the entire life of the implementation. It is at the
Organization and Administration level where progress can be
monitored and schedules revised.
The System Development phase embodies the construction and
testing tasks of implementation. Priority is given to
implementing and testing the programs which load the data base.
This allows portions of the system test, in particular the data
base test, to run concurrently with the remainder of the
development cycle. The use of the IDMS data base in the program
and system tests requires standardized procedures which are
detailed in the Implementation Plan.
1-5
-------
Finally, Conversion tasks outline the steps that need to be
addressed prior to the actual cutover to the new system. These
steps include the issues of interfacing the computer system with
operations personnel and user groups, distribution of new input
forms and report requests, and the scheduling of computer
resources.
(2) Data Base Design
The Data Base Design Document, Chapter III, addresses
Data Element Descriptions
Record Contents
Data Base Structure
Schema and Subschema Specifications
by supplying a data element dictionary, a data structure diagram,
and appropriate Data Definition Language (DDL) and Device Media
Control Language (DMCL) statements.
The data base management system (DBMS) selected by OPP for
implementing the Pesticides Monitoring System is IDMS, marketed
by Cullinane Corporation of Wellesley, Massachusetts. IDMS, a
member of the CODASYL family of DBMS, provides a network data
structure which relates logical record types via sets. A set
consists of an owner record type and one or more member record
types. A given record type may be the owner of more than one
set, and record types may be members of some sets and owners of
others.
Access to an IDMS data base is generally accomplished via
programs written in a host language (usually COBOL) and extended
by statements from the Data Manipulation Language (DML). Access
is made to a single record at a time by entering the data structure
at selected points and following pointers which link record
occurrences together into set occurrences. Access to data may
also be achieved through the use of an On-Line Query Language
(OLQ) facility for direct end-user access without programmer
intervention.
Output of data may be accomplished through the (COBOL)
application programs, the OLQ interface, or the CULPRIT report
generator software package. CULPRIT is a commercial product of
Cullinane Corporation and interfaces directly to IDMS to generate
reports. The reports are specified through user-supplied input,
output, work, and process definition parameters.
Data security is accomplished via the specification of
subschemas which define user-specific views of the contents of
1-6
-------
the data base. The complete data base is specified in the schema.
Individual users access a given subschema which limits the user's
view concerning the data elements, record contents, and set
structure of the data base. Security may also-be enforced through
the use of special installation-written data base procedures
which are called automatically whenever IDMS accesses particular
data elements.
The schema (and subschemas) are written in a special Data
Definition Language (DDL) which defines
Data element types (names, formats, etc.)
Record types (collections of data element types)
Set types (collections of record types).
Certain characteristics of the data base, such as number of
records per page, number of pages per area, etc., are specified
in another language, the Device Media Control Language (DMCL).
Statements in these two languages are processed by IDMS system
compilers to produce an object version definition of the data
base.
The data structure is presented in the diagram of Exhibit
1-3. There are 13 record types identified:
State Information
Site Information
Soils Site Description
Air Site Description
Crops Grown Information
Pesticide Usage Information
Sample Identification Information
Air Sample Description
Patient Sample Description
Analysis Criteria
Residue Information
Residue Index
1-7
-------
EXHIBIT 1-3
OPP Data Base
Data Structure Diagram
IX Census-Region
N MM Sorted
IX Census-Oivisn
Soils Site
211 f
: 27
Site-Description
VIA
State-Area
212
Air Site
F
30
Site-Description
State - Area
VIA
Air Sample
221
F
35
Sample Description
State-Area
VIA
Patient Sample
222
F
51
VIA
Sample Description
State-Area
Code
300
V
36 C
lirect
Code - Area
ineo ^s.
T7 \
Region ^^
Sorted ^x.
ry IX-Site-Nu
XN MM
v Site-Description
N MA Next
Sample Descriptio
N MA Next
State
200 F 32
CALC
FIPS- State -Number DN
State-Area
mber
Sorted
>
^^ N MM Sorted
State - Site
NO MA Sorted
f
Site
210 F 68
State-Site
VIA
DN
State-Area
i
She
NO
t
Sample
220 F 25
Site - Sample
Site-Crops _
N MA First
Sample
MA Sorted
VIA
|OL
State -Area
n
>
Sample Analysis
NO MA Sorted
r
Analysis
230 F 43
Sample-Analysis
VIA
DL
State - Area
Crops
216 F 7
Site Crops
VIA
State Area
Crop - Pesticide
N MA First
V
Pesticide
217 F 26
Crop Pesticide
VIA
State Area
^7 IX -Residue-Class
| N MM Sorted
Residue
250 F 68
Residue Code
CALC
ON
Residue - Area
Analysis-Residue
NO MA First
Residue - Index
NO MA First
Residue Info
240
F
12
VIA
Residue Index
Residue - Area
-------
Code Information.
These record types are grouped into 10 sets which relate the
various pesticide, residue, crop, and sample records to the sites
and states where the samples were collected. CALC fields provide
direct access (via hashing) to selected records through the FTPS
State Number and the Residue Code. Alternate indices are provided
for direct access to EPA Census Region, EPA Region, Site Number,
and Residue Class using the Sequential Processing Facility of
IDMS.
(3) Input/Output Specification
Chapter IV incorporates specifications for data entry and
report formats. Data enters the Pesticides Monitoring System
from 8 laboratory analysis report form types as well as from the
HANES II survey data file supplied by the U.S. Public Health
Service. The input report form types and the number of forms of
each type are:
Type Number
Human: Urine 4
Human: Serum, Adipose, Organochlorine 1
Human: Patient 1
Soils: Application 1
Soils x
Water 1 1 for all
Estuarine [
Air J
Information is output by the system through 13 generic report
types which can be used to generate over 200 specific reports
depending upon user-supplied parameters. These reports can be
generated on a regularly scheduled basis or they can be generated
on demand by the end-users. The output report formats are
presented in Chapter IV. The generic report types are:
Air Geographic
Humans Geographic
Soils Geographic
Water Geographic
Air Site by Site
Soils Site by Site
Water Site by Site
1-8
-------
Water Pesticide
Estuarine Exception
Estuarine State Summary
Soils Application Summary
Humans Demographic
Human Patient Summary.
(4) Functional Module Specification
Six functional subsystems have been identified for the
Pesticides Monitoring System. These subsystems are specified in
Chapter V in the form of HIPO charts (Hierarchy plus Input-
Process-Output). Exhibit 1-4 shows an overview of the system
modules in a hierarchical representation. Subsequent charts in
Chapter V carry the design to increasing depths of detail. At
each stage, the HIPO chart depicts the input data, the output
data, and the intermediate processing steps which occur at that
stage. Solid arrows indicate processing flows, and hollow arrows
indicate data flows. Each processing step is labeled with a
hierarchical code which can be used to relate that step to
subsequent more detailed HIPO charts.
The five functional subsystems are:
Data Conversion and Loading
Data Edit/Update
Ad Hoc Report Generation
Standard Report Generation
Data Base Unloading
Data Base Recovery, Restructuring, and other maintenance.
The Edit/Update function validates input data, updates the data
base with valid data, and generates an error report of input data
to be corrected and resubmitted. The Data Conversion function
develops tables of valid states, sites, sample types, analysis
methods, residue codes, etc. and then converts the existing Humans,
Soils, Water, Air, and Estuarine files to the new integrated data
base environment. The Report Generation functions verify the
input report parameters, initiates appropriate data base
retrievals, and compiles the results into an output report. The
Data Base Maintenance function deals with recovery/restart,
backup, and restructuring of the data base. The Data Base
Unloading function deals with Data Base archival.
1-9
-------
(5) Operations Manual
The Operations Manual in Chapter VI specifies general
procedures for initiating, operating, and maintaining the
Pesticides Monitoring Data Base. It covers such topics as:
Data Base Generation
Application Program Support
Data Base Loading
Data Base Update
Backup/Recovery Procedures
Data base Maintenance
Security Procedures
Archived Data.
(6) User's Manual
Chapter VII contains the User's Manual for the OPP Pesticides
Monitoring System (PMS). The User's Manual presents general
guidance for accessing the PMS, causing new data to be entered
into the system, requesting specific reports to be generated,
obtaining information from archived data bases, and making queries
of the data base. Three sample queries are presented as examples
of the use of the query facility.
1-10
-------
II. IMPLEMENTATION PLAN
-------
II. IMPLEMENTATION PLAN
INTRODUCTION
Initial exposure to the data base environment impacts many facets
of the EPA/OPP data processing organization as well as the users. It
is critical, then, that the implementation be planned in such a way as
to emphasize the data base aspects. A major task in a data base
environment is the testing of the DBMS itself to ensure its reliability
when used in conjunction with the testing of other elements of the
system. This is best handled by a standard test data base which can
then be utilized by all programs in various test situations in addition
to standard data base test procedures.
The Implementation Plan (See Exhibit II-l) designed for the
EPA/OPP Pesticide Monitoring System encompasses the following
philosophy:
The conversion programs are given implementation priority
over all other programs
A data base test procedure begins immediately following the
completion of the conversion subsystem
The test data base will be comprised of actual historical
data enhanced as necessary
The data base will be tested (with respect to initial design
aspects) prior to inception of the remainder of program test
procedures and certainly prior to system test.
The following procedures should be adhered to during program and
system test:
(1) The same data base will be constructed for use in all program
and system test activities.
(2) During program test, as each program completes processing
(either successfully or abortively) and test results have been
ascertained, the data base should be rolled back to its original
state. This can be accomplished through the IDMS system rollback
procedures which are described in Chapter VI, the Operations
Manual.
(3) During system test, as each cycle completes processing, a
similar rollback procedure as in (2) should be followed.
II-l
-------
PHASE 1
Organization &
Administration
l.l
Establish Milestones
for System Development
1.2
Make Programmer Assign-
ments
1.3
Review Test Plans and Data
for Completeness
1.4
Revise Time Plans
1.5
Coordinate Systems Test
with Users/Operations
1.6
Review all Test Results
1.7
Authorize System Readiness
1 8
Monitor Post Installation
OPPi Pesticides Monitoring System
Implementation Plan
PHASE 2
System Development
2.1
Review all Detailed Documen-
tation
2.2
Develop Program Test Plans
2.3
Construct Conversion
Subsystem
2.4
Evaluate Results
2.5
Construct Maintenance,
Edit/Update. Archival,
Reports Subsystems
2.6
Evaluate Results
PHASE 3
System Test
3.1
Review System Test Plan
3.2
Prepare System Test Files
3.3
Perform Data Base Test
3.4
Perform System Test
PHASE 4
Conversion
4.1
Conversion Preparation
4.2
Convert Files
4.3
Cutover to New Systems
-------
(4) If during the test (system or program) the necessity for
change to the physical or logical design of the data base becomes
apparent, strict modification procedures must be followed and
complete documentation made of all changes. Since the system
programs communicate through the date base, all affected programs
must be inspected for possible ramifications prior to the
institution of the changes. Particular attention must be paid
to the restart of the relevant test plans.
The Implementation Plan has been divided into four phases:
Organization and Administration
System Development
System Test
Conversion.
The Implementation Schedule of Tasks and Staffing (Exhibit II-2)
supplies the relative sequencing of phases and tasks and the resource
allocations necessary for the successful completion of the
implementation. The narrative which follows has been written to
supplement the chart, not to replace it. The reader is cautioned that
the chart contains pertinent information which is not present elsewhere
in this document.
PHASE 1 ORGANIZATION AND ADMINISTRATION
The Organization and Administration phase has been created as an
ongoing phase lasting throughout implementation. The tasks in this
phase span the other three phases (Exhibit II-2). These tasks are
assigned to the project leader in order to maintain adequate control
over the implementation process. One person must have the
responsibility of tying together the results of the various tasks by
reviewing the outputs and monitoring the overall implementation
schedule thus offering a specific focal point for quality review. The
project leader is charged with the final responsibility in the form
of the following tasks.
Task 1.1 Establish Milestones for System Development Phase
The milestone review point consists of a structured review
at certain crucial milestones in the project. At each such
milestone, specific accomplishments and supporting documentation
is expected. The analyst must review and approve the
documentation before activity toward the next milestone is
initiated. Standard milestones include the following;
Completion of review of file and report definitions
II-2
-------
OPP Pesticides Monitoring System
Schedule of Tasks and Staffing
Stall Dayi
PHASE 1 ORGANIZATION AND
ADMINISTRATION
1 Establish Milestones lor System Development
2 Made Programmer Assignments
3 Review Test Plans lor Completeness
4 Revise Time Plans
6 Coordinate System Test
6 Review all Test Results
7 Aullioiiie System Readiness
B Monitor Post Installation
PHASE 2 SYSTEM DEVELOPMENT
1 Review all Detailed Design Documentation
2 Develop Piogiam Test Plans
3 Construct Conversion Subsystem
4 Evaluate Results
Archival. Report Subsystems
6 Evaluate Results
PHASE 3 SYSTEM TEST
1 Review System Test Plan
2 Piepare System Test Files
3 Perlorm Data Base Test
4 Perlorm System Test
PHASE 4 CONVERSION
1 Prepare lor Conversion
2 Convert Files
3 Culover to New System
1
1
1
1
9
1
1
10
1
11
1
1
V
12
VE
13
EK
14
Sf
IS
FT
16
Ef
17
III
18
MP
19
LEI
20
ME
21
I
NT
22
AT
23
IO
24
N!
n
>T«
»6
kRl
27
r
28
29
30
3
32
33
34
1
1
35
3G
37
38
39
J
40
1
mmm
SUBTOTAL
SUBTOTAL
SUBTOTAL
SUBTOTAL
TOTAL IMPLEMENTATION
£
S
,
2
4
3
3
S
1
Ongoing
19
_
_
_
-
-
_
-
-
_
_
-
19
E
f
_
_
_
-
_
_
_
s
4
Z
3
24
Z
40
3
20
10
35
69
17
4
2
23
131
i
2
2
6
33
44
II
14
3
3
45
|
i
I
g
g
i
i
2
i
i
9
Assume 2 3 Programmers. I Analyst, 1 Protect leader
X
5
-------
Completion of review of HIPO charts
Completion of program coding
Completion of testing
Completion of documentation.
In addition, the project leader may wish to establish a
periodic reporting system which will consist of informal reports
on a weekly basis to brief the project leader on tasks completed,
problems encountered, and estimated time to completion. The
project leader will use these periodic briefings to obtain an
early detection of problems and to assure that assigned personnel
are maintaining their commitment to the project.
Task 1.2 Make Programmer Assignments
The Program Inventory (Exhibit II-3) incorporates critical
information about those attributes of a program which describe
its difficulty level such as:
Input/Output
IDMS Usage
Le ng th
Priority
Estimated Time to Complete.
The project leader should evaluate programmer skills
available and make assignments accordingly.
Task 1.3 Review Test Plans and Data for Completeness
The project leader should check the program test data for
completeness. In cases where the user has provided test material,
the project leader assures that the programmer has supplemented
this material with the additional cases which may be required to
test each routine in the program.
The programmer's test plan, developed in Task 2.2, is reviewed
to determine if it will give the program a thorough test. Test
data must be available for each element in the test plan. Test
data is constructed independently for each program in the system;
the output from one test run should not be used as input to
subsequent runs to be tested.
Task 1.4 Revise Time Plans
II-3
-------
OPP Pesticides Monitoring System
Program inventory
NAME (PRIORITY)
DDL (0). OMCL (0)
Conversion Load/Unload
Subsystem (1)
Edit/Update (Soils, Water) (2)
Edit/Update (Humans) (2)
Report Requests (3)
Maintenance Utilities (5)
INPUT/
FILES
-
Humans Soils M/F
Water Trans Data
Base
Monitoring Data Trans
Data Base
Monitoring Data Trans
Data Base
Data Base
Report Request
Data Base
OUTPUT
REPORTS
-
Control Report
Update Trail
Error Report
Update Trail
Error Report
Requested Report(s)
Control
IDMS USAGE
Heavy
Heavy
Medium
Medium
Medium
Light
DIFFICULTY
(No Coding Involved)
Difficult
Difficult
Difficult
Med-Difficult
Easy
LENGTH
-
Short
Long
Long
Medium
Short
ESTIMATED
RESOURCE
(PROGRAMMER
DAYS)
2 Days
60 Days
100 Days
100 Days
175 Days
5
m
X
I
C9
Based on IDMS Usage
File Criteria
-------
Based on the progress reports received, the project leader
must anticipate problem areas and project delays. Ramifications
for Systems Test and Conversion must be considered and revisions
made to the Implementation Schedule of Tasks and Staffing (Exhibit
II-2).
Specific revisions are scheduled during the data base test,
construction of the edit/update, maintenance, and report
subsystems, and the closing portions of the development phase.
These represent critical areas where problems may develop which
may delay succeeding task initiation. The project leader may
come to recognize other critical paths as the system
implementation proceeds, and additional revision points may be
scheduled.
Task 1.5 Coordinate Systems Test with Users/Operations
Additional strain will be placed on both hardware and people
resources during systems test and conversion. Estimated people
resources are shown in the Implementation Schedule. Revision of
this schedule may take place during initial phases of the
implementation effort. Computer loading must be anticipated in
advance and the proper arrangements made with operations
personnel. Because the OPP Monitoring system will be tested in
parallel, there will be the load of both the old system and the
new system in addition to any current processing.
The user groups from Humans, Water and Soils must be
introduced to the Users Manual (Chapter VII) and a training
schedule established followed by a review procedure where
necessary revisions are made to the Users Manual.
The Operations group must be introduced to the Operations
Manual (Chapter VI). Of particular importance are the data base
backup/recovery procedures for the IDMS data base. These will
be tested in the Systems Test phase, and the operations personnel
need to be trained in system crash procedures. Additionally, the
new input procedures must be reviewed by the data entry personnel.
Task 1.6 Review All Test Results
The project leader must review the final test results to
determine if the program system specifications have been met.
The project leader will also view any test results, prior to the
final test, which resulted in planned abnormal end of job
conditions (i.e., out of sequence, wrong input, etc.). Evidence
(in the form of printed or typed test results) must be available
for all of these conditions. The project leader will assure that
all cycles have been run successfully.
Task 1.7 Authorize System Readiness
II-4
-------
It is the project leader's responsibility to initiate
conversion procedures based on the examination of system test
results and feedback from users groups. Serious problems must
be identified, and conversion may need to be delayed. It is
important that conversion not be initiated until the system test
has been completed.
Task 1.8 Monitor Post-Installation
Post-installation monitoring provides important feedback
for system designers, programmers, and users. A formal procedure
should be developed which will record complaints from users,
system failures, and operations statistics. Since the system is
no longer under development, the maintenance function takes
control of this procedure. In a new system, all feedback must be
answered as swiftly as possible thus promoting good user
relations.
PHASE 2 SYSTEM DEVELOPMENT
Task 2.1 Review All Detailed Design Documentation
Prior to the initiation of the actual construction of
programs, the entire development team (analyst and programmers)
should review the detailed design documentation. This
documentation includes:
Implementation Philosophy Chapter II
Data Base Design Documentation Chapter III
Data Element Dictionary Chapter III
Input Specifications Chapter IV
Report Specifications Chapter IV
HIPO Charts Chapter V
Checks should be made for consistency and completeness through
the use of formal or informal design walkthroughs. Any questions
should be cleared up prior to actual construction.
Additions/changes must be well documented, and implications of
those modifications should be considered.
Task 2.2 Develop Program Test Plans
A Program Test Plan must be developed for each unique
functional program. This implies that a test plan for one
conversion module may very well serve for all other conversion
modules. The programming team is responsible for this task. It
II-5
-------
is recommended that top down testing be performed. The HIPO
charts can be utilized as the blueprint for the test plans.
Coupled with top down coding, the test procedure can be run in
parallel with other coding efforts. The test plan must be reviewed
by the project leader in Task 1.3. Each test plan will identify
the following information:
All routines and combinations of routines to be tested
For each condition:
input data needed for test
output results to be checked (i.e., control reports,
file dumps, etc.)
Estimate of hardware resources necessary.
Tasks 2.3-2.6 Coding, Compiling, Testing, and Evaluation
These tasks can be discussed together since they cover the
same generic functions, namely, coding, compiling, testing, and
evaluation. In general, all coding activities will follow those
standards in the EPA/OPP Documentation/Standards Manual.
A specific area of concern is data base programming strategy,
and particular attention should be paid to subschema definition,
currency indicators, and error recovery techniques. Each
programmer should be familiar with the navigational paths
available in a particular subschema. The navigation of those
paths will account for a majority of the data base related code,
particularly in the Edit/Update programs. Therefore, it is
important to design an effective access strategy. Coupled with
the access strategy is the concept of currency. Currency
indicators define the status of all records in a subschema at
any point during execution of an object program. Complete
awareness of the currency status is essential to the effective
use of the DML. Status checking must occur after every DML
statement in order to adequately ascertain the result of that
particular call to the data base. Status codes returned by the
IDMS system are used to signal end-of-set or record-not-found
conditions which may indicate specific control paths to be taken
by the program as well as critical errors which may cause a
program abend. The programmer must include specific checks for
possible codes as well as a call to the IDMS Status-Check routine
which will abort and rollback, if necessary. These procedures
are described in detail in the IDMS Programmer's Reference Guide.
Coding activities are to include JCL creation. Test data
should be created, using the program test plans (Task 2.2) as a
guide. Corrections to the test plan will correspond to
modifications of program specifications discovered during coding.
II-6
-------
Milestones for programming activities are developed in Task 1.1
of the Organization and Administration phase. Progress reports
will be made on a timely basis to ensure the reliability of the
Implementation Schedule of Tasks and Staffing (Exhibit II-2).
The Conversion Subsystem (Tasks 2.3 and 2.4) is separated
from the other subsystems because of its importance to future
tasks. In order to program test the remaining subsystems, a data
base must be built. Therefore, it is mandatory that the data base
specifications be tested prior to their use in the remaining test
phases. The Conversion programs perform the loading of the
initial data base and are used in Task 3.2.
During the programming process, care must be taken to update
all detailed design documentation (Task 2.1} as necessary.
All test results are to be indexed to the Program Test Plan
(Task 2.2) and a log is to be kept recording those results for
use in Task 1.6. A standard procedure for the use of data bases
in testing situations has been developed (see the Introduction
to this Implementation Plan) and should be instituted by all
programmers.
PHASE 3 SYSTEM TEST
Task 3.1 Review System Test Plan
A System Test Plan (Exhibit II-4) has been developed. Both
string and full scale system testing has been incorporated into
the plan. Nine cycles have been identified as potential test
cases. For each cycle the following information is presented:
Programs involved
Reference to HIPO charts for I/O specifications
Test conditions
Parallel results in old system.
In all cases (excluding the water system) prior reports exist
with which to do comparison checks. Finally, the system is run
in its entirety using a characteristic volume of input and data
base records.
During the review of the plan, any chronic trouble spots in
the program test should be noted to ensure that they are corrected.
Task 3.2 Prepare System Test Files
II-7
-------
OPP Pesticides Monitoring System
System Test Plan
CYCLE
1
2
3
4
PROGRAMS (HIPO XREF)
Humans Report Requests (4.0|
Soils Report Requests (4.0)
Edit & Update Soils (2.0)
Report Requests Soils (4.0)
Edit & Update Soils (2.0)
Report Request Soils (4.0)
Edit & Update Humans (2.0)
Report Request Humans (4.0)
MAJOR TEST CONDITIONS
No Invalid Requests
Volume Test Report Results
Check Control Reports
Use Data Base (Version 0)
Use Valid Update Data
Updated Data Base (Version 1) Reflected
in Reports
Check Control Reports
Exception Error Reports May Occur
Large Amounts of Invalid Data
to Verify Validation Criteria
Use Data Base (Version 0)
Use Invalid Data & Valid Data
Use Data Base (Version 0)
Control Reports Checked
Error Reports
Run Reports Only After Errors
Have Been Corrected and Resubmitted
Procedures for Re-Running Monitoring
Data Checked
Control Incorporation of Hanes Data
OLD SYSTEM PARALLEL RUN
Run Old Systems to Produce
Humans Geographies
Humans Demographics
Humans Patient Summary
Soils Residue Summary
Soils Application Summary
Use Expected Results from Task 3.2
Run Soils Update Procedure
Run Soils Requests
Compare Error Reports
Against Expected Reults from
Task 3.2
' Run Old Soils System to Produce
- Updated File
- Soils Reports to Check Effect
of Large Amounts of Invalid Data
Compare Error Reports
to Expected Results from
Task
Run Old Humans System to 1
Update Files and Produce Reports
Humans Demographics
Humans Patient Summary
1I
o 2
S. H
is]
-------
OPP Pesticides Monitoring System
System Test Plan
(Cont.)
CYCLE
5
6
7
8
9
PROGRAMS (HIPO XREF)
Edit & Update Water (2.0)
Report Requests Water (4.0)
Edit & Update Water & Soils (2.0)
Report Requests Water & Soils (4.0)
Code Update (2.0)
Edit & Update Soils (2.0)
Report Request Soils (4.0)
Edit & Update (2.0)
Report Requests (4.0)
Archival (5.0)
Report Request (4.0)
MAJOR TEST CONDITIONS
Use Invalid & Valid Data
Use Data Base (Version 0)
Control Reports Checked
Error Reports
Large Volume Combination Test
Use Data Base (Version 0)
Timing of Reports Runs
Use Invalid & Valid Data as Input
to Update of Codes
Use Monitoring Data that will use
Updated Code Information
Run Limited Report Requests to
Test Results
Include all Invalid Update
Conditions to Test
- Delete Capability
Modify Capability
Use Data Base (Version 0)
Use Value Monitoring Data
Test System Crash Procedures for
Data Base Rollback. Etc.
After Recovery. Run Test Cycle 1
in Order to Test Success
Simulate Archival Procedure
Run Report Request that Spans
Years
OLD SYSTEM PARALLEL RUN
No Parallel in Old System
Use Expected Results, Error & Control
Reports from Task 3.2
Use Results from Soils in Cycle 2
And Expected Results From Cycle 5
Compare Against Expected
Results from Task 3.2
None
Use Test Plan Cycle 1
Data for Check
Use Test Data Expected
Results from Task 3.2
IS)
*>
-------
The System Test Files are categorized as follows:
Files for creation of the data base
Input transaction data
Expected results for reports not available in the old system.
These files are to be created in parallel with the development
cycle if the resource allocations are available. The System Test
Files are logged carefully to document the exact function of the
data. These files, once created, will be used to run the new
system as well as the old. The use of historical data in the
system test does not preclude the necessity of constructing test
cases that do not occur in that data. The following procedure
is followed to create the System Test Files:
Freeze humans and soils files as of the last update in order
to provide candidate data for the system test
Capture water data by instituting the use of new input forms
For the time period beginning at xx/xx/xx and continuing
until system test inception, capture all monitoring data
input
Create sample type information, state information, and site
information for water
Select 1/3 of all candidate sites, providing criteria for
the selection of a reasonable subset of humans, soils and
water site data
Prepare files for use in the systems test (Exhibit II-4)
using selection criteria determined above
Log all system test files
Prepare water data for water subsystem conversion program
Code and punch report requests necessary to test all water,
humans, and soils reports (See Report Documentation Chapter
IV)
Evaluate monitoring data input and report requests to ensure
that specific errors are present. Monitoring data must
correspond to site selection in Step 4 above
Create those errors not present in historical data
Prepare valid and invalid data base update transactions
II-8
-------
Review test data isolated in previous steps
For each system test cycle (System Test Plan, Exhibit IV),
prepare expected results for:
Water Pesticide Report
Water Geographic Report
Site-by-Site Water
Central Reports
Error Reports
Archival Report
Task 3.3 Perform Data Base Test
The objective of the Data Base Test is to ensure the valid
loading, DDL design and DMCL design of the data base. The data
base created can be used during the remaining program and system
test procedures; however, it may be desirable to use a subset of
the data base during program test. The procedure outline is:
Build the Data Dictionary
Load the DDL and the DMCL
Initiate the subschemas for all application programs
Run IDMSCLUC to build the data base protocols
Build the System Test Data Base using the Conversion
subsystem with the System Test Files prepared in Task 3.2
as input
Monitor the control and error reports produced by each run
to ensure that all data has been loaded and validated
Unload the data base to the original files using the data
base unload program
Verify the results by comparing the files used in in creating
the data base and those created in the unload process
Troubleshoot the results, if necessary, by checking the DDL
and DMCL for the data base load and the load procedure for
the data base
Distribute printed copies of the program test data base to
all programmers for use in determining their test data
II-9
-------
Task 3.4 Perform System Test
The System Test will be run using test files created in Task
3.2. The data base design has been tested in Task 3.3. The primary
purpose of the system test is to:
Ensure proper communication between programs and between
programs and data base
Prove the logic of the system in a controlled environment
Test all unusual conditions
Introduce data in volume
Collect timings on programs
Provide training to personnel in humans, water, soils,
operations, and data entry divisions of OPP.
The System Test output is compared to corresponding output
in the old system except in the case of the water data where
expected results have been prepared in Task 3.2.
In addition to verifying system outputs, the System Test
interfaces the user procedures documented in the User's Manual
as well as the data entry and operations procedures documented
in the Operations Manual. Of particular interest are the
restart/back-up procedures pertaining to the data base.
The timings collected can be used to extrapolate full system
volume statistics. The review of these statistics may result in
the fine tuning or reorganization of the data ^Ise. The important
steps in the process are:
Run all cycle tests identified in the System Test Plan
(Exhibit II-4) in the old and new system where possible
For each test, document
input
output
timing statistics
Evaluate test results using expected results developed in
Task 3.2 and results achieved in the old system parallel run
Document all deviations from expected results and refer to
programmer assigned
11-10
-------
Use the Operations Manual and data entry procedures during
all system tests
Document problems and/or changes made to procedures.
PHASE 4 CONVERSION
Task 4.1 Conversion Preparation
The objective of conversion preparation is to ready the
organization in anticipation of the cutover to the new system.
While the system is undergoing final stages of system test, the
Implementation Schedule of Tasks and Staffing (Exhibit II-2)
should be reviewed for continued validity. Each cutover event
described below must be assigned and timed. The conversion
schedule should be published and distributed to the user groups,
operations, and systems personnel. During this task, the PMS
User's Manual and PMS Operations Manual must be in final form,
reflecting any changes made during system test. All training in
both operation and use of the system must be completed.
Additionally, all physical site preparation should be completed
as follows:
Assure that data entry and report request forms are printed
and distributed to monitoring groups
Confirm the availability of computer time requirements
Prepare HANES data for input to the conversion procedure as
well as other files needed for the Conversion subsystem
Task 4.2 Convert Files
The procedure to convert the files is outlined in the
Conversion subsystem. This procedure has been tested in the Data
Base Test (Task 3.3). The Conversion subsystem focuses on the
creation of the Pesticides Monitoring System Data Base. The
discussion which follows should be reviewed in conjunction with
the Data Base Documentation (Chapter III). There are four major
functions that conversion must accomplish:
Initialize the data base
Create the sample type and other code records
Create residue records
Convert Humans and Soils Master Files.
The initialization of the data base involves the creation
of the data base logical and physical structure through the
11-11
-------
building of the Data Dictionary, the loading of the DDL and DMCL,
the initialization of the application program subschemas, and the
building of the data base protocols.
The sample type records contain information about each
sample type as well as the code used in the remaining records in
the data base. Other code records include formulation codes,
application codes, etc. The information is stored LOCATION MODE
DIRECT and is not a member of any database set.
The next step is to load the entry point records in the data
base. The input required for this is:
Residue code information
State information
Site information.
Residue and state information must be collected and created. The
site information exists in machine readable form for Humans and
Soils, but the Water data must be collected. The residue and
water site information is loaded first. The state and site records
will be built in combination with the last function: the
conversion of Humans and Soils Master Files.
The actual conversion of the old system master files to the
data base is accomplished through an iterative navigation through
the data base structure. The reader should consult the data base
schema found in Chapter III while reviewing the following steps.
(1) Load the next state
(2) Load the next site within that state
(3) Load, using the old master files, the remaining records
on the data base path for that site to include:
- Crops ("soils only)
Pesticides (soils only)
Soils Site (soils only)
Sample (all)
Patient Sample (humans only)
Analyses Criteria (all)
Residue Information (all)
11-12
-------
(4) Repeat (3) for each site loaded in (2)
(5) Restart procedure at (1) when all sites for a particular
state are loaded
(6) Repeat (1) through (5) until all states have
been loaded.
The ordering of the load process impacts database performance;
therefore the Humans Master File should be converted first since
it contains the most frequently used data.
The most recent file status is established by processing
all available updates on the old system. The files are frozen
and all monitoring data collected past that time is collected on
the new input forms. The conversion subsystem is used for the
creation of the PMS data base. All error and control reports
must be monitored. Errors are investigated and reconciled prior
to cutover (Task 4.3).
Task 4.3 Cutover to New System
The conversion of the files is discussed in Task 4.2. The
actual conversion requires coordination between user groups,
operations, and systems personnel. The completion of conversion
is signaled by the following:
All data is being collected on the new input forms
The data base is operational as are all programs in the
system
The old system is retired for archival purposes and no longer
retains production status.
A post-implementation procedure must be implemented that
records feedback and problem areas in the system. Once the new
system has entered production, the system has attained a
maintenance status and must be monitored accordingly. The data
base operation is then monitored for a post-implementation period.
(Task 1.8).
11-13
------- |