a ror ioia
or form;

CDNTR
I N F ~ R M AT I ~ N
Logistics Management Institute
Quality Assurance Project Plan for the
Interim Central Data Exchange System
EP005T7

-------
Logistics Management Institute
Quality Assurance Project Plan for the
Interim Central Data Exchange System
EP005T7
September 2001
Don Egan
Kim Harris
Daniel Jackson
Jodi Narel
John Kupiec
The views, opinions, and findings contained in this report are those of LMI and should not be construed
as an official agency position, policy, or decision, unless so designated by other official documentation.
Logistics Management Institute
2000 Corporate Ridge
McLean, VA 22102-7805
Printed on 30% recycled paper.

-------
Contents
Section 1 Introduction	1-1
1.1	Background of CDX	1 -1
1.2	Purpose	1-2
1.3	Scope 	1-2
1.4	Quality Assurance/Quality Control Activities	1-2
1.5	Report Organization	1-3
Section 2 Quality Overview	2-1
2.1	Quality Commitment	2-1
2.2	Management	2-2
2.3	Professional Training and Development	2-6
2.4	CDX Quality Assurance Areas	2-6
Section 3 Software Development	3-1
3.1	Objective	3-1
3.2	Definition	3-1
3.3	Quality Measures	3-1
3.4	Metrics	3-2
3.5	QA Process Description	3-4
Section 4 Customer Service and Technical Support	4-1
4.1	Objective	4-1
4.2	Definition			4-1
4.3	Quality Measures	4-1
4.4	Metrics	4-2
4.5	QA Process Description	4-2
Section 5 System Operations	5-1
5.1	Objective	5-1
5.2	Definition	5-1
5.3	Quality Measures	5-2
iii

-------
5.4	Metrics	5-2
5.5	QA Process Description	5-4
Section 6 Security and Risk Management	6-1
6.1	Objective	6-1
6.2	Definition	6-1
6.3	Quality Measures	6-1
6.4	Metrics	6-2
6.5	QA Process Description	6-3
6.6	Risk Management	6-5
Section 7 Project Management	7-1
7.1	Objective	7-1
7.2	Definition	7-1
7.3	Quality Measures	7-1
7.4	Metrics	7-1
7.5	QA Process Description	7-2
7.6	Non-Software Document Review	7-2
7.7	Supplier and Subcontractor Controls	7-3
7.8	Quality Records	7-6
Appendix A Software Requirements Checklist
Appendix B Software Design Review Checklist
Appendix C Release Notes Template
Appendix D Bill of Materials Template
Appendix E Project Planning Checklist
Appendix F References
Appendix G Tools and Techniques
Appendix H Configuration Management
Appendix I Data Quality
Appendix J Abbreviations, Acronyms, and Definitions
iv

-------
Contents
Figures
Figure 2-1. EPA CDX Task Organization Chart	2-2
Figure 3-1. CDX Development Phase Activities	3-4
Figure 3-2. System Change Request Form			3-7
Figure 3-3. System Change Request Blank Form	3-8
Figure 3-4. Product Tracking Link	3-9
Figure 3-5. Product Detail Screen	3-9
Figure 3-6. Product Review Summary Form	3-9
Figure 4-1. Customer Activity Tickets and SCR Process Flow.....	4-4
Figure 5-1. Details of CDX Functions and Data Flows	5-1
Figure 5-2. CDX System Components	5-4
Tables
Table 2-1. Quality Assurance Responsibilities	.....2-3
Table 3-1. Software Development and Maintenance Quality Measure and
Performance Metrics—Identify and Track Trends in the Software Change
Request Process	.	3-2
Table 3-2. Software Development and Maintenance Quality Measure and
Performance Metrics—Identify and Track Trends in the Requirements
Management Process	3-3
Table 3-3. Software Development and Maintenance Quality Measure and
Performance Metrics—Identify and Track Code Modification Trends	3-3
Table 4-1. Customer Service Quality Measure and Performance Metrics			4-2
Table 5-1. System Operations Measuies and Metrics	5-2
Table 5-2. Archive Administration										5-7
Table 6-1. System Security Metrics			-					....6-2
Table 7-1. Quality Measure and Performance Metrics	7-2
v

-------
Section 1
Introduction
1.1 Background of CDX
The Environmental Protection Agency (EPA) is responsible for collecting,
verifying, and monitoring compliance reports from stakeholders and making them
available to the public. The federal government requires that stakeholders convey
their interactions with the environment to the EPA through compliance reporting.
Historically, stakeholders have met this requirement by periodically submitting
paper reports to EPA systems, which have multiple independent databases. The
databases are diverse and generally do not conform to common data definitions
for like data elements.
Recent mandates of the Government Paperwork Elimination Act of 1998 (GPEA)
and the EPA's Reinventing Environmental Information initiative require the EPA
provide convenient electronic options for stakeholders to submit these reports.
These mandates stipulate the EPA must make progress in implementing electronic
business, taking into consideration the following factors:
~	Reduction of reporting burden
~	Data integration
~	Establishment of consistent procedures for electronic signatures
~	Decreased public access time
~	Improvement in data quality.
The Central Data Exchange (CDX) is a system that facilitates electronic data
exchanges for EPA stakeholders and is a key component of EPA's strategy for
addressing these mandates. As a single receiving point for all reports, die CDX
ensures a baseline for standardization and compatibility of incoming data. In
addition, the CDX provides electronic forms that are pre-filled (or pre-populated)
with data that do not change or change infrequently (e.g., permit number or
address)—thereby reducing the stakeholder's "burden" of filling in redundant in-
formation. The CDX also can allow a smooth transition to integrated compliance
reporting and an integrated EPA database.
1-1

-------
Purpose
This CDX Quality Assurance Plan (QAP) describes the quality assurance stan-
dards, guidelines, procedures, and activities used to support the development and
enhancement of the EPA's CDX system and EPA applications developed and
hosted at LMI. This plan outlines current and future quality assurance activities
for the CDX system. This QAP does not address the details of any specific CDX
program.
Scope
This CDX Quality Assurance Plan addresses the methods and processes LMI uses
to support the design and development of the CDX system. This plan proceeds
from the following assumptions:
~	Quality is the concern of every individual working in support of EPA.
~	A team effort with EPA promotes cooperative actions and common goals.
~	Quality improvement enforces the principle that every individual can
make improvements as time progresses.
~	Aspiring to excellence promotes better products and services for the EPA,
its partners, the regulated community, and the public. It also provides per-
sonal rewards and challenges of achievement for each individual working
in support of this effort.
While this plan is based upon standard quality assurance practices, it is a dynamic
document that must respond to particular demands. Therefore, as the development
and operation of the CDX system continues, additions, deletions, and modifica-
tions will occur.
Quality Assurance/Quality Control Activities
This CDX QAP includes a number of standards, guidelines, procedures, and
activities derived from International Organization for Standardization (ISO) stan-
dard procedures. The quality assurance, quality control (QA/QC) activities for the
CDX team include the following:
~	Metrics collection
~	Tools and techniques
~	Reviews and quality checks.
While the CDX team strives to maintain a standard set of quality assurance prac-
tices for all of its projects, certain practices may require tailoring for a particular
1-2

-------
Introduction
project. Thus, while the standards, guidelines, procedures and activities described
in this document are generally true, the state, depth, and formality of some elements,
and the details of some procedures and activities may be project-specific.
To support these standards, subordinate procedures and documents may be
developed in accordance with the CDX team standards as project requirements
and circumstances dictate.
Report Organization
This Quality Assurance Plan contains 7 sections and 10 appendixes and is organ-
ized as follows:
~	Section 1, Introduction—This section includes the purpose, scope, back-
ground, and overview of this plan.
~	Section 2, Quality Overview—This section describes the quality objectives
for the quality assurance program. Section 2 describes management and
quality assurance responsibilities, professional training and development,
and the CDX quality assurance areas.
~	Section 3, Software Development—This section describes the develop-
ment activities for the CDX system. It also describes the components of
the software development used for the CDX system, as well as the stan-
dards to measure against and the metrics to evaluate by.
~	Section 4, Customer Service and Technical Support—This section
describes the customer support process for the project. It also describes
standards, metrics, customer activity tickets and the SCR process.
~	Section 5, System Operations—This section describes the systems opera-
tion activities for the CDX system. Section 5 also describes the compo-
nents, standards, and metrics for systems operation.
~	Section 6, Security and Risk Management—This section describes the
security for the CDX system. It also describes how risk is managed for the
project and lists the major components of the risk management approach.
~	Section 7, Project Management—This section describes the project man-
agement activities for CDX. This section also describes the supplier and
subcontractor controls and the QA records kept for the project.
~	Appendix A, Software Requirements Checklist—This appendix describes
the elements reviewed for a requirements review.
~	Appendix B, Software Design Review Checklist—This appendix describes
the elements reviewed for a software design review.
1-3

-------
Appendix C, Release Notes Template—This appendix describes the
contents and format for the Release Notes.
Appendix D, Bill of Materials Template—This appendix describes the
contents for the Bill of Materials.
Appendix E, Project Planning Checklist—This appendix describes the
elements that must be reviewed for a project plan.
Appendix F, References—This appendix lists the materials used to pro-
duce the QAP and documents that are of specific relevance.
Appendix G, Tools and Techniques—This appendix describes the tools
and techniques used in the development and management of the CDX
system.
Appendix H, Configuration Management—This appendix details the
configuration management process for the project. This appendix also
includes the directory structure for the different environments.
Appendix I, Data Quality—This appendix addresses the relationship of
data quality and CDX.
Appendix J, Abbreviations, Acronyms, and Definitions—This appendix
lists and describes the abbreviations, acronyms, and definitions used
within this QAP.
1-4

-------
Section 2
Quality Overview
The mission of the CDX quality assurance program is to support continuous
process improvement and the development of high-quality deliverables that are
timely and cost-effective. This is accomplished by establishing software devel-
opment life-cycle (SDLC) conventions, procedures, standards, and processes.
This promotes the development of quality work products, and provides training,
support, and assistance to project staff in the effective use of quality-related tools.
The objectives of the quality assurance program for CDX are as follows:
~	Institute processes that result in the early detection of defects or problems.
~	Implement formal and informal review and audit processes.
~	Conduct scheduled reviews and audits.
~	Ensure work is performed in accordance with contract requirements.
~	Enforce ISO 9000 standards and procedures for software development.
~	Implement metrics, where appropriate.
~	Encourage project staff to obtain proper training for developing effective
work processes and producing quality work products.
2.1 Quality Commitment
The CDX team is committed to strict adherence to practices that produce quality
products. The CDX team achieves this by documenting selected activities in
accordance with ISO guidelines; abiding by ISO development, system operation,
and project management practices related to standards, methods, and procedures;
passing reviews, quality checks, and audits; and performing testing and evaluation
activities for all CDX products.
The accomplishment of these activities is measured in several ways. The most
obvious metrics show if the project or specific task is completed on time, within
budget and is accepted by the customer. Additional metrics include quantitative
information related to specified development, operations, customer service, and
project management of CDX activities. The CDX team members responsible for
evaluating and measuring the success of CDX are project leaders, program manag-
ers, program directors, and—ultimately—EPA. Specific quality measures, metrics,
and responsible officials for these activities are described in Sections 3-7.
2-1

-------
2.2 Management
The program and system life-cycle support and systems development and techni-
cal support provide overall project leadership and direction. The systems devel-
opment and technical support staff conduct application development and manage
system operation. The CDX QA staff provide support and QA/QC oversight to all
levels of project staff.
2.2.1 Organization
Figure 2-1 shows the CDX organization.
Figure 2-1. EPA CDX Task Organization Chart
Air Program
Support Services
Water and Water Compliance
Other
UCMR
STORET
NEI
SDWIS
RCRA
OMR
TRI
IDEF
TSCA
Transborder
AQS
CDX
User Support
Registration and FRS
Documentation
State Nodes
Security
XML
Infrastructure
Radionuclide
2-2

-------
Quality Overview
2.2.2	Responsibilities and Tasks
Quality assurance responsibilities of the CDX team are based upon three impor-
tant premises:
~	Project requirements are the foundation from which software quality is
measured.
~	Specified standards define a set of development criteria that guide how
software is engineered.
~	The primary intention of producing a quality product is to ensure that it is
reliable and conforms to standards.
2.2.3	Quality Assurance Responsibilities
The CDX staff fill a variety of positions, each providing a specific role in the
management, development, process or product review, and support, as shown in
Table 2-1.
Table 2-1. Quality Assurance Responsibilities
Role
Responsibilities
Program Director for Program
and System Life-Cycle Support
business unit (PD-PSLSCS)
•	Sets the strategies and goals for the group
•	Ensures high quality work
•	Ensures completion of projects on time and within budgets
•	Makes efficient, effective use of staff and oversees other
resources
•	Briefs LMI management and clients on project
Program Director for Systems
Development and Technical
Support business unit
(PD-SDTS)
•	Sets the strategies and goals for the group
•	Ensures high quality work, procedures, and standards
•	Ensures completion of projects on time and within budgets
•	Prepares proposals
•	Makes efficient, effective use of staff and other resources
EPA Program Manager
(PM-EPA)
• Ensures complete project plans (e.g„ approach, analysis,
data collection, reporting, budget, schedule, team
assignments, and quality controls)
•	Coordinates and directs the effort of research staff
•	Briefs LMI management and clients on project
•	Sen/es as primary point of contact for assigned project
•	Prepares proposals
•	Coordinates completion of project reports and other
deliverables
2-3

-------
Table 2-1. Quality Assurance Responsibilities (Continued)
Role
Responsibilities
Functional/technical leaders
•	Manage schedule, budget, and deliverables
•	Provide subject area knowledge
•	Provide guidance arid direction to developers
•	Obtain requirements from the client
•	Prepare project-related documents
•	Attend the client status meetings
•	Track and monitor problems and resolutions
•	Provide status reports to EPA program managers
•	Prioritize system changes requests
•	Prepare project plans
•	Prepare project reports and deliverables
EPA Program Technology
Manager (PM-Technology)
•	Contributes to team building
•	Coordinates and directs the effort of technical research
staff and subcontractors
•	Briefs LMI management and clients on project
•	Prepares technical portions of proposals
•	Prepares project reports or other deliverables
CDX User-Support Leader
•	Provides customer support
•	Sets team standards
•	Tracks and monitors problems and resolutions
•	Provides status reports to the EPA program technical
managers and functionalftechnical leaders
CDX Architecture Manager
•	Provides oversight and direction for the development and
maintenance of the CDX
•	Ensures requested changes are documented and
implemented
•	Performs coordination across CDX projects using best
practices
•	Reviews and evaluates technology
CDX Configuration Manager
•	Provides an environment for directory structure for all
versions of software
•	Maintains bookkeeping to record in which machines the
different software versions {development, test, and
production) reside
•	Manages hardware and telecom communications
•	Maintains hardware logs
•	Maintains security logs
COX Database Manager
•	Provides the development and maintenance of the
database structure and data
•	Oversees security
•	Establishes standards and procedures
2-4

-------
Quality Overview
Table 2-1. Quality Assurance Responsibilities (Continued)
Role
Responsibilities
Quality Assurance Lead
•	Ensures high-quality work
•	Conducts and documents quality reviews
•	Conducts quality checks
•	Provides QA status report
•	Tracks and monitors SCRs
•	Tracks and monitors problems and resolutions
2.2.4 Tasks
QA involves systematic activities that provide evidence of the fitness-for-use of
the total product. The CDX team achieves this through an assessment of the speci-
fications and by following specific procedures.
The QA staff supports the CDX team by focusing on four major areas:
1.	Quality assurance determines whether the software or software process con-
forms to established standards and identifies software or software processes
that do not conform to standards. To determine conformity, the QA staff per-
forms a series of checks against project deliverables, including verifying spe-
cific checklists against the actual artifacts. These checklists provide a baseline
for the standards to adhere.
2.	Verification and validation identifies any oversights or deviations from cus-
tomer requirements and predecessors. The QA staff determines deviation by
utilizing a series of checklists to determine the soundness of the requirements
against project deliverables.
3.	Test and evaluation checks for shortfalls within the requirements and design
documents by exercising the coded form of software and identifies those defi-
cits in deliverables. The QA staff determines the quality of the code by ensur-
ing the customer requirements have been addressed.
4.	Configuration management addresses the need to make changes visible and
traceable, and supplies a formal way to control those changes. The need for
change arises primarily from the application of the other three task areas. The
QA staff ensures a baseline is established and deployed to the production en-
vironment. They then provide a software migration checklist to the technical
lead to ensure deployment of the correct version of the software.
2-5

-------
2.3 Professional Training and Development
All members of the QA staff have a background in quality assurance, testing, and
training and an understanding of the software development processes. Training
and professional development activities play an important role in helping the pro-
ject staff use the current software development tools and techniques. Staff mem-
bers are encouraged to attend professional training to remain current on the latest
methods, procedures, and tools of the industry.
The CDX team is dedicated to ensuring employees receive the training necessary
to perform their duties. Training is an integral part of ensuring quality of the
products produced. For example, CDX team members attended GENTRAN,
XML, and Rational training courses in the previous year because they were rele-
vant to specific tasks.
2.4 CDX Quality Assurance Areas
Quality assurance is measured across five major areas of CDX:
~	Software development
~	Customer service and technical support
~	System operations
~	Security and risk management
~	Project management.
The following sections discuss the objectives, definitions, quality measures,
metrics, and process descriptions for each of these areas.
2-6

-------
Section 3
Software Development
3.1	Objective
The objective of QA/QC is to ensure the CDX products produced and delivered in
the development phase are in accordance with the SDLC model and are of a high
quality. QA activities for this phase include product reviews and process recom-
mendations.
3.2	Definition
CDX development focuses on the requirements gathering and analysis design of
CDX data collection software and systems, file formats, and implementation
guidance.
3.3 Quality Measures
Quality measures indicate how progress toward a project's goals and objectives is
captured. They also help focus project efforts on achieving priority goals and
objectives. CDX quality measures are the basis from which quality metrics are
produced.
The CDX team uses proven development practices to produce quality software
applications for the CDX system. These development practices are in accordance
with ISO 9000 standards and recommendations.
During the software development phase, the CDX team applies quality measures
in the following areas:
~
Documentation
~
Deliverables
~
Standards and guidelines
~
Reviews, quality checks, and audits
~
Test and evaluation
~
Problem resolution and corrective action.
3-1

-------
3.4 Metrics
Metrics provide a quantifiable basis for evaluating the quality of software and
other project deliverables. Qualitative judgments are more frequently used during
the early SDLC phases, while quantitative metrics are used during the latter
phases of development. In its implementation of ISO 9000 standards, the CDX
team incorporates metrics into both its general project management practices and
its software development processes.
The CDX team works closely with functional and technical project leaders to
complete software development tasks. These tasks generate a number of metrics,
as shown in Tables 3-1-3-3.
Table 3-1. Software Development and Maintenance Quality Measure and Performance
Metrics—Identify and Track Trends in the Software Change Request Process
Performance metric(s)
Goals(s)
Indicators)
Number of SCRs identifying
critical deficiencies
Number of SCRs identifying critical
deficiencies is less than or equal to
5 percent of total of SCRs
Number of SCRs identifying critical
deficiencies is greater than or equal
to 10 percent of total number of
SCRs
Total number of SCRs
(excluding enhancement
requests)
Total number of SCRs (excluding
enhancement requests) is less than
or equal to 50 percent of the total
number of requirements
Total number of SCRs (excluding
enhancement requests) is greater
than or equal to 65 percent of the
total number of requirements
Number of SCRs identifying
critical deficiencies in pre-
deployment versus post-
deployment phases of project
Number of SCRs identifying critical
deficiencies in post-deployment
phase of project is less than or equal
to 25 percent of the number of SCRs
identifying critical deficiencies in the
pre-deployment phases of the project
Number of SCRs identifying critical
deficiencies in post-deployment
phase of project is greater than or
equal to 40 percent of the number of
SCRs identifying critical deficiencies
in the pre-deployment phases of the
project
Number of SCRs (excluding
enhancement requests) that
originated internally versus
number of SCRs that
originated externally
Number of SCRs originated internally
(by either LMI or OEI) should
represent 75 percent or more of the
total number of SCRs (excluding
enhancement requests)
Number of SCRs originated
internally (by either LMI or OEI)
represents 65 percent or less of the
total number of SCRs (excluding
enhancement requests)
Number of SCRs (excluding
enhancement requests) for
front end, middle area, and
back end of the system
Percentage of total SCRs should be
equal (33 percent) for each area:
front end, middle area and back end
Percentage of total SCRs for any
one area (front, middle, back)
exceeds that of the other areas by
25 percent or more
3-2

-------
Software Development
Table 3-2. Software Development and Maintenance Quality Measure and Performance
Metrics—Identify and Track Trends in the Requirements Management Process
Performance metric(s)
Goal(s)
Indicator(s)
Patterns of change in
requirements, including the
following:
Track trends, timing and
patterns of requirement
modifications
Cumulative number of requirement
modifications exceeds number of original
requirements
Cumulative number of
requirement
modifications
Shift distribution of
requirement modifications to
earlier phases of SDLC
Requirements modification for each of the
later SDLC phases (Testing, Deployment,
Operations) exceeds mean number of
requirement modifications per phase by
more than 1 standard deviation
Number of requirement
modifications by phase
of SDLC
Minimize number of
requirement modifications


Track trends in code rewrites
Cumulative number of requirement
modifications for later SDLC phases
exceed cumulative number of requirement
modifications for early SDLC phases
(business case, requirements,
development)
Table 3-3. Software Development and Maintenance Quality Measure and Performance
Metrics—Identify and Track Code Modification Trends
Performance metric(s)
Goals(s)
Indicator(s)
Number of units of code
significantly modified as a
result of peer review
10 percent or less than the
total number of units of code
Greater than or equal to 25 percent of the
total number of units of code
Number of units of code
significantly modified as a
result of SCRs
10 percent or less than the
total number of units of code
Greater than or equal to 25 percent of the
total number of units of code
The CDX Quality Assurance Lead is responsible for compiling information re-
lated to the metrics in this and following sections and for providing a monthly QA
status report to EPA. Any anomalies that occurr during the reporting period are
also described in the status report.
3-3

-------
QA Process Description
The QA/QC activities for the development phase ensure the development staff
follows standardized software development processes for the CDX system. The
QA/QC process includes the following activities:
~	Document and track requirements.
~	Conduct and document design reviews.
~	Conduct and document product peer reviews.
Accomplishment of these activities is measured in several ways. The most appar-
ent metrics show if a project is completed on time and within budget according to
the latest available requirements, specifications, schedule, and budget. The ulti-
mate standard is whether the project deliverables are accepted by the customer.
This information is provided by project leaders via Microsoft Project Gantt
charts and is described in Section 7.
Other methods used by the CDX team to gauge success include manual review of
Gantt chart status reports, documents, and deliverables by the QA Lead, program
managers, and program directors, as well as internal "lessons learned"
sessions with CDX team members and external sessions with EPA.
Figure 3-1 displays the process flow for activities occurring during the develop-
ment phase.
Figure 3-1. CDX Development Phase Activities
•	Conduct stakeholder meetings
•	Collect requirements
•	Analyze information
¦ Develop system requirements
specification document
•	Deliver system requirements
specification
•	Customer acceptance
•	Prioritize requirements with customer
Project Planning
Business Analysis and
Requirements
Development
¦	New work survey
• Quality review assessment
» Product tracking form
¦	Project plan
¦	Customer acceptance
¦	Design data model
¦	Develop design model
« System design document
•	Design user interface
•	Create database
•	Code and unit test
•	Testing and validation
•	Customer acceptance
3-4

-------
Software Development
3.5.1	Documentation Process
Documentation produced for CDX projects passes through several review proc-
esses to ensure it meets CDX project requirements and quality standards. QA staff
review CDX documentation to verify it meets project requirements and SDLC
standards. The technical consultant's editorial staff review and edit project reports
to ensure they meet ISO 9000 standards for documentation.
The quality assurance staff review project documents on both a set schedule and
as-needed throughout the SDLC. The results of the documentation reviews are
captured on the Software Product Review Checklist. One QA person serves as
coordinator for the review, documenting any notes as attachments to the checklist.
This person also coordinates completion of the checklist. The results are tracked
in the ISO information management system on the product review summary (see
Figure 3-6).
3.5.2	Deliverables
The QA staff review deliverables produced by other project staff. These may
include the following:
~	The Requirements Document—using a software requirements checklist to
determine if the requirements are testable, verifiable, consistent, complete,
unambiguous, and traceable (see Appendix A).
~	The Software Design Document—utilizing the Software Design Review
Checklist to ensure the design is congruent with the requirements
document (see Appendix B).
~	User support documentation (e.g., tutorials, instruction guides, and user
guides)—ensuring they are representative of the system to be delivered.
This review requires the QA staff employ the support documentation to
use the system, then report any discrepancies.
~	Release notes—utilizing the Release Notes Template (see Appendix C).
~	Bill of Materials against the Bill of Materials Template—The bill of
materials is the list of all hardware, software, and other assets used on the
project. QA/QC verifies whether the bill of materials includes an
inventory of materials, software contents, changes, installation
instructions, and known errors and problematic features (see Appendix D).
~	Software product tested against CDX requirements—using a Test Plan.
The practice of using these various checklists was instituted in May 2001. Less
formal methods of QA were used before that date. (See Appendixes A and B)
3-5

-------
QA procedures extend beyond the reviews conducted by the QA staff. All reports
produced for the EPA CDX system are routed through a formal editorial review to
ensure the proper use of grammar, templates, and graphics.
All deliverables are reviewed and signed by
~	the program director for the Program and System Life-Cycle Support
business unit, and
~	LMI's EPA program manager.
For some technical documentation, the documents are also reviewed by
~	the program director for the Systems Development and Technical Support
business unit, and
~	LMI's EPA program technical manager.
3.5.3 Standards and Guidelines
CDX software development adheres to standards and guidelines derived from
ISO 9000 operating procedures for software development. These guidelines are
implemented in two ways:
~	Through the implementation of LMI's standards, methods, and
procedures, which are certified as ISO 9000 compliant.
~	Through the use of the Rational software development suite of tools,
which are ISO 9000 compliant.
The ISO 9000 software development standards and the Rational software devel-
opment suite complement one another. Our standards and procedures are followed
by project personnel. Additionally, the Rational suite provides the tools that im-
plement and automate software development.
The quality assurance staff uses a number of guidelines for creating and imple-
menting QA standards, procedures, and tools. These guidelines, derived from
SDLC best practices, may include
~	maintainability,	~	integrity,
~	interoperability,	~	usability,
~	testability,	~	reusability,
~	reliability,	~	efficiency, and
~	flexibility,	~	portability.
~	correctness,
These guidelines form the basis for	the checklists used by QA staff to review pro-
ject documents and deliverables.
3-6

-------
Software Development
For the management of software development, the CDX team uses Rational
ClearQuest™, an ISO 9000-compliant system change request (SCR)-tracking
tool, to enter and track all SCRs. Using this tool, the CDX team summarizes and
reports on a variety of SCR data, which also help identify different trends.
SCR data available for summary and analysis may include a subset ot the
following characteristics:
~	Cumulative number of SCRs opened on the project
~	Percentage of SCRs that have been closed
~	Percentage of SCRs that remain open
~	Date each SCR was opened
~	Severity of each SCR
~	Status of each SCR
~	Average age of open SCRs
~	Number of SCRs listed by severity, status, or keyword term.
The CDX team is currently researching the compatibility ot Rational ClearQuest
and the McAffee SQL™ tool. If these products are compatible, an interface will
be developed, allowing the two tools to interact. This solution will potentially
consolidate the SCR tools and allow more efficient use and tracking ot SCRs.
Figure 3-2 and Figure 3-3 show sample SCR forms.
Figure 3-2. System Change Request Form
y Window Help


,J&I
0. Bill (
|§|tf| HonDwwa "» c
rfl 'S+l OWMI

J	——¦ 1

Hsadline

auhty Name lu 411 chaiacicis
Submitted |3-Noimal Queue
SDTS00000346 Add woCwtoo legadirfl odils oi Facikly ID. Facto Type. Walei Type and Availability
S[iTStt)000376 Extianeoue: da.a »>A6*»MlW		-		
SliTSn. ii.iOiJ I'HI Add Activity Statu to'PV/S Facility RepotH» space is available
SDTS0000U4QS Duplicate Recoidi in Lab Samole Select Reooit
\RhmuH «e1 /( Ot wry editor	^ SOL ««<>. /
1,1 mill.-1 4 Low PiMnity
Submitted ;2-ijive Hiah Attq
| Holes j Qetcluhon j
ID )SOTS00000337
^Haehmeiv. ] Histoiy | £ustamei |
glale: j Submitted
Headine: [seTina* length ot Facility Name to 40 character
Pjojact- jEPOOl 08UCMR	' '
Seventy jlAveraoe
~VJ ff^cAtyName
"J fetttFaciMy
Pfioiity ]3-Nomttl Queue
Bwner.
£je«cnption.
Symptoms
1J
d

The maximum length of the Fackty Name ihould be 40 Cba»acteis

Prnt Recad
Actons
I < I ID 00000337 1M»'I
3-7

-------
Figure 3-3. System Change Request Blank Form
% s*rai 5-
Submit Defect SO1500000543
,. . .V,-, ;¦ ¦,:•¦	v
1 Mah 1 Attachments j Customer j
wii xl
Slate: jSubmitted
|SDTSOOGOOS43
Hotline: I
Keyword:
Values ~
Severn
jJ Symptoms
Priority
De;cription
3.5.4 Reviews, Quality Checks, and Audits
As part of the ISO 9000 processes, CDX employs several quality planning and
review requirements for the design and development of its software products.
Although project leads and reviewers are responsible for completing the processes
described below, the QA Lead is responsible for monitoring and documenting
overall adherence to QA processes related to reviews, quality checks, and audits.
3.5.4.1 Reviews
To determine the level of technical review for the CDX project, the project leader
or QA staff completes the Quality Review Assessment (QRA) form during project
initiation. The QRA solicits project identification as well as information intended
to match the project with appropriate quality assurance tools and procedures.
The form is submitted to the program director (PD) or the program manager (PM)
for review and approval.
The reviews are tracked and documented in the ISO 9000 Information Manage-
ment System using the following link and forms:
~ Product Tracking Link. Project leaders select this link to identify products
that require a technical quality review. Project leaders assign a reviewer
for each product. The Information Management System automatically
notifies the assigned reviewer of the produces) they are responsible to
review. This link also shows the schedule for review completion (see
Figure 3-4).
3-8

-------
Software Development
Figure 3-4. Product Tracking Link
EP005-13
ENHANCE CDX

. . w 0

. ^ Prototype CDX Upgrade (Little. Pauick

Other)

^ Prototype CDX L'ggrade Documentation

. . . . ' Prototype C'DX maintenance

documentation and status report (Little, Patrick

Other!

^ ¦ ».v V?",> •..
Collapse All 1 Previous

i	 i
~ Product Detail Screen. This form provides information about the product:
product title, product type, quality reviewer, and an estimated review due
date (see Figure 3-5).
Figure 3-5. Product Detail Screen
oi

Product input for project: EPQ01-00



Reviewer Name:
Little, Patrick
Product Typ«
LMI Report - final
Estimated Review
Completion
(ininddvyyy):
Actunl Review
Completion
(inin/dd/yyyy):
05/15/2001

UCMR Requirements Document
~ Product Review Summary Form. This form is a summary of the technical
quality review and actions taken based on the reviewer's recommendations.
Figure 3-6 below represents a product review summary form.
Figure 3-6. Product Review Summary Form

— hhI
u O rroauci iteview summary m
1
04/27/5001 02:28 PM
Helgp Xnreiile

I ask No.:
I'a.s!.: Title:
TR003-00
USTRANSCOM XML DOCUMENT SERVER
Project Leader:
Program Group:
Soreide. Helge
SD&TS
Product Title:
Actual Compeletion (nurt/drf'yyyy)
Sample #1
12/31/2001
Geuriva! T'titduig; of Review

Actions Taken by Project Leader:

3-9

-------
The QA reviewer or designated quality reviewer ensures the forms are complete
and available within the ISO 9000 Information Management System for the
product under review.
3.5.4.2	Quality Checks
Procedural quality checks by the QA staff ensure processes are followed. Proce-
dural quality checks and evaluations also provide quality measurements of CDX
system development, maintenance, and support processes. The schedule of deliv-
erables determines the quality check schedule. The QA staff will perform a qual-
ity check after each major deliverable. Problem reporting and corrective action for
the quality checks are reported to the technical lead and documented. The team is
then given the opportunity to correct any non-conformities and a follow-up qual-
ity check scheduled.
3.5.4.3	Formal Audits
As part of the ISO 9000 recertification, formal audits of the processes are re-
quired. These audits are performed by an external group.
3.5.5 Test and Evaluation
Designated project personnel serve as reviewers in the test and evaluation process.
Reviewers perform various testing activities, including the development of test
plans, procedures, and scripts, and execution of the tests, through sign off on the
software and documentation.
The test execution for CDX is both internal and external:
~	Internal testing—Internal testing is conducted by the CDX staff. This
testing includes unit, functional, integration, regression, and system
testing. Internal testers include any CDX team member or a QA/QC
member. Typically, a test plan and test scripts are developed and executed
for this process. The test scripts usually include a checklist for
documenting success or failure of specific requirements or functionality.
This documentation is then included in the quality records associated with
the CDX system. The QA/QC team ensures the software is functional and
meets the requirements.
~	External testing—External testing is conducted by the EPA and any entity
external to EPA, which may include a state or a facility. This testing
assures the client the software is acceptable and meets their requirements.
Issues or problems that arise from external testing are typically reported to
the project leader as well as the Customer Support team. Project leaders
document these test results, which may also be entered into the SCR
system, if warranted. The Customer Support team enters customer calls
into the McAfee database. A more detailed description of the customer
support process is described in Section 4.
3-10

-------
The QA reviewer or designated quality reviewer ensures the forms are complete
and available within the ISO 9000 Information Management System for the
product under review.
3.5.4.2	Quality Checks
Procedural quality checks by the QA staff ensure processes are followed. Proce-
dural quality checks and evaluations also provide quality measurements of CDX
system development, maintenance, and support processes. The schedule of deliv-
erables determines the quality check schedule. The QA staff will perform a qual-
ity check after each major deliverable. Problem reporting and corrective action for
the quality checks are reported to the technical lead and documented. The team is
then given the opportunity to correct any non-conformities and a follow-up qual-
ity check scheduled.
3.5.4.3	Formal Audits
As part of the ISO 9000 recertification, formal audits of the processes are
required. These audits are performed by an external group.
3.5.5 Test and Evaluation
Designated project personnel serve as reviewers in the test and evaluation process.
Reviewers perform various testing activities, including the development of test
plans, procedures, and scripts, and execution of the tests, through sign off on the
software and documentation.
The test execution for CDX is both internal and external:
~	Internal testing—Internal testing is conducted by the CDX staff. This
testing includes unit, functional, integration, regression, and system
testing. Internal testers include any CDX team member or a QA/QC
member. Typically, a test plan and test scripts are developed and executed
for this process. The test scripts usually include a checklist for
documenting success or failure of specific requirements or functionality.
This documentation is then included in the quality records associated with
the CDX system. The QA/QC team ensures the software is functional and
meets the requirements.
~	External testing—External testing is conducted by the EPA and any entity
external to EPA, which may include a state or a facility. This testing
assures the client the software is acceptable and meets their requirements.
Issues or problems that arise from external testing are typically reported to
the project leader as well as the Customer Support team. Project leaders
document these test results, which may also be entered into the SCR
system, if warranted. The Customer Support team enters customer calls
into the McAfee database. A more detailed description of the customer
support process is described in Section 4.
3-10

-------
Software Development
There are four levels of testing performed against the CDX system:
~	Unit testing verifies the smallest piece of a program (module) to determine
the actual structure is correct and if the function of the code operates
correctly and reliably. The CDX application developer(s) performs this
test.
~	Integration testing evaluates a group of units (modules) that have been
implemented together. The CDX application developer(s) performs this
test.
~	System testing verifies the product by testing the application in the
integrated system environment. The purpose of system testing is to ensure
the CDX functional requirements are satisfied by the system. The CDX
application developer(s) performs this test.
~	Acceptance testing verifies the application is fit for deployment. This may
include verifying that the application is reliable, meets the requirements
for business, performs well, and has a consistent look and feel. This
testing is performed by the CDX application developer(s). At this point,
the functionality of the application is typically demonstrated at the CDX
program management level.
The CDX program testing activities are both internal and external (by EPA and
others). The testing efforts may include the following types of testing:
~	Performance testing verifies that all software modules both individually
and collectively, meet specified performance objectives, including
maximum load and throughput.
~	Reliability testing verifies that all of the possible operations in the
application work without causing the software or the system to hang or
crash.
~	Business-function testing verifies that the crucial business functions are
working in the application. The business rules are documented in a
requirements document.
~	User-interface testing verifies that the application under test is consistent
throughout and meets the objectives of the user-interface design
specification.
~	Installation testing verifies that the application can be installed correctly.
~	Configuration testing is performed using specific hardware and software
combinations.
~	Documentation testing verifies the accuracy and completeness of user
documentation.
3-11

-------
~	Regression testing is the re-execution of tests after a fix, change, or
enhancement has been made to the code and a new build has been
delivered to QA. Regression testing verifies that previously identified
problems have been fixed and changes to another part of the application
have not introduced new problems.
The requirements and design documents are used to develop test plans, test pro-
cedures, and test scripts for evaluation. Interim and post-test documentation typi-
cally includes one or more of the following: status reports, problem reports, test
summary reports, and recommendations.
Project leaders and the QA staff maintain testing results records.
3.5.6 Problem Resolution and Corrective Action
This EPA QAP incorporates review, testing, and documentation into all applica-
ble phases of software development. These measures ensure problems can be
identified, analyzed, and addressed quickly.
The following elements are at the core of our problem-resolution processes:
~	Process and code reviews, as appropriate, within the project.
~	Software test plan and test cases, as appropriate, within the
project.
~	Software change requests and their corresponding management system.
~	Customer support and the corresponding trouble ticketing system.
Code reviews and software tests ensure anomalies are detected and identified ex-
peditiously. Customer support incidents may also identify system anomalies. The
SCR management system and the customer support trouble-ticketing system en-
sure problems are tracked, resolved, and analyzed for trend identification.
The SCR management process handles all hardware and software-related requests
and problems. Input to the process may come from code reviews, software tests, or
customer service incidents. Generally, SCRs originate from one of the following:
~	Software testers and associated CDX team members, who identify
problems and enhancements that merit SCRs
~	Customer support personnel, who qualify customer incident reports and
report system problems that have not already been identified; these
problems may merit SCRs
~	Technical personnel, who may identify software problems or
enhancements that merit SCRs.
3-12

-------
Software Development
The resolution of SCRs is a cooperative process among team members. The man-
agement of SCRs is the joint responsibility of the project's technical leader and
functional leader. These two team leaders designate one or more individuals to
enter and track SCRs and to produce and circulate SCR reports. Periodically, the
project technical leader and the project functional leader review all open SCRs,
prioritize SCRs for resolution, assign new SCRs to appropriate team members for
investigation and resolution, and identify SCRs that may require consultation with
the customer. Team members who are assigned SCRs for investigation and reso-
lution report their findings to the SCR trackers.
These change request trackers use the findings to identify SCRs that are provi-
sionally resolved. An SCR is not formally resolved and closed until the provi-
sional resolution can be independently verified by a team member other than the
one initially assigned the SCR.
The QA staff uses Rational's ClearQuest to track all SCRs and generate SCR re-
ports, which can be used to detect trends that may indicate systemic weaknesses
in the project processes. If any such trend is detected, the project team identifies
which elements or project procedures may be responsible for the trend, and
recommends appropriate remedial action to the customer. After instituting appro-
priate corrective actions, the project team assesses the effect of the corrective
actions. If there is no reversal in the trend, corrective actions are modified to
increase their effectiveness.
SCRs are evaluated to determine potential cost or schedule consequences. The
CDX project lead and EPA jointly evaluate enhancements to determine effects on
cost and schedule before implementation.
3-13

-------
Section 4
Customer Service and Technical Support
4.1	Objective
The objective of QA/QC is to ensure the CDX customer service and technical
support activities are conducted and documented according to a set of standard-
ized procedures.
4.2	Definition
Customer service and technical support help external customers who require
assistance when using a CDX service, typically when there is inability to connect,
failure to receive an expected confirmation, etc. User support also assists with
performing CDX operations when documentation is unclear to the user.
User support does extend to program-specific assistance as it relates to CDX. For
example, CDX assists a user in confirming their TRI submission was processed
and sent to the EPA TRI reporting center. CDX user support would not assist us-
ers with the completion of the TRI form itself. User support will also manage the
CDX web page and assist in user registration.
4.3 Quality Measures
The CDX Help Desk adheres to proven customer service practices to ensure
excellent customer service and technical support. These practices are easily trans-
lated to quality measures:
~
Help Desk operations
~
Problem resolution and corrective action
~
Customer evaluation
~
Documentation
~
Performance
~
Testing and validation
~
System operations.
The following sections describe these quality measures in more detail.
4-1

-------
4.4 Metrics
The Help Desk staff uses McAfee SQL trouble-ticketing software, an automated
application that logs, classifies, tracks, and reports contact with or requests to the
Help Desk. Metrics maintained through this application are listed in Table 4-1.
Table 4-1. Customer Service Quality Measure and Performance Metrics
Performance metric(s)
Goals(s)
Indicator(s)
Number of McAfee tickets opened
(daily, weekly, monthly)
95 percent or more of
McAfee tickets are closed
within one business day of
origination
5 percent or more of McAfee
tickets remain open for more
than one business day
Number of McAfee tickets opened
for each CDX application within a
reporting period
Number of McAfee tickets
opened for each application
represents 5 percent or less
of the total number of users
registered for that application
Number of McAfee tickets
opened for each application
represents 10 percent or mort
of the total number of users
registered for that application
The help desk currently operates according to the following service levels:
~	"How to" questions about EPA CDX resolved within 4 hours
~	Browser problems resolved within 1 day
~	Software support problems corrected within 1.5 days
~	The number of resolved or isolated on-site hardware problems are re-
solved within 1 day
~	Isolated onsite Internet connectivity problems resolved within 1 day.
4.5 QA Process Description
The QA/QC activities for customer service and technical support staff follows a
set of procedures developed for the CDX system. The QA/QC process includes
the following activities:
~	Logging daily CDX activities
~	Documenting problems and issues
~	Notifying users of resolutions to their problems:
>	Communicating promptly with users
>	Resolving problems in a timely manner
~	Notifying users of system availability.
4-2

-------
Customer Service and Technical Support
4.5.1	Help Desk Operations
The CDX Help Desk provides customer support through e-mail, fax, and a
toll-free number from 8:00 a.m. to 6:00 p.m. (EDT), 5 days per week, 52 weeks
per year (except for federal holidays). Both the e-mail and voice mail offer 24-
hour-a-day coverage for users to place their call and provide initial documenta-
tion.
4.5.2	Problem Resolution and Corrective Action
Customer support constitutes the initial level of support on all EPA CDX prob-
lems and requests.
Every issue submitted to customer support for resolution results in a unique activ-
ity ticket number, which is issued to the user (users leaving voice mail or sending
e-mail or a fax will be promptly contacted and provided with their activity num-
ber). The activity ticket is assigned to a technical support analyst for action. Users
may refer to the ticket number when inquiring about the status of their support
request.
CDX customer support personnel staffing the help desk use the McAfee SQL
trouble-ticketing application to document, manage, track, and analyze technical
support requests. The McAfee SQL trouble-ticketing application is used to log,
classify, track, and report on all customer contacts to the EPA Help Desk. The
Technical Project Leader responsible for the Help Desk also reviews trouble tick-
ets in the McAfee system to ensure consistency and thorough customer service
problem resolution. This process is documented in Figure 4-1.
If technical support is unable to resolve or is having difficulty resolving problems
they escalate to the technical project leader, software developer, network engi-
neer, and/or functional project leader. Depending on the severity of the problem,
the technical support team may request initiation of an SCR.
The technical support staff uses an escalation process that allows them to resolve
problems within their team. If the problems are not resolved then they inform
their technical or functional lead.
4-3

-------
Figure 4-1. Customer Activity Tickets and SCR Process Flow
Resolution communicated to Technical Support
Analyst.
Technical PL changes status of SCR to closed.
Yes
Successful
testing?
No
QA/QC retest.
Yes
Problem
Resolved?
Yes
No
No
Technical Support Analyst escalates probleirTTo"
Technical Project Leader (PL) and/or Functional
PL by completing Rational Clear Quest System
	Chanoe Request (SCR) formfsl 	
Technical Team provides intermediate solution.
¦Immediate^
Changes to
.software?-
Technical Support Analyst assists user in
applying solution to the problem.
Technical Support Analyst opens activity ticket
within McAfee.
User requests support.
Technical Support Analyst assesses problem.
Technical Support Analyst troubleshoots to find
solution to the problem.
Technical Support Analyst documents resolution
in McAfee activity ticket.
Technical Support Analyst documents and
categorizes problem accordingly within McAfee.
Technical Support Analyst closes McAfee
actiivity ticket.
Technical Team resolves problem.
Permanent solution scheduled for next software
release.
Technical PL and/or Functional PL reviews SCR.
4-4

-------
Customer Service and Technical Support
4.5.3	Additional QA/QC Procedures
The Technical Project Leader tracks and monitors technical support activities.
Additionally, the Technical Project Leader ensures service levels by spot-
checking McAfee tickets to guarantee quality service to customers.
The Technical Project Leader meets with the technical support staff at least
weekly; although meetings typically occur daily. These meetings are opportunities
to discuss the status of open ticket activities, problems, issues or concerns. These
meetings may also stimulate ideas on further enhancements or refinements to
Help Desk operation procedures.
In addition to the weekly meeting, the Help Desk staff and the Technical Project
Leader also attend a weekly meeting with the CDX EPA staff. This is typically a
conference call.
4.5.4	DOCUMENTATION
The technical support staff document all activities. They document and track
problems daily. Weekly, the technical support staff provide the following docu-
mentation:
~	Weekly CDX Report—This report includes metrics and support calls. The
support calls are classified as (1) How To, (2) Browser, (3) User Account,
(4) Problem with Software. The report is distributed to the CDX EPA
staff, Functional and Technical Leads, and (5) Problems with hardware
during the reporting period.
~	Weekly TR1 Log—This report lists all of the submissions received. This
report is available to the CDX EPA staff, functional and technical leads,
and the Emergency Planning and Community Right-to-Know Act
(EPCRA) Reporting Center.
~	Weekly Registrants List—This report list all users who registered per pro-
gram. This report is per request only and is available to the CDX EPA
staff, functional and technical leads.
4.5.5	Customer Evaluation of Performance
EPA stakeholders are the best gauge for measuring customer and technical sup-
port. Given that CDX is a public system that offers technical support as one of its
features, EPA program offices and the CDX team are the first to learn of issues
related to poor performance. To date, there have been no such reports regarding
the technical support staff. The CDX team is committed to maintaining this level
of service now and in the future.
Currently, there is no formal way to receive feedback from the users unless the
Help Desk staff specifically requests this information during a phone call or it is
4-5

-------
included in e-mail sent to the Help Desk. The CDX team is in the process of de-
veloping an online survey that will be used as a means to attain feedback on the
EPA Help Desk support. This evaluation would assist in determining where im-
provements can be made in the Help Desk processes.
Additionally, the Technical Project Lead overseeing the Help Desk works very
closely with the Help Desk staff, ensuring performance and activities are accom-
plished according to service-level agreements.
Any of the following techniques may be used to evaluate customer response.
Given EPA and government concerns regarding client contact, none of the
following have been implemented within CDX:
~	Periodic reviews
~	Online surveys
~	Website e-mail
~	Verbal comments from callers
~	Gathering of customer feedback during trouble call
~	Keeping CDX EPA staff and technical leads aware of user comments.
4.5.6 Cross-Team Support
The Help Desk has its own specific responsibilities which were described above
along with its procedures and QA/QC steps conducted to measure performance.
However, the Help Desk itself performs QA/QC functions in support of the CDX
application programming staff.
The Help Desk staff provides testing for CDX applications. Because the Help
Desk team is trained in the use of all CDX applications, the team is well suited to
rafomi testing activities that are integral to the CDX QA/QC framework. The
Help Desk staff assists technical project staff and QA staff in QA testing and vali-
dation procedures described in Section 3. This provides the added benefit of giv-
ing the Help Desk staff hands-on experience using the applications from both the
technical and user perspective.
The Heto Desk staff also performs QA/QC activities related to CDX system op-
erations Daily the Help Desk staff checks applications running on CDX using
the Web interface. Any anomalies discovered during this check are reported to the
CDX kacl sysxem engLer. If this cessation with the CDX lead system engi-
neer establishes the need to initiate a software change request, an SCR is entered
into ClearQuest to track the problem and its resolution.
4-6

-------
Section 5
System Operations
5.1 Objective
The objective of QA/QC is to ensure the quality and operation of CDX is main-
tained systematically according to a set of documented procedures.
5.2 Definition
CDX system operations include how the system is managed and maintained on a
daily basis. The core CDX services are the underlying functions used by specific
program area data flows. Figure 5-1 illustrates the 10 core functions.
Figure 5-1. Details of CDX Functions and Data Flows
3,0 Submit Data
Data in from industry,
programs, and states.

1.0 Maintain Web Site
k
• Maintain forms for

downloading


• Maintain accessible

compliance data

• Support user registration
10.0 Register Users
•	Register/unregister users
•	Link users to programs
•	Validate access
4.0 Receive Data
•	Maintain mail boxes
(paper and electronic)
•	Apply date/time stamp
•	Conduct initial verification
•	Log transaction
•	Archive data
9.0 Manage CDX Operations
•	Test and evaluate new EC
tools
•	Manage and review
process and procedures
•	Manage facility: contracts,
equipment, staff
2.0 Manage Security
•	Issue/maintain IDs
and security
•	Authenticate inbound data
• Operate firewalls, check
viruses, and manage other
security threats
Customers
6.0 Translate Data
•	Convert formats (includes
data entry for paper)
•	Acknowledge receipt
•	Conduct QA/QC
-	Syntax checks
-	Completeness
5.0 Archive Data
•	Archive electronic
and paper copies
•	Maintain logs of
electronic transactions
•	Maintain statistics
6.0 Support Customers
•	Trouble shoot
•	Provide documentation
•	Maintain Help Desk
•	Train
•	Plan and coordinate with
program areas
7.0 Distribute Dat^
•	Archive distribution copy of
data
•	Distribute data » EPA
and other Stakeholder
systems
Data out \
Ao EPA programs,\
states, and )
integrated /
databases /
5-1

-------
5.3 Quality Measures
System operations involve several members of the CDX team. It is the responsi-
bility of the systems engineers and the developers to ensure the CDX system is
operational by monitoring the system's daily activity. The quality measures asso-
ciated with monitoring and operating the CDX system include the following:
~	Operations monitoring and reporting
~	Problem resolution and corrective action
~	Scheduled system maintenance
~	Independent reviews.
5.4 Metrics
The aforementioned measures represent the many dimensions of quality necessary
to determine the effectiveness of the CDX system and its supporting functions.
Because of the complexities of the CDX system, it is not appropriate to tie indi-
vidual metrics to specific quality measures—many interrelated components make
up the whole. Rather, the CDX team ensures quality measures are monitored by
conducting automatic and repeatable tests to demonstrate required properties are
achieved and website behavior and performance meets expectations. Table 5-1
describes the metrics the CDX team uses to validate the quality measures. The
operational definitions of and standards for many of these metrics are defined in
the standard hosting agreement between the CDX team and EPA. The hosting
agreement may contain additional performance standards and metrics, which are
incorporated into this document by reference.
Table 5-1. System Operations Measures and Metrics
Relevant CDX
component(s)
Quality measures
Performance metrics
for standard measurement period
Goal
Warning
1.0 Manage
Websites
Track CDX usage
patterns
Number of pages
Number of changes to pages
requested by program areas or
users due to:
« Deficiencies in work
¦ New features or flows
Annual review of web-site for
compliance with EPA web page
standards
Less than
5 percent of
changes due to
deficiencies
Website meets all
EPA web page
criteria
More than
10 percent of
changes due to
deficiencies
Website achieves
less than "good-
rating in review
2.0 Manage
Security
See Section 5



3.0 Submit Data
Track CDX usage
patterns
Number of direct file uploads
Number of files submitted through
web forms
User problems
measured through
user support

5-2

-------
System Operations
Table 5-1. System Operations Measures and Metrics (Continued)
Relevant CDX
comporient(s)
Quality measures
Performance metrics
for standard measurement period
Goal
Warning
4.0 Receive Data
7.0 Distribute Data
Track CDX
processing
patterns
Number of transactions received
in in-box
Number of transactions made
available for distribution
Number of transactions remaining
available for distribution for one
week.
Each inbound
transaction
accounts for either
placement on
distribution server
or is rejected.
No transactions
remain unclaimed
on distribution
server for more
than 1 week
Identification of a
single transaction
not leaving the
system
Any transaction
remains
unaccessed for
more than 2 weeks.
5.0 Archive Data
Track CDX
performance
patterns
Number of files archived
Number of archive attempts failing
Amount of disk space used
No archive failures
Any archive failures
Disk space reaches
75 percent of
capacity
6.0 Translate Data
Track CDX
processing
patterns
Number of files translated
Number of transactions rejected
by translator
Number of pass-through files
Less than
5 percent of files
are rejected for
applications in
production status.
Less than
30 percent of files
are rejected in test
status.
More than
10 percent of files
are rejected for
applications in
production status.
Less than
50 percent of files
are rejected in test
status.
8.0 Support
Customers
See Section 4



9.0 Manage CDX
Operations
Track CDX
system availability
patterns
Number of times system was
down for 1 hour or more due to
CDX system failure
Percent of system downtime due
to CDX system failure
Number of times system was
down for scheduled maintenance
No unscheduled
down time
100 percent
system up time
(discounting
scheduled down
time)
Scheduled down
times used are no
more than 2 per
month
More than 2
unscheduled down
times per month
System up time
falls below
95 percent
(discounting down
time)
Scheduled down
times used exceed
3 per month
Track CDX
performance
patterns
Average throughput by CDX
application after assessment of
requirements
Overall average throughput (form
receipt at inbox to placement on
distribution server).
No more than
5 percent of
transactions for
that application
exceed expected
average
No more than
10 percent of
transactions for that
application exceed
expected average
10.0 Register Users
Track CDX
registration
patterns
Number of registered users
Number of status changes
Number of errors recorded in key
fields
Less than
5 percent of key
fields contain
erroneous data.
More than
10 percent of key
fields contain
erroneous data.
5-3

-------
5.4.1 Performance Standards
Performance standards for operations (e.g., goals for minimal level of system up-
time) are defined in the Service Level Agreement and the Contingency Operations
Plan (COOP).
5.5 QA Process Description
The QA/QC activities for system operations ensure the CDX system is operating
and maintained according to a set of guidelines defined by the project. Figure 5-2
displays the 10 core components associated with the CDX system.
Figure 5-2. CDX System Components
Maintain
W ebsite
CDX
System
0.0
Receive
Data
4.0
Translate
Data
6.0
Submit
Data
3.0
Archive
Data
5.0
Manage
Information
Security
2.0
Distribute
Data
7.0
Register
Users
10.0
Manage
CDX
Operations
9.0
Support
Customers
8.0
5.5.1 Components of System Operation
The following sections describe the basic components of the CDX system.
5.5.1.1	Maintain Website
This component provides requirements for developing and maintaining the con-
tent for CDX websites, including a homepage, registration pages, and submission
pages. This functionality also contains a subprocess that provides capabilities for
maintaining the sites (including the capability to add, modify, and delete forms
and post announcements).
5.5.1.2	Manage Information Security
Information security requirements address perimeter defense, authentication pro-
cedures, access controls, system software controls, virus scanning, and backup
procedures. As technology progresses, additional functionality may be added to
enhance the system.
This function establishes techniques for registering and validating users, authenti-
cating incoming data, and protecting legacy data. Specific protocols are delineated
and methods are identified for handling secured data.
5-4

-------
System Operations
This function contains four main subprocess areas. The first, establishing and
maintaining a user registration, manages the user registration database for author-
ized users. The second subprocess, user verification, includes requirements for
monitoring access to the system by creating procedures for user log-on, establish-
ing the log-on passwords, and verifying the security for inbound messages. The
third subprocess, data authentication, acquires digital certificates and facilities for
handling electronic signatures. The final subprocess, protection, contains re-
quirements for specific protocols and distinctions between types of data.
5.5.1.3	Submit Data
This function provides general capabilities for users to access, view, download,
and update information through direct transmission, using the Internet or through-
put on the CDX website. This function also enables the user's profile information
(i.e., name and address) to pre-populate forms and send information to the CDX.
Four subprocesses fall within the submit data functional area. The manage forms
subprocess provides capabilities for reviewing, printing, editing, and signing
forms. The pre-populate forms subprocess supplies functionality to fill out forms
automatically with information that is based on the user and previously supplied
data. The other two subprocesses—send electronic data and send paper—
establish techniques for users to send information electronically and on paper.
5.5.1.4	Receive Data
This function enables the CDX system to handle incoming data. This functional
area has four subprocesses. Two of these subprocesses—establish communication
techniques and establish data formats—delineate how data will be retrieved. The
receive transaction subprocess details the complete flow of information through
the function. The receive paper subprocess handles paper submissions.
5.5.1.5	Archive Data
This function delineates policy and procedures for the CDX to capture, store, and
maintain data as well as control access to the archives. This function includes
methods for storing data, logging transactions, archiving system backups, retriev-
ing archived data, and measuring statistics and storage as well as general archiv-
ing information. The archive function provides the ability to maintain the
submission in its original format and, in conjunction with the transaction log, to
provide legal traceability of the document.
The archiving process for CDX electronically holds original transaction files re-
ceived from a client, resulting files from a translation of the original submission,
files signed by CDX to be sent to the client, and reports generated by EPA to be
sent to the client. These are all stored in a secure database.
5-5

-------
The archiving process is divided into four steps. Each program (AEI, TRI,
UCMR, IDEF, etc.) uses one or more steps:
~	Archive 1 stores the original submission file in a secure database.
~	Archive 2 stores any digital signature file after that has been validated or
verified in the secure database.
~	Archive 3 stores any resulting file from a translation in the secure data-
base.
~	Archive 4 stores files to be sent to the client after being digitally signed by
CDX and report files from EPA to be sent to the client in the secure
database.
All programs use step one. Some programs require a digitally signed letter with
their submission. Other programs require a digitally signed document to be sent
back to the submitter as receipt acknowledgement. All programs require a positive
or negative acknowledgement. Some programs require a notification by the sub-
mitter using the CDX inbox at the same time as SMTP mail.
Multiple files in a single submission are bound in a single zip file. If the file re-
ceived is larger than 32 megabytes, the file is saved in a secure directory and a
pointer is set in the archive table.
The CDX Help Desk is notified of every error sent to CDX via SMTP mail
(see Table 5-2).
5.5.1.6	Translate Data
This function establishes mechanisms for transforming incoming stakeholder data
and mapping or parsing the data to the appropriate EPA legacy database. The re-
quirements relate to file structures expected as input, data setup and manipulation,
data verification, standards, and multiple mapping solutions.
5.5.1.7	Distribute Data
This function establishes mechanisms for disseminating copies of received data to
appropriate data systems, as well as providing a means for archiving and logging
all transactions. Two initial subprocesses determine the recipient profile, verify
security, and send the transaction.
5-6

-------
System Operations
Table 5-2. Archive Administration
Function
Description
Archive security
Archive process management
Archived information
Distribution of information
Error detection for
archive process
The application is a Secure NT, password protected
process. Connection to the database is established using
SQLNET password protection.
The application(s) is on NT Scheduler and is activated
every five minutes (time varies depending on the
program). It creates a log file that can be retrieved for
debugging purpose.
The application(s) read the necessary information from
an INI file. The INI file is located in the same directory as
the application(s) resides.
The information is distributed through SMTP mail and/or
CDX inbox in the register database.
The application(s) check for and log the following types of
errors:
•	An INI file was not found
•	Connection to the database was not established
•	The submitted file does not meet the naming
convention requirement for that program
•	The sender and client are not valid
•	An e-mail address was not found in the Register
for that sender
•	The file was not properly saved in the database or
moved to the archive directory (for large files)
•	Acknowledgements and notifications are not sent
•	Zip, unzip, move and copy files were not
successful
All errors are logged in the log file, in the Transaction Log
table or communicated to the Help Desk via SMTP mail.
5.5.1.8 Support Customers
The support customers' function identifies user assistance requirements. The re-
quirements for support are organized according to the following subprocesses:
~	Troubleshooting
~	Security management assistance
~	Training
5-7

-------
~	Online documentation
~	Maintenance of documentation.
Other subprocesses include tracking customer requests and responses, and the
functionality needed to query status. See Section 4 for more details.
5.5.1.9	Manage CDX Operations
This function specifies requirements for managing and maintaining the CDX sys-
tem, monitoring costs, and evaluating trend and performance data. This function
has five subprocesses. The first sub-process provides administrative tools and
specifies requirements for administering, collecting, and tracking system data.
The maintain mailboxes subprocess relates to requirements for managing system
mailboxes. The monitor costs and maintain documentation subprocesses refer to
CDX requirements for cost reporting and general document maintenance. The
facility subprocess outlines requirements for continuity of operations planning
and shadow systems.
5.5.1.10	CDX Register Users
The register users function is an automated method of registering and tracking
users who are permitted to use the CDX system and its specific programs. There
are two types of registration: government pre-registration and user registration.
Government pre-registration allows the users to obtain their information using the
following:
~	EPA authorization
~	Customer retrieval key provided in a letter sent by EPA
~	Create username and password.
User registration allows the user to enter personal business data using an online
data entry system.
Both the government pre-registration and the user registration typically follow the
same process after the initial logon:
~	Verify or edit organization information.
~	Verify or edit role information.
~	Complete registration.
5-8

-------
System Operations
5.5.2	Reviews, Quality Checks, and Audits
The QA Lead performs reviews, quality checks, and audits to ensure system op-
erations are being monitored and maintained as required. Logs are generated by
many of the automated system monitoring applications reviewed by the QA Lead,
the Lead Systems Engineers, and other CDX team members.
5.5.3	Operations Monitoring and Reporting
The CDX system is monitored by the Systems Engineer using the following tools:
~	Compaq Insight Manager—the Compaq application for easily managing
network desktops and servers. Insight Manager delivers intelligent moni-
toring and alerts as well as visual control of Compaq servers and desktops.
~	Computer Associates ArcServe IT—the tool used to perform backups. The
Lead Systems Engineer, a systems engineer, and the Help Desk all moni-
tor this tool. Full backups are performed weekly. Incremental backups are
performed daily.
~	Cyber Cop—this tool is a security assessment tool and intended for one-
time use.
~	Lockit—an EPA one-time tool used at build time to check several servers
against a set of EPA standards and guidelines. The staff utilized a manual
checklist before the use of this tool. This tool captures approximately
90 percent of what is required from the checklist.
~	Norton Antivirus Corporate—virus protection software used for the CDX
project. It is installed on all servers within the EPA domain. It protects
servers 24 hours a day and alerts the system engineers when a virus is
discovered. The servers are monitored and checked daily for any issues
related to the Norton application.
~	Rdisk—a Microsoft utility used to update the repair information saved
when installing the operating system. This utility creates an Emergency
Repair Disk. The repair information is used to recover a bootable system
in case of failure. Rdisk is an operations tool that operates via manual
process. A disk is created for each server in the EPA domain upon being
built. The disks are stored in a locked cabinet and are updated on a
monthly basis.
~	Stat Scanner—monitors the servers; performs diagnostics on each server.
This tool assists the staff in finding security leaks and indicates a solution.
This tool is used on a monthly basis.
~	What's Up—this tool continuously monitors the status of the network
connections.
5-9

-------
The system activity is tracked with the following logs:
~	Checkpoint Firewall Log captures source and destination of users logging
into the CDX programs. This log is checked daily.
~	IIS Logs track public web server usage. The information captured for our
purposes includes source IP, resource accessed, time, and action (get or
post).
~	NT Application Log captures information that indicates the application
activity. This log is checked daily.
~	NT Security Log captures the server activity; it captures which users have
logged on to what specific server. When an invalid user is detected the
LAN Administrators investigate. This may require some research and/or a
phone call. This log is checked daily to ensure that there were no unex-
pected users logging on to the servers.
~	NT System Log—This log captures information related to any system files.
This log is checked daily.
~	What's Up Log captures information about the Network. It is set up to
automatically page the Lead Systems Engineers for specific problems.
This log is checked daily.
These logs are checked periodically by the Network Administrator. When the
Network Administrator detects an anomaly or is alerted by network monitoring
tools, the anomaly is qualified and reported to appropriate CDX team members.
The CDX team members investigate the anomaly and identify, troubleshoot, and
resolve any underlying problems. Depending on the CDX application and staff
assigned to specific applications, team members will vary accordingly. Typically,
the application developer(s) are responsible for products during the entire life
cycle. During troubleshooting and problem resolution, team members working the
problem keep other CDX/EPA staff informed:
~	Lead Systems Engineer interpret the error message and determine severity
~	Contact the appropriate CDX/EPA team member(s) and determine if
escalation procedures are necessary
~	Investigate the problem
~	Contact the appropriate CDX/EPA team member(s)
~	Troubleshoot to solve the problem
~	Solve the problem
~	Contact the appropriate CDX/EPA team member(s)
5-10

-------
System Operations
There is a specific security-alert phone tree of persons contacted for specific prob-
lems. This information is detailed in the EPA Security Plan, dated March 5, 2001.
5.5.4	Problem Resolution and Corrective Action
The CDX problem resolution and corrective action process begins with the CDX
Help Desk. Every issue submitted to customer support for resolution results in a
unique activity ticket number, which is passed back to the user—users leaving
voice mail or sending e-mail or a fax are promptly contacted and provided with
their activity number. The activity ticket is assigned to a customer-support analyst
for action. Users may refer to the ticket number when inquiring about the status of
their support request.
Customer support constitutes the initial level of support on all EPA CDX problems
and requests. If customer support is unable to resolve or is having difficulty resolv-
ing problems, they receive direction and support from the technical project leader,
software developer, network engineer, or functional project leader. Depending upon
the severity of the problem, the customer support team may initiate an SCR. Refer
to Section 4 for more details.
5.5.5	Scheduled Maintenance
The CDX system requires down-time in order for the CDX team to perform soft-
ware migrations or any hardware changes. These activities are performed accord-
ing to the following guidelines:
~	All major test migrations are coordinated and scheduled with EPA and the
CDX development staff at least one week before migration.
~	Scheduled downtime to accommodate migrations from the test to the pro-
duction environments occur the 2nd and 4th Friday of each month, begin-
ning at 7:00 p.m. through 8:00 a.m. the following Monday.
~	For minor migrations, if a team member needs to migrate to test or pro-
duction during 12:00-1:00 p.m. on a business day, an e-mail is sent from
the technical project lead to the EPA development staff by 11:00 a.m. on
the day of migration.
5.5.6	Additional QA Measures
The CDX team conducts automatic, repeatable internal tests to ensure quality
measures are attained. However, additional external measures are necessary to
ensure that the CDX system (as a whole) is meeting the expectations of its stake-
holders. Local execution of tests is fine for quality control, but not for perform-
ance measurement work, where response time measurements must include Web-
variable delays that reflect real-world usage.
5-11

-------
The CDX system requires additional monitoring from entities not normally asso-
ciated with the CDX team, thus providing the degree of independence necessary
to ensure an outsider's perspective. These independent reviews should include
internal and external risk assessments as well as security penetration tests.
Examples of independent reviews and assessments that have already taken place
include the following:
~	LMI staff running STAT Scanner, NMAP, and NESSUS
~	Sytex Security risk assessment
~	SAIC Security Assessment penetration test.
5-12

-------
Section 6
Security and Risk Management
6.1 Objective
The objective of QA/QC is to ensure security practices are implemented and prac-
ticed in a procedural and systematic manner according to a set of documented
procedures.
6.2 Definition
Security policies and procedures proceed from the requirements of each project
and can be assessed only within the context of the project. Security is properly
considered a major component of any risk management plan.
6.3 Quality Measures
As noted in Section 5 of this document, the QA staff plays no direct role in devel-
oping or implementing security policies and procedures. As part of their checks of
any risk management plan, the QA staff ascertain the following:
~	Are security policies and procedures required as part of any Risk
Management Plan?
~	If required, are they implemented in accordance with standard documenta-
tion review processes (described in Section 3 of this document)?
~	Are records of security tools and procedures maintained and available for
inspection and audit?
6-1

-------
6.4 Metrics
Several security metrics are collected as part of the system operations process,
(see Table 6-1). See Section 5 for specific details.
Table 6-1. System Security Metrics
Quality measure
Performance metrics
Goal
Warning
External Security
Measures
Annual Vulnerability Review
All risks assessed to be
very low
Any risks assessed at high
or more than 3 at moderate.

Number of identified O/S
vulnerabilities that have been
addressed by service packs or
patches
Monthly survey of each
host shows O/S includes all
current stable, relevant
system patches or service
packs
Monthly survey of each host
shows O/S lacking one or
more current stable, relevant
system patches or service
packs

Number of viruses detected

"Significant" increases in
number detected

Number of viruses penetrating
0 penetrations
Any penetrations

Number of intrusions attempted

"Significant" increases in
number detected

Number of successful intrusions
0 intrusions
Any intrusion
User logon
Number of logons


processes
Number of accounts locked due to
login failures
Number of valid signatures
Less than 1 percent of
logons rejected
More than 5 percent of
logons rejected

Number of invalid signature
attempts
Less than 1 percent of
signatures rejected
More than 5 percent of
signatures rejected

Number of files with validated
certificates



Number of files with certificates
not validated
Less than 3 percent of
signatures failing validation
More than 10 percent of
signatures failing validation
6.4.1 Performance Standards
There are no clear performance measures for these items. Clearly, the first
successful penetration of a virus or intrusion is cause for serious concern. The best
means for both preventing and measuring these risks is to conduct intrusion detec-
tion tests. Currently the CDX is funded to conduct one such test a year. Addition-
ally, the EPA conducted an independent test in June of 2001 and the CDX was
rated as a LOW security risk.
6-2

-------
Security and Risk Management
6.5 QA Process Description
The CDX security has two levels: basic security and procedural security. Basic
security consists of the logon security, which allows users into the system and the
programs they can access. Procedural security determines which user is allowed
to perform functions against specific programs. Security precautions and imple-
mentations are maintained throughout the life of a project, just as all aspects of
the CDX infrastructure security are strictly enforced. Additional security is
detailed in the EPA Security Plan, dated March 5, 2001.
6.5.1	Intrusion Detection
The CDX intrusion detection process is currently a manual process. This process
includes examining firewall logs, virus alerts, monitoring machine failures, look-
ing for patterns, and malicious activity. The only process that is currently auto-
mated is when an intruder attempts a port scan on the CDX internet protocol (IP)
range. If this type of intrusion occurs, the systems engineers are automatically
paged.
This manual process will be replaced by an automated tool in the near future. The
tool selected is Internet Security Systems' RealSecure. This is a powerful, auto-
mated, real-time intrusion protection system for computer networks and hosts.
RealSecure provides unobtrusive, continuous surveillance.
6.5.2	USER VALIDATION
All CDX users are validated and classified in two categories, NT users and Oracle
users. For each application the user must log on using a valid user name and
password. If an invalid user is detected, the systems engineers are notified and a
follow-up determines whether or not the user is valid.
6.5.3	Disaster Recovery
The Continuity of Operations Plan (COOP) provides policy and guidance for
personnel assigned to the Central Data Exchange (CDX) to ensure that essential
operations are continued in the event of an emergency or threat of an emergency.
It is EPA policy that EPA employees be prepared to respond efficiently and effec-
tively to the full range of emergencies, so the agency can continue performing its
essential functions. A more detailed statement of this policy is found in the EPA
policy statement, EPA Continuity of Operations Plan Policy, dated December 20,
1996.
The CDX team is likewise committed to protecting its own employees and
essential functions.
6-3

-------
The EPA COOP details the procedures and required steps needed in case of
emergency situations. Some of the procedures outlined in the COOP include the
following assumptions:
~	Emergencies, or threatened emergencies, will adversely affect CDX's abil-
ity to continue to perform essential operations.
~	The event will be limited to a 30-day emergency period.
~	The event will require physical relocation from current facilities.
~	Resources from EPA will be made available to LMI, if required, to con-
tinue essential operations.
~	LMI corporate support (i.e., corporate network, the contracts department,
etc.) is available to aid in executing the COOP.
EPA CDX interim system-production servers are the essential point of service for
operational continuance.
The following personnel are responsible for the continuity of operations planning
as it pertains to CDX:
~	The CDX Project Officer directs the activation of the COOP. Activation
consists of initiating the alert roster or contact tree, as illustrated in
Appendix B, of the COOP.
~	The LMI CDX Program Manager contacts (voice-to-voice) the EPA CDX
Project Officer and the LMI Technical Program Manager. If voice contact
is not made, the LMI CDX program manager will leave a message (voice-
or e-mail) and then call the next contact on the alert roster. The person
contacted calls the next person in the chain on the alert roster. This succes-
sion of calls continues until all personnel on the alert roster are contacted.
~	The LMI System Manager:
>	directs the development of the COOP;
>	ensures that the COOP is maintained and updated annually;
>	directs the development, conduct, and evaluation of COOP exercises
twice per year; and
>	provides ongoing training in COOP procedures for personnel assigned
to emergency staff positions and continually evaluates the effective-
ness of the training program.
6-4

-------
Security and Risk Management
6.5.4	Routine Backup
The interim CDX system has a dedicated local tape backup for the application
software and database. An incremental backup is performed daily, and a full
backup is done weekly. These backup tapes are stored off-site to ensure data re-
covery in the event of a catastrophe.
The CDX Oracle database is also backed up nightly to another hard drive. The
backups are not accessed directly during normal operation of the CDX applica-
tion. The backups are cumulative incremental backups that occur every evening at
8:00 p.m. This nightly backup is achieved through the use of a command line in-
terface to Oracle's Recovery Manager utility. The Oracle DBA uses the Recovery
Manager to
~	configure frequently executed backup operations;
~	generate a printable log of all backup and recovery procedures;
~	use the recovery catalog to automate both media restore and recovery
operations;
~	perform automatic parallelization of backups and restores;
~	find data files that require a backup based upon user-specified limits on
the amount of redo that must be applied; and
~	backup the database, individual table spaces, or data files.
6.5.5	DATA PROTECTION
Data protection is controlled by using the following:
~	Firewalls
~	User level security
~	Tape backups
~	Physical security (access control); room is locked
~	Virus protection.
6.6 Risk Management
Risk management determines the risk level of the CDX project or any compo-
nents of the project. Other components of the project may include risk determina-
tion of a software/hardware product or of unplanned requirement changes. The
PD has the responsibility to determine the project's risk level.
6-5

-------
LMI's	risk management approach may include the following components:
~	Assess program or project plans
~	Establish critical path
~	Identify and rank key risk areas
~	Quantify risks
~	Develop risk migration strategies
~	Conduct risk reviews.
Risk management is not a function of the QA staff; however, the QA staff has the
responsibility of ensuring the risks have been determined and documented in a
risk management plan via internal quality checks.
6-6

-------
Section 7
Project Management
7.1	Objective
The objective of QA/QC is to ensure the CDX activities for each phase of the sys-
tem development life cycle are performed and delivered with the highest quality.
7.2	Definition
Project management is a means of tracking a project through the project life-cycle
process. This involves the ability to assess the risk associated with meeting cus-
tomer requirements. It also involves the ability to define resources and track the
budget.
7.3	Quality Measures
Project management quality measures indicate how progress toward deadlines,
timely deliverables, and project oversight are managed. The CDX project leaders
manage requirements, project plans, project documentation and other responsibili-
ties in accordance with ISO 9000 standards and recommendations. CDX project
leaders are, therefore, integral to the success of the project.
7.4	Metrics
For general project management, the CDX team uses standard project manage-
ment and tracking tools, such as Microsoft Project which generate a number of
standard metrics. These metrics, shown in Table 7-1, gauge future tasks by pro-
viding better estimates for assignments.
7-1

-------
Table 7-1. Quality Measure and Performance Metrics
Quality measure
Performance metrics4
Goal
Warning
Track CDX project
requirements, milestones
and deliverables
Number and percentage of
tasks completed
Number of client requested
changes in deliverables,
cost, and schedule
Provide all deliverables
Miss 1 major deliverable

Estimated cost vs. actual
cost
Complete all tasks and be
within budget
Exceed budget by
10 percent

Number of days project is
ahead of or behind
schedule
Complete all tasks and be
within schedule
Exceed schedule by
10 percent
a In order to measure quality, all metrics must differentiate between customer-initiated changes to scope and
requirement versus meeting or not meeting stable requirements.
With the exception of the last metric, all metrics listed in the table are tracked and
reported to the customer on a monthly basis. The last metric is collected and used
internally for CDX project management purposes.
As described in Section 3, project leaders also use the Rational suite for docu-
menting requirements, developing use cases, and tracking SCRs, to name just a
few high-level responsibilities. Additionally, project leaders use the tools and
techniques described in Appendix G to facilitate QA. These include the real-time
task status and labor information systems, which provide data for tracking and
monitoring the level of effort of LMI staff and contractors.
7.5	QA Process Description
The QA/QC activities for project management ensure that updates to the project
plan have been incorporated and also determine, document, and track require-
ments. QA ensures the project leader is documenting and tracking requirements
(i.e., QA does not track requirements). Project leaders also work closely with the
LMI subcontracting staff to issue task orders and purchase orders for additional
technical support.
7.6	Non-Software Document Review
Function project leaders are responsible for overseeing production of a number of
documents that can be related to either project management or program area (e.g.,
As-Is process flows). These documents can range from short (one-two page)
documents to large reports. They can also be in briefing, project-schedule, or
document formats. The extent of review can vary by document size and its
7-2

-------
Project Management
importance to the project. In general, the following steps are performed with all
reports:
~	A document outline or table of contents is agreed upon between the LMI
and EPA project leader.
~	An initial draft is developed. This draft will typically be reviewed by peers
and the LMI project manager.
~	Draft is reviewed and approved by the EPA.
~	Final version is developed by the author.
~	Final version is edited by LMI editor.
~	Final version is proof read.
~	Final version is reviewed by author.
~	Final version is reviewed by project manager or program director.
~	Final version is reviewed and accepted by EPA.
7.7 Supplier and Subcontractor Controls
Subcontractor staff members generally perform many of the CDX development
tasks. The CDX staff uses subcontractors whenever there is a need for a specific
skill not available in-house.
The research staff is responsible for monitoring the subcontract technical per-
formance related to the specific task, as well as monitoring the number of hours
used for tasks. It is the subcontractor's responsibility to monitor the expended au-
thorized hours and costs incurred on task orders to ensure that hours worked are
not in excess of contracted limit.
The Subcontract Administrator performs the following functions related to the
labor information system:
~	Input the purchase order (PO) number, task number, subcontractor, period
of performance, and total number of hours authorized for all new purchase
orders and task orders.
~	Add more hours and extend the period of performance after issue modifi-
cations to existing orders.
Real-time software tracks subcontractor's invoices paid to date.
The process for the Subcontract Administrator is described below.
7-3

-------
7.7.1	Subcontract Task Ordering Procedure
When the CDX functional leader or technical project leader wants to procure the
services of a subcontractor, a task order request is sent, via e-mail, to the subcon-
tract administrator. The request must contain the following:
~	Task number
~	Subcontractor name
~	address
~	Statement of work
~	List of deliverables
~	Period of performance
~	Price proposal or quote.
The Subcontract Administrator then determines whether
~	subcontracting is allowable under the task,
~	there are sufficient funds to support the task order, and
~	the period of performance falls within the prime contract.
If all three conditions are met, the task ordering procedure will continue. If not,
the requestor will be contacted and informed of the problem.
If the CDX staff has not received a quote from the subcontractor, the Subcontract
Administrator may issue a RFQ to the subcontractor who will then provide a
quote containing labor categories and the number of hours to be worked by per-
sonnel in each labor category, as well as any other direct costs (ODCs) that may
be incurred.
If the subcontractor's quote is determined to be fair and reasonable and sufficient
funding exists, a task order is issued. Copies are distributed to accounting and the
CDX Technical Project Leader. A copy is filed with Contracts.
7.7.2	Subcontract Task Ordering procedure (Non-Schedule)/
Miscellaneous Purchasing
When a CDX project leader wants to procure the services of a subcontractor who
is not on the schedule, a request to subcontract is sent via e-mail to the Subcon-
tract Administrator. The request must contain the following:
~	Task number
~	Subcontractor name and contact information
7-4

-------
Project Management
~	Statement of work
~	List of deliverables
~	Period of performance
~	Proposal or quote.
The Subcontract Administrator then determines whether
~	subcontracting is allowable under the task,
~	there are sufficient funds to support the task order, and
~	the period of performance falls within the prime contract.
If all three conditions are met, the task ordering procedure will continue. If not,
the requestor will be contacted and informed of the problem.
If the CDX staff has not received a proposal from the subcontractor, the Subcon-
tract Administrator may issue a request for proposal (RFP) to the subcontractor
who then provides a technical and price proposal. The price proposal must include
information about labor categories and rates, the number of hours to be worked by
personnel in each labor category, as well as any ODCs that may be incurred.
If the subcontractor's proposal is determined to be fair and reasonable and suffi-
cient funding exists, a task order is issued. Copies are distributed to accounting
and the CDX Technical Project Leader. A copy is filed with Contracts.
7.7.3 Modifications to Existing Orders
When the CDX technical project leader deems it necessary to modify an existing
task or purchase order, a request for modification is sent via e-mail to the subcon-
tract administrator. The request must contain the following:
1.	Purchase order number or subcontract and task order number
2.	Description of the modification requested
3.	Price proposal or quote (if necessary).
Depending on the nature of the modification, it may or may not be necessary to
request a proposal or quote from the subcontractor. If it is necessary to obtain a
proposal or quote and the research staff has not already received one, the Subcon-
tract Administrator will issue an RFP to the subcontractor who then provides a
technical or price proposal. The price proposal must include information about
labor categories and rates and the number of hours to be worked by personnel in
each labor category as well as any ODCs that may be incurred.
7-5

-------
Following receipt of a request for modification and—if necessary—a subcontrac-
tor proposal, the Subcontract Administrator determines whether
1.	additional subcontracting is allowable under the task;
2.	there are sufficient funds to support the purchase order;
3.	the period of performance falls within the period of performance of the
prime contract; and, if necessary,
4.	the subcontractor's proposal is technically acceptable and cost reasonable.
If all four conditions are met, a modification is issued. Copies are distributed to
accounting and the Project Leader. A copy is filed with Contracts.
7.8 Quality Records
The QA staff performs periodic checks to ensure the following quality records are
being kept for CDX and other applications associated with CDX. The records are
evaluated to ensure they are ISO 9000 compliant. The QA staff also ensures the
following quality records are kept for the CDX and other applications, and reports
to management when these items are not available for the project:
~	Project Plan—describes the tasks to be performed. It lists the deliverables
and the resources required to perform the tasks.
~	Quality Review Assessment Form—determines the type (Approved
Category 1, Approved Category 2, Draft, Under Review, and Returned) of
technical quality review that a particular project warrants based on the
sensitivity or visibility of the project. Project leaders complete this form
and submit it to their program director (PD) or program manager (PM) for
review and approval. The rating that a project receives is automatically
calculated by the application.
~	Product Tracking Form—includes the product review summary form for
each product.
~	Product Review Summary Form—completed after the final review of the
software.
7-6

-------
Appendix A
Software Requirements Checklist
Software Requirements Checklist
Last updated 4/20/01

Yes
No
N/A
A. The following basic issues are addressed:
Functionality—
A1. The application/system behavior has been operationally defined.



A2. The data, meta-data, input, output and processes are described for
each module.



External interfaces—
A3. Application/system interaction with external entities and processes are
identified and operationally described.



Performance—
A4. Performance parameters such as throughput, processing speed,
resource availability, response time, recovery time and other
parameters have been identified and operationally described.



Attributes—
A5. The portability, correctness, maintainability, security, etc. considerations
have been identified and described.



Design constraints imposed on an implementation—
A6. The required organization standards, implementation language, policies
for database integrity, resource limits, operating environments, etc. are
in effect.



B. Requirements that are outside the bounds of the System Requirements Specification
(SRS) are not specified:
B1. The SRS correctly defines all of the software requirements.



B2. The SRS does not define any design or implementation details.



B3. The SRS does not impose additional constraints on the software.



C. The SRS properly limits the range of valid designs without
specifying any particular design.



D. The SRS exhibits the following characteristics:
Correctness—
D1. Every requirement stated in the SRS is a requirement that the software
should meet.



Unambiguous—
D2. Each requirement has one, and only one, interpretation.



D3. The customer's language has been used.



D4. Diagrams are used to augment the natural language descriptions.



A-l

-------
Software Requirements Checklist
Last updated 4/20/01

Yes
No
N/A
Completeness—
D5. The SRS includes all significant requirements, whether related to
functionality, performance design constraints, attributes, or external
interfaces.



D6. The expected ranges of input values in all possible scenarios been
identified and addressed.



D7. Responses been included to both valid and invalid input values.



D8. Types and ranges of input have been described.



D9. All figures, tables and diagrams include full labels and references and
definitions of alt terms and units of measure.



D10. All TBD's have been resolved or addressed.



Consistency—
D11. The SRS agrees with the Vision document, the use-case model and the
Supplementary Specifications if applicable.



D12. The SRS agrees with other higher level specifications.



D13. The SRS is internally consistent, with no subset of individual conflicting
requirements.



Ability to Rank requirements—
D14. Each requirement is tagged with an identifier to indicate either the
importance or stability of that particular requirement.



D15. Significant attributes for property determining priority have been
identified.



Verifiable—
D16. Every requirement stated in the SRS is verifiable.



D17. A finite cost-effective process with which a person or machine can
check that the software product meets the requirement exist.



Modifiable—
D18. The structure and style of the SRS, is such that any changes to the
requirements can be made easily, completely, and consistently while
retaining the structure and style.



D19. Redundancy has been identified, minimized and cross-referenced.



Traceable—
D20. Each requirement has a clear identifier.



D21. The origin of each requirement is clear.



D22. Backward traceabiiity is maintained by explicitly referencing earlier
artifacts.



D23. A reasonable amount of forward traceabiiity is maintained to artifacts
spawned by the SRS exist.



D24. Comments:
Reviewer:
Date:
A-2

-------
Appendix B
Software Design Review Checklist
This checklist is to be used in a design review of application software. Each
checklist item is to be given a rating in terms of the compliance of the design with
the item.
Rating Definition
1	= Total Compliance 4 = Major Deviations
2	= Minor Deviations 5 = Critical Deviations
3 = Moderate Deviations
Evaluation Items (Ratings)
1
2
3
4
5
1. Is the design described in a document?





1a. Does the design meet the criteria established for the
application?





1 b. Is the design consistent with the approach described for
the application development project?





1c. Does the design conform to the standards described for
the application development project?





1d. Are all of the application elements traceable to functional
requirements?





2. Does the design appear to meet the performance
requirements for the application?





3. Do the assumptions made in the design appear to be
relevant and correct?





4. Does the design convey a consistency of symbols and
terms?





5. Does the list of inputs/outputs, source/destination, units
and range appear to be correct?





6. Does the functional breakdown of the design appear
valid?





7. Does the relationship and hierarchy of the units described
in the design appear correct?





8. Does any flow chart logic shown appear to be incorrect?





9. Does the design appear to correctly handle abnormal
inputs to avert fatal errors to the system?





10. Are the computer storage and processing estimates
correct?





B-l

-------
Evaluation Items (Ratings)
1
2
3
4
5
10. Are the computer storage and processing estimates
correct?





11. Are other real-time processing considerations valid?





12. Are all ot the modules of the application accounted tor?
Do all of the possible control paths end (that is, "no
endless loops")?





13. Are all logic statements syntactically and semantically
valid?





14. Are all of the calls made by each module shown in a
calling hierarchy?





15. Do all of the calls made by a module follow the calling
conventions of the programming guidelines?





16. Do all names defined in modules follow the conventions
of the programming guidelines?





17. How would you characterize the adherence of the design
to the following software quality factors:
—
—



17a. Maintainability?





17b. Interoperability?





17c. Testability?





17d. Reliability?





17e. Flexibility?





17f. Correctness?





17g. Integrity?





17h. Usability?





17i. Reusability?





17j. Efficiency?





17k. Portability?





B-2

-------
Appendix C
Release Notes Template1
Introduction
[The introduction of the Release Notes should provide an overview of the
entire document. It should include the disclaimer of warranty, purpose,
scope, definitions, acronyms, abbreviations, references, and an overview
of this Release Notes.]
Disclaimer of Warranty
 makes no representations or warranties, either ex-
press or implied, by or with respect to anything in this document, and
shall not be liable for any implied warranties of merchantability or fit-
ness for a particular purpose or for any indirect, special or consequential
damages.
Copyright © 2001, cCompany Name>
All rights reserved.
GOVERNMENT RIGHTS LEGEND: Use, duplication or disclosure by
the U.S. Government is subject to restrictions set forth in the applicable
cCompany Name> license agreement and as provided in DFARS
227.7202-1(a) and 227.7202-3(a) (1995), DFARS 252.227-7013(c)(l)(ii)
(Oct 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14,
as applicable.
"cCompany Name>" and cCompany Name>s products are trademarks
of cCompany Name>. References to other companies and their products
use trademarks owned by the respective companies and are for reference
purpose only.
Purpose
The purpose of the Release Notes is to communicate the major new features and
changes in this release of the cProject Name>. It also documents known problems
and work-arounds.
Scope
This document describes the cProject Name>
[Click to enter the release identifier here] .
1 The Release Notes template is extracted from the Rational Software Corporation tool suite.
C-l

-------
Definitions, Acronyms, and Abbreviations
[This subsection should provide the definitions of all terms, acronyms, and
abbreviations required to properly interpret the xxx. This information
can be provided by reference to the project Glossary.]
References
[Any external references are presented here. This may include references
to user manuals, policies and procedures, external web sites, or the like.]
Overview
[This subsection should describe what the remaining Release Notes con-
tain and explain how the document is organized.]
About This Release
[A description of the release is presented here, including release-defining
characteristics or features. The description should be brief and simply
clarify the release definition.]
Compatible Products
This product has been tested on the following platforms (or with the following
products):
~	[Click to enter a product or platform name here] [Also list any product
operating environment requirements here.]
~	Upgrading
[Describe the process for upgrading from previous product releases.]
C-2

-------
Appendix D 1
Bill of Materials Template
Introduction
[Provide an overview of the entire document.]
Purpose
[Describe the purpose of the software to which this document applies.]
Scope
[Identify the recipients for the items identified in the Bill of Materials, for
example, the source code is typically not released to all recipients]
Definitions, Acronyms, and Abbreviations
[This subsection should provide the definitions of all terms, acronyms, and
abbreviations required to properly interpret the Bill of Materials. This
information may be provided by reference to the project Glossary.]
References
[This subsection should provide a complete list of all documents
referenced elsewhere in the Bill of Materials. Each document should be
identified by title, report number (if applicable), date, and publishing
organization. Specify the sources from which the references can be
obtained. This infortnation may be provided by reference to an appendix
or to another document.]
Overview
[This subsection should describe what the rest of the Bill of Materials
contains and explain how the document is organized.]
1 The Bill of Materials template is extracted from the Rational Software Corporation tool suite.
D-l

-------
Version Description
[Version description includes the following:
~	Identification of version, number, date, and name,
~	Summary of changes from previous version of the product
including additions, updates, and deletions,
~	Where appropriate, disposition instructions for new version and
previous version.]
Inventory of Materials
[List all the physical media (CDs, floppies, etc) and associated
documentation that make up the software version being released. Identify
numbers, titles, abbreviations, dates, versions and release numbers as
applicable.]
Handling Considerations
[Describe safeguards for handling the material, such as concerns for
static and magnetic fields, and instructions and restrictions regarding
duplication and licensing.
Inventory of Software Contents
[List all the files that make up the software version being released.
Identify numbers, titles, abbreviations, dates, versions and release
numbers as applicable.]
Changes
[List all the changes incorporated into the software version since the
previous version. Identify, as applicable, the problem reports and Change
Requests associated with each change. Describe the effect of each change
on software use or operation as applicable.]
Adaptation Data
[Identify any site-unique data contained in the software.]
Installation Instructions
[Provide or reference the following information:
Instructions for installing the software, procedures for determining
whether the version has been installed properly, and known errors and
problematic features [Identify any possible problems or known errors with
D-2

-------
Bill of Materials Template
the software at the time of release. Describe steps that can be taken to
recognize, avoid, correct or handle the problematic features.]
D-3

-------
Appendix E
Project Planning Checklist
Project Planning Checklist

Yes No
N/A
A. Project Plan
A1. Project Plan is current.



A2. Project Plan includes modifications.



A3. Kickoff meeting is scheduled.



A4. Project Plan is in ISO-IMS.



A5. Project Plan is ISO-9000 compliant.



B. Statement of Work
B1. The SOW is defined.



B2. Scope of the work is defined.



B3. Specifications/approach is defined.



B4. Assumptions are defined.



B5. Period of performance is defined.



B6. Staff are defined.



B7. Security issues are defined.



B8. The budget has been defined.



C. Risks
C1. Risk have been defined.



C2. Risk mitigation is defined.



C3. Risk avoidance is defined.



C4. Contingency plan has been defined.



D. Deliverables
D1. Task deliverables have been defined.







Comments:
Reviewer: Date:
Last update 4/17/01
E-l

-------
Appendix F
References
The following materials were used to produce this QAP:
~	Logistics Management Institute Quality System Manual (November 16,
2000).
~	Central Data Exchange for Electronic Reporting Prototype System
Requirements; Version 3 (December 2000).
~	The Rational Unified Process.
Documents of specific relevance to this QAP are:
~	EPA CDX Continuity of Operations Plan (April 2001).
~	Environmental Protection Agency Government Paperwork Elimination
Act Risk Assessment, EP005T5 (March 2001).
~	EPA CDX Security Plan (March 5, 2001).
~	EPA Network Blueprint (October 30, 2001).
~	OMB Report: M-97-12, Evaluation of Agency Implementation of Capital
Planning and Investment Control Processes (April 25, 1997).
F-l

-------
Appendix G
Tools and Techniques
The EPA CDX system team uses a number of tools to facilitate QA activities.
These tools consist of specialized software for application system testing and
change reporting, a version management system, and a physical library of applica-
tion software and the following related documents:
~	The ISO 9000 information management system (IMS) is one of the major
tools used for documenting and monitoring projects and products. This
tool tracks the quality records for projects from initiation through closeout.
The LMISDTS group is responsible for maintaining this system, which
operates from the LMI intranet.
~	Rational RequisitePro is a requirement repository that organizes require-
ments and provides traceability and change management throughout the
project lifecycle. A RequisitePro project includes a requirements
database and related documents. CDX requirements reside in Requisite-
Pro, which maintains and tracks requirements as they evolve. Project lead-
ers are responsible for updating the requirements as necessary. The
Rational suite of products is installed on each project leader and program
manager's desktop. Project leaders save RequisitePro files to a shared di-
rectory that is accessed by all CDX team members.
~	Rational ClearQuest is an automated change reporting system for tracking
software defects and requests for software enhancements. It operates as an
integrated unit within the Rational development suite of software. The QA
staff, project leaders and the Help Desk staff enter software defects or en-
hancements into the ClearQuest tracker. The QA staff is responsible for
monitoring the status of the SCRs. The Rational suite of products is in-
stalled on each project leader and program manager's desktop. ClearQuest
files are saved to a shared directory that is accessed by all CDX team
members.
~	Real-time Task Status tracks and monitors the budget for contracts, tasks,
subtasks, and sub-subtasks. Project leaders and program managers typi-
cally use this application. This application operates from the LMI network
and is maintained by the LMI Corporate Information Systems staff.
~	McAfee is a (SQL database) tool used by the Help Desk to track trouble
calls. Each submission is tracked using a unique trouble ticket. This appli-
cation operates from the LMI network (protected by a firewall) and is
maintained by the CDX Lead Systems Engineer.
G-l

-------
~	Microsoft Project produces timelines for deliverables and other project re-
lated materials. This application is installed on each project leader and
program manager's desktop. Project leaders are responsible for updating
project plans for CDX as necessary. MS Project files are typically saved to
a shared directory that is accessed by all CDX team members.
~	Labor Hour Information System (Lotus Notes/Domino application) is used
to track subcontractor hours. This web-enabled tool allows subcontractors
and consultants to enter their hours worked on projects. This system does
not replace the subcontractor and consultant invoicing process. It is in-
tended to give project leaders and program managers a "real time" view of
hours used on projects for these labor categories. The LMI SDTS group is
responsible for this application, which is maintained on the LMI network.
~	Microsoft Visio 2000 is a tool designed to develop:
>	Flowcharts
>	Organizational Charts
>	Basic network Diagrams.
Project leaders and developers use this tool to diagram CDX information
flows and processes. Visio is installed on individual PCs. Visio files are
typically saved to a shared directory that is accessed by all CDX team
members.
~	Project templates and checklists include, but are not limited to:
>	Project Planning Checklist (Appendix E)
>	Software Requirements Checklist (Appendix A)
>	Bill of Materials Template(Appendix D)
>	Release Notes Template (Appendix C)
Project templates are stored in a shared directory and are accessed by CDX QA
staff, developers, project leaders and program managers. Several checklists are
currently under development or under review, including a test and evaluation
checklist and a code peer-review checklist, among others.
G-2

-------
Appendix H
Configuration Management
The CDX staff employs a directory structure for tracking and managing the
different versions of the software for the CDX system. This configuration process
is currently a manual process. The structure consists of three environments:
(1) new development, (2) test, and (3) production. These environments minimize
the possibility of promotion of incorrect versions of the software into the produc-
tion environment. Figure H-1 and Figure H-2 display the configuration manage-
ment directory structures.
The folders shown in Figure H-l exists on the D drive of the epa_web_devl
server.
Figure H-l. epa_web_devl Directory Structure
| Console Root
Oteipet Information Server
_web,dev1
ffi IISADMIN
: Si P IISHELP
: rn ill SCRIPTS
a ijjj IISADMPWD
msadc
IB Uj| Mail
i m P SSLCERT
, !£ Qj| AspUpload
: Si Uj IMAGE
Srl^J PreRegistratiori
Sr ~ STYLE
SrCj Tri
Si'Cj Trids
5- Administration Web Site
ii; iiE9 Default SMTP SiteEStopped)
® O Microsoft Tlansaction Server
j|>] AB0UT.ASP
jS) default, asp
EPa.HOmE asp
jig FAQ.ASP
$ F0OTER.ASP
jj) HELP ASP
IH MENU ASP
j«j NEW.ASP
' ~ SECTION.ASP
TERMS.ASP
IISADMIN
® IISHELP
Li SCRIPTS
QlllSADMFWD
msadc
Mail
SSL
SSLCERT
LjjflAspUpload
IMAGE
U3 PreRegislration
Cj style
Tri
£} Trids

-------
Figure H-2. Configuration Management Directory Structures
Each of these folders has
the following subfolders, with
the exception of the HTTP
folder:
•	Aei
•	Cdx
•	Dmr
•	Idef
.	Tri
•	Ucmr
The HTTP folder contains
only a HOME folder.
DevelopmentObjects
code resides here.
-All
http—All non SSL CDX
asp/html docs reside here.
https—contains all asp/html
docs for any project that do
not require digital certs.
httpsCerts—contains ONLY
asp/html docs that require
digital certs.
ProductionObjects—
contains all objects thai
registered on the server
including third party and
objects commonly used by
multiple CDX programs.
The ProductionObjects
Folder contains a subfolder,
called General, that will be
the home for any third party
or commonly used Dlls/Exe.
J1* exploring - D A
File gtk View loots Help
All Folders
Contents of "D:V
Desktop
JH My Computer
ffi 0 3H Floppy (A:)
frfijp (Cj)
: i fcfti APPS
+ .	! DataBase
^ DeveloprnentObjects
i+: Aei
a -__J Cdx
S: CJ Dmr
' O I del
ffi Cj Tri
S3 Ucmr
ip http
https
httpsCert
•B O Inetpub
Oracle
ProductionObjects
_J Aei
: Cj cdx
¦ I Dmr
S ill General
: f "l Idef
Q Tri
Si CJ Ucmr
Si-{Cl Program Files
^-Pi fiecucler	
fejApps
I DataBase
DevelopmentObjects
¦J http
»J https
httpsCeit
L .1 Inetpub
CJ Oracle
PioductioriObjects
CJ Program Files
CJ Recycler
Name
In the test and production environments under the D drive, the directories con-
struction uses the same set of folders, as represented in Figure H-2.
The CDX staff utilizes the Rational RequisitePro tool to document the CDX re-
quirements. This tool is also used to track and monitor the requirements as they
change.
The CDX staff is working toward automating the configuration management
process by utilizing a configuration management tool such as Rational
ClearCase LT. Such tools allow a designated staff member to access, distribute,
and control different versions of source code and documentation.
H-2

-------
Appendix 1
Data Quality
The EPA takes the issue of data quality very seriously. With the creation of the
Office of Environmental Information, added focus has been placed on the imple-
mentation of quality systems within the EPA. In support of this top agency prior-
ity, the Quality and Information Council has requested that specific system
information on data quality be collected as part of the IT Capital Planning and
Investment Control process. This appendix addresses the following questions
from this process:
~	Does this project involve the production, collection, storage, analysis or
presentation of environmental measurement or facility-related informa-
tion? If so, does the project have a documented "Quality Assurance
Project Plan" or equivalent document, which includes easily identifiable
planning procedures which address IT and information/data quality indica-
tors, planning processes, implementation processes, and assessment proc-
esses?
~	What features of your system or application ensure an appropriate level of
quality of data?
~	Does this project include specific procedures for the identification and
correction of errors in existing data systems by internal or external cus-
tomers? Is this system covered by the Integrated Error Correction Process,
etc.? If yes, describe the process.
~	What validation checks are present in the system to verify the integrity of
transmitted data?
Overview of the Quality Assurance Project Plan
CDX presents EPA with the opportunity to establish a new baseline of data qual-
ity that is standardized across program data collections. The implementation of
data quality in CDX is addressed through the CDX Quality Assurance Program.
In general terms, the CDX quality assurance program consists of establishing and
maintaining a standard set of objectives, quality indicators, performance stan-
dards, planning, implementation and assessment processes, and problem resolu-
tion activities across the broad range of CDX activities.
The general objective of our quality assurance program is to establish and
maintain QA standards and procedures that lead to achieving quality products in
1-1

-------
particular timeliness and "freedom from defect." The quality assurance program
addresses the development of CDX in four general areas:
~	Data collection, design and implementation
~	Operations and maintenance
~	Security
~	Customer service.
Project management applies across the four general functional areas. Specific in-
stances of CDX components that contribute to data quality include the following:
~	Error reduction—Reducing common data submission errors is critical to
ensuring quality data entering our systems. Detecting errors and reconcil-
ing data post-receipt can involve costly and complex rework by both the
submitter and CDX and should be avoided. In the course of supporting
EPA data collections, CDX will strive for error reduction by leveraging
electronic data collection technologies, such as
>	designing and testing new data collections through an incremental and
highly structured quality assurance process to ensure new collections
are deployed that are user friendly,
>	pre-populating data from previous submissions and providing intelli-
gent tips to guide reporters on the proper format for the data being
submitted,
>	instituting data standards and establishing standard data exchange
templates and user guidance, thereby reducing the opportunity for con-
fusion and incorrect data entering EPA systems, and
>	maintaining a CDX customer service, on line help systems, and related
program help desks to coach submitters on the proper entry of data
into our systems through CDX.
~	Error detection/data validation—Although the primary responsibility for
quality data originates with the submitter, CDX can minimize data entry
errors by early detection edit checks. All data collections submitted
through CDX shall be subjected to a series of standard edit checks focused
on the structure, syntax, and formatting of the data received and, where
appropriate, more rigorous data quality check of the content of the data.
~	Data reconciliation—An opportunity exists for leveraging CDX capabili-
ties to facilitate reconciliation of the data with our external community. In
the past, programs relied on a long and often arduous process of printing
data received from the submitter and providing hard copies of these data
1-2

-------
Data Quality
back to the submitter for review and reconciliation, CDX can leverage our
customers "online" access through CDX to view and revise data almost in
real time. While a number of procedural and policy issues need to be re-
solved, CDX is currently developing this approach with the new UCMR
reporting requirements for the drinking water program. In this scenario,
laboratories submit data to CDX for review, reconciliation and approval
online by local and state regulatory agencies before loading the data into
production UCMR databases. This provides a new and efficient approach
for the identification and reconciliation of data errors and holds a great
deal of promise for other collections supported by CDX.
CDX Data Quality Features
In a broad sense, data quality may be defined both in terms of the quality of the
"content" of the data (precision of estimates, method of determination, etc.), and
in terms of quality "delivery" (timeliness, degree of freedom from defect). The
CDX primarily focuses on new, high-quality approaches to delivering data to
EPA's program systems. Therefore, CDX is primarily interested in measuring at-
tributes of data quality associated with data delivery, such as processing time, data
entry errors, data quality checks, error notifications, and valid authentications of
submitters. Because CDX is currently in prototype, the current data quality effort
focuses on establishing and testing a baseline of data quality attributes, to track
performance of the "final" CDX system. The data quality attributes where meas-
urements are under development and testing include the
~	percentage of data elements checked for compliance with CDX basic data
quality parameters (field length, field type, completeness of transaction,
etc.) across agency data exchanges processed through CDX (presently un-
der prototype system—data quality goal is 100 percent basic data quality
checks),
~	frequency of error reports from CDX provided to agency systems on those
exchanges,
~	time required to access electronic archive to aid reconciliation of all trans-
actions occurring between EPA and its partners,
~	processing time of data entering our systems by automating data entry
processes that are currently manual,
~	number of customer service calls received on CDX application deploy-
ment, and
~	number of successes and failures of registration/installation of clients
through CDX.
1-3

-------
Because CDX is in the prototype phase and will not reach full "production" until
FY 2003, CDX attributes of the quality control features are still under develop-
ment and the "metrics" associated with these attributes are not yet finalized. A
preliminary set of metrics on these attributes is provided in more detail as part of
this quality assurance plan.
CDX and the Error Correction Process
CDX is the data exchange portal for EPA. If, in the course of registering or sub-
mitting data through CDX, the submitter discovers an error, our technical support
can direct them to the error correction process. Additionally, CDX provides
"viewing features" by allowing access for states and industry to select reports for
review. CDX is also working with FRS and agency program systems to determine
the long-term relationship of CDX and the Agency's error correction process.
CDX Data Validation
CDX is attempting to introduce intelligent checks to the webforms and highly
specific attributes to the XML and EDI files that will reduce errors in transmis-
sion. Once transmitted however, CDX, at a minimum, is performing general con-
formance checks to specific data elements (e.g., field length, alpha numeric). In
some cases (e.g., PCS/IDEF) highly sophisticated edits are performed to assess
content (e.g., code values) against accepted values. In the latter case, these checks
are tied to "intelligent" middleware currently under development with guidance
from the program.
1-4

-------
Appendix J
Abbreviations, Acronyms and Definitions
Bill of Materials
CDX
COR
COTS
EPCRA
GOTS
IMS
ISO
New Work Survey
Project Files
PD
PM
Prototype
Quality Assurance
Quality Control
(QA/QC)
A Rational Unified Process template which describes the
version description. This document describes the inventory
of materials, installation instructions, software changes,
known errors, and problematic features.
Central Data Exchange
Copy of Record, one type of data flow through CDX
Commercial Off The Shelf software
Emergency Planning and Community Right-to-Know Act
Government Off The Shelf software
Information Management System
International Organization for Standardization
This is the initial form to be completed for all IT
Development projects. This is an overview assessment of
the project. This information is used to determine if the
work will be performed.
Collections of files related to or associated with the CDX
project and stored in a central repository in a shared
workspace.
Program director
Program manager
Software and supporting hardware and network
components that are designed to evaluate the feasibility of
a product. The prototype may not include the full
functionality of the system.
The process of performing checks and balances against a
software process or product. This process requires the use
of various reviews, audits and evaluations, and testing steps
to ensure best practices are being conducted.
J-l

-------
Quality Review
Quality Review
Assessment (QRA)
Form
Rational
RequisitePro™
Rational ClearCase
LT™
Rational
ClearQuest™
Rational Software
Release Notes
An assessment of all relevant project deliverables,
documentation and procedures to ascertain their degree of
compliance with project requirements. A Quality Review
may be scheduled for certain phases of the Software
Development Life Cycle, or it may be conducted on a
periodic, calendar basis to ensure continued compliance
with project requirements. A Quality Review may vary in
scope, depth, and formality, depending on project
requirements and available resources.
A form used to determine the type of quality review that a
particular project warrants based on the sensitivity or
visibility of the project.
A requirements depository, which organizes requirements
and provides traceability and change management
throughout the project lifecycle. A RequisitePro project
includes a requirements database and its related documents.
A configuration management system that manages multiple
variants of evolving software systems. It enforces site-
specific development policies, offers multiple developer
workspaces, and provides advanced support for parallel
development.
An automated change reporting system for tracking
software defects and requests for software enhancements. It
operates as an integrated unit within the Rational
development suite of software.
Suite of software tools being used to support the software
development process. Included in the suite are tool
mentors, guidelines, checklists, best practices and
suggested workflows for software development activities.
Release notes identify changes and known bugs, in a
version of a build, or deployment unit, that has been made
available for use.
J-2

-------
Abbreviations, Acronyms and Definitions
SCR: System
Change Request
SDLC
Software Product
SRS: Software
Requirements
Specifications
Test Case
Test Plan
An SCR is used during the tests and validation phase of the
SDLC to identify requested changes and/or problems
related to the design and development of software tools and
to track corrective actions. Specifically, the SCR records a
description of the problem or suggested enhancement, the
date submitted, the priority and severity of the issue, and
other pertinent information. SCR information is maintained
electronically in the project files for each software
development project.
Software Development Life Cycle; The LMI SDLC
consists of phases for software development, including:
project planning, business analysis and requirements,
design, development, testing and validation, customer
acceptance, and operations and maintenance. The LMI
SDLC follows an iterative approach to the development of
software products. Some projects may not include all
phases of the life cycle in accordance with the project's
Statement of Work (SOW).
Complete set of computer programs, procedures, and
associated documentation and data developed for delivery
to a customer. Includes—where applicable—databases,
web pages, and other programs developed using COTS or
GOTS applications development tools. A product can
contain multiple deployment units, and may be accessible
as a downloadable commodity, in shrink wrap or on any
digital storage media formats.
The SRS is a technical document that identifies all the
known functional and technical requirements of the
software product being developed. This document assists
the software developer in determining specific, detailed
requirements for the development and implementation of
the product. The technical project leader is responsible for
initiating and maintaining the SRS as a part of project files.
A scenario that assesses the software product's compliance
with software requirements, use cases, and other relevant
requirements. A test case is derived from a Test Plan.
A document that describes how a software product will be
tested to ensure its compliance with the SRS, Use Cases
and other relevant requirements documents.
J-3

-------
Test Scripts	Instructions for the execution of a test case. They are
created by programming or recording, or are written
manually. Test scripts are usually forms of software, and
therefore, need to be designed, created, tested, and
maintained like any other software artifact.
Use Case	Use Cases are a method of capturing and documenting
requirements. A use case describes user interactions with a
system. Use cases define "what" a system should do, not
"how" a system works. A use case defines a sequence of
actions the system performs that yields an observable
outcome of value to the user.
XML	Extensible Markup Language
J-4

-------
REPORT DOCUMENTATION PAGE
Form Approved
OMB No. 0704-0188
Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time tor reviewing instructions, searching existing data sources, gathering and maintaining the
data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing
this burden to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-
4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for falling to comply with a collection of Information If it does not display a currently
valid OMB control niimhflr PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.	
2. REPORT TYPE
Final
1. REPORT DATE (DD-MM-YYYY)
XX-09-2001	
4. TITLE AND SUBTITLE
Quality Assurance Project Plan for the Interim Central Data
Exchange System
3. DATES COVERED (From - To)
5a. CONTRACT NUMBER
GS-35F-4041G
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S)
Kim Harris,
Daniel Jackson
Jodi Narel
5d. PROJECT NUMBER
John Kupiec
Donald F. Egan
5e. TASK NUMBER
EP005.14
5f. WORK UNIT NUMBER
~7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)
Logistics Management Institute
2 000 Corporate Ridge
McLean, VA 22102-7805
8. PERFORMING ORGANIZATION REPORT
NUMBER
EP005T7
9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES)
U.S. Environmental Protection Agency
Attn: Matthew Leopard
Ariel Rios Building, Mail Code 2823
1200 Pennsylvania Ave, NW
Washington, DC 20460 	
10. SPONSOR/MONITOR'S ACRONYM(S)
11. SPONSOR/MONITOR'S REPORT
NUMBER(S)
12. DISTRIBUTION / AVAILABILITY STATEMENT
F Further dissemination only as directed by EPA 24 September 2001.
13. SUPPLEMENTARY NOTES
14. ABSTRACT
The Central Data Exchange (CDX) is a system that facilitates electronic data exchanges for
EPA stakeholders and is a key component of EPA's strategy for addressing federal mandates. As
a single receiving point for all reports, the CDX ensures a baseline for standardization and
compatibility of incoming data. In addition, the CDX provides electronic forms that are pre-
filled (or pre-populated) with data that do not change or change infrequently (e.g., permit
number or address)—thereby reducing the stakeholder's "burden" of filling in redundant
information. This CDX Quality Assurance Plan (QAP) describes the quality assurance standards,
guidelines, procedures, and activities used to support the development and enhancement of the
EPA's CDX system and EPA applications developed and hosted at LMI. This plan outlines current
and future quality assurance activities for the CDX system. This QAP does not address the
details of any specific CDX program.
15. SUBJECT TERMS
Central Data Exchange (CDX), electronic data exchanges, Quality Assurance Plan (QAP), U.S.
Environmental Protection Agency (EPA).
16. SECURITY CLASSIFICATION OF:
17. LIMITATION
OF ABSTRACT
Unclassified
Unlimited
18. NUMBER
OF PAGES
84
19a. NAME OF RESPONSIBLE PERSON
Nancy E. Handy
a. REPORT
UNCLASSIFIED
b. ABSTRACT
UNCLASSIFIED
c. THIS PAGE
UNCLASSIFIED
19b. TELEPHONE NUMBER (include area
code)
703-917-7249
Standard Form 298 (Rev. 8-98)
Prescribed by ANSI Std. Z39.1B

-------