Permit Compliance System
(PCS)
Quality Assurance
Guidance Manual

PRO^
Office of Wastewater Enforcement and
Compliance (OWEC)
Office of Water
U.S. Environmental Protection Agency
August 1992
311 Isa American Management Systems

-------
Permit Compliance System (PCS)
Quality Assurance Guidance Manual
INTRODUCTION	.	1-1
1.1	Purpose of Manual 					l-l
1.2	Role of PCS in the Environmental Protection Agency 		1-2
1.3	PCS Policy Statement 						1-3
1.4	PCS Data Quality 			1-3
1.5	Organization and Suggested Use of this Document 		1-4
PCS QUALITY ASSURANCE PROGRAM 				2-1
2.0	Overview							2-1
2.1	PCS Data Quality Standards 				 .2-1
2.1.1	Timeliness			2-2
2.1.2	Accuracy 						.2-2
2* 1.3 Completeness				2-2
2.1.4	Consistency 				 .2-4
2.1.5	Summary					2-5
2.2	Components of a PCS QA Program 			 2-6
2.3	Benefits of a Documented QA Program 					 2-7
OPERATION AND MANAGEMENT OF A QA PROGRAM FOR PCS
3.0	Overview
3.1	Data Capture
3.2	Data Transfer
3.3	Data Edit and Update
3.3.1	"Optional" Features	
3.3.2	Standard Review Procedures
3.3.3	Standard Error Correction
3.4	PCS Data Base Quality Control ,
MANAGEMENT INFRASTRUCTURE AND GENERAL PRACTICES
4.0	Overview				
4.1	Assign Staff Responsibilities			
4.2	Establish Attainable Goals and Targets	
4.3	Track Performance Against Goals 		
4.4	Assess Your Quality Assurance Program			
4.5	Management of Data Input Personnel			
4.6	Show Consistent Commitment to Data Quality	
4.7	Promote Communication				
4-1
.4-1
.4-2
,4-3
, 4-4
, 4-4
,4-5
4-6
.4-7
APPENDIX A. PCS POLICY STATEMENT 	•	A-l
APPENDIX B. PCS QA PROGRAM SELF-ASSESSMENT . 				B-l
APPENDIX C. SAMPLE PERMIT COMPLIANCE SYSTEM (PCS) QUALITY
ASSURANCE MANUAL							C-i

-------
LIST OF FIGURES
Figure
2-1.	Products of PCS Quality Assurance Program. . . .
3-1.	Major steps required to enter DMR data into PCS.
LIST OF TABLES
Table
Management Functions Supported by PCS . . .
PCS Targets for Timeliness	
Summary of PCS Data Quality Targets .....
An 8-point Program for PCS Quality Assurance

-------
Permit Compliance System (PCS)
Quality Assurance Guidance Manual
SECTION 1
INTRODUCTION
1.1 Purpose of Manual
This Permit Compliance System (PCS) Quality Assurance Guidance Manual provides
guidance to Environmental Protection Agency (EPA) Regional Offices and states in developing
and documenting quality assurance procedures for PCS. If you already have established
documented quality assurance (QA) procedures, this manual can help you assess and improve
them.
The manual addresses three major questions that should be of concern to all Regional and
state PCS users:
¦	How is "PCS data quality" defined and how is it measured (see Section 1.4 and
Section 2)?
¦	What practices and tools should users employ to ensure that their PCS data meets these
quality standards consistently (see Section 3, Section 4, and Appendix C)?
¦	What steps can users take to assess their current practices and determine how to
improve their quality assurance program (see Appendix B)?
A sample QA manual, included in Appendix C of this document, is intended as an example
for you to use in documenting your QA procedures. It is based on guidelines presented here and
gives you detailed examples of procedures useful in providing for PCS data quality.
1-1

-------
1.2 Role of PCS in the Environmental Protection Agency
PCS is the information management system that supports the National Pollutant Discharge
Elimination System (NPDES) program, the Congressionally-mandated program for issuing
permits to facilities discharging wastewater into navigable waterways. EPA has delegated
authority for the NPDES program to environmental agencies in 38 states, while EPA Regional
Offices implement the program in the remaining states.
The Office of Wastewater Enforcement and Compliance (OWEC), which oversees the
NPDES program nationally, relies on PCS as the primary source of information on state and
Regional Office activities in the program. The Regions and delegated states use PCS as a central
repository of information on regulated NPDES facilities to track, permit compliance and
enforcement activities of these facilities. Table 1 describes the management functions supported
by PCS. It is very important that the data it contains be complete, accurate, and up-to-date and
that all users be consistent in the way they define and use various PCS data elements because
the system plays such a central role in program management.
Table 1.
Management Functions Supported by PCS
¦	Track permit issuance and reissuance
¦	Support oversight of NPDES programs by identifying automatically:
d Compliance Schedule Violations
d Discharge Monitoring Report (DMR) Effluent Limit Violations
n Discharge Monitoring Report (DMR) Reporting Violations
D Reportable Noncompliance Violations (RNC)
¦	Determine compliance statistics at a State or national level
¦	Track enforcement actions and the resulting compliance schedules and interim limits
¦	Respond to requests for information from Congress, state legislatures, and the general
public
1 2

-------
1.3 PCS Policy Statement
The PCS Policy Statement, issued in 1985 (Appendix A), designated PCS as the national
data base for the NPDES program and established the minimum required standard for data entry
-- the Water Enforcement National Data Base (WENDB). The integrity of these elements is
essential because they form the core of data used to generate statistics for the NPDES program.
In this policy statement, OWEC required all direct PCS users to take steps to ensure the quality
of their PCS data. The Policy Statement mandates that each user's QA program shall include:
¦	monthly measurement of the level of data entered
¦	appropriate time frames to ensure that data are entered in PCS in a timely manner
¦	nationally consistent standards of known data quality based on proven statistical
methods of quality assurance
¦	targets for the completeness (for assurance of full data entry) and accuracy of the data
entered into PCS.
1.4 PCS Data Quality
The development of your PCS QA program should be guided by a clear definition of what
is meant by the term "data quality." It is especially important to recognize that the following
key principles underlie the development of quality in PCS data:
¦	Quality must be built in from the very beginning of the data collection, preparation,
and entry process. It is often costly in terms of resources, timeliness, and efficiency
to make corrections after the data are entered into PCS.
¦	Quality is achieved through effective management and commitment. Responsive
1-3

-------
management should provide the support and training that is necessary to achieve good
data quality and recognize, reward, and encourage quality service and performance.
¦ Quality must be tracked and performance of the program evaluated at regular intervals.
Data quality must be measurable so that the causes of poor data quality can be
identified and corrected.
l.S Organization and Suggested Use of this Document
Following this introduction, Section 2 discusses the components and benefits of a PCS QA
program. Section 3 provides an overview of procedures that should be included and Section 4
suggests management practices that will help you administer your program. The PCS Policy
Statement is reproduced in Appendix A. Appendix B provides a method to assess your current
QA program and Appendix C provides a Sample PCS Quality Assurance Manual.
PCS managers and their staff are encouraged to use this guidance document, in conjunction
with Appendix C, when they develop or modify their own PCS QA manual. Appendix B should
be used as a reference when you evaluate your PCS QA program.
1 4

-------
SECTION 2
PCS QUALITY ASSURANCE PROGRAM
2.0	Overview
Your QA program should be based on a through understanding of the national PCS data
quality standards for timeliness, accuracy, completeness, and consistency. Combining this
knowledge with well-documented procedures and effective management commitment will
produce a successful QA program resulting in better PCS data, smoother PCS operations, and
improved operation and management of the NPDES program.
2.1	PCS Data Quality Standards
How well your PCS data meet the definition of quality can be evaluated based on an
objective assessment of each of the following four measures:
¦	Timeliness — the extent to which the data covering a specific interval of NPDES
program activity are promptly entered into PCS
¦	Accuracy — the extent to which the data recorded in PCS reflect the correct,
true, or reported values
¦	Completeness the extent to which the required data are reported and recorded in
the system
* Consistency — the extent to which the values of the data elements use the standard
definitions or codes and the extent to which these definitions and
codes are used in the same way by all users.
2-1

-------
2.1.1 Timeliness
Timeliness refers to the "punctuality" of information in the data base — as measured by the
length-of time between the actual event (or receipt of information about the event) and-its
appearance in the data base.. PCS targets for. timeliness vary by the type of data being entered
into the system. Table 2 defines the target for timeliness for each PCS data type. For example.
Discharge Monitoring Report (DMR) data should be entered into PCS within ten working days
of the receipt of the DMR.
¦	Information from DMRs should be entered into PCS within 10 days of receipt; most
other data types should be entered into PCS within five working days of the receipt
of the information.
2.1.2	Accuracy
Accuracy refers to the absence of erroneous data resulting from mistakes during any point
in the data preparation, entry, or transmission process. Errors sometimes result from mistakes
by key-entry personnel, but they can also be introduced by program or facility personnel who
prepare the source documents used for data entry. Data entry errors are usually misspellings
and incorrectly entered values while transmission errors often result in transpositions of
characters and dropped digits.
¦	95% of the WENDB elements entered into PCS should be identical with those
reported on the DMR, permit or other input document.
2.1.3	Completeness
Completeness refers to the amount of required data present in the data base at a specific
point in time. Completeness is important to assure that all pertinent information is available for

-------
Table 2.
PCS Targets for Timeliness
Data Type Data Entry Target
¦ Permit Facility Data
¦ To be entered within five working days
of date of receipt of permit application
and permit issuance.
¦ Pipe-Schedule Data
¦ To be entered within 10 working days of
permit issuance.
¦ Parameter-Limits Data
¦ To be entered within 10 working days of
permit issuance or issuance of
enforcement action.
¦ Measurement/Violation Data
¦ Measurement data to be entered within
10 working days of receipt of DMR.
¦ Compliance Schedule Data
* To be entered within five working days
of schedule establishment or notification
of completion.
¦ Compliance Schedule Violation Data
¦ System generated.
¦ Enforcement Action Data /
Enforcement Action Key Data
¦ To be entered within five working days
of the enforcement action.
¦ Single Event Violation Data
¦ To be entered within five working days
of notification of th^ violation event (e.g.
fish kill).
¦ Pretreatment Performance
Summary Data
¦ To be entered within thirty working days
of receipt of pretreatment annual report.
¦ Inspection Data
¦ To be entered within 10 working days of
receipt of inspection report.
¦ Pretreatment Audit/PCI Data
¦ To be entered within 10 working days of
receipt of audit or inspection report.
¦ Permit Events Data
¦ To be entered within five working days
of date of receipt of permit application
and permit issuance.
¦ Evidentiary Hearing Data
¦ To be entered within five working days
of the appropriate hearing date.
2-3

-------
use when it is needed. The PCS Policy Statement has designated the WENDB elements as the
minimum set of data dements required for PCS,
¦ Timeliness and completeness arc closely interrelated and are-often considered together,' In
particular, completeness must always be evaluated with respect to time - that is, data are
complete (or incomplete) as of today. However, in some instances it is easy to separate these
two factors. For example, a report may be received and entered into PCS in a timely manner,
but Sacks a few WENDB elements; in this case information was received and entered in a timely
way, but is not entirely complete.
¦ 95% of the WENDB elements should be entered for each facility.
2.1.4 Consistency
Consistency refers to the extent to which appropriate values are used for a data element as
defined nationally (in the case of WENDB data elements). For management reports to be the
most effective, data must be comparable over time within the area of interest. Comparable
codes or values must be used for the same data elements over time and in different geographical
area (state or Region), if valid comparisons are to be made by PCS managers. For example,
program managers may want to identify facilities authorized to discharge large amounts of a
certain pollutant (i.e. cadmium); this data can be readily complied from PCS, however, only
if a consistent code has been used for the pollutant.
Consistency errors usually originate with the program or facility staff who prepare source
documents. They differ from accuracy errors in that the value is correct as far as the coder is
concerned, but the selection of the code is not consistent with the approved definition of the data
element, the data element value, or with the way the code is used by others. An example
illustrating this problem is the difficulty of identifying measurement data at the end-of-pipe to
be used for loadings. Due to the multiple values in the monitoring location (MLOQ data
element used by Regions and states, it is extremely difficult to identify the end-of-pipe. National
2-4

-------
data consistency standards will be developed for the WENDB elements.
¦ 100% of the WENDB elements should use appropriate values as defined nationally.
2.1.5 Summary — the Four Measures of Quality
Problems in any of these four measures can degrade overall data quality and undermine the
confidence users have in PCS. Therefore, in order for the quality of PCS data to be "good",
the data must meet each of the four data quality targets for timeliness, accuracy, completeness,
and consistency. The failure to meet these targets is especially serious when the WENDB
elements are affected due to the use of this data for program decisions and public access.
Uncorrected errors in the optional data elements can also pose significant problems to users.
The four measures of data quality and their suggested targets should be incorporated into
your objectives for your QA program. While your data quality objectives should be set at an
attainable and realistic level, they should be at least as stringent as the national standards
summarized below (Table 3). Further, it is important to track your performance against the
targets.

Table 3.


Summary of PCS Data Quality Targets

Timeliness
Accuracy
Completeness
Consistency
¦ Based on
¦ 95% of the
¦ 95% of the
¦ 100% of the
data type.
WENDB elements
WENDB
WENDB
See Table 2
in data base
elements
elements use
for targets.
identical to
entered for
appropriate

original source
each facility.
value defined

document.

nationally.
2-5

-------
2.2 Components of a PCS QA Program
The PCS Policy Statement requires all direct users to develop a QA .program that includes
monthly tracking of the level-of-data entered, appropriate time frames-for data entry, and
nationally consistent standards for PCS data completeness and accuracy. .Within these broad
guidelines, PCS users have the flexibility to configure their individual program in a manner that
best suits their own resources and requirements. Some factors influencing this decision include
the extent of optional data elements used, available staff, severity of past data problems, and the
cost of developing or implementing new procedures,
OWEC strongly recommends a "prevention oriented" approach for your QA program. You
should try to prevent problems through clear procedures, well-defined responsibilities, and
understandable instructions — then catch and correct the few problems that will inevitably occur.
This economical approach saves substantial amounts of rework and uses your limited PCS staff
and resources most efficiently. Simply put, it is far easier and cheaper to do the work right the
first time than it is to redo it.
¦ A successful PCS QA program uses eight simple ingredients. You will need to draw on
each of these points to meet your individual needs. (See Table 4 on the following page)
If you currently have a formal QA program, you may want to evaluate it to determine
where and how it may be improved. If so, Appendix B contains a check-list to help you assess
your current program. By using this check-list you will be able to measure your program and
identify which of the ingredients listed above might be improved upon or incorporated into your
existing program.
2-6

-------
Table 4,
An 8-point Program for PCS Quality Assurance
Set Your Quality Objectives
» Measurable, well-defined data quality objectives
Document Your Procedures — and Follow Them
¦	Well-documented data collection and handling procedures
¦	Procedures for detecting and correcting errors
Manage for Quality
¦	Procedures to measure and track performance against goals
¦	Clearly assigned staff responsibilities and oversight
¦	Adequate documentation, training, and communication
*	Consistent management commitment to data quality
¦	Periodic review and evaluation of your QA program
j
2.3 Benefits of a Documented QA Program
A vigorous QA program using the eight points listed above forms the foundation for
supporting the many tiers of interrelated functions of PCS (Figure 2.1). In particular, a good
PCS QA program will pay off in four areas:
¦ Better PCS Data
The most immediate and observable effect of a QA program is the improvement in quality
of PCS Data. The procedures outlined here, in conjunction with management commitment to
the PCS quality targets, will produce PCS data that is timely, accurate, complete, and consistent.
High quality data underlies all other system and programmatic benefits,
2-7

-------
Figure 2.1
Products of PCS Quality Assurance Program
NPDES Program
NPDES Operations
PCS Optrmtion*
PCS
Data
Quality
Public Access to
Quality PCS Data
Valid and Responsive
Public Policy
Develop
Consistent
Pecmit
Criteria
'PCS Policy
Statement
Compliance
PCS Data.
Timely
Demonstrate
Program
Effectiveness
Data
Intiy
Accurate
Determine
Valid
Compliance 01
Enforcement
Actions
Protection
Against
Staff Turnover
More
Efficient
Budget Use
Complete
Consistent
PCS QUALITY ASSURANCE PROGRAM

Set Yow Quality
. ¦ Objectives
I Documcnt Ycror Proctdtires I Manage for Quality
.
2-8

-------
¦ Smoother PCS Operations
Benefits to the PCS operations go far beyond satisfying the PCS Policy Statement
requirement of establishing a QA program. First, a well-documented QA program will increase
the efficiency of PCS data entry by providing clearly defined staff assignments, QA procedures,
and ready references to resolve problems. PCS will also more effectively perform program
functions, such as identifying limit violations and printing the Quarterly Non-Compliance Report
(QNCR). Second, written documentation of the procedures to be followed is the best insurance
against staff turnover. With good documentation, you can bring new employees up to speed
quickly enough to maintain the program's existing performance. Third, once the key ingredients
of the QA program arc established and functioning, fewer resources will need to be invested in
operations and management, freeing resources to be allocated elsewhere in the office.
¦	Sounder NPDES Program Operations
Substantial benefits will accrue to NPDES program operations. Improvements to the quality
of PCS data will yield timely, reliable data for analysis. By using this data, permit writers will
develop more consistent permit criteria and compliance engineers will be able to determine valid
compliance and enforcement actions.
Your QA program will also yield benefits to the overall NPDES program. By using quality
data, OWEC, Regions, and States will be able to work together to establish valid and responsive
public policy. Further, PCS managers will have the ability to demonstrate the validity of their
decisions to the regulated community, Congress, courts of law, and other organizations.
¦	Improved NPDBS Program Management
In addition, benefits from your QA program will reach the general public. Hundreds of
Freedom of Information Act (FOIA) requests are now received every year. OWEC, Regional,

-------
and Slate staff respond to these requests by providing access to all requested PCS data, except
•O
that which is enforcement sensitive. QA programs will provide the general public with
increasingly better information. Quality data will become more and more important lis complete
public access to PCS data becomes a reality in the future.
2-10

-------
SECTION 3
OPERATION AND MANAGEMENT OF A QA FROGRAM FOR PCS
3.0 Overview
This section describes the areas in the PCS data handling process where quality assurance
is necessary and identifies the procedures which should be developed in each of these areas in
order to implement a successful QA program. A comprehensive QA program should include
well-documented procedures for data capture, data transfer, edit and update error correction, and
data base quality control. By paying proper attention to data integrity in these four areas, you
can dramatically improve the quality of your PCS data.
¦	Data capture	- procedures relating to document handling prior to its
entry into PCS. An example of data capture is the
receipt, logging, sorting, and transfer to data entry
personnel of Discharge Monitoring Reports (DMR)
from NPDES facilities.
¦	Data transfer	— procedures relating to the entry of information from
the input documents, such as the DMR, into PCS.
Examples include the detailed screening of the input
documents, coding the input, resolution of any
obvious problems (for example, missing data), and
entering the data into PCS using PCS-ADE, PC-
Entry, or batch entry.
¦	Edit & update error correction - procedures to correct data errors resulting from the
PCS Edit or Update process. Examples include
running the PCS Edit to produce PCS Edit Audit
reports, running PCS Update to produce the Update
Audit report, and correcting any errors that are
detected on these reports.
¦	Data base quality control — procedures to correct data errors identified on
reports from the Generalized Retrieval or Inquiry.
An example is running the PCS QA Retrieval to
identify missing or invalid data elements.
3-1

-------
Figure 3.1 examines the process of entering DMR data into PCS to illustrate the major
events in each of the four QA areas. Hie DMR originates in the permitted facility and is sent
to the EPA regional office (or delegated state office) where it is received, logged, sorted, and
-given to data entry personnel (data capture). Next, the DMR is screened, any obvious problems
are resolved, and the DMR data are entered into PCS (data transfer). The PCS Edit program
may then be run to check the new data, after which the PCS Update is run to place the new data
into the PCS data base. Each of these programs produce a report that identifies errors to be
corrected (Edit and update error correction). After the PCS data base is updated, reports can
be produced to help identify and correct existing errors (data base quality control). All
information flowing into PCS may be quality checked in these four areas, although the details
will vary by the type of input being processed.
The remainder of this section provides the principles and procedures that can be applied to
each of these areas to improve data quality. Appendix 6, PCS Quality Assurance Program Self-
Assessment, provides guidance on the evaluation of the quality of PCS data and on the
assessment of a quality assurance program. Appendix C, Sample PCS Quality Assurance
Manual, includes specific examples of the procedures suggested here and should be referenced
for more detailed information.
3.1 Data Capture
Data capture includes the receipt and sorting of input documents, their preliminary review
for missing or inaccurate data, and their submission for data entry. Major events pertaining to
data capture include recording the receipt of a particular document, reviewing it for
completeness, and routing it to its proper destination for entry into PCS. To ensure that data
are properly captured you should:
¦	' Keep a Sst of input due dates.
¦	Log in reports when received.
3-2

-------
Figure 3,1
Major Steps Required t© Enter DMR Data Into PCS
Facility
JU
JU
tpi
	illr
IQiOIOi
fn m m
_/r
DMR
Log& Sort
DMR®
DataCapSaiw
Feswawl
R»ri«w«4
P©e®s»*nt» I®
Programs Offlwa
4t
DataT
ffl
E«vmw
J
Rraolv*
Ftoblcm#
1 Screen I
-H(Cod«)DMR

Gcimatf
DoosiexttB
Mmi le^nlted

«4> ViralaStai
Eeeagnitfon
Rep®ft
Date Edit/Update
CcMXCCtiMS
Ealar Dahi

=

Cemd
into PCS

-

Error*





PCS Edit;
Aoiit Report

-------
¦	FoUow-up quickly on non-receipt or missing data.
¦	Ensure input document is properly routed for data transfer.
Knowing when input documents are due will help you anticipate your expected work load
for the short term (e.g. monthly), easily identify facilities that are late submitting their reports,
and prepare appropriate follow-up action against tardy permittees (if required). Much of the data
is submitted on a regular schedule (for example, the submission of DMRs) or based on a known
schedule (for example, permit expiration dates and compliance schedule deadlines),' If needed,
the PCS Generalized Retrieval can produce several management reports (for example, the DMR
Administrative Report and the Compliance Forecast Report) that can help you compile a list of
the due dates.
As reports are received, they should be logged-in. This allows you to document their
receipt and provides a checkpoint if problems occur in the future. A standard log-in form should
be developed (see page 12, Appendix C) for each of the input documents that are routinely
handled. These forms will also aid you in identifying the facilities that are late in submitting
their required reports.
You should follow-up on the non-receipt of required documents as soon as possible. In
some cases, a phone call reminding the appropriate individual that their documents are due will
prompt them to be more punctual. A list of common phone numbers may be kept at hand so
they may be contacted easily (see page 16, Appendix Q. Those problems that cannot be solved
quickly should be flagged so they can be pursued more thoroughly later. Care must be taken,
however, to ensure that data is NEVER entered or corrected in PCS based on a telephone
conversation. For all data entered into PCS there must exist a valid source document to
substantiate up the data. Moreover, since phone calls are not valid enforcement actions, written
requests must be used in cases of recurrent problems so the escalation of enforcement actions
may be documented.
3-4

-------
You should also develop procedures to verify that input documents are properly routed for
date entry after they are logged in and screened. One method of doing this is to note the
transfer of input documents to data entry on the log-in form or to develop and use a separate
routing. form. This is especially important for."PCS input'documents that are on a long or
irregular submission cycle,
3 J Data Transfer
Data transfer includes events taking place after the input document has been received and
logged-in until the data has been keyed into the system. If pursued carefully and diligently, the
screening of PCS data and the timely resolution of the identified problems before data entry will
have a significant effect on PCS data quality. During data capture, the input documents were '
examined to make sure that they were complete. In this step, the examination is more detailed
and focuses on the actual data. Major events pertaining to the transfer of PCS data include the
detailed screening of reports, resolving obvious problems, coding forms for data entry, and
entering data.
Your Quality Assurance program should focus on establishing and documenting standard
procedures for each major data transfer event so that data are correctly entered into PCS and
any necessary corrections are properly entered. The following procedures should increase the
probability that data are correctly taken from the input documents and transferred into PCS.
¦	Use check-lists to screen incoming documents and
to code documents for data entry (e.g. Sample DMR).
¦	Keep a common problem / solution log.
¦	Develop a step-by-step procedure for all data entry methods.
The detailed screening of each input document is difficult and time consuming because
many documents are very complex. One way to speed up their review, and to increase the
3-5

-------
accuracy of the screening at the same time, is to develop a check-list of items to be examined
(see Section 2.2, Appendix C). These check-lists will focus the attention of the reviewer on the
most important data in an orderly way. This method can also significantly improve the quality
and increase the speed of coding data input documents. For example, the Sample Discharge
Monitoring Report employs a checklist approach to ensure the quality of permit limits data in
PCS by improving communication between the permit writer and the PCS coder (see Section
2.2.3.1, Appendix C). In addition to Sample DMRs, checklists may also be implemented for
coding data from Permits, Compliance Schedules, and Enforcement Actions.
Three methods of data entry are employed in PCS — PCS-ADE, PC-Entry, and Batch,
Each method meets a different need in the PCS user community. PCS-ADE is an on-line data
system specifically designed to provide extensive edit checking as the data is entered and to
enable the user to enter key data as efficiently as possible. PC-Entry is a micro-computer data
entry system with a similar format to that found in PCS-ADE. Data are entered using PC-Entry
and then the data set is batch submitted to the PCS edit. PC-Entry is especially useful to users
entering large amounts of data because it allows rapid data entry. AH PCS data base
transactions, other than those using the on-line PCS-ADE, are batch-entered. You should
establish and document instructions for coding and entering data for each of the data entry
methods in use in your Region or state.
3.3 Data Edit and Update
After PCS data has been entered into the system through PCS-ADE or "batch", PCS
processes the transactions via two methods. First, the PCS Edit Process evaluates the
transactions for correct syntax and stores them in a transaction file until they are updated into ¦
the PCS data base. The PCS Edit may be run at any time. Next, the PCS Update Process
incorporates the stored transactions into the PCS data base. The PCS Update is run by
Headquarter's staff every Monday and Thursday.
3-6

-------
After PCS processes the data through the Edit and Update^ it is necessary to develop QA
procedures to correct transactions that are rejected in the Edit or Update process.
¦	Use "Optional" features to check your data.
¦	Develop standard review procedures for the Edit Audit
and the Updaie Audit mpo/t.
¦	Develop standard procedures to correct identified errors.
t
3.3.1 "Optional" Features
Two optional PCS features can be used to check the data, identify errors, and correct them
prior to using the PCS Edit or Update. These features are the PCS Dummy Edit and PCS Range
Checking.
The PCS Dummy Edit or test edit allows the user to have the Edit process evaluate each
transaction in the batch submittal for the correct syntax. This optional process produces only
a PCS Edit/Audit Report (Dummy) which can be reviewed, annotated, and used to correct the
data before running the "live" Edit. Data are not stored in the transaction file to be updated.
Range-Checking is another optional technique to improve the accuracy of measurement data.
It is based on the idea that each valid measurement has a particular "range" within a set of
high/low values that are most likely to occur. It then follows that any values outside of this
range are quite possibly errors and should be investigated. OWEC is now undertaking the
development of range-checking capabilities for PCS. The range-checking capability will be
added to batch Edit and PCS-ADE as an optional feature that the user may select. The users
will have the option of selecting a national set of default ranges for primary pollutants, or of
developing and using their own individual tables covering any parameter. When implemented,
this feature will offer PCS users a powerful method of analysis and quality assurance.
3-7

-------
3.3.2	Standard Review Procedures
Once the optional features are run, and the identified errors are corrected, the PCS Edit can
be run to complete the edit process for batch submission. The PCS Edit automatically rejects
transactions with serious errors and places those with no, or minor, errors in a transaction file
to be incorporated into the PCS data base when the Update program next executes. This process
produces an Edit Audit Report which can be reviewed, annotated, and used to correct the
rejected transactions and amend the minor errors.
PCS-ADE offers an efficient, on-line data entry method with immediate edit checking.
Data are checked as they are entered for validity and completeness. Transactions that
successfully pass the edit checking are placed in a holding file to be processed during the next
Update.
The PCS Update is the final step needed to place the transactions into the PCS data base.
Completion of this process will produce a new PCS data base. Normally updates to the data
base are made twice per week. A PCS Update Audit report is produced when the data are
incorporated into the PCS data base. The Rejected Transactions section of this report should
be reviewed and corrections made to the data, if necessary.
3.3.3	Standard Error Correction
Each of the PCS Edit and Update audit reports has three sections: an Audit Report of
Rejected Transactions, an Audit Report of Accepted Transactions, and an Audit Summary
Report. Your QA program can be most effective by developing complete review procedures to
examine these reports, and by following through and resolving any errors found (see Section
2.3, Appendix C). If the error rate is high (exceeding 10%), you should evaluate your current
input procedures and develop a plan for reducing the error rate.
3-8

-------
3.4 PCS Data Base Quality Control
.1
y
Once the data has been entered and uploaded into the PCS data base, the focus of quality
assurance shifts from preventing new errors to identifying and correcting existing errors in the
data base. You may produce reports through the PCS Generalized Retrieval subsystem to
determine the quality and quantity of data in PCS and to highlight existing errors in the data base
which require correction.
¦	Use the PCS Quality Assurance report.
¦	Use customized data quality reports to examine questionable data.
¦	Develop special techniques to meet your particular needs.
The PCS Quality Assurance Retrieval is a new report developed to provide quality
assurance of all WENDB elements on a permit by permit basis. This retrieval examines each
WENDB element and determines if it, or a related element, has a missing or invalid value, The
pre-formatted report produced by this retrieval displays all WENDB elements and their
%
) associated error message. The user can specify the data types to be displayed. The report will
also print a summary of each of the data types selected. The value of this report is that it
provides quality assurance for all WENDB elements in a single easy-to-read report and checks
for missing values in related data elements.
You may develop customized data quality reports by using the PCS Generalized Retrieval
to examine a particular subset of data. This method is especially effective in reviewing data that
is not checked by the Quality Assurance Retrieval (i.e. non-WENDB elements). An example
is the Quick Look report which can check the validity of PCS data against input documents.
Customized reports could be used to:
¦	check for completeness and accuracy of DMR data
¦	check for accuracy of permit data
)
3-9

-------
¦	Identify inconsistent coding of Pipe Schedule Data
¦	perform limited QA checks to supplement the QA Report (such as the absence of
Permit Expiration Date if Permit Issuance Date is present), .
¦	Special techniques are a third retrieval method which may also be developed to meet a
particular quality assurance need. One useful technique is trend analysis. Trend analysis
examines the values of the same data parameters at regular intervals over time, for example,
monthly values of dissolved oxygen levels over the course of a facilities permitted history. This
method is especially useful in highlighting consistency errors.
A technique useful for trend analysis of data is the Effluent Data Statistical (EDS) software.
The EDS software allows DMR effluent data to be either statistically analyzed or graphed over
time. In addition, mass loading reports and graphs may be produced. EDS may assist in trend
analysis and to identify facilities with DMR data quality problems. In addition to data entry
problems, EDS will highlight improper unit conversions and the improper use of monitoring
location codes.
When errors are identified using any of the three methods above, errors should be corrected
using procedures described in the edit/update section.
3-10

-------
SECTION 4
•s
MANAGEMENT INFRASTRUCTURE AND GENERAL PRACTICES
4.0 Overview
A QA program encompasses more than a set of well-documented procedures for avoiding
and correcting errors. PCS managers should also administer their programs in such a way as
to emphasize the importance of quality assurance and stress the need for continual improvement
of PCS data quality. In developing, or enhancing, a QA program the following management
activities should be addressed:
¦ Assign Stalt Responsibilities
¦ Establish Attainable Goals
Track Performance Against Goals
¦ Assess Your QA Program
Formally establish your staffs duties
to demonstrate the importance of PCS
data quality.
Set your PCS data quality targets for
timeliness, accuracy, completeness,
and consistency.
Evaluate how well your current PCS
data quality meets your data quality
goals.
Evaluate how well your Quality
Assurance program functions and
how to fine-tune it.
¦ Manage Your Data Input Personnel
¦ Show Consistent Commitment
Promote Communication
Manage your staff to support and
enhance PCS data quality.
Provide full management commitment
to PCS data quality.
Provide adequate documentation and
training. Open channels for your
PCS staff to talk to each other about
common problems.
4-1

-------
Two appendices give additional detail of some of the procedures mentioned here. Appendix
B, PCS Quality Assurance Program Self-Assessment, provides a description of how to go about
evaluating your program. Appendix C, Sample PCS Quality Assurance Manual, can be used
to develop your own QA manual.
4.1	Assigns Stall Responsibilities
¦	Include responsibility for PCS §uatity assurance to relevant pb descriptions.
¦	Establish performance objectives for quality assurance.
¦	Evaluate quality assurance accomplishments during performance evaluations.
¦	Designate a PCS quality assurance overseer.
Each person's responsibility for their quality assurance activities must be explicitly assigned
for your program to be successful. Because ensuring good data quality usually requires extra
diligence and effort, it is often slighted unless the staff knows it is part of their duties. An easy
way to make this assignment is to include quality assurance as a required task in job
descriptions. This assignment should cover all positions involved with PCS, such as, the PCS
coordinator, data-entry personnel, permit writers/coders, and compliance engineers. For
example, the PCS coordinator's QA responsibilities could include the oversight of the QA
program, the scheduling of staff training, and the monthly measurement of QA performance (see
page 88, Appendix C). Formally assigning responsibility demonstrates clearly that you have
made a serious commitment to quality assurance and consider it an important objective of the
position. An additional benefit of having assigned responsibilities is that new employees learn
from the beginning that their job requires attention to PCS data quality. Then, as they master
their tasks, they are more likely to use quality assurance procedures routinely. - ¦ -
Once formally assigned, staff will know their expected duties and should adjust their
performance accordingly. Their accomplishments in improving data quality should be evaluated
fairly and good performance should be rewarded. After quality assurance has been included in
4-2

-------
the job description, performance pertaining to data quality can be evaluated during annual
performance reviews.
• • * II is also important to designate someone to be responsible for day-to-day oversight of the
Quality Assurance program, nils person can maintain a broad view of the program from a data
quality perspective, coordinate all data quality activities, and act as a resource,for problem
solving.
4.2 Establish Attainable Goals and Targets
¦	Establish attainable goals and strive to meet them.
¦	Publicize your data quality targets and your performance.
An established set of data quality targets is the measure used to judge a QA program. If
you have an ongoing, well-established QA program then your goals should, at a minimum, be
the national data quality targets for timeliness, accuracy, consistency, and completeness included
in this guidance manual (see page 2-3). Concentrate your efforts on improving your
performance in the areas where your program is the weakest.
If you are just now establishing a new quality assurance program, or are altering an existing
program that failed to produce quality data, then the national data quality targets for timeliness,
accuracy, completeness, and consistency should be used as an ultimate goal for your evolving
program. During the development of your Quality Assurance program, set a scries of realistic
goals (monthly or quarterly) that are both attainable and measurable. Each sequential goal
should be nearer the national data quality targets and should require a "stretch" for your staff
and resources. Endeavor to achieve these intermediate goals and strive for continual
improvement until your data quality targets are as stringent as the national targets.
You should publicize your data quality targets so that all PCS staff know their goals and
can judge their performance relative to them. Publicizing your objectives by posting them on
4-3

-------
a bulletin board, or by circulating memos, makes them more concrete and provides necessary
feedback to your staff about their performance,
4.3	Track Performance Against Goals
¦	Measure your progress regularly to assess your progress.
The quality of your PCS data should be measured at regular intervals and compared to your
program's targets and goals. This evaluation lets you know how well you are doing in terms
of your data quality. You should start this evaluation with a clear view of your objectives and
use the same, routine method of gauging data quality performance at regular intervals (each
month, quarter, or year). Techniques for you to use to gauge your performance are provided
in Appendix B. The routine measurement of achievements against your quality targets allows
you to track your data quality status over time, identify trends for appropriate managerial
oversight, and provides information to use in fine-tuning your program.
The interval between your data quality measurements will depend upon the type of data
being evaluated. Since DMRs are submitted monthly, it is appropriate to evaluate DMR data
on the same day every month. Other data types with a longer interval between reports should
be evaluated on a monthly to quarterly basis.
4.4	Assess Your Quality Assurance Program
¦	Periodically review, evaluate, and fine-tune your Quality Assurance program.
A periodic review of your Quality Assurance program is necessary for you to judge its
effectiveness and to determine how to continuously improve it. Your assessment should first
determine the current status of your QA program and measure where you stand relative to the
OWEC national standards. Then you should focus on problems that affect PCS data quality and
on identifying their causes. Once the cause of a problem is identified possible solutions may be

-------
evaluated and the most appropriate solution implemented. Your review should address all areas
of your program. If your program has recently changed, you may want to examine more closely
those areas which .have been modified. For example, if new staff have recently been hired,
examine their training; if changes have been instituted in your program, make sure new
problems -have not arisen because of them .and that they arc helping .to alleviate the original
problem.
The frequency of your review will depend on the features of your particular program, such
as the experience of your staff and the number and magnitude of recent changes made to the
program. Factors necessitating a frequent review include a high staff turnover and numerous
or severe changes made in quality assurance procedures. Less frequent review will be required
for established programs operated by an experienced staff. In general, established programs,
and those with modest changes, should be reviewed yearly while programs with more severe
changes may warrant quarterly reviews.
4.5 Management of Data Input Personnel
¦	AUow staff enough time to do a good job.
¦	Reprimand or reassign non-performing stqff.
¦	Respond quickly to anticipated stuff turnover.
¦	Cross-train existing stqff.
A certain amount of time is required to "do the job right the first time." PCS managers
and their supervisors must understand this and allow their staff to devote the time necessary to
complete their data quality work and to follow up on problems promptly when they occur. -
Once you have established data quality as a key requirement of an employee's performance
objective, you must follow through. As mentioned above, staff responsible for good quality data
should be rewarded. Conversely, you should also be willing to reprimand or reassign staff who
4-5

-------
do not demonstrate a willingness or the ability to perform quality assurance functions adequately.
Staff turnover poses a serious threat to the success of your PCS Quality Assurance program. ¦
PCS Is a complicated system to Seam and it is difficult for new staff to become proficient using
the system quickly. - Further damage to your program is listed by the long time often needed
to complete hiring of new staff. To mitigate damage from staff turnover, try to move quickly
to replace PCS staff who are leaving, ideally while the experienced staff member is still on the
job to orient the replacement.
Another method may be useful to mitigate the effects of personnel turnover in offices with
several PCS staff members. Each existing staff member can be "cross-trained" in a portion of
the duties of the other positions. Then, when staff leave unexpectedly, the remaining staff can
temporarily fill in. While no single person need know all aspects of PCS, knowing more than
just one's assigned duties can Iielp immensely during staff transition periods.
4.6 Show Consistent Commitment to Data Quality
¦ Be consistent in your commitment to data quality.
Unflagging management support is crucial for the success of all quality assurance programs.
Understand that every program experiences problems and setbacks. However, through the
perseverance of the PCS managers and the dedication of the PCS staff most, if not all, of these
problems can be overcome. One method to accomplish this is to issue a statement establishing
data quality as a priority and outlining a course of action to achieve it. Be consistent and follow
your established policy even when difficulties arise.
4-6

-------
4.7 Promote Communication
¦	Make sure adequate documentation is available.
¦	Provide PCS staff with helpful training,
¦	Provide resources to enhance communication among PCS staff.
Communication, including system documentation and staff training, is a vital part of a
quality program and should not be neglected. Once you have developed your QA manual, make
it accessible to your staff, both for their use as a reference and for them to annotate with their
solutions to common problems. Expand on the one-page "summary sheets" included in the
sample manual and post them in locations where they are needed. These "quick reference"
sheets will save considerable time.
The staff member responsible for the day-to-day operations of the quality assurance program
should also be responsible for coordinating PCS QA training. This person knows the problems
that could be solved with proper training, learns of changes in PCS first, and knows what
training resources are available. PCS QA training should be conducted as frequently as
necessary and should be targeted to the staff who can benefit most from it, for example, new
staff members. All relevant staff should be included in training programs as major PCS QA
procedures change.
Knowing who to contact for help with problems can save much time, effort, and rework.
Often solutions to a particular problem have been developed in other regional or state offices.
People within your own office or organization have particular expertise that can be of help. It
is especially important that all staff members involved with -PCS communicate with each other.
You should try to open channels for people involved in ail aspects of PCS, such as data entry,
permit writing, and enforcement, to meet to discuss problems with PCS data quality and their
potential solutions.
4-7

-------
APPENDIX A
PCS POLICY STATEMENT

-------
PERMIT COMPLIANCE SYSTEM POLICY STATEMENT
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
STATEMENT OP POLICY
It ii SPA policy that the Permit Compliance System {PCS) ®h®n
b® the national data base for the National Pollutant Discharge
Elimination System (NPDES) program. All EPA Regions must use PCS
directly* and all NPDES States mutt either use PCS directly or
develop and maintain an interface.
As our primary data source, PCS will promote national consis-
tency and uniformity tn permit and compliance evaluation. To
achieve national consistency and uniformity In the NPDES program,
the required data in PCS must b® complete and accurate. Facility,
perniti (i.e.# events and limits); measurement, Inspection, cow-
pi lance scheduler and enforcement action data are required. These
required data elements are further defined In Attachments I and 2.
They comprise the Mater Enforcement National Data Base (WENDB)
which has been redefined as the core of information necessary to
enable PCS to function as a useful operational and management tool
and so that PCS can be used to conduct oversight of the effective-
ness of the NPDES program.
All required data for NPDES and non-NPDES States must be
entered into PCS by September 30, 1986 and maintained regularly
thereafter. This will require Reg ions and States to start entering
data as early as possible, and not wait until late FY 1986.
By the end of F¥ 1986 , direct users of PCS shall establlsh,
with Office of Water Enforcement and Permits (OWEP) ass 1 stancer
a Quality Assurance program for data In PCS. The program shall
define;
• monthly measurement of the level of data enteredi
•. appropriate time frames to ensure that data are entered
in PCS in a timely manner; and
" nationally consistent standards of Known data quality
based on proven statistical methods of quality assurance.
PCS Quality Assurance shall address the completeness (for
assurance of full data entry) and accuracy of the data
entered into PCS.
Adoption of PCS by States should be formalized in each
5tate' s 1106 Program Plan, State/EPA Agreement, or in a separate
agreement. Each plan should clearly define EPA' e and the NPDES
State's responsibilities regarding PCS. The Key Management
Practices in this Po1 icy Statement should be Incorporated into
the S106 Program Plan.

-------
- 2 -
BACKGROUND
When th« PCS Steering Committee met in March 1985, CPA
Regional representatives stressed the essential need for a positive
statement from EPA Headquarters management to Regional and Stat®
management specifically requiring the support and use of PCS.
Lack of -such support may result in an incomplete and unreliable
data base. With sufficient EPA Headquarters, Regional, and State
support, however, PCS will com® to serve several major purposes
for the NPOES program}
*	PCS will provide the overall inventory for the NPDES program,
*	PCS will provide data for responding to Congress arid the
public on the overall status of the NPDES program. As
such, it will serve a§ a valuable tool for evaluating the
effectiveness of the program and the need for any major
policy changes.
*	PCS will encourage a proper EPA/State oversight role by iden-
tifying all major permittee violators*
*	PCS will offer all levels of government an operational and
management tool for tracking permit issuance# compliance,
and enforcement actions.
This PCS Polley.Statement is a result of the Steering Committee
meeting. It is a clear message to Regional and state management
that PCS is the primary source of NPDES information# and as
such, it is to be supported wholeheartedly by all users of PCS*
The PCS Steering Committee meeting also resulted in «
redefinition of wendB and ratification thereof. WENDB is th®
minimum standard of data entry which will allow PCS to function
as a useful operational and management tool (see Attachment® 1
and 2). EPA Regions agreed that all WENDS elements will be
entered into PCS by September 30, 1986, and maintained regularly
thereafter.
Once the required data are entered into and regularly main-
tained in PCS, PCS will assist permits and compliance personnel
in many of their operational and management responsibilities.
PCS will greatly reduce reporting burdens for such activities
as the Strategic Planning and Management System {SPMS)* and i&
will reduce efforts needed for effective compliance tracking at
both Reg ional and State levels. Also, substantial automation of
the Quarterly Noncompliance Report (QNCR) will save time and
resources.

-------
IMPLEMENTATION STRATEGY
Key Management Practices
To effectively implement and uphold this PCS Policy Statement
and enhance PCS* capabilities, there are certain key management
practices that must be implemented:
•	The following milestones have been established to facilitate
the entry of all reauired data by the end of ft 1986s
-	All required National Municipal Policy (NNP) data must be
entered into PC? by October 31 , 1985 (See Attachment 1).
-	All required data for non-NPDES States must be entered
into PCS by March 31, 1986.
*	NPDES permits shall be enforceable and tracked for compli-
ance using PCS. The 0ffice of Water Enforcement and
Permits (OWEP) recognizes there may be situations where
permit limits and mon i tori ng conditions are not initially
compatible with PCS data entry and tracking, tn these
cases, Regions should ensure that appropriate steps are
taken by the permit writer to identify difficult penslts
to the PCS coder* and to mutually resolve any coding
Issues. The Regions should work closely with their NPDES
States using PCS, to address similar data entry problems
with State-issued NPDES permits.
•	WEND* is the minimum standard of data entry for PCS {see
the attached lists of data reauirements). If States and
Regions wish to enter NPDES data beyond what has been required
they may do so. For example, if States want to enter
Discharge Monitoring Report (DMR) data for minor facilities*
the option is available in PCS and the States' may use it
as their resoiTr^ea aTIowT	ensure that sufficient
computer space is available for the currently projected
use of PCS.
*	All OMRs submitted to EPA Regional Offices < including OMRs
submitted by NPDES States for EPA entry into PCS} must be
preprinted using the Office of Management and Budget (OMB)
approved DMR form. NPDES States directly using PCS are
not required to use the OMB-approved form? however» its
use is strongly encouraged, with the continuing demand
for more complete information and with atabls» if not
diminishing, data entry resourcest it is to EPA'i and
NPDES States1 benefit to preprint DMRs. The use of pre-
printed OMRs will greatly reduce PCS * date entry burden $
making available resources to be used in other areas .
(e.g. , PCS quality assurance, data entry for other PCS
records, etc.}.

-------
•	Th# frequency with which OMRs are submitted to the CPA or
NPDBS State Ss important for ensuring timely entry of
dit® Into PCS and timely review of permittee's compliance
status. Quarterly, semi-annual, or annual submission of
OMR® creates a major data entry burden and is pedes the
compliance evaluation process. As a result, the useful-
ness of DMR data for compliance evaluation decreases
substantially. Monthly submittal of DKRs alleviates this
problem and enhances PCS * effectiveness signi ficantly. It
is recommended that monthly submittal of OMRs be incorpo-
rated into major permits as they are reissued. With approx-
imately 20 percent of the permits reissued each year, it
will take five years to complete the transition to monthly
submittal for all major permittees.
•	CPA Regions should coordinate with their respective States
to develop strategies that describe each State1s plans to
either use PCS directly or develop an interface. These
strategies should include the rationale for selecting one
of these methods of data entry into PCS* an outline of all
requirements necessary for implementing the selected
method, the mechanisms to be used to supply sufficient
resources, and a schedule for attainment not to exceed
September 30, 1986. If a State Is a current user of PCS
via one of these methods, the strategy should describe its
needs for enhancing its PCS usage or improving its PCS
interface, the mechanisms to be used to supply sufficient
resources, and a schedule for attainment not to exceed
September 30, 1966.
•	When writing or revising a Memorandum of Agreement (MOA),
the Region and State should specify the State's intent to
use or Interface with PCS. The MOA should address the
rat ionale for select ing one of these selected methods of
data entry into PCS, an outline of al1 requirements neces-
sary for implementing the selected method, the mechanisms
to be used to supply sufficient resources# and a schedule
for attainment*
Responsibilities
Off lee of Water Enforcement and Permits: It is OWEP*s full
responsibility to maintain the structure (i.e., the computer
software) of PCS and to operate the system. OWEP will continue
to support time-sharing funds needs, train ing, and the necessary
resources to continue the operation of PCS. OWEP will work with
the EPA Regions and NPDES States to continually evaluate and
improve, where feasible, the system's software, time-share funding
operation, and maintenance. OWEP will maintain a Steering Commit-
tee and User Group, organize the national meetings# and work
closely with the Regional and State representatives on major
decisions related to PCS.
OWEP will oversee the Reg ions' and States* progress in
fulfilling this policy statement by assessing the quantity of
data entered each quarter.

-------
IP* legions and NPDES States: It is the EPA Regions' end
NPDES Stat#®1 full responsibility to maintain the Infrastructure
of PCS by accurately enterina data in a timely manner, AIs©, epa
Regions and NPDES States are responsible for participating in PCS
Workgroups and contributing to improvements to PCS.
Three National PCS meetings are held each year, one for the
Steering Committee and two for the PCS Users Group. EPA Regions
are expected to attend all three meetingsNPDES States directly
using PCS are invited to attend the State portions of these
meetings. More meetings may be scheduled during the year if
necessary.
Since consistent and objective compliance tracking is &
central component of an effective and credible enforcement proaran,
NPDES States are strongly urged to use PCS directly. We realise*
however, that there may be some cases where NPDES States cannot
use PCS directly. In these instances, in accordance with §123.41
of the regulations, EPA requests from the States all required
information (as indicated in the attachments) for entry into PCS*
This can be achieved one of two waysi
•	A State Automated Data Processing (ADP) interface can be
developed. It is the EPA Region's responsibility to work
with the NPDES State to develop an effective State ADP
interface. The State, however, should take the lead in
developing the Interface and work closely with the Region
to ensure the interface is effective. It should be realised
that system interfaces are often troublesome and unwieldy?
they are often ineffective and limit the States* flexibility
to change their systems quickly to meet management needs*
In the event a State ADP interface is developed * there
must be formal agreement that the State will operate the _
interface# maintain the interface software, and be fully
responsible for making any changes to the interface based
on changes made to its automated data base. This will
ensure that the NPDES State will be he Id responsible for
system compatibility. If the State does not accept full
responsibility with system compatibility, then changes
must not be made to the State system without the prior
knowledge of EPA. The State is responsible for ensuring
that the data a re transferred to PCS in a timely manner»
accurately, and completely. Interfaces must be developed
and maintained so that they operate with maximum efficiency
all of the time.
•	OWEP recognizes that FY 1986 will be a transition year for
PCS. NPDES States will begin using PCS or will develop
interfaces. In the event that neither of these alternatives
is accomplished by the end of FY 1986, in accordance with
the FY 1986 Guidance for the Oversight of NPDES Programs,
the State will be responsible for submitting all required
information (as indicated in the attachments) in hard
copy format. The data must be submitted either already

-------
- 6 -
code* onto PCS coding sheets or lr» a fomat that can be
readily transferred onto PCS coding sheets, Alio, the data
must be submitted at tabular intervals to ensure timely
entry into PCS. Once the data are received by EPA, it is the
EPA Region's responsibility to enter the data into PCS in a
timely manner.
Funding
* $106 grant funds may be used for interface software develop-
ment. However, they cannot be used for maintenance of the
interface software for State-initiated changes to a State
AO? system or for the operation and maintenance of a separata
State *DP system.
•	$106 grant funds may be used for State data entry if and
only if the State uses PCS directly or the State provides
data to PCS via an interface that meets the standards of
this policy.
•	If reguested by a State, EPA will agree to pay for its
time-sharing costs to implement this oolicy, within given
resources.
•	Headquarters will continue to pursue alternative methods of
reducing the data entry burden on ions and States.	)
Hate

-------
APPENDIX 1
REVISED 19B9
Permit Facility Data
Permit Event Data
Inspection Data
Parameter Limits and
Pipe Schedule Data
X
X
X
X
MINOR*
MMSBg
X
X
X
Significant Compliance Data
Compliance Schedule Data X
DKR Measurement Data	X
Enforcement Action	X
(Enforcement Action Data,
Compliance Schedule Data,
and Interim Limits Data
from all active formal
enforcement actions and
Enforcement Action Data
for ail activ* informal
enforcement actions)
Enforcement Action Data
from all astivft informal
and formal enforcement
actions
Enforcement Action/	X
Administrative Penalty Order*
Pretreatment Approval* X
National Municipal Policy X
Data4
X
X
Xs
X
OTHER8
MINOR
X
X
X
X8
X
Single Event Violation X
Data
Pretreatment Compliance X
Inspection (PCS)/Audit
Pretreatment Performance X
Summary
X*
X1
Xs
Xs
r
X*

-------
Appendix B
(continued)
1 for each of th« categories listed in this chart, the
Information
Type is the set of core date elements listed in Appendix c.
8 These data elements are required specifically for
administrative
penalty orders. Entry of these data element# is only required
for
EPA actions*
4 Pretreatment Program Required Indicator, PRET; one data
element.
* All required data as described in May 16, 1985 nenorandua on
National Municipal Policy Tracking in PCS. This includes
NPFF,
NPSC, NPSQ, RDC2, Compliance Schedule, and Enforcement Action
Information.
8 The following information types are only for minor POTWs which
are pretreatment control authorities: pr©treatment approval,
single event violation data, pretreatment compliance
inspection
(PCI)/audit, and pretreatment performance summary.

-------
APPENDIX C
WATER ENFORCEMENT NATIONAL DATA BASE (WENDB) ELEMENTS
REVISED 1989
Acronvw
smmiLmx
NPDES Number	NPID
SamJM£EJgOimiLE-lI£6BC
Compliance Schedule Actual Date	DTAC
Compliance Schcdula Data	DTSC
Compliance Schedule Event Code	EVNT
Compliance Schedule File Number	CSFN
Compliance Schedule Number	CSCH
Compliance Schedule Report Received Date	DTRC
Compliance Schedule User Data-Element2	RDC2
Data Source Code	DSCD
COMPLIANCE VIOLATION RECORD1
Compliance Schedule Violation Code	CVIO
Compliance Schedule Violation Date -	CVDT
Compliance Schedule Violation Event Code	CVEV
Compliance Violation Compliance Schedule Number	VCSN
Compliance Schedule Violation Data Source Code	VDCD
QNCR Compliance Schedule Violation Detection Code	SNCC
QNCR Compliance Schedule Violation Detection Date	SNDC
QNCR Compliance schedule Violation Resolution Code	SRCC
QNCR Compliance Schedule Violation Resolution Date	SRDC
ENFORCEMENT ACTION RECORD'
Enforcement Action Code	ENAC
Enforcement Action Comment	.	ECMT
Enforcement Action Compliance Schedule Violation	ECVC
Code
Enforcement Action Compliance Schedule Number	EVSN
Enforcement Action Compliance Schedule Violation	ECVD
Date
Enforcement Action Data Source Code	EVCD
Enforcement Action Date	ENDT
NOTE: See last page for listing of footnote®
- 1 -

-------
APPENDIX C
Acronvn
Enforcement Action Discharge Number	¦ evds
Enforcement Action Event Cod®	EVEV
Enforcement Action File If umber	ERFH
Enforcement Action Limit Type-Alphabetic	EVLM
Enforcement Action Modification Number	EMOD
Enforcement Action Monitoring Date	EVKD
Enforcement Action Monitoring location	EVKL
Enforcement Action Parameter Code	EVPR
Enforcement Action Report Designator	EVRD
Enforcement Action Response Due Date	EVDT
Enforcement Action Season Number	ESEA
Enforcement Action Status Code	ENST
Enforcement Action Status Date	ESDT
Enforcement Action violation Type	EVTP
Enforcement Action Code - Violation Key1 - EKAC
Enforcement Action Date - Violation Key1 EKDT
Enforcement Action Type Order Issued EPA/State	E3CTP
Violation Key*
Enforcement Action Single Event violation Code*	ESVC
Enforcement Action Single Event Violation Date'	ESVD
Enforcement Action Type Order Issued EPA/State*	EAT?
EHFO.RC EMENT ACTION/ADMINISTRATIVE PENALTY ORDER
Date Proposed Order Issued	APDT
(generated from Enforcement Action Date)
Class I or II	APCL
Hearing Requested	APHR
Date of Final Order	APF©
Penalty Amount Assessed	APAM
Class II Appeal Filed	APAP
Date Judicial Appeal Filed	APAF
Date Penalty Collected	APPC
EVIDENTIARY Hj&BlMg_BEggSB
Evidentiary Hearing Event Code*	EHEV
Evidentiary Hearing Event Date	EHDT
HiSEJEgrJON RECORD
Inspection Date	' DTIR
Inspector Code	IHSF
Inspection Type	TYPI
Inspection Comments (First three characters for	ICQH
Industrial User pretreatment inspections)
NOTE: See last page for listing of footnotes
- 2 -

-------
APPSTO1X c
EEIi.21	
MEASUREMENT VjO^TION	RECORD
Measurement/Violation concentration Average	WCAV
Measurement/Violation Concentration Minimum	MCMN
Measurement/Violation Concentration Maximum	MCSff
Measurement/Violation Quantity Average	MQAV
Measurement/Violation Quantity Maximum	MQMX
Measurement/Violation Discharge Number	VDSC
Measurement/violation Monitoring Location	VKLO
Measurement/Violation Monitoring Period End Date	MVDT
Measurement/Violation Parameter	VPRH
Measurement/Violation Report Designator	VDRD
No Discharge Indicator	NODI
Measurement Violation Detection Cod®1	SNCE
QNCR Heasurenent Violation Detection Date1	SNDE
QNCR Measurement Violation Resolution Cod®1	SRCE
QNCR Measurement Violation Resolution Date1	SRDE
£aem£tek
Change of Limit Status	COLS
Concentration Av«rage Limit	LCAV
Concentration Maximum Limit	LCKX
Concentration Minimum Limit	LCMN
Concentration Unit Code	LCUC
Contested Parameter Indicator	COUP
Limit Discharge Number	PLDS
Limit File Number	PLFH
Limit Report Designator	PLED
Limit Type - Alphabetic	LTYP
Modification Number	MODN
Modification Period End Data	...	ELED
Modification Period Start Date	'	ELSD
Monitoring Location	MLOC
Parameter Code	PEAK
Quantity Average Limit	LQAV
Quantity Maximum Limit	LQMX
Quantity Unit Cod®	LQUC
Season Number	SEAN
Statistical Base code	STAT
NOTE: See last page for listing of footnotes
- 3 -

-------
APPENDIX C
Permit Tricking Actual Date	PTAC
Permit Tracking Event cod®8	PTEV
mpmi^iMiLU.L,MZQQm
Average Design Flow	FLOW
City Cod*	CITY
County Cod®	COTY
Facility Inactive Cod®	XACC
Facility Inactive Data	IADT
Facility Fame Long	FHML
Federal Grant Indicator	FDGR
Final Limits Indicator	FLIK
Major Discharge Indicator (Entered by EPA	MADI
Headquarters)
HHP Final Schedule*	NP8C
NHP Financial Status*	NPFF
NHP Schedule Quarter*	KPSC
Pretrsatment Program Required Indicator	PR2T
QNCR status Code, Current Year (Manual)7 CYMS
Reissuance Control Indicator	RCIN
River Basin (first four characters)	BASS
SIC Code	SIC2
Type of Permit Issued - EPA/State	EPST
Type of Ownership	typo
PIPE SCHEDULE RECORD
Discharge number	DSCH
Final Limits End Date	FLED
Final Limits Start Date	FLSD
Initial Limits End Date	ILED
Initial Limits Start Date	I LSD
Initial Report Date	STRP
Initial Submission Date - EPA ' ,	STSU
Initial Submission Date - State	STSS
Interim Limits End Date	HLED
Interim Limits Start Date	HLSD
Number of Units in Reporting Period	NRFU
Mumber of Units in Submission Period - EPA	N3UH
Number of Units in Submission Period - Stat®	HOTS
HOTE! See last page for listing of footnotes
- 4 -

-------
APPENDIX C

Esis^lsasniJtom	Acror
PIPE SCHEDULE RECORD (caT&imiMl
Pip® Inactive Coda	PIAC
Pip® Inactive Date	PIDT
Pip® Latitude*	PLAT
Pip® Longitude® PLON
Report Designstor	.	DRID
Reporting Units	HQfU
Submission Unit - EPA	SHUN
Submission Unit - Stit®	SUUS
data elements'
Singl® Event Violation Cod®	SVCO
Single Event Violation Date	•	SVDff
QNCR Single Event Violation RNC Detection Code	SKCS
QNCR Single Event Violation RNC Detection Date	SNDS
QNCR Single Event Violation RNC Resolution cod®	SRCS
QNCR Single Event violation RNC Resolution Date	SRDS
	sxssbljssbsi
SOURCE - PRETREATMENT COMPLIANCE INSPECTION fggmAMDU
Adoption of Technically-based Local Limits	ADLL
Categorical.Industrial User®	CIUS
Technical Evaluation for Local Limits	EVLL
SIUS in snc with Self-Monitoring	MSNC
Significant industrial Users without Control	HOCK
Mechanisms
SIUS Not Inspected or Sampled	NOIH
SIUS in SNC with Pretreatraent Standards or Reporting PSNC
Date Permit Was Modified to Require Pretreatraent	PTIK
Implementation
Significant Industrial Users	SIUS
SIUS in SNC with Self-Monitoring and Not Inspected	SHIN
or Sampled
PCI/Audit Date	DTIA
NOTE: See last page for listing of -footnotes

-------
APPENDIX C
WENOB ELEMENTS
Data Element Kama
SPVRCE - FBETBEAi
V«I4CI>
FgRFOKKftKI svmmt
Formal Enforcement Actions Excluding Civil and
Criminal Judicial Suits
Industrial Users From Which Penalties Have
Been Collactad
Civil or Criminal Suits Filad Against SIUS
SIUS in SNC vith Pretreatment Compliance Scheduled
SIUS with Significant Violations Published in
Kawspapar
Pretreatment Parformanca Summary Start Data
Pratraatmant Performance Summary End Data
Acronvn
FERF
lUPtf
jxtoi
ssirc
SVPU
PSSD
PSED
KpTEs Sea last page for listing of footnotes
- 6 -

-------
Listing of Footnotes
1. These data elements are automatically generated by PCS unless
the user wishes to enter than manually.
2.	The data element ara required for both informal and formal
enforcement aetions (vhen applicable). This includes
administrative penalty order both EPA and State issued.
3.	These data elements vere added at the request of the PCS Steering
Committee at the 1966 meeting.
4.	There ara seven (7) required evidentiary hearing event codes
(vhen applicable). They are as follows:
01099 Date Granted	10099 Date ALJ Decision Rendered
01099 Date Hearing Scheduled 11099 Date Appealed to Administrator
07099 Date Requested	(EPA issued permits only)
08099 Date Settled
09099 Denied
5.	There are thirteen (13) required permit event codes
(vhen applicable.) They are as follovs:
P1099 Application Received	P7299 301 (g) Variance
P3099 Draft Permit/Public Notice	P7S99 316 (b) Variance
P4099 Permit Issued	P7699 316 (b) Variance
P5099 Permit Expired	P7799 Fundamental Difference
? P7199 301 (c) Variance
P7499 301 
-------
APPENDIX B
PCS QA PROGRAM SELF-ASSESSMENT

-------
APPENDIX B.
PCS QA PROGRAM SELF-ASSESSMENT
B.O Overview
This appendix discusses the evaluation of a quality assurance program as a whole. This
evaluation should be conducted at regular intervals (e.g. yearly) to make sure that all of the
necessary components of the program are in place and are functioning together efficiently.
Every QA program requires periodic evaluation and modification to perform up to its full
potential. To accomplish this, OWEC suggests a systematic approach using the following steps:
¦	Evaluate your current QA program
¦	Identify QA problems with your program
¦	Isolate the source or cause of the problems
*	Identify potential solutions
*	Select and implement solutions
¦	Regularly re-evaluate your QA program
B.l Evaluate Your Current QA Program
Starting with your current QA program consider how input documents are received,
screened, and transferred into PCS, A set of questions, relating to the suggestions made in
Sections 1—4, are presented below. Go through these questions and use them to help verify that
your program contains all the components needed for a successful QA program.
B.2 Identify QA Problems With Your Program
As you read through the list of questions you are likely to identify two types of problems.
First, a suggested quality assurance component may be absent from your program. For
example, responsibility for quality assurance may have never been formally assigned to your
B-l

-------
staff. Alternatively, you may identify problems with a component that is already in place, but
is not functioning efficiently. For example, you may have kept a list of phone numbers of
problem facilities in the past, 'but it is now outdated. Any of the questions on the list that you
answered with a "No" or "< 50s may identify a potential problem with your QA program. As
you work through this list of questions, you may identify other problems with your PCS QA
program. Make a list of all problems as they occur to you.
The next step is to determine whether these problems are, or could potentially, affect your
PCS data quality. Make sure to consider all four of the PCS Data Quality targets (timeliness,
accuracy, completeness, and consistency).
B.3 Isolate the Source of Current QA Problems
It is important to realize that some problems may be caused by factors that are not at first
apparent. For example, your staff may be failing to log-in DMRs correctly when they arrive.
The cause of this problem may not lie in the design of the form but may simply be that they did
not attend a training course on the log-in procedures used in your office. Review your list of
problems and try to identify the ultimate cause of the problem.
It is also important to realize that you cannot solve every problem. Some errors occur by
random chance and cannot be corrected (e.g. lost mail). Other problems may be corrected only
at a different administrative level (e.g. PCS computer system enhancements). Focus your
actions on the problems that you can correct through local action.
B.4 Identify Potential Solutions
Once you have identified the problems affecting the quality of your PCS data, isolated their
ultimate causes, and determined that they can indeed be solved at your local level, then list all
of the potential solutions for each problem.
B-2

-------
When Identifying potential solutions you should also consider whether changes in staff
management could be applied effectively. Some management topics to consider include bow QA
responsibilities are assigned and evaluated, if PCS training has been adequate, and if documented
procedures are available to all staff who may need them, A more complete discussion of
management factors worth considering is presented in Section 4.
B.5 Select and Implement Improvements
For each problem' select the one solution from your list of potential solutions that best suits
your particular set of circumstances. Some questions to consider when making this decision
include:
¦	How costly is each solution in terms of staff time and money?
¦	Is a particular solution likely to create a different problem?
¦	How quickly will the benefits be seen in terms of PCS data quality?
¦	Can more than one problem be solved with the same solution?
¦	Is there applicable solution to the problem in the Sample PCS QA MamaV.
After you identify the solution with the highest return to your program, obtain any needed
resources, and implement the solution. The final, and very important, step is to completely
document the actions that you have taken. By recording your actions, you provide a reference
for future evaluations and create additional documentation for your QA program.
B.6 Regularly Re-evaluate Your QA Program
Plan to evaluate your program at regular intervals. Conditions, personnel, budgets, and
regulations continue to change and unless your QA program is fine-tuned periodically, it will-
soon become inefficient and outmoded. OWEC suggests that every program be evaluated on a
yearly schedule.
B-3

-------
QUESTIONS TO EVALUATE YOUR PCS QUALITY ASSURANCE PROGRAM
PCS DATA QUALITY OBJECTIVES
What percentage of your staff knows your data quality target for:
timeliness?
~
<
50
~
51-75
n
>
75
accuracy?
~
<
50
~
51-75
~
>
75
completeness?
~
<
SO
~
51-75
~
>
75
consistency?
~
<
50
~
51-75
~
>
75
QUALITY ASSURANCE PROCEDURES
Dote Capture
Do you keep a list of input due dates?
Do you keep a list of phone numbers of problem facilities?
What percentage of PCS input documents
are logged in when they are received?
What percentage of the time do you
follow up on non-receipt or missing data?
Bala Transfer
Do you keep a problem / solution log?
What percentage of your input documents
are reviewed using checklists?
What percentage of your Sample DMRs
are written and coded using checklists?
Edit mid Update Ermr Correction
Do you have established procedures to review
Edit and Update Audit reports?
What percentage of the time do you use standard procedures
to correct identified errors?	~
What percentage of the time do you use the PCS Edit (Dummy)? ~
~
No
n
Yes
~
No
~
Yes
~
51-75
~
> 75
~
51-75
~
> 75


~
No
~
Yes
~
< 50
~
51-75
n
> 75
~
c 50
~
51-75
~
> 75
~
No
~
Yes
~
51-75
~
> 75
~
51-75
~
> 75
B-4

-------
PCS Data Base Qrniity Control
%)Vhat percentage of the time do you use;
the Quality Assurance Report?	D	< so ~	51-75	~ > 75
customized Data Quality Audits?	~	< so ~	51-75	~. > 75
special techniques to examine the PCS data base?	~	< 50	~	51-75	~
> 75
STAFF MANAGEMENT
Assignment of Staff Responsibilities
Have you designated a PCS quality assurance overseer?
What percentage of relevant PCS job descriptions
include assignment of QA responsibility?
What percentage of the staff member's QA accomplishments
are evaluated during their performance evaluations?
Establishment of Attainable Goals and Targets
Are your timeliness goals as stringent as the national targets?
Are your accuracy goals as stringent as national targets?
Are your consistency goals as stringent as national targets?
Are your completeness goals as stringent as national targets?
Trucking Performance Against Goals
Do you measure your DMR data quality monthly?
Do you measure your non-DMR PCS data quality quarterly?
Self-Assessment of Your Qiatity Assurance Program
Do you review and evaluate your PCS QA program at least yearly?
Management of Data Input Personnel
Does your staff have enough available time to do a good job?	~ No ~ Y«
Are non-performing staff reassigned or reprimanded?	~ No ~ Y«
Are hiring procedures completed quickly for new staff?	~ No ~ Yea
MmmgimeM Commitment to QuaBy
Have you issued an office statement emphasizing PCS data quality?	~ No ~ Y«


~
No
~
Yes
~
< 50
~
51-75
~
> 75
~
< 50
~
51-75 ~
> 75


~
No
~
Ym


~
No
~
Yes


~
No
~
Yes


~
No
~
Yes


~
No
~
Yes


~
No
~
Yes
»

~
No
~
¥®s
B-5

-------
Communication between PCS Users	¦
Do you have a phone list to call for help with PCS problems?
Is your office represented at the yearly national PCS meetings?
What percentage of your staff have written
PCS QA procedures readily available to them?
What percentage of your staff have attended PCS training classes?

~
No ~
Yes

~
No ~
Yoi
~
< 50 ~
51-75 ~
> 75
~
~
£
V
51-75. O
> 75

B-6

-------
APPENDIX C
SAMPLE
PERMIT COMPLIANCE SYSTEM (PCS)
QUALITY ASSURANCE MANUAL

-------
This appendix provides a sample manual for Environmental Protection Agency (EPA)
Regional offices and states to use in developing and documenting their own quality assurance
procedures for the Permit Compliance System (PCS). It is based on guidelines presented in the
main body of this manual and draws information from documented QA procedures provided to
EPA Headquarters by the Regional offices.
The sample manual is written as if it were the PCS QA Manual for EPA Region 11.
This approach allows the presentation of concrete, usable examples of methods that are
particularly effective in increasing the quality of PCS data. Users of this sample manual should
be aware that for many procedures there may be other equally effective methods. The methods
and procedures presented here have been selected because they are simple to use and will
produce the desired result of improving PCS data quality.

-------
Sample
Permit Compliance System (PCS)
Quality Assurance Manual
Table of Contents
INTRODUCTION 						 C-l
1.0	Overview							 C-l
1.1	PCS Quality Assurance Program			. C-l
1.2	PCS Data Quality Standards 					C-2
1.3	Key Ingredients for a QA Program		 . . 		.		 C-4
1.4	QA Goals for PCS			,	C-5
PCS OPERATIONS . 						,		 C-6
2.9 Overview	C-6
2.1	Data Capture	. . , ,				 C-7
2.1.1	General Document Processing 			C-7
2.1.2	DMR Logging and Transmittal	,	C-10
2.1.3	Problem Facility Ftione List		 . 		C-14
2.2	Data Transfer			C-15
2.2.1	Permit Facility Level Data Entry 			C-l8
2.2.2	Permit Pipe Schedule / Limits Data				 C-19
2.2.3	DMR Data							C-22
2.2.3.1	Sample Discharge Monitoring Report (DMR) 		 . C-22
2.2.3.2	DMR Data Entry Checklist			C-28
2.2.4	PCS Range Checking			C-31
2.2.5	Compliance Schedule Data 					 C-34
2.2.6	Enforcement Actions Data				 C-36
2.2.7	Inspections Data			C-38
2.2.8	Pretreatment Compliance Inspection Data 		C-39
2.2.9	Pretreatment Audit Data				 C-40
2.2.10	Permit Event Data 				C-41
2.2.11	Evidentiary Hearing Data 					C-42
2.2.12	Single Event Data 			.C-43
2.3	Edit and Update Error Correction 			 C-45
2.3.1	PCS Edit Audit Reports 					 C-45
2.3.2	PCS Update Audit Reports 					 C-49
2.4	PCS Data Base Quality Control 		.	C-51
2.4.1 Pre-formatted & Quick Look QA Retrievals 		C-52
2.4.1.1	PCS Quality Assurance Retrieval 		C-53
2.4.1.2	Facility Report (FA) 					C-56
2.4.1.3	Limits Summary					C-58
2.4.1.4	Abbreviated limits Summary . 				C-59
2.4.1.5	DMRs Printed Report 				C-60

-------
2.4.1.6	Quick Look Report for Reviewing Permits		€-62
2.4.1.7	DMR Administrative Report (DA) . »	. €-64
2.4.1.8	DMR Administrative Report by Parameter (DP) . 		€-66
2.4.1.9	DMR Non-Receipt Report 			 C-68
2.4.1.10	DMR Summary Report CDS) . . . .			C-7Q
2.4.1.11	Violation Log Report				 .C-72
2.4.1.12	Inspection Scheduling Report 		C-74
2.4.1.13	Compliance Schedule Forecast Report 		C-76
2.4.1.14	Enforcement Action QL Retrieval 	. . C-77
2.4.2 Special Processing					.C-79
2.4.2.1	Effluent Data Statistics (EDS) 	€-79
2.4.2.2	PCS Quarterly Noncompliance Report (Selective QNCR) . . . €-83
2.4.2.3	PCS Quarterly Noncompliance Report (Coordinator's QNCR) €-85
PCS MANAGEMENT 			.			C-86
3.0	Overview 				.C-86
3.1	Staff Training 		C-88
3.2	PCS Phone Contacts					€-88
3.3	PCS Documentation List . 						€-88
3.4	PCS Staff Responsibilities 				€-92

-------
list of Figures
o
Figure
1.1	Summary of Recommended PCS Data Quality Targets ........... C-3
2.1.1	PCS Data Transmittal and Processing Form . . 			C-8
2.1.2	DMR Log-in Form 					 ,C-12
2.1.3	DMRs Due Report (DMR Administrative Report) . .			C-14
2.1.4	Problem Facility Phone list					C-16
2.2.1	Form A for a Sample DMR						 C-24
2.2.2	Form B for & Sample DMR			C-26
2.2.3	Effluent DMR Data Key Screen 					C-31
2.2.4	DMR Data Entry Screen with Range Checking	C-32
2.3.1	Edit Audit Report (Rejected Transactions) 			 C-46
2.3.2	Edit Audit Report (Accepted Transactions) ................. C-47
2.3.3	Edit Audit Report (Summary)				 C-48
2.3.4	Update Audit Report (Rejected Transactions)	C-49
2.3.5	Update Audit Report (Accepted Transactions)		C-50
2.3.6	Update Audit Report (Summary) 			 C-50
2.4.1	PCS Quality Assurance Retrieval					C-54
2.4.2	Facility Report 			C-56
2.4.3	Limits Summary Report 				 C-58
2.4.4	Abbreviated Omits Summary Report 		C-59
2.4.5	DMRs Printed Report 				C-60
2.4.6	Quick Look Report for Reviewing Permits 		 C-62
2.4.7	DMR Administrative Report					C-54
2.4.8	DMR Administrative Report by Parameter					C-66
2.4.9	DMR Non-Receipt Report 			 			 C-G9
2.4.10	DMR Summary Report			C-70
2.4.11	Violation Log Report Retrieval			C-72
2.4.12	inspection Quick Look Retrieval	C-74
2.4.13	Compliance Schedule Forecast Report Retrieval .............. C-76
2.4.14	Enforcement Action Quick Look Retrieval Statement			C-77
2.4.15	Statistical Loading Report Generated From EDS 		C-80
2.3.16	Quarterly Noncompliance Retrieval (Selective QNCR)	C-83
2.3.17	Quarterly Noncompliance Retrieval (Coordinator's QNCR) . 		C-85
3.1	PCS Staff Training 			C-89
3.2	Phone Numbers for User Support 			 C-90
3.3	QA Responsibility for PCS Positions 	C-93

-------
Region II
Permit Compliance System (PCS)
Quality Assurance Manual
SECTION 1
INTRODUCTION
1.0	Overview
This manual is designed to provide you with quality assurance (QA) procedures for entering
and maintaining reliable and consistent data in the Permit Compliance System (PCS) which may
be used with confidence in environmental decision making. The manual contains three sections.
This first section serves as an introduction and presents background information on PCS and
Region 11 *s PCS QA program. Topics include PCS data quality standards, key ingredients of
the QA program, and the QA goals for PCS. The second section of this manual focuses on QA
in Region IPs PCS Operations. Four areas in the PCS data handling process where QA
procedures should be implemented are discussed - data capture, data transfer, data edit and
update error correction, and data base quality control. To assist in the development and
evaluation of QA procedures in each of these areas, several forms have been included for the
staffs use. The final section of this manual focuses on PCS Management. Topics briefly
discussed include assigning staff responsibility, establishing attainable goals, tracking
performance, and managing data input personnel, among others. Several forms are also
presented here to help you carry out these management functions.
1.1	PCS Quality Assurance Program
Region 11 is committed to cleaning and protecting our Nations surface waters by carrying
out the goals of the Clean Water Act. Under Section 402 of the Act, all facilities that discharge
wastewater into waters of the United States must obtain a National Pollutant Discharge
Elimination System (NPDES) permit. As a condition of these permits, facilities must
periodically report their compliance information to EPA or to a NPDES delegated State. This
C-l

-------
data is entered into PCS, The Region uses information retrieved from PCS to assess
performance In achieving the NPDES program objectives.
To meet the information needs of the NPDES program under the-Clean Water Act, EPA's
- Office of Wastewater Enforcement and Compliance developed PCS for tracking compliance and
enforcement status. It is a computerized information system for the automated entry, updating,
and retrieval of information on NPDES permitted facilities. PCS is designed using a data base
management system called ADABAS. It resides on the EPA IBM mainframe computer located
in Research Triangle Park, North Carolina.
The PCS Policy Statement, issued in 1985, designated PCS as the national data base for the
NPDES program and established the minimum required standard for data entry - the Water
Enforcement National Data Base (WENDB). Moreover, the PCS Policy Statement requires all
direct users of PCS to develop a QA program that includes monthly tracking of the level of data
entered, appropriate time frames for data entry, and nationally consistent standards for PCS data
completeness and accuracy.
1.2 PCS Data Quality Standards
How well Region ll's PCS data meets the definition of quality is evaluated based on an
objective assessment of four measures - timeliness, accuracy, completeness, and consistency.
Headquarters has proposed national standards for each of these measures which are based on
input from the Regions and States. The recommended standards are presented in Figure 1.1.
Region 11 has adopted these national standards in their entirety for its own PCS QA program.
Timeliness refers to the "punctuality" of information in the data base — as measured by the
length of time between the actual event (or receipt of information about the event) and its
appearance in the data base. The PCS targets for timeliness vary by the type of data being
entered into the system.
C-2

-------
Summary of Recommended PCS Data Quality Targets
Timeliness
Accuracy
Completeness
Consistency
Entered within 5 Working
95% of the
95% of the
100% of the
Days of Receipt of Report,
WENDB
WENDB
WENDB
Application, or Action
elements entered
elements entered
elements use

into PCS should
for each facility.
appropriate value
¦ Permit Facility Data
be identical with

defined
¦ Compliance Schedule
those reported on

nationally.
Data
the DMR, permit


¦ Enforcement Action
or other input


Data /
document.


Enforcement Action



Key Data



¦ Single Event Violation



Data



¦ Permit Events Data



• Evidentiary Hearing


Data



Entered within 10 Working



Days of Receipt of Report,



Application, or Action



¦ Pipe-Schedule Data



¦ Parameter-Limits Data


1
¦ Inspection Data



¦ Pretreatment PCI Audit



Data



¦ Measurement/Violation



Data



Entered within 30 Working


1
Days of Receipt of Report


I
¦ Pretreatment



Performance Summary



Data



Figure 1.1

-------
Accuracy refers to the absence of erroneous data resulting from mistakes during My point
in the dala preparation, entry, or transmission process. Errors sometimes result from mistakes
by key-entry personnel, but they can also be introduced by program or facility personnel who
prepare the source documents used for data entry.
Completeness refers to the amount of required data present in the data base at a specific
point in time. Completeness is important to assure that all pertinent information is available for
use when it is needed. The PCS Policy Statement has designated the WENDB elements as the
minimum set of data elements required for PCS,
Consistency refers to the extent to which appropriate values are used for a data element as
defined nationally (in the case of WENDB data elements). For management reports to be the
most effective, data must be comparable over time within the area of interest. If valid
comparisons are to be made then comparable codes or values must be used for the same data
elements over time and in different geographical area (slate or Region).
1.3 Key Ingredients for a QA Program .
Region 11 's QA program is based on a thorough understanding of the recommended
national PCS data quality standards for timeliness, accuracy, completeness, and consistency.
Combining this knowledge with well-documented procedures and effective management
commitment has produced a successful QA program resulting in better PCS data, smoother PCS
operations, and improved operation and management of the NPDES program in Region 11. The
success of the Region ll's PCS QA program is due to the adoption and customizing of eight
ingredients to meet current program needs. These ingredients are:
¦	Measurable, well-defined data quality objectives
¦	Well-documented data collection and handling procedures
» Procedures for detecting and correcting errors
¦	Procedures to measure and track performance against goals
C-4

-------
¦	Clearly assigned staff responsibilities and oversight
¦	Adequate documentation, training, and communication
¦	Consistent management commitment to data quality
¦	Periodic review and evaluation of the QA program,
1.4 QA Goals for PCS
Every QA Program should set overall goals and establish data quality procedures to achieve
these goals. Among legion ll's QA goals for PCS are:
¦	Ensure consistency of data
¦	Ensure data is reliable for use in environmental decision making
¦	Identify areas where PCS software problems or enhancement recommendations should
be forwarded to Headquarters
¦	Ensure valid compliance/noncompliance statistics are reported to Headquarters,
Congress, and the general public
¦	Pinpoint areas where re-training efforts are necessary
¦	Ensure data is reliable for data integration efforts (e.g., multi-media studies)
¦	Provide effective oversight of delegated programs.
This manual has considered each of the key ingredients for a QA program and incorporates
them into its procedures in order to meet our QA goals for PCS and the NPDES program.

-------
.)

-------
SECTION 2
t4' %\
PCS OPERATIONS
2J Overview
There are four major areas in Region IPs NPDES data handling process whore QA
procedures have been established to ensure that high quality data are entered and maintained in
the system -- data capture, data transfer, edit and update error correction, and data base quality
control.
¦	Data capture — procedures relating to document handling prior to its entry into PCS, such
as logging in Discharge Monitoring Reports (DMR) from NPDES facilities.
¦	Data transfer — procedures relating to the entry of information from the input documents
into PCS, such as screening and entering the data from a DMR into PCS using PCS-ADE.
¦	EdM & update ermr mrreetioa — procedures to correct data errors resulting from the PCS
Edit or Update process, such as identifying and correcting errors from a PCS Edit Audit
report.
¦	Usm base quality control - procedures to identify and correct data errors in the PCS data
base, such as running the PCS QA Retrieval to identify missing or invalid data elements.
The remaining portion of this section considers each of these four areas in turn. By paying
proper attention to data integrity in these areas, you can help maintain Region ll's PCS data
quality at its high level.
C-6

-------
2.1 Data Capture
Data Capture
sgmmtmmmmmiimmmmMmM
aaaaagaaBesa
This section of the manual documents data capture procedures for the NPDES program
documents processed by the legion. Dab capture includes the receipt and sorting of input
documents, their preliminary review for missing or inaccurate data, and their submission for data
entry, la Region 11 date for entry into PCS is reported oil various source documents such as
the permit, the DMR, enforcement actions, and inspection reports. These procedures include
identifying due dates for reports where possible, logging and performing initial screening on
input documents, and verifying that input documents are properly routed for data entry.
The Region has designed two log-in forms to help you manually track source documents
received and when they are routed to the appropriate personnel for data entry. They are:
¦	PCS TmnsmiMtd emd Processing Form
¦	Discharge Monitoring Report (DMR) Log-in,
2.1.1 General Document Processing
As NFDES documents are received by the EPA Unit Supervisor, they should be date
stamped. The "PCS Data Transmittal and Processing Form5* used for forwarding work to the
PCS contractor (shown in Figure 2.1.1 on page C-8) should then be completed specifying the
type of work to be performed and an EPA Log number assigned to the request. (DMRs are
generally handled In batches and recorded on a separate DMR Log Form. Procedures for
logging and transmitting DMRs are discussed in Section 2.1.2) The EPA requestor should also
indicate the date the documents are sent to the PCS contractor, verify that all accompanying
documents are included, and note any special requirements in the "Special" or "Comments"
areas of the form.
C-7

-------
Mail Code
Special
EPA Due of
Receipt
Type of Bequest
PCS Initials
Log No.,
Received
Original
Original
Received
Contractor
Original
COMMENTS:
Distributed to
Requestor
Figure 2,1.1
C-8

-------
Instructions to Complete v
PCS Data Transmittal and Processing Form
~	1, Record the Requestor, Mail Code, if Special Processing is required, and Date of
Receipt in the boxes across the top of the form.
~	2. Check the type of request. Add appropriate remarks to "Comments" section at bottom
of form.
~	3, Initial the form, assign a log number, and record the date received and transmitted to
PCS Contractor.
C-9

-------
Upon receipt of the "PCS Date Transmittal and Processing Form," the PCS contractor
should enter the date received and verify that all accompanying documents have been included
to process the work. For example, reissued permit revisions and corrections, permit
modifications, administrative outers, and consent decrees should be accompanied by a current
Limitation Summary report. Any request for special or priority work should be given the
highest priority and processed within two updates if possible. Upon completion of the work,
the contractor should enter the "Completed Contractor" date in the form and the "Distributed
to Requestor* date and return it to the EPA requestor.
2.1.2 DMR Logging nod Transmittal
All NPDES permits require the submission of DMRs to the legion or sate. He DMRs
represent a summary of the results of monitoring tests taken by the permittee over a monitoring
period in accordance with the limits established in the permit. The states for which Region 11
enters data record their DMR information on preprinted DMRs,
You may use the manual form in Figure 2.1.2 to track when DMRs are due or you may
run a DMR Administrative Report at the Parameter Level. This report will automatically
identify those facilities which are due within a specified time frame. Once the submission dates
are established, reports which have not been received should be flagged and follow-up steps
should be taken to determine why the DMR is late. When the DMRs are received at the
Regional office the following steps should be taken by the EPA Unit Supervisor:
~	Date stamp and copy.
~	Order by state and NPDES number.
~	Perform initial screening for obvious errors such as incorrect NPDES numbers,
missing data, and changes to any of the preprinted fields on the preprinted DMRs.
~	Record information on Log-in Forms.
C-10

-------
~	Submit copies of the DMRs, accompanied by a log sheet, to PCS contractor personnel
for further review and data entry.
~	The PCS contractor removes the log sheet, compiles against the batch of DMRs to
ensure that all forms were actually sent for the facility, and verifies that all the
facilities are listed.
~	The PCS contractor notes any discrepancies and returns the batch to the PCS Unit
Supervisor for corrections.
~	The DMR Log-In Form is filed in the Contractor Log book for DMRs.
If the number of DMRs exceeds that which can be easily tracked manually, use the
Generalized Retrieval to generate the due dates for the DMRs from PCS. The retrieval logic
to produce the report is shown on page C-14 and a sample report are illustrated in Figure 2.1.3.
The DMR Log-in Form would then be used only to transmit the DMRs in batches.
C-ll

-------
DMR Log-in Form
NPDES
Number
Pipe
Number
Facility
Name
Date Due
Date Received
Date Transferred
to Data Entry






























































































































Figure 2.1.2
&
C-12

-------
Instructions to Complete DMR Log-in Form
~	1. Record the NPDBS Number In first column, the pipe numbs in the second column,
the name of the facility in the third column, and the DMR due date in fourth column.
~	2. When the DMR is obtained from the facility, record the date received in the fifth
column. As appropriate, contact facilities to check on DMR status. Record any
relevant notes from your conversation.
~	3. Review the DMR Log-in form periodically to identify any overdue reports.
~	4. As the DMRs are transferred to Data Entry, record the transfer date on form.
C-13

-------
DMRs Administrative Report by hnntda-
Example PCS Generalized
Retrieval, Retrieval
statements produce a
itport similar to the
example on the following
page.
W SYNTAX-MO SM7-m PR1Y*2 IOMD®DMBBP	VM*>mDS
01 HQ DP FOE RBG8WXI
10 SEW HQ11
20 BP
30 WTTH SUDB OE 010 M
mmmwmiMmM
M DEFAULTS W EFPBCT «
JOiftCCKQA
10PTYPAB
Output produced from the above retrieval statements which can be used as a DMR Log-in form.


»cs mm. Msmmmnm isksst
wifmmEtm
mfsmsBMmx
hk*t rswtxk mmim • maim
FACILITY OOF-
WBimtCATtm MJL
WOT MKMIRtUKO
TVHs DK)
mmtmtm mi «jb bate tm vnumw
LOCXTOt ATEVa AT STATE EVEBT
fAMMETmS
fEBMrrMfe WA
JOHKSOOW MAIOK
¦ffEWMnQNAL TO
p.o. mx n
fmal mam
esmmm mmtm
QSmSVMJUB
BfflttJ4M.Tr
psiwj.a
DMOVeffiUE
joctwr mm
fml nam
EFFLsenr ®mm
QSOSS VMJUl
m
BME OVERDUE
murr-NOi owa
MSmWIKCE
CWMCAL CO
PjO. «2
WH. W2SWS
estojeht w/nm
mass value
MJD,5-BAT
p>nso.c)
I*® overdue
mvm

Btmrnr wwss dms wekbue
mass value
sam,« oomN^r ox thsu tBBATMerr fukt
Figure 2.1.3
2.1.3 Problem Facility none List
The Problem Facility Phone list (illustrated in Figure 2,1.4 on page C-16) is to aid you
in contacting the facilities in Region 11 that are habitually late in lending in their reports.
Instructions for completing the form are cm the back. There are several other methods of
maintaining a phone list, such as using a personal computer software package to maintain a word
processing document or a dBase file. Such an approach is an efficient method if a personal
computer is available for use.
C-14

-------
1J2 Data Transfer
Data Transfer
Data transfer includes events taking place after the Input document has teen received and
logged-in and transmitted to the PCS Contractor until the date has been keyed into tte system.
Examples include the detailed screening of the input documents, coding the input, resolution of
any obvious problems (for example, missing data), and entering the data into PCS using PCS-
ADE, PC-Entry, or batch entry. When pursued carefully and diligently, the screening of PCS
data and the timely resolution of the identified problems before data entry will have a significant
effect on PCS data quality. Region 11*1 QA program focuses on establishing and documenting
standard procedures for each major PCS data information type or MPDE5 source document (i.e.,
Facility Information, DMRs, Permits, Inspections, and Enforcement Actions) so that data are
correctly entered into PCS.
There are two techniques used in Region 11 for entering data into PCS — a batch method
and an interactive method using PCS-ADE. Tte batch method involves using the PCS Data
Entry software package on a personal computer. It fates the data and converts it to a cud
format to be uploaded to the main frame. The interactive method involves logging into QCS and
using PCS-ADE to send the data directly to the mainframe. The majority of the data entry in
Region 11 is done using PCS-ADE with the exception of DMR data which is frequently entered
using PC-Entry.
This section is organized to present QA procedures for data transfer by the type of NPDES
document received for processing in the Region, for example, the Permit, the DMR or an
Inspection Repent. Instructions for preparing, screening, and entering the document information
and follow-up steps are presented for each of the documents in a checklist format. Hie check-
list approach provides for an orderly review method of the most critical data elements and
ensures each step in the process is followed.
In addition, procedures for using the Range checking option to check measurement data for
validity are discussed in Section 2.2.4 after the DMR data entry procedures.
C-15

-------
Problem Facility
Bum® list J
NPDES Facility Name
Number i
Contact
Phone
Problem / Comments 1
/ Action






































































Figure 2.1.4
C-16

-------
Instructions to Complete
Problem Facility Phone list
Record the NPDES Number in first column and the name of the facility in the
second column.
Record the contact person received in the third column and their phone number
in the forth column.
Briefly note the problem and action taken in the last column. Record any relevant
notes from your conversation.
C-17

-------
2.2.1 Permit Facility Level Data Entry
Facility Level data includes general information from the NPDE8 permit describing each
permitted facility. For example, the facility's name, address, classification, and design flow rate
are included in this data type.
Permit Facility Level Data Entry Checklist
Preparation
~	1. PCS Assistant / Specialist ensures that all of the following input documents are
available:
~	Application
~	Public Notice
~	Permit.
Screening
~	2. PCS Assistant I Specialist reviews documents to check for:
~	Blank permit event dates
~	Original issue date is not greater than current issue date
~	Current issue date greater than reissue, modification, effective, or expiration
date
~	Effective date greater than expiration date
~	Reissue date greater than effective and expiration date
~	Permit modification date greater than effective and expiration date.
Hate Emry
~	3. PCS Contractor enters data using PCS-ADE.
~	4. PCS-ADE automatically checks for the following:
~	Valid NPDES number
~	Valid PCS codes (SIC, City, County, etc.)
~	Valid Alpha! numeric entry.
FoUow-Up
~	7. PCS Contractor reviews the Edit Audit and Update Audit reports using the
checklists in Sections 2.3.1 and 2.3.2, verifies accepted transactions, and resolves
and corrects errors.
~	8. PCS Contractor ensures all input documents are initialed, are marked completed
with the completion date, and are stoned in the permanent files.
C-18

-------
2.2.2 Fermi! Pipe Schedule / limits Data
This data type contains information relating directly to permitted effluent limits and
parametric requirements,, in addition to information relating to DMR submission and reporting
requirements for specific permitted outfalls.
Permit Ffpe Schedule / limits Pali Entry Checklist
Pftpamtmn
~ 1, PCS Assistant / Specialist ensures that one of the following input documents is
available:
~	Permit (includes general permit)
~	Administrative Order
~	Modified Permit
~	Consent Decree.
~ 2. PCS Assistant / Specialist retrieves PCS data using Limits Summary Report or
DMR Printed Report and Quick Look for Reviewing Permits to compare to new
permit requirements. Examples of these reports are illustrated in Section 2.4.
Screening
0 3. PCS Assistant / PCS -Contractor compares reissued oc modified permits to the
information in the limits Summary to determine what changes are required.
Particular attention should be given whan making comparisons to ensure that the
PCS Assistant i PCS Contractor:
~	Verifies and notes PCS issue and expiration dale.
~	Notes seasonal, quarterly, annual pipes, parameters.
~	Verifies and notes interim/final limits and associated start/end dates.
~	Notes any additional monitoring requirements added from Part HI of Permit.
~	Explains any unique permit requirements.
~	Checks for any DMR date greater than effective date of permit.
~	Determines appropriate modification number and COLS if limits are from
Consent Decree or Administrative Outer.
C-19

-------
Permit Pipe Schedule / Limits Data Entry Checklist (continued)
~	Examines Monitoring Locations (MLOC) to ensure that they are coded
correctly, that the code used accurately reflects monitoring location
requirements stated in the permit.
~	Examines Unit Codes to ensure that they accurately reflect the units stated
in the permit. Especially check that milligrams per liter (mg/1) and
micrograms per liter (ug/1) are used as stated in the permit.
~	Examines Statistical Base Codes to ensure they are correct.
~	Examines Limit Data to ensure that data in PCS under the headings LQAV
(Limit Quantity Average), LQMX (Limit Quantity Maximum), LCMN
(Limit Quantity Minimum), LCAV (Limit Concentration Average), AND
LCMX (Limit Concentration Maximum) are exactly the same as is written
in the permit.
~	Examines any Special Conditions on the Permit to ensure they are correctly
coded in PCS.
~	Examines start and end dates to ensure that these is no overlap with the
reissued permit start and end dates.
~	4. PCS Assistant / PCS Contractor marks any changes on current Limits Summary
Report.
Data Entry
~	5. PCS Assistant routes material to appropriate personnel for data entry.
~	6. PCS data entry personnel enters outfalls and limits into PCS using PCS-ADE.
PCS-ADE will verify:
~	Alpha/numeric values
~	Overlapping dates
~	Valid PCS codes (unit codes, etc.)
~	Valid transaction code
~	Presence of COLS
C-20

-------
PermH Pipe Scttedufe / limits Data Emtry Checklist (oratimiecD
~	7. PCS Contractor reviews the Edit Audit and Update Audit reports using the
checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and resolves
and corrects errors.
~	8. PCS Specialist / PCS Contractor runs new limits summary after updates and
compares to source documents for 100 % accuracy. Any errors are returned to
PCS Contractor for data entry.
~	9. PCS Contractor ensures all input documents are stored in the permanent files.
C-21

-------
2.2.3 DMA Data
The DMR form is used to record the results of monitorifig required by the NPDES pennit.
These forms aie preprinted with the effluent limits contamed in the pennit and mailed to the
permittee for reporting. In Region 11 the primary method of coding and entering DMR data is
the DMR Data Entry Checklist approach. The Region alio supports the Sample DMR approach.
2.2.3.1 Sample Discharge Monitoring Report (DMR)
The Sample Discharge Monitoring Report is one method for ensuring that NPDES permits
are written in a form that can be accurately coded and entered into PCS. To use this method
successfully, it is vital that the Permit Writer and the PCS Coder communicate effectively.
The Permit Writer initiates the process by completing the Sample DMR forms with the
required information from the draft permit concerning the facility, contact person, parameters
and numeric limits, sample type, and frequency of analysis. Any special conditions concerning
the pennit are noted on the form. It is often helpful to the Permit Writer to communicate with
the PCS Coder concerning coding limits that may apply in Region 11. Once the Sample DMR
is completed, it is sent to the PCS Coder.
The PCS Coder thai inserts the proper PCS code for all information on the same form.
Special permit conditions are also coded directly on the Sample DMR forms. The PCS Coder
clarifies any questions with the Permit Writer and informs the Pennit Writer of any specific
procedures concerning Region ll's coding limits. This back and forth communication between
the Permit Writer and the PCS coder ensures that the finalized permit ran be accurately recorded
and coded in PCS.
As the Permit Writer and the PCS Coder become more familiar with the Sample DMR
process, they can work together to help each other become more efficient. For example, the
Pennit Writer can indicate appropriate codes for parameters and unique monitoring conditions.
The PCS Coder can help the Permit Writer understand the standard procedures for coding
information for PCS. The Permit Writer and the PCS Coder may use the checklist m the-"
following page as a guide to complete the Sample Discharge Monitoring Report.
C-22

-------
Sample Discharge Monitoring Report (DMR) Procedures
FOR THE PERMIT WRITER: (See Form A for a Sample DMR completed by At Permit
Writer.}
Using a copy of the permit, complete the Sample DMR by including, at a minimum, the/allowing
items:
~	I. Facility name and address
~	2. Name of person to whom form should be sent
~	3, Facility permit number
~	4. Discharge number (e.g. 001, 002)
~	5. Any notes to appear in upper right corner, at your discretion
~	6. Parameters (e.g. flow, BOD-5)
~	7. Numeric limits
~	8. Statistical base codes (e.g. monthly average)
~	9. Units (e.g. lbs/day, mg/1)
~	10. Frequency of analysis (e.g. continuous, 2/week)
~	11. Sample type (e.g. recorded, 24-hr. composite)
~	12. Note any special monitoring conditions that apply to specific parameters (such as
monitoring location) on "Comments and Explanation" section at bottom of DMR.
~	13. If a parameter has seasonal limits, enter the different limits in two rows and specify
the dates that they apply.
~	14. Place an "X" in the boxes that do not have limits
~	15. Add any additional explanations as necessary for the PCS Coder
FOR THE PCS CODER: (See Form B for a Sample DMR completed by the PCS
Coder).
Insert the following codes on the Sample DMR:
~	1.	Parameter codes (e.g. 50050, 00310)
~	2.	Monitoring locations (e.g. P, Q)
~	3.	Season Codes (e.g. season 1 and 2 for BOD and Ammonia)
~	4.	Statistical base codes (e.g. 01, 13)
~	5.	Unit codes (e.g. 03, 26, 19)
~	6.	Frequency of Analysis codes (e.g. 99/99, 02/07)
~	7.	Sample type codes (e.g. RC, 24, GR)
~	8.	Add any permit conditions as specified by permit writer
~	9.	Clarify any questions with permit writer
C-23

-------
Figure 2.2,1
C-24

-------
«
>)
*
i
1
l
J

T



«


c
t
, 1
»
K
a
ifi
t
3

ft


k
H
m
S

3
(4

G
£
£
4
L.
i i i s i i
tin
Mil
I I I I
I I I
t I
I I
I 1
I I I I I
1 I I II
1 II I I
UMI
I 1
I I
I 1
I 1
I I
I I
I I
I £
I I
I
i i
i i i
" i
i
i i
i i
m i i
ii
i i
I M 1 I l
I I M I J
dl! Ill
E
J * -
i j s,
i'f i
¦ i -
t"- 5
i

n>
T>
<3
of
X
x
§"?b
"3
0
sr

•I
ill I
*>» i •<
I
ii s
| * o
$ 1
If
I I
I * 3
| * «
Si
-sf»
is-.:1!
\i\nk\
0
<
Figure 2.2.1 (Continued)
C-25

-------
Figure 2.2.2
C-26

-------
Figure 2.2.2 (Continued)
C-27

-------
2.2,3,2 BMR Bats Entry Checklist
Preparation
~ 1. PCS Assistant / Specialist ensures that the following input documents are
available:
~	Preprinted Discharge Monitoring Reports (OMRs)
~	Limits Summary.
Screening
~ 2. PCS Assistant i Specialist screens all preprinted DMRs as follows:
~	A. Orders DMRs by NPDES number within a state.
~	B. Checks the following fields for legibility and for alteration by permittee:
NPDES number, limit type, Dischage/Designator number, and
Monitoring Period.
~	C. Checks remainder of DMR to ensure permittee has not altered any
preprinted field (especially parameter and monitoring location codes).
~	D. Checks Units field for each parameter to insure that permittee has not
altered preprinted units and that units match Limitations Summary. If
needed, converts to correct units.
~	E. Checks each parameter for real numerical values or acceptable PCS
codes. If fields are left completely blank, or if "N/A" or "Not
applicable" is entered, checks to see if the following pertains to the
parameter:
~	a) the comments field at the bottom of the DMR
~	b) the "Frequency of Analysis" field (if < monitoring period)
~	c) if the parameter is "paired" (i.e. winter/summer temperature).
~	F. If parameters have been added to preprinted DMR, checks Limitations
Summary.
~	G. Checks "Frequency of Analysis" field for valid codes.
C-28

-------
DMR Data Entry CisecMlst (contiiHied)
:3
~	H. Checks bottom and rear of form and attachments for Notice of Non-
Compliance,
~	I, When screening and coding process is complete, initials and dales DMR
in sal penal DMR and initiates Data Entry process.
O 3, PCS Assistant / Specialist completes code-sheet for data entry,
~	4. PCS Supervisor reviews code-sheets to;
~	Check for blanks
~	Verify all MGD flow values greater, than 100
~	Check for reported maximum values greater than reported average values
~	Verify pH values less than 4.0 or greater than 11.0
~	Check for coding consistency
~	Check key data elements.
~	5. PCS Supervisor marks questionable items and discusses them with staff.
Data Entry (for corrections mi fat submissions)
~	6. PCS Assistant / Supervisor enters data using PCS-ADE or PC-Entry.	. j
~	7. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid key values
~	Valid PCS codes,
~	8. PCS Assistant / Supervisor notes warming messages and discusses them with
appropriate person.
~	9. PCS Assistant / Supervisor enters corrections.
Data Entry (for Minority of DMRs)
~	6. PCS Assistant / Supervisor routes code-sheets to data entry personnel.
~	7. PCS data entry personnel keys preprinted DMRs:
~	Reviews current limits summary
~	Notes any changes to limits on DMR
C-29

-------
DMR Data Entry Checklist (continued)
~
8.
~
9.
~
10.
~
11.
~
12.
~
13.
FoUow-Up
~	14.
~	15.
~ Batches, logs and enters data.
PCS data entry personnel enters data using PC-ENTRY (Batch).
PCS data entry personnel conducts dummy edit.
Data entry personnel reviews rejected transactions from dummy Edit Audit report.
Data entry personnel researches and edits data set prior to Mve edit submittal.
PCS data entry personnel discusses problems with PCS Supervisor.
PCS data entry personnel corrects rejected transactions and submits live edit.
PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors. Verifies non-reporting violations in PCS.
PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-30

-------
2.2.4 PCS Kaoge Checking
Range Checking is one PCS optional feature that may be used to improve the accuracy of
your measurement data. It identifies measurements with values lying outside i range considered
to be valid and highlights extremely high or low values which warrant Anther investigation.
Several different sets of valid range can be specified allowing you to tailor your range-checking
to specific requirements. The PCS Range Checking capability is available for both Batch Edits
and PCS-ADE.
Checklist to Use PCS Mange CtagcMsig
Using the PCS Range Checking Capabilities with Batch Edits:
~	I. Enable Range Checking option by using the appropriate ICL statements.
~	2. Select National Default Table or Region or State-specific table by using the
appropriate JCL statements.
~	3. Examine all identified values on the report that foil outside of the normal range
and determine their validity.
Using the PCS Range Checking Capabilities with PCS-ADE: Range checking may be
enabled in PCS-ADE on both the EVIO and the EDMR screens* This example will employ the
EDMR screen.

PCS-ADE
EFFLUENT OMR DATA
KEY SCREEN
PCDEKEYD
SCREEN ID; EDMR
PERMIT f


TRANS CODE _


DISCHARGE NUMBER	


REPORT DESIGNATOR	


RANGE CHECKING? Y/N _


TABLE©:


ACCEPT? Y/N/M: _

VERSION o.n «iate>
Figure 2.2.3
C-31

-------
Checklist to Use PCS Range Checking (continued)
~	1. Enable option through the Effluent DMR Dam Key Screen. (Enter Y at Range
Checking? Y/NJ
~	2. Meet National Default Table or Region or State-specific table through the
Effluent DMR Data Key Screen. If no table is specified, the National Default
Table will be used.
When range checking is enabled, the Effluent DMR Data Screen will display the message
that Range Checking is turned on In the upper light hand comer (See Figure 2.2.4).
 EFFLUENT OMR DATA	PCDEEDMS
	SCREEN ID: EDMS
PERMIT #	 TRANS CODE	RANGE CHECKING B ON
DISCHARGE NUMBER _ REPORT DESIGNATOR		TABLE ID: XXXXX
MONITORING PERIOD END DATE		- USE FEZ TO OVERRIDE
DMR RECEIVED DATE		
PIPE NO DISCHARGE REASON CODE
S	••••
A
V
!

R
R
R
R
N
T
M
1

E
FREQ. S
U
C
O
U VPRM
L
I —QUANTITY	
		CONCENTRATION—-—	
- X
OF A
N
u
D
S
O
I AVG MAX
MM AVG MAX
C
ANAL. M
T
N
I
1:
2:		
3:	
5;
-
1
1
1
1
1

-
	
—
—
—
6:		
7:	
-
1
1
		 	 		
_
	
—
—
—
ACCEPT: Y/K/P/N/M:	VERSION N.N 
Figure 2.2.4
~ 3. Examine any value identified as felling outside of the normal range. If the value
is valid press PF2 to override.
C-32

-------
At any time during the PCS-ADE session, the range-checking environment can be changed:
range-checking can be tinned on or off, and the Table ID can be changed. This is accomplished
by returning to the EDMR Key Screen and changing these two values.
When range-checking is enabled, each measurement entry is compared against the
corresponding ranges in the selected Table Of an entry exists for that parameter). If the
measurement falls outside the range allowed, a Fatal error will occur. In order to allow PCS-
ADE to accept that measurement entry (assuming it has been correctly entered), the user will
have to press the override function key displayed in the range-checking notice on the data entry
screen. This will force PCS-ADE to accept the data. The override function will only apply to
the currently displayed range-checking error. If multiple range-checking errors are found on a
measurement or group of measurements and the override function is necessary, it will have to
be invoked for each error to be overridden. The override function will have no effect on any
other edit errors.
C-33

-------
2.2.5 Compliance Schedule Data
The Compliance Schedule contains information to back the scheduled versus achieved dales
of specific milestone events which are conditions of die facility's permit The Compliance
Schedule violations, such as failure to achieve a milestone or failure to submit a required report,
are automatically determined from this information.
Compliance Schedule Data Entry Checklist
Pmpamitem
~	1. PCS Assistant I Specialist ensures availability of one of the following input
documents:
~	Permit and Modified Permit
~	Administrative Order
~	Consent Draw.
Screening
~	2. PCS Assistant / Specialist prepares code-sheets for data entry.
~	3. PCS Supervisor reviews code-sheets to:
~	Check for blanks
~	Check for coding consistency
~	Check key data elements.
~	4. PCS Supervisor marks questionable items and discusses them with staff.
Da®. Entry
~	5. PCS Supervisor routes PCS-ADE code-sheets to data entry personnel for entry to
PCS-ADE,
~	6. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes
~	Record on file (new transactions)
~	Record exists in data base for transactions being changed, deleted or mass
deleted.
D Associated violation does not exist for delete transactions.
C-34

-------
Compliance Schedule Data Entiy Checklist (continued)
~	7. Data entry personnel notes warning messages and discusses with appropriate
person.
~	S. Data entry personnel reenters corrections.
FoIIow-Up
~	9. PCS Assistant / PCS Contractor reviews the Edit Audit and Update Audit reports
using the checklists in Sections 2.3.1 and 2.3,2, verifies accepted transactions
and resolves and corrects errors.
~	10. PCS Assistant / PCS Contractor verifies compliance schedule number against
events to ensure correct number is used on Pretreatment, Municipal Water
- Pollution Prevention, and Orders for Information.
~	11. PCS Assistant / PCS Contractor verifies that all schedule numbers related to
formal enforcement activities have a docket number present.
~	12. PCS Assistant / PCS Contractor runs a printout of schedule violations with
associated enforcement actions to ensure action is taken on all violations.
~	13. PCS Assistant / PCS Contractor ensures all input documents are stored in the
permanent files.
C-35

-------
2.2.6 Enforcement Actions Data	0
This data type includes "mfammtim related to erifefcemen! actions taken by the authorized
agency against the facility is a result of effluent limits violations, non-receipt of DMR's or
compliance schedule citations.
Enforcement Actions Data Entry CireeMM
PmpamMom
~	1. PCS Assistant / Specialist ensures that all of the following input documents are
available:
~	Administrative Oder
O Consent Decree
~	Warning Letter
~	Record of communication
~	Compliance Review Action Sheet (CRAS)
~	Meeting.
Screening
~	2. PCS Assistant / Specialist prepares code-sheets for data entry.
~	3. PCS Supervisor reviews code-sheets to:
~	Check for blanks
~	Check for correct compliance schedule number
~	Check for coding consistency
~	Check key data elements.
~	4. PCS Supervisor marks questionable items and discusses them with staff.
Data Entry
I
~	5. PCS Supervisor routes code-sheets to appropriate personnel for data entry using
PCS-ADE.
~	6. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes
~	Enforcement action key data elements do not exist in data base for new
transactions
C-36

-------
Enforcement Actions Data Entry Checklist (continued)
~ Enforcement action key data dements do exist in data base for transactions
being changed, deleted or mass deleted.
~	7. Data entry personnel notes wanting messages and discusses with appropriate
person.
~	8. Data entry personnel reenters corrections.
FoUow-Up
~	9. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors.
~	10. PCS Assistant / Specialist verifies that every enforcement action has an associated
violation and a docket number in PCS.
~	11. PCS Assistant I Specialist verifies Payment and Consent Agreement is entered on
Administrative Penalty Orders.
~	12. PCS Assistant / Specialist ensures all input documents are stored in the pemanent
files.
C-37

-------
2.2.7 Inspections Bats
This data type contains information relating to the inspections of facilities, such as the
performing inspector, inspection scheduling information, and inspector comments.
Inspections Data Entry Checklist
Preparation
~	1. PCS Assistant / Specialist ensures that Inspection Report is available.
BaM Entry
O 2. PCS Assistant f Specialist enters data using FCS-ADE.
~	3. PCS-ADE automatically checks for the following:
Inspections Data Entry ChecMlst (continued)
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes.
~	4. PCS Assistant / Specialist proofs all transactions (both accepted and rejected).
~	5. PCS Assistant f Specialist discusses problems with inspector or engineer/scientist
performing inspection or audit.
~	6. PCS Assistant / Specialist re-keys rejected transactions.
Follow-Up
~	7. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2, verifies accepted transactions and
resolves and corrects errors.
~	8. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-38

-------
2.2 J Pretreatmenl Compliance Inspection Data
This data type Includes information from either a Pretreatment Audit or a Pretreatment
Compliance Inspection.
Pretreatmtng Cotapllaisee Inspedlcm Rita Entry Checklist
Preparation
~	1. PCS Assistant / Specialist ensures that the PCI - Inspection Report is available.
Screening
~	2. Inspector prepares code-sheet for data entry and forwards it to appropriate
personnel for data entry using PCS-ADE.
Data Entry
~	3. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Parent inspection on file
~	Valid PCS codes
O " Type of inspection - P
~	Totals are compared for SIUS, CITJS, NOIN, and NOCM.
~	4. Data entry personnel notes warning messages and discusses them with sppmpnme,
person.
~	5. Data entry personnel reenters corrections.
Follow-Up
~	6. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors.
~	7. PCS Assistant / Specialist verifies Pretreatment Compliance Inspection dates in
inspection family against PPETS.
~	8. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files*
C-39

-------
2.2.9 Pretreatmefit Audit Data
N, -
This data type includes information from the Annual Report or Pretreatrosat Performance
Summary (PPS) submitted by a prctreatment facility.
Pretreatment Audit Data Entry Checklist
jfcparefuin
~	1. PCS Assistant / Specialist ensures thai Pretreairaoil Audit - Inspection report is
available.
£crven&»f
O 2. Compliance Engineer/Scientist prepares code-sheet for data entry.
Bmki Entry
~	3. Compliance Engineer/Scientist routes ©ode-sheet to appropriate personnel for
entry using PCS-ADE.
~	4. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Parent inspection on file
~	Valid PCS codes
~	PSSD must be entered for new, change, and replace transactions.
~	5. Data entry personnel notes warning messages and discusses them with appropriate
person,
~	6. Data entry personnel reenter corrections.
Follow-Up
~	7. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors,
~	8. PCS Assistant / Specialist verifies Pretreatment Audit dates in inspection family
against WETS.
~	9. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-40

-------
2.2.10 Permit Event Data
The Permits Event data type contains information to tracks the events related to permit:
issuance.
Pfermit Event Data Entry Checklist
~	1. PCS Assistant / Specialist ensures that the Permit and Modified Permit are
available.
Screening
~	2. PCS Assistant / Specialist prepares code-sheet using the Permit and Modified
Permit.
D®Ui
~	3, PCS Assistant / Specialist enters data using PCS-ADE.
~	4. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes
~	Key data elements do not oust in data base for new transactions
~	Key data elements do exist in data base for transactions being changed or
deleted.
~	5. PCS Assistant / Specialist notes warning messages and discusses with appropriate
person.
~	6. PCS Assistant / Specialist reenters corrections.
FoUow-Vp
~	7. PCS Assistant I Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors.
~	8. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-41

-------
2J.il Evidentiary Hearing Data
Ttie Evidentiary Hearing data type contains information related to a facility's appeal or
negotiation of limits or compliance schedules at evidentiary hearings.
Evidentiary Hearing Data Entry Checklist
Preparation
~ 1. PCS Assistant / Specialist ensures that all evidentiary hearing documents arc
available.
Data Entry
~	2. Data altered by Regional Counsel using PCS-ADE.
~	3. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes.
FoUow-Up
~	4. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors.
~	5. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-42

-------
2.2.12 Single Event Data
This flflta type contains informatioii describing violations not related to effluent limits or
compliance schedules.
Single Event Data Entry Checklist
PmpamtmR
~	l. PCS Assistant / Specialist ensures that all of the following input documents are
available:
~	Ncm-compliance report
~	Inspection Report
~	Record of communication
~	Administrative Order
~	Consent Decree.
Screening
~	2. PCS Assistant / Specialist prepares code-sheet
~	3. PCS Supervisor reviews code-sheets to:
~	Check lor blanks
~	Check for correct compliance schedule number
~	Review coding for consistency
~	Check key data dements.
~	4. PCS Supervisor returns questionable items to staff for resolution.
Data Entry
~	5. PCS Assistant/ Specialist enters data using PCS-ADE.
~	6. PCS-ADE automatically checks for the following:
~	Valid alpha/numeric entry
~	Valid date ranges
~	Valid PCS codes
~	Enforcement action key data elements do not exist in data base for new
transactions
~	Enforcement action key data elements do exist in data base for transactions
being deleted (change transactions not allowed).
C-43

-------
Single Event Data Entry Checklist (continued)
~	7. PCS Assistant / Specialist notes warning messages on code-sheet and discusses
them with PCS Supervisor,
FoUow-Up
~	8. PCS Assistant / Specialist reviews the Edit Audit and Update Audit reports using
the checklists in Sections 2.3.1 and 2.3.2. Verifies accepted transactions and
resolves and corrects errors. Also verifies that related enforcement action/SNC
is entered into PCS.
~	9. PCS Assistant / Specialist ensures all input documents are stored in the permanent
files.
C-44

-------
2.3 Edit udi Update Error Correction
Edit & Update
U Error Correction
This section of Region IPs QA manual focuses on procedures to correct data errors
identified as a result of the Edit/Update process in PCS. In Region 11, PCS users enter data
into the system via two methods - PCS-ADE or batch mode. PCS-ABE is an on-line,
interactive method that allows users to input data directly into the IBM mainframe. On-line edit
processing allows for the immediate correction of unacceptable data entries. Batch data entry
involves the entry of transactions Into tie system without interactive edit and correction through
the use of PC-Entry, An Edit Audit report is generated from batch process submittal. A
Dummy Edit may be run first to verify the data. The Update Audit report is generated from
both batch and on-line data entry. The procedures outlined below were developed to identify
and correct rqected transactions on the Edit and Update Audits. If the error rate on the Edit
or Update Audits exceeds ten percent, immediately notify the PCS Supervisor so that the PCS .....)
data capture and entry process can be evaluated and corrected.
2.3.1 PCS Edit Audit Reports
Hie Edit process evaluates each transaction in the batch submittal. Those transactions with
fetal syntactical errors are rejected while those with correct syntax are output to a transaction
file. A Dummy Edit may be run prior to the line edit to avoid entering inaccurate data into the
update. An Edit Audit Report containing three actions is generated. One section lists
transactions rejected by the Edit processor, one section lists transactions accepted by the Edit
processor, and a third section lists a breakdown of the transactions submitted by data type and
transaction type with the number of transactions accepted or rejected. Examples of the three
report sections are shown on the following page with instructions for reviewing the Rejected and
Accepted sections.
C-45

-------
Checklist to Review PCS Edit Audit Reports
Example Output from Edit Audit Report (Rejected Transactions)
mm mm mwn raw cdmpuawce system edit ambit wmm auditkft
Emam: U USER-ID: JXL BATCWP; update
PAGE: 1




mm* li
KEIBCra) TttANSACIKWS

transaction nhbbs
DATA


WASMNOWOK
ID TYJH NUMBER KBY VALUE
ELEMENT
OEM
VALUE
ERR MSG. JPCTOIMATTOW®
MV cm XXOOMCM BBTA 9 3 OCDIO1 0 0 9J11S0





MVK>
0
E90
KBCOKBNOTONmj «

VDffi
0
Y


MCAV

3?


mcmx

49


VQAV
<3
OOGSSO


vqmx
G
mm®


VCMN
<3
oasm®


VCAV
G
mm


VCMX
a
mam


vwcs

warn

MV CHO JQWmSi BETA 9 S CODO 1 00 WU30





MVS£>
a
BSD
StBCO8DM0TOMFH£ (W)

VMD
o
Y


MCAV

35


MCMX

M>


VQAV
o
eooooo
.

VQMJC
a
<100009


VCMNT
43
flOOCKX)


VCAV
<0
MJ0QJ?


VCMX
€5
000011


VWCS

M0BI7

Figure 2.3.1
For the rejected transactions on the Edit Audit Report
~	1. Check "ERR MSG, WAMMNG(W)» OR INFORMATION(I)" column for reason
that transaction was rejected.
~	2. Research and resolve error message.
~	3. Route corrections to appropriate person for entry into PCS data base.
C-46

-------
Checklist to Review PCS Edit Audit Reports (continued)
Example Output from Edit Audit Report (Accepted Transactions)
•m wtfi to/us**	»fff«tjr eawiMKe fismt wit mmtf »wt	tuottflpf
PiMr |
¦ffton: H WII-IIJ DEI. PtCH-IOi f*?T Iff	KgW; ||
IMMiCntMl
NMUCtlW WHl	QUI
lt> TT*t Mim	Rlf v4tlM	flCMKT MM	¥*11*	CHUM* UMmttti It IWQjUTTPII H
« 11 Mmnih Mil t
^ ^	it" * i Hill I • • 
-------
Checklist to Review PCS Edit Audit Reports (continued)
Example Output from Edit Audit Report (Summary)
nil *i1(; •»/!*~*«



ffiif iUBl? WHO!
nuns? ca*Liwct ivittN
lEitWi 14


Alttaot
mti 4
nttOMt i«
fRIMtUTlON IT*E
?®fAL
mt
«
e
HfJCCTCB
CHANGS BftPMCC
ttCt^rtl NJCCTEB ACCEPTED PE.lt ft (9
GEtCtt
IttClifO HJI
sat*
»f*Cfirr
ICCIfltft
f*«niT rtcam
•
I
1
# i * *
I
•
l.l
R*MV MBIftl
•
•
1
• II*
1
•
l.l
PtPC «OH«C
i
•
•
lit*
1
I
J«t.l
NAUflll LUUft
»
•
•
I • I •
¦
c
IM.ft
MliilliM VMUttM
•
a
•
# * « I
t
•
hj
KKOWl
•
•
•
till
•
•
i.i
cn#ttMi vraufiw
I
•
•

•
l
f.i
•i#«aeciitifv ktism
1
i
t
I • • *
*
V
•ti
KWCIWIff UVI
«

~
• ill
I
•
i.i
WPtCTrw Minti
•
•
•
III*
I
•
l.l
nwvciKM
•
•
•
III*
i
•
i.i
mvamu iMSPVcfracs
wm tvMii
•
•
t
i
4
i
It**
* i * i
#
*
*
•
l.t
*.*
ivttiPftlfln MtJfltVttt
•
i
•
* ft » •
*
I
*.i
mtrt
•
«
•
* i * «
i
*
l.l
SVim Vint
•
«
t
¦ it*
•
*
i.i
WlflEAmMT SWHtfTT
•
•
•
i » i' *
«
I
•*i
ICS f«Hf UPMTI
•
*
•
t * • i
#
1
*.*
amtm mMietiM
#
«
~
* • i I
*
I
*<•
10HL3:
1?
t
•
lift
it
•
m.t
Figure 2.3.3
For the Edit Audit Summary Report
~ 6. Verify totals as a check that you have examined all transactions.
C-48

-------
2.3.2 PCS Update Audit Reports
Data base update transactions accepted by the Edit processor "batch mode'' and PC5-ADE
are initially stored for input to the PCS "Update processor. When the Update is executed (usually
on Mondays and Thursdays) by EPA Headquarter's staff, the data are actually incorporated into
the PCS data base. The Update process generates ail Update Audit report of rejected and
accepted transactions and an Update Audit Summaiy report identical in formal to the Edit Audit
Report generated for batch transactions.
i
Checklist to Review PCS Update Audit Reports
Example Output from Update Audit Report (Rejected Transactions)
RUM DATE: 0MW9J	PERMIT OQMPIJAMCE SYSTEM REOUIAR UPDATE AUDIT REPORT	AUOfTSFT
REOIM: U USSR.®; Ml BATCHES: BK0WM	M€» i
KBCHON; It
BSIBcrEB TRANSACT!®®
TiAHSACnOM npdbs	data	wabmnqwos
US TYPE mMBER KEY VALUE	ELSMSMT OEN VALUE ERR b©0. MFOTtMATlOM®
PL CHQ XXQQ25755 COM 95 01094 1 0 I
LTYP	P	SEASONAL LOOT CHO LEAVES MVWfOUM
eusd	nam	elbd change may reevaluate mbaj ro
ELED	940131
Figure 2.3.4
For the rejected transactions on the Update Audit Report
~	1. Check 'ERR MSG, WARNING(W), OR INFORMATION^" column for reason
that transaction was rejected.
~	2. Research and resolve error message.
~	3. Route corrections to appropriate person for entry into PCS data base.
C-49

-------
Checklist to Review PCS Update Audit Reports (continued)
Example Output from Update Audit Report (Accepted Transactions)	
mm mTEi mnmt
KMOTCDHIUWCimilH BBGUUUt UHU1B AUSST REHJKF

mwiw

STATCH: XX USES-H): EKLEATCm 8SUWN

rA£ffij2

aocot
BDTSANU
kCTIOWS

¦BOBMt it
m«ts*CTwh mm
DATA


WAtMWOWOa

D TYPE fflMBEg. .KE1
r VALHfi ELSMBTT

VALUE DUtMSO.
SKPt®M«K»S®

is cm xxmim
ssmo s s nasi son
G
•MQ


ib cm xxmnsi
Mmecsfami
<#
flUM


IS OK mmytffim
ram 11928B1 son
fl



B osi jzssmxm
son

mu

*
Figure 2.3.5 .
For the accepted transactions on the Update Audit Report.
~ 4. Check 'ERR MSG, WARNING(W), OR INFORMATION®" column. Review
information (I) messages to identify any affecting PCS data quality. Resolve, if
necessary.
Example Output from Update Audit Report (Summary)
HN , <
XJt
¦fCiifcM ire iff AUOIt
Ffomf C9»itHCe ITSHK
*»«f«9«cvraM
WH
INTO
wcifiti whcci?*	njfCTn mxsptcb KJutm
HUtl
fllJieTEer
«« KHnui
lint*

tmncTiv
S1NHI VfDfl9 n©9
«» talll *
fMVtllt
t
1
t
9
t
«
s
8
I?
t
9
9
*
9
ft
0
8
9
«
«
0
9,9
in.i
9.9
m.«
l«a.S
9.9
i.i
t.l
•.I
9-9
9.9
9.9
9»9
9*9
9*9
9.9
9.9
9.9

9T,9
Figure 2.3.6
For the Update Audit Summary Report
~ 5. Verify totals as a check that you have examined all transactions.
C-50

-------
2.4 PCS Data Base Quality Control
'|..S A	H * v *	f +
i-rt; f'psrta jCagtue


S^SS^SSSSSK
as

	
:S pnroic CcBt^qii';'>';
Data Bus®
Quality Control
Once the clam has been entered and uploaded Into the PCS data base, the focus of
Region 11*8 quality assurance program shifts from preventing new errors to identifying and
eorrecting existing errors in the data base. Various PCS Generalized Retrievals are used to
determine the quality and quantity of Region ll's data in PCS and to highlight existing errors
in the data base which require correction. Three types of PCS Repents are available:
~ Pte-farmatted Reports — System generated reports with standardized formats. They
provide comprehensive information on all of the data found in PCS and are
particularity useful in verifying the accuracy of data in PCS and identifying when
information is due to EPA for entry into PCS.
a User Designed Reports — The user generates the types and details of information to
be displayed and the format of the resulting report These "Quick Look" Reports
alow data to be reported on from any of the data types in PCS and are useful for users
in tailoring the amount and types of information required to QA their data.
o Special Preceding Reports — These reports are a subset of the predefined reports
which are run to track compliance information or produce date which can be
statistically analyzed. In addition to providing information on the program, they serve
as excellent QA reports because the data in the system must be accurate and
comprehensive to produce reliable reports. A thorough review of these reports
frequently spotlights arras requiring QA. Examples of these reports are the QNCR
and the EDS report.
This remainder of this section presents examples of each of these retrievals for use in
quality assuring the data already contained in PCS. Card images are given for each retrieval.
A short checklist which identifies the major items to examine in each of the printed reports
follows as necessary.
C-51

-------
2.4.1 Fre-formatted & Quick Look QA Retrievals
Pre-formatted standardized Generalized Retrievals and Quick Look Retrievals are most
useful for QA. Often, several of these retrievals are used together to examine data at a
particular level (data type). The examples below present a variety of retrievals to QA the
different types of data found in PCS, They provide the retrieval logic necessary to produce the
report, the general format of the report, and the steps necessary to review the output. The
retrievals are grouped according to the type of data being QA'd. Below is a summary of the
retrievals which will be discussed in this section.
All Data Types
¦	Quality Assurance Retrieval
Facility Level data
¦	Facility Report (FA)
Penult limits data
¦	Limits Summary
¦	Abbreviated limits Summary
¦	DMRs Printed Report
¦	Quick Look Report for Reviewing Permits
DMR Data
¦	DMR Administrative Report (DA)
¦	DMR Administrative Report by Parameter (DP)
¦	DMR Forecast Report
¦	DMR Summary Report
¦	Violations Log Report
Inspections Data
¦	Inspection Scheduling Report
Compliance Schedule
¦	Compliance Schedule Forecast Report
Enforcement Actions
¦	Enforcement Action Quick Look Retrieval
C-S2

-------
2.4.1.1 PCS Quality Assurance Retrieval
Hie PCS QA Retrieval produces a pre-formatted report that displays all the WEKDB
dements for all data types in PCS and their associated error messages cm a permit by permit
baits. This retrieval examines each WENDB element ami determines whether it, or t related
dement, has a missing or invalid value. The following statements apply to this retrieval:
~	Any 10 card selection statements cm be used including 11 OR cards.
~	The 20 card QA is used for the report type. This will only display Facility data. To
display more than Facility data use one of the three following options:
~	SECTIONS — Lists all data dements for the data types specified, whether there
are errors or not. Four 20 WITH cards can be used with this selection:
~	SECTIONS =M
20 WITH MVDT GE MMDDYY
20 WITH MVDT GE MMDDYY
~	SECTIONS -I
20 WITH DTTN GE MMDDYY
20 WITH DUN LE MMDDYY
~	MISSING - Lists only data dements that are missing from the data types
specified.
~	ERRORS — Lists only the data dements that have error messages from the data
types specified.
~	The data types that can be selected for the three options are:
F	- Permit Facility and Permit Tracking Events
I	- Inspections
C	— Compliance Schedules
O	— Outfalls
L	— Parameter Limits
M	— Measurements
E	- Enforcement Actions
P	- Administrative Penalty Orders
H	-- Evidentiary Hearings
S	- Single Events
C-53

-------
o
PCS Quality Assurance ReirieTafe (rontinued)
N — Pretteatment Inspections / Audits
X - Pretreatment Performance Summary
A — Selects All Data Typo
Example of PCS QA
retrieval statements:
m 8OTAX-mJOlUD-QAnir*223mY>411ME-5MlKN-D0afCDnBS-l
01 HQ QUALITY ASSURANCE RBTRIBVAL
ffl FORSBOKltfll
10 MHD ALXXffiMI&SO
10 MHB ALAE0OO0SSS
K> QA SBCTW©»B»L
« DEFAULTSIN HPF®cr •*
10 MCCBQA
10 FfYFAB
Output from QA retrieval statements:
UK:	PtMt I
OMtm	KEPVI
iCtftlM KI
MwwmwtiMwuimmwiwiHBwuw*
* PKHtf FACILITY ft»TI	•
l#»:	FHWSt NM«(* PSieaiftM CM	l«OI: *
W» VHUC9

luum
r vaim



NMIlf


I AS?
*


net
n


1KC
A


fti n
*


aic*
rtu


wpo
mi


em
i»»


OCT*
Ifi


BASS
«!*•


tPSF
«


turn



Mil



«*rr
>*«

nam

5S
Mill)


w
t




raw

M9Y
•
*****


ft
m *st»s ro n wiiie mm oywsr rassr? «*?i
flfUTi» «r« esuotsf - m
* mwm. tpm, RRfBH.ii §&u
mm- CTiwiwt h»uj urns pgimm com	nasi*
Kit V«ttSi	tUMCNT VAUft
* *« id« ** *	4P-WWA*
VMlWI*)**	VTV #7lt#|
nsT «
H«M»—«W—
* Uf9T W1A
ItMSTT
KEf VIU19
mi	uniT e?cobi »s mssiae r* mis net
xxsmQ?vieefAt	ms	iihit esom ass mmum fm mm wm
Figure 2,4.1
C-54

-------
PCS Quality Assurance Retrievals (continued)
Once the report is generated, follow the checklist below.
~	1. Examine the output for each data type requested and check each error message.
~	2. Identify the invalid or missing WENDB data dements. A single missing WENDB
elemeM may msuM in mmy §rmr messages m rdated dam ekmems.
~	3. Research error message by examining original input documents and resolve,
~	4. Route corrections to appropriate person for entry into PCS data base.
Summary Section from Output of QA Retrieval;
Mm os/27m
¦QUALITY ASSURANCE RETRIEVAL
PAGB: 4

su&maby section


DATATYPE
Mmmomcosm
IKVAOD VALUES

mmnwMMire
0
1

B*SPECTXO«S
€
0

C0MFLWNOE SCHEDULE
a
0

WTFAU. DATA
0
0

PARAMETER UMrTB
0
4

MEASUREMENTS
0
0

EHFMCmffiOT ACTONS
0
0

ADMINISTRATIVE PENALTIES
0
0

evidential v hsamnos
0
0

SfflMOLE EVERTS
0
0

ntsnEBAIMENT ©WBCTJOW
0
0

nSJRSATMENT SUMMARY
0
0

TOTALS
0
5

Figure 2.4.1 (continued)
~	5. Review Update Audit report to verily correction has been made, as described in
Section 2.3.2.
~	6. Repeat the retrieval to verify that no more errors exist (especially in complex
cases or when many corrections were made).
C-55

-------
2.4»U Facility Report (FA)
Facility reports can be useful in QA by tracking the entry of data into PCS, especially for
permit modifications or re-issues. The Facility Report is a pie-formatted report which lists all
permit facility level information for a given NPDES facility along with any requested
information. In this example, Permit Events data are requested.
Example of Facility Report
retrieval statements.
09
SYHTAX«M>IOBID»CK3l3A.HME»(»M FSTYB2

OS
nN-DOOSaMr-US

01
30 FACHITY SEPOKT

10
W1OKJJOC0O0OW

w
fA SECTIONS®?


•• DEFAULTS M EFFECT ••

10
MGCBQA

10
wtbab

Output from Facility Repcrt retrieval statements

B*ft» tVM/n
PEimiT MIWR;
fMIOTf MSB i LEEBT tSW
(ttifirCHT Sf 1
tM&mm lebh at
giti s m&m mm%
C9MT7! 8f5 mmmi
*IAT* | PL HfOKMt M
PCS paci urv RETCST
* MMEUL FACILITY	*
«»t
msm mesas t xmssft?
mimrw&m
»atzm» s
Mtnoa utmi m ot»
ft£rm?r stabs* jc!W
«ni¥|TT »*T| f
EPft 1W HUWim 5
RHzm mumim
IISS
rm of QMBwiPt w
sec essi/sEscesp ?
iMsuifftiM cuss i «
WSA OP Ftt. S£S*|
res pacilit? io «
€B3@UB&1f5 10 *
IfATt peskiy mi S
&m.w
ft*
Sim Mlfl i «sm« 8£/fT*
emu such :
itttivwi wots I hist at
UnTUBE	f SS&itM IWUIWIi
w ut ose of accim?* t
jttmsg agszsai fim % i*w
riocMt. mm usse&ks j
fBt&L LIMIT 1N&IG&HP i
H&TB9 WMUU UMf US J
etcraiAr m erram %
ATTWNEVt	B»«EH» f NlJS
AUTManr in ?
erstmts? mn m i
flSPtt 8BF4? im
Mfft: MDPSi
tppis e sons
BBPg £BSe
Wftt 9IWK
¦DFft SMS
lmOiT{Hi
tl&3$ FJC IS&j
used 01PMED tilROir It
iimsi ssurrgi rami t r
9UB8S MB SSgfBBS S«MW¥ It
UQimi. SCSCfilFfBB;
mu miifici eoit
Wltlft lOCATIQMv
uwt tsaasttt am* tt/u/tt wani naw muiis? es/tvn ktot bmi emm* m/m/m
osttiittL PBMt ssas oaUi asm/m giisaugs ibms;
* wiuw sfomtm *
PRIM* iff® ftft&M WSRSSS.-
MiSDnr creek iiminKf sist
Mti Bses isu«s	nb
mm mt*
ruftio*	iiflsfl
CPB(AI
-------
Facility Report (FA) (continued)
Output from Faciity Report retrieval statements
datb: mmm

PCS FACILITY REPORT
PACE; 2
WMmm-.imma
• raw TSACSaHO DATA" PERMIT NUMBER: JO®B®«29
mMSTTM&STAMiAm


PACE: 2
TTFB OP AFfUCATSOM: LA STANDARD A OMC2MAL BSUE DATE llfWm NEW SOURCE COD &

FERWT ISSUED BY: EPA
MMBES OFKBBSUES: HEW SOURCE DATE:

PERMIT TRACKING BVENT
ACTUAL
SCHEDULED FW1IT TRAQtfflia 1VBNT BOBI
IDE2
commmcmrmm
SATE
DATE COMMENTS

WW* AFFIXATION lECEfVED
mmm
mmm qmwirsiqn

rsw AfHJCAHOK COMPLETE
mnsms


sm-99 draft rasMrmfJiuc hotkte
amsm


PEUfflTISSUBB
mimi
1U®I«

WW® reRMT EXPIRED
mmm
tm\m

*tmw raaMrrHTOcnvB
mmm


Figure 2.4,2 {continued)
~	1. For re-issue or modified permits, run a Facility report for existing permit.
~	2, Enter new data into PCS as per established procedure.
~	3, Once data has been entered into PCS, generate a Facility report for the new permit.
~	4. Compare new facility report against the old Facility report or against code shot to
verify the accuracy of coding and data entry.
~	5. When errors ire identified, inform the appropriate person. Follow up to ensure their
correction in the PCS data base.
C-57

-------
2.4.1.3 limits Summary (LS)
The limits Summary (LS) is a pre-formatted icperf which provides all the information
available in PCS cm pipe schedules and their related parameter limits associated with a specific
facility. In addition to providing the information on permit limits, it can be useful as a QA tool.
Among its uses are identifying changes to permit limits and verifying new data has been
accurately entered. It is also useful in verifying the accuracy of DMR data entered into PCS
(See Section 2.2.3.2).
Example of Limits Summary
retrieval statements.
00	S¥NTA5C-NOIOBID-LJMSaMr-2ISPKIT-4
m TWB«2M MN™D0QS C©FJES«!
01	HQ LMfTS SUMMARY FOR RBdION XI
10 NTOBQXXM345#5
20 LS
•• DEFAULTS IN EFFECT ••
)0 IACCBQA
10 PfYBAB
Output from limits Summary Report retrieval statements
frlTgc 0*/| I/9JJ
» »mm s* m * ims*	W* r fwiw
ricum vuniT	em «
CITY	X ME9TIM
COUMTY	j
; S3	t*
PCS UMIUT1WS UMRT »IK»T
IMS
*«•	iuu **»
ACTIVITY STATUS! ACtl^t
feffOT I98UC0 ;
PSJWIIf fXfJSIf 3
*«* fluff ail am **«
«JT-
*AU
oiprriti
SESC&XPTIQi
ACftVSTf MIWI9
»T«T
lifWT T9T41 JMITIAL UHITS
rate sm 9?*gr w
SfE OS6E
TtJf
tiviff e*$m
com. orrxciAi.
J
t
? mrtteQ ft LGtf WSRS «
t fCOTt STATUS* CITY
JNWn UDUTS
FIMfcL Ufim
SIM? OA
691A
siAsant ireit tnftmtm
tsifrreassf TTpfts
m wtr ffmxnuL
mi Acrtwro iuimi
sg einrreiTOsaroi
ACTIVt N/M/Bt «l W> M
Itl
«A
51
SiBimTATI«N iStntPSJ
gisauast to mpjce «»tss
61&VSTT TffSO®CtS^
isrti/s*
BliQfflCTiai tOfUiSMI
usefiu
?iPt camawTBi
TMESS SttLt 6
: Ml BIBOtARCg OF FHHT2N6 SDLXBS OS Ttl»li 983* IP ®*1«® mH8 l*AGff Af®W¥Tt*
«#• unm &A?A »»
wn-
F«u
LlDtf
rm
IWI19RM UXiTION
B8B* 3-iMT?
ft* flift. CI
H3H lee COM? $
AM IRUlHKiJCHI
«B»B
SHSAY
ie# ass, ci
non ice I
?»,/ 9*»PiE
; Tm/nn


I <*58 fMCt) «!/«?
eV3l/*ft - WIK-
W«l/*i DATS
t rrrnnnmnnrr c«nw>
# cms «rem m&7
#s/w» - «w-
PATf
iweAY	mmm m mm
m/dm rm mT *«»•* mn.
sob* w -m m
UN/MY	1430 f|4» HBri.
M/VM *«$.*«« fTt.m r«/i,
KB* IV T*A AV
* S»
!».•
. AY
*S
*?*«
79A AV
Figure 2.4.3
C-58

-------
2.4.1.4 Abbreviated limits Summary
(7)
This Quick Look report displays information similar to the limits Summary Report but in
a more user-friendly format
Example of Abbreviated
limits Summary retrieval
statements.
to stisbqxx
10 MHD KJ XSHMWHNN
X TOP=ra iHBAK-l
30 mm
40 MM> FTO4S MADID FUM CNTN SU8R TABI8 FEUD raKE TAB) SK3 tttt
40 / T«St WTO FOTmC HOT ftSDFLESTABl MJBJ HOT STAT tJQWC
laqav una lcuc
40 t/LCMN LCAV lcmx
40 / TAB*® OQLS TABW COW HIN SEAJ) AULS TAS® FSWM SAM? //
Output from Abbreviated Limits Summary Report retrieval statements.




WBRVXME1I LIMITATION
SUttAfiV




Cll «»¦"»* BMW
hpi© n**9

«s®s
run CltTM

pfs® c
«9t
Sltt

esse Pfit
Pfftt f^BST
Fi4Q pus siw stun mm
m» 9T«9 SSSJS HStf* WW
FBaf
mm it*p

HH0W
IIA StSffl ff*T URIC wv u
m ua£ iciw it**
ism




eou
cawp
pum mm *u*
mm s»b»


i
i
*
* MUM «
nAJ» r

98
«Vl7/« 07/91/91
MS

mw rascEfs m*
Mtmimv m» seas a


eMt HliD ¥
1
RQ0
«M1/«d OS/ll/fB M
a eEUSH
ROB Afr-Ef 2
SCU«H If
mwMMm
mum
91/19 OP
m
79
m
•*» r
i
6
rrm/ti is
9
it
mmmui
9,9
91/1® «

9.9
boum»tqih
9V9NNDK0 0|»H f
i
•
*?/«/*! 19
£* RUW
9
I»UIM 19
mu«M
01/99 CP
u
39
M2TV09C** itraiA
WTAl iil 9HII f
i
«
89/91/99 W*|/*l 19
E6 iiira*
i
PEUCM If
ffiwrmwN
seunsi* osuxm le
sm
HITTOfHt M1HU
rar*i, im r
i
1
BOB
9OTI/90 19/11/99 10
£4 $&UR&t
id M-IS I
if
1199*1 H»W9I1
eiutst
i,i

Ngfttpsui* Anrani*
TOTAL 
-------
2.4.1.5 DMRs Printed Report
This jwe-foimatted report is crated when DMRs are pre-printed and is used in conjunction
with the Quick Look Report for Reviewing Permits (Section 2.4.1,6) in order to manually check
against Use permit to ensure that data has been correctly entered into PCS, The DMRs Printed
Report provides an easy format to determine whether limits have been correctly coded.
Example of DMRs Printed
retrieval statements.
00	ttNTAXHMJCWH>>»I>MUFllMB>-2lnw, cm or
63Sca«iS! - itmr
BtSi$Bi*r5a T*fi
Wl - A ffMH
BI3€*a*OL «CWI
•UtlUHC $TF«1 LIME OHt
:i iiiitn
¦trait*
*i:aiTirsisat=!
r tM no
minus. cm

s*a#i - »a
titm tmsi
fiwii
tttwi fimi
9IMCI *it«»
im * tiaciT
IMM - 1
MM* - 1
- N
•s«tt
«»(• - «
1
9
M»)8 - &
»i«W - I
IIMt * ft
•IMS - i
lilK - I
«»l* - 1
itttt - I
Mil#
iSSt: I
Figure 2.4.5
C-60

-------
DMRs Printed Report (continued)
~	L Generate DMRs Printed retrieval.
~	X Compare limits report output to limits lit permit.
~	3, When errors are identified, inform the appropriate person. Follow up to ensure their
correction In the PCS data base.

C-61

-------
2.4.1.6 Quick Look Report for Reviewing Permits
The Quick Look Report for Reviewing Permits is used in conjunction with the DMRs
Moled Report to manually check against the permit that the data has been correctly entered into
PCS. It can verify that limits in PCS are accurate and have the coned statistical base codes and
unit codes.
Example of
Permit Limits/
Measurement
Violations
Quick Look
retrieval
statements.
00	swtaxwm© josro-amiwrTOffi-iM pkty»2 mw»dmss jufr-jss
01	HQ QL EEPORT FOR RET8JBVWQ PERMITS
10 WMBiQSHifflMSSS
30 QL TOP-YES
40 / NFH> FMMS FLEM RWAT MaDI PTBV FTAC/
40 / MISS LT¥PPKAM FitAM0 STAT F8AH SAMP/
40 /m^H^LQUClJQAVl^qWUaiCLCWLCAVIiMXAUJMUWaBANMODN
•• DEFAULTS IK EFFECT *«
10 SAOCBQA
10 HYP AS
Output from Permit Limits / Measurement Violations QL retrieval statements
0»

M KCPOfT WW RlltlEVlX WHfin
nn>«wn>

ft I
WWIMMHM QL



run
«HMT
rail

PfAC

ITf* 994*1 PtAff



91 AT rftftH



(LU
ets®
tmn teue
icnt ilcav um ail*
Btse ft
suss nSBM

x**9VtHS nmmt cm c*
fftir£si» tttr or
VM9VH14I l»IW. CIU ff
ffCSfSit. ttfV OF
XXIIMMI llildt. cm tf
niiiwf m«tm, cm m
itito*, cm or
f
f
t
f
9
f
f
«isw»(i *•* m »
^SgESfcSAleS «P It
egigiBttif* zmm m
smtm&im m*& n
getem?m asr hp «
SSSEIMATee «ll!r If
NM9 «4l/N
FlUt l«/ll/t»
pern
FS&tt
Aim ti/it/tt

t»>* t
iisi« em*
s-®**
(tl 01
[C* CI
13



MU f
•1/11 /N M
•1/SI/M ift
MMMI Ma

ilifltH
tm it
mum
nmfmm t
n 49 Tvmwm* i
If ii/ti fit
i
1

•4/11/^1
MIA f
tsfit mit
VQTAL
it
SMttftfi
i.f
t.t YVITf1V1ITTY
11 «f/«? t*
i
»

M/H/tl
MIA r
•Vll/M t*
•*•*• cam
nin
I4H tHI It
jr. iisaBtffs its
csurat
on
VTTftlMflfl
It *9 fTTITTffTTTf*
ii n/im h
t
I
»
t

M/tt/tl
•11* 9
tl/31/M
tIMt IM
0|SS!&¥1$
It
IAS
NUQt
»>
i.) i.f
11 •l/M S4
i
t

•II* r
|l/)j/«* It
•IMS &1CKE1. 9II9IUW IAS
SELfffi
R|(
#«* t.t mwrmrm
I# it/i# ft
•
#

tt/tl/f*
MIA f
IVM/M
•IMS !1«
hjmii
It
IAi IHI
esusvi
•,t t.t mmmm
it ti/it tt
1
%

Mim
MIA F
tK/11/ftft
11*14 COLI
it
t»t* fical w-pi amm
mum
i.i i,i nrntmm
tt «s^7 m
I
t

«%/*!/*!
MIA 7
•VSI/M
SNN FL&S
m ama
If MlJVM
r m mm mum Ptmt
m im
I) ftm S3S
I
9

•V*|/tt
MU f
tlfl/*t O
•VSI/M 11
INM G*US
§»?!
t.TI
tsi« WH
«£
ill
18

mvfiiifiiii
niHNMNi
<%t ti/ti n
I
f
»
9

Muftis* i
MIA F
81/11^4
•!•!• M».
S»&&V ffK
It
sot *bw*m.
DEUCffS
Kim) rj ftvmrmiv
3? fl/W CA
i


94/ aim
IS/U/M

m
95
pivm Rum
K
«

Figure 2.4.6
C-62

-------
Quick Look Report for Reviewing Permits (continued)

~	1. Generate the Quick Look Report for Reviewing Permits.
~	% Compare the information on limits in the report output to the permit.
~	3. What errors are identified, inform the appropriate person. Follow up to ensure their
correction in the PCS data base.
C-63

-------
2,4,1.7 DMR Administrative Report (PA)
The DMR Administrative Report is a pre-formatted report which provides summary
information on measurement violations tallied by pipe and date. Hie report lists the number of
nonreporting violations, late violations, effluent violations, and administrative deficient
violations. The report includes an ERRORS=YES option which you may use to request a DMR
Error Report which gives information on why a DMR will not be produced.
Example of DMR
Administrative Report
retrieval statements.
Output from DMR Administrative Report
PATEi «/!*/«







PASS

s



PCS M
AMmf&ATIVf ftiFOBT









res mim n sims









tapmt m/si/n - m/k/92












## total mm
*#

&#r-
imn
reHTfWMI
i mms m
&STE sue
hati sue
mt-
*PT
EPF
awt
FMEtttrr immtfUMtim
fAU,
*TP£
ud m
1
1
¦1
AT CPA
A? tTATf

u?e

mw
PEKrttT-MH mioiMi mm
®es$


IF
tGutvas
t*/£S/9S
?

8
t
mm ELKCTKIC










PQ sm 717

































esec
PWi
tt/31/9*
*


$
«

9
HftlWll. c®










its mm m*sw
nu
rsmt

t
M/tra

£

»
§
SKmtmus










xx iiitf
CH
fum

e
N/iS/ll

ft
«
«
9
PflmU-Mft: XXfttttTft MJOH
e#§t
fMmt

4

M/USW
6

9
*
W ftfCttlC-CS&lSlE&B










») H «HifCT
$$$»
rzmi


B®/1 "KG


•
e

xje lmi

rimL




*
9
«
»

»H»
FML

$


i
§
i
a
Figure 2.4.7
C-64
00 SYNTAX «=N© KMT**25$ PRTY-2 JOBID-BMIUM
00	nME=2M BINs= DOW
01	HQ DMR ADMINISTRATIVE REPORT PIPS LEVEL
m FOR REGION XI MINORS
I0 WEGN EQ II
10 MADI EQ M
20 DA
20 wrm sura ob 060192
20 wrm SUDB LE 063092
DEFAULTS IN EFFECT ••
10 IACC EQ A
10 FTYP AB

-------
DMR Administrative Report (DA) (continued)
-*m»*
~	1. Generate DA report for a specified timeframe.
~	2. Review the report to identify those facilities with violations.
~	3. Run additional retrievals to review facilities with violations and make corrections if
possible. If further investigation is required to resolve, discuss with appropriate PCS
contact.
C-65

-------
2.4,1,8 DMR Administrative Report by Parameter (DP)
He DMR Administrative Report By Parameter (DP) is a pre-formatted report similar to
the DA Report. It provides summary information on measurement violations and is especially
useful in verifying old DMR data. This report lists violations by pipe, but unlike the DA it lists
information by parameter within each pipe. Care should be taken not to rely on this report
entirely as some data may be incomplete and still not be flagged as missing (such as parameters
lacking concentrations).
Example of DMR
Administrative Report by
Parameter retrieval
statements.
Output from DMR Administrative Report by Parameter
DATE: mmm

PCS DMR ADMINISTRATIVE REPORT

PAOB: 1



BY PARAMETER





WfOTSBOKOTXI




KHPCMLT PSMQ®: DI/01/9Q - tyjl/W


facility mrnnmcATKm
OUT-
UMT
MQNTTOSINO UONTFOmiQ
BATE BUB
DATE DUB VUIM

PAUL
TWB
EKDDATE LOC/PARAMHTERS
AT EPA
AT STATE HHEHT
mMMtT-m-, XMOOOtttti
ooia
FINAL.
mmm htlumtgross
mm m
Momrae
MAJOl


SOB.4-EWY


IOTEEMATSOWCO


<»DB0.Q


P.O. BOX 72





OUMJESTOH





JOE 32487
oka
FINAL
mmm efflotnt gross
value m
tmim
tsAomoie
Figure 2.4.8
00 SYNf AX-HO RMT-255 PRTY-2 JOBS>=BMRI>P
00	TIME=2M BSH-BO05
01	HQ DP FOR REGION XI
10 MGNEQXI
20 DP
20 wrra sudb ge 010192
20 WITH SUDB LE 033192
•* DEFAULTS IN EFFECT ••
10 IACC EQ A
10 FTYP AE
C-66

-------
DMR Administrative Report by Parameter (Di^ (continued)
~	1. Generate DP report for specified timeframe.
~	2. Review report for missing pipes and parameters.
~	3. Contact facility to determine why DMR data has not been received.
C-67

-------
2.4,1.9 DMR Non-Receipt Report
The DMR Non-Receipt Report is a pre-formatted report which lists the instance of non-
reporting, monitoring period end date, outfall, and limit type for each facility selected for a
specified time period. This report is useful in identifying facilities that have not reported DMR
data as required or to identify facilities which are required to report for a selected timeframe
(i.e., FY1992). You may select three levels of information:
o LEVEL—FACI — will display information where the entire DMR is missing
o 1JE¥EL=PIPE - will display the same information as LEVEL=*FACI where all pipes
arc missingj and, in addition, partial DMRs where entire pipes are missing
o LE¥EL=MAM - will display the same information as LEVEL=PIPE where entire
pipes are missing and, In addition, parameters that are missing from other pipes on the
DMR.
The DETAIL=PIFE option can be used with all LEVEL options to display the pipes where
the entire DMR is missing. The example below illustrates the format for the report using the
LEVEL=PRAM option.
Example of DMR Non-
Receipt Report retrieval
statements.
00 SYNTAX'S NO RMT-255 PITY-2IGBID-DM1DF
00	TJME-2M BIN—BOOS
01	HQ DMR NON-RECEIPT REPORT
10 SITE EQ XX
20 J>F LEVEL-PRAM
20 WfTH SUBB COB 060192
20 WITH SUDBLE 063002
" DEFAULTS W EFFECT **
10 1ACC EQ A
10 FTOAB
C-68

-------
DMR Non-Receipt Report (DF) (codiUiiiied)
Output from DMR Non-Receipt Report
B*U t
Kt mm HM-ffiCElPT RCPSBT
MVMETIt tKL
im iaflmi9TMtivt iwnf iv wMTfi
«M98iM nmm* mstv** -
Mtiim iec)futic*riflff
HMT HSi MMttlt HIM
witet Maun
ift IS BStffN
IIHCOIW
rocrrotiM*
im BATE
mnma v ioacamm wmu
tTWT
TYPI
WOTT1K
KM fKMON IV
IDC ttffl Mil
«S/31/*t
Dm orcaoui - f tAic
PtflnlT Mil ISMHtll MMR	#® MM • ffATl
kCHlS nooucis. PC.
P.O. BOX UI
rarwuiH
KM MHI
mill m$	mjm ti/n/ti m mm - siai*
JiUliM VfCMC rxsiwit
n tax in
KM MM
PtRffir Pt IXMHIII nZMQt W/Hrtt Btn WIIW - IWiVMI
CCMUt IUCT
n boje tt*
•cvtT i pwi* srarra
X* *%«»
pcmr ndi Kumm» mxm ea/»t/it on mrot - ciwmit
ftltfttAtl CSHPMir
sib cACSt Hiaatr
sirrwtviiu
XX *6315
Figure 2.4.9
~	1, Generate DMR Non-Receipt Report (DF) for the specified timeframe.
~	2. Review the report to identify those facilities with missing DMR data.
~	3. Run additional retrievals to verify the information in the DF report. Resolve problems
if possible. Contact the facility to determine why DMR data has not been reported.
C-69

-------
2.4.1.10 DMR Summary Report CDS)
Hi© DMR Summary Report (DS) is a pre-formatted report which displays totalling
information on reported measurements sallied by municipals and non-munidpals within each
State and/or Region for a specified timeframe. You may use this report to determine the
percentage of DMR data the Region is altering into PCS.
Example of DMR Summary
Report retrieval statements.
00 SYNTAX=NG RMT-255 Flf¥=2 IGSID=DMRBS
00	TTME=2M BIN** DOW
01	HQ DSP FOR REGION XL
10 STF1 EQ XX
10 MADIEQM
10 INCLEQM
20 DS
20 WITH SUDB OE 010192
20 WITH SUDB Us 003192
•• DEFAULTS IN EFFECT "
10 IACC EQ A
10 FFYPAB
Output from DMR Summary
iuji; •r/M/ti
RC6W XI
m wm mbi» 53
Hrae?	- ti/ii/vt
PAnutsrm
SfidHTflEB
p*mmt rm
SMTvm
RtCtlVfp
ucmriB
RICflYIQ
IWI£IF44
T0TM
mlttP&L
WWHWlCIMi
n
no
n
MM
m
m
19.11
It.II
4».M
12
*1
it
41
41
ae
41
as
29*£t
H,M
e?,tt
U.OI
Figure 2.4.10
C-70

-------
DMR Summary Report (continued)
~	1. Generate DS report for specified timeframe.
~	2. Review report to determine reporting percentages.
~	3, Run DF or DP report to identify facilities not reporting DMR data as required. Bring
information to attention of PCS coordinators to determine appropriate follow-up
procedures.
C-71

-------
2.4.1.11 Violation Log Report
The Violation Log Report identifies violations of effluent measurement data and non receipt
of DM1 data. In addition, it shows compliance schedule events which have been scheduled but
not achieved.
Example of Violation
Log Report retrieval
statements.
10 CTBBQXX
10 HABi mM
io mwrmmnn
to MTOFIB0WI
10	MVIOMJ BBO
11	m
10 CTTHBQXX
10 MMttBQM
10 mtC OT 010170
10 PJSCLBftSOM
to DTACBQ 000000
20 QLTaP»YnBKEAK*>30HOfir«ra
30 EBPJ PHm OTTO
40 JFiMS TAW KRD TABS MABIB TABS FUM IWIR SUBS RDPS TAB4 MTOC
Stmt NFSSJ TAB10 CWQS CPUS TABS EBF5
40 ;///TASi VBSO VPEMVmMU VMLOVMO&MVXn'MCM'.V MQMXMCMM
KCAVMCMXMV»SK€ESaCE
40 withmvioheb»
m wrraMwrcgsewM
« wrasswrugcajwj
40 W/CSCaiMCDBVIirEVMTOOnCVrACimCCSrHOGMMTABSSIICC
mcc
m with one ct mono
40 WITH DrrsClBOJlJM
40 WITH DTaC RJ 0W000
40 ////ENACEMACSBAlTBVTPBRIMEMVrBCMIBClffi
Output from Violation Log Report retrieval statements
hpb£s mjsm ~ violation aeco&iiTiB! rnvmr
au mamstmm vmiATwm* sens^esr i*n iim fraaysst m?
mm ttm-mtitPf vssunmst	tm may .mmw im
tafuwci Rnstaa&t	srtim www ha* ts* I'm
etca usee swr iwfr
em
lAfP 9¥1F BBFW
XKMMMM BMP
«wit fliw torn sum
was raoe tmt w
DtSC BfB£ CSPH
EMST S9lt
CTffi enn wo«
COL
ttl* 0M>» »LH*. TOTAL
0010 9*»«e GSflJSPlWi* TOTM
0410 «eefr» cmusbihs, totai
otto 7%»» cotxroitf, fical
to 10 74991 CQLITM. PECftL
•»M	I0UM» TOTAL
«§S® iOLSfff* TOTAL
gg/SIrtt
tt/XVW
i?
M
11
71 EM
ft m
14 Iff
11 #*&:! MIlL 31U&3E ffiJTWt' BfRRTT
1« tT*» »»7IL SUBNET ANNUAL PIK»i ¦CfOBT

«
IMMV CALL
9
II
05
CAtL
9
Ci
W
OStfltftCf UtTf*
S
CI
«
EHroaCENERrr CaffEBlHEt
1
ffl
«9
w&tics er ¥igLmpff*iism
s
ES

MJTICI » VIOtAT JtPM * tl#ML
8

rnui 6A1A
It/M/tV
ei/i
se~rorAL suit* its® fs?b*t in*i:
Figure 2.4.11
C-72

-------
Violation Log Report (continue*®
~	1. Generate the Violation Log Report specifying the appropriate dates.
~	2. Review output for violations.
~	3. When errors are identified, inform the appropriate person. Follow up to ensure their
correction in the PCS data base.
C-73

-------
2.4,1.12 Inspection Scheduling Report
The Inspection Scheduling Report Is an option associated with the Facility Report. It lists
projected dates for NPDES inspections to be conducted at permitted facilities by date and
quarter. It can be compared to the Inspections Report to determine how well projected goals
are being met.
Example of Inspection
Scheduling Report
retrieval statements.
00	SYNTAX=WO JOBlD»QAlSftMT«2JJ PltTY«=4 TWB-3M BJN-D005 COPES-=1
01	HQ INSPECTION SCHEDULING REPORT
01 POR RECKON XI MINORS
10STTBBQ
10 MAD! HQ M
20FASBCTV0KS-R
*• DEFAULTS IN EFFECT "
J01ACCBQ A
10 FTYP AB
Output from Inspection Scheduling Report retrieval statements
MTEt tif/tl/tt
pcs iACitmr Rtrair
matt t


BMCTIPI ttWOULM MHtf
rm worn xt nimors

FACILITY]
JWCS liTiU OTtl>a#1
* dmctujk rnmmnM data »
PIRttlT MMMI; HIM
Mfili £
fWORIIBP
flaiomiiD iMSPEcnw ttpi
cooe / mscsipticw
aicniiSMRcnK miMKie
GEDC j? EHEtSCXZPnr'VQH B* J i.
SCHEDULED IfaFICTftJH
OWfTU / tUX

C CWLIAMCE CVAL KHM-ttUPUMBI
• inn w/mm
acHcouiiD uemnm mmm*i


C COVLliCI «y*L tHKM-ftUffUMl
i flTATE 09/19/90
KNCOUU0 HOf«CTII»l CtHVtfTS:
*/w
Uk/ll/V!
a pcnroHWMCi audit
S BTATC msswn
senmtee wwrw tomm?
1/91

s tomttmm. mrkim
s HATi M/ia/fi
9CKtKiice mnmoi eavcmi;

Figure 2.4.12
C-74

-------
Inspection Scheduling Report (continued)
~	1. Generate a Facility Report selecting the Inspection Scheduling option for the specified
timeframe.
~	2. Compare the Inspection Scheduling report to the Inspections report to determine actual
inspections conducted as compared to scheduled inspections.
~	3. Route report to Inspections section for their review.
C-75

-------
2.4.1.13 Compliance Schedule Forecast Report	0
The Compliance Schedule Forecast Report is a pre-formatted report which may be used to
make sue that all milestone events and reports scheduled in a compliance schedule hive been
achieved.
Example of
Compliance Schedule
Forecast Report
retrieval statements.
Output from Compliance Schedule Forecast Report retrieval
09 SYNTAX® MO JGBIB-QACP KMT*2SS fSlTTM TOU-IM BIN "BOSS COMBS*!
01	MQCOMPUAM^SCMM)Ul£K«ECASrllEIWr
02	FOE REGION XIMWOIS
»RBOKJf*BQSl
I0MABI1QM
20 CP
aowfmimc«(wns3
20 WWH DISC us (M3W2
©iff; «Vil; xxatmti
u.9. VAMOtWt co» teiv
»s*-w tHB «n {wmm neera e&/«i/§#
EST tUlCA
mnmw
i»-w isr mi fmnuTioi nmin
ernsv m* xmtmi wj
vm nacfm* ettr op
it
tn-» m. n^tMi tmm
msth/vs
PfR«? »£: SKtSStff® fUUSS
mawii mm imMNT
puna
S8SSH eMVmSSH
PSB55J? SOI Rn)MM ftMGS
iUCKt
em, i^iWL i
mot e®5 ssftMBsa n&jsi
F«S* CW Of
FIIIA
Wt-ff Iff SfiPWT QP EHMMPfttl
ALTBftafi BfLlBftB tea.
PfBfltT e®? mMMM ftUCB
see STiii am em
WSTE**
•it* n?-» npitn i»um t vnci
Figure 2.4,13
~	1. Generate a Compliance Schedule Forecast Report for the specified timeframe.
~	2. Compare Compliance Schedule Date to the Compliance Schedule Achieved Date.
~	3. Bring any facilities where the scheduled date has not been achieved to attention of the
supervisor.
C-76

-------
2,4.1.14 Enforcement Action QL Retrieval
The Enforcement Action Quick Look Retrieval is used to ensure that the complete
enforcement actions are entered, closed, superseded, and linked to the comet violations.
Example of Enforcement
Action Quick Look retrieval
statements.
m symtax^no	PtT¥«i	joii3j«ckeai tmb=»s
01 KQ CHECK cm eTOSCEMEfnUCTKOT DATA
10 HnuBQxxanom
20 QLTOF-tBSH&yjEie-MWa
40 MTOfWMSMADICTQSCTMS
40 / TABIC EMAC EAIF ENDT BKfKEHST H3T BtAC
40 / TAMS JJVTF EVB8 SSFQ ETOD EVLM EWH EVML ESEA WQ8 EVMU
40 /TABlSBTOETOiSVaJEWVECTCECVB
40 / TABU EVTF BSVC ESVD
« DEFAULTS Of EFFECT •*
10 IACCEQA
10 ftwab
Output from Eeforeemeii? Action QL retrieval statements
nSJXBZB PKfltfT MM*
cweck km ewfwcemtrr actio*
' ONE* 9IAIV9 Clffl WftM <
ww lif iw St f» wii sf rue mjk s»iv> w wm suivs oiitfa
w K|V VIA (W 09CM Uf PSP1 «Mi or KH M9I1 (W UN Tf» |« mi
w wt *i«. tTfli f** cs flw p* epce cp t»« Evm m t
n» *e* vioi tin vim ** vioi ten
Sffl!WS 4
s
a
ci
ci
EI
ft
ti
prioiutt
epviocB p» ««w vi« ®r
ti
ti
et
(
ti
u
ei
il/u/«r
il/KM»7
ee
H
It
It
IS
c»
II
ttmm
Figure 2.4.14
C-77

-------
Enforcement Action QL Retrieval (continued)
Generate an Enforcement Action Quick Look report for the entered data.
Compare the Enforcement Action Quick Look report against code sheet to verify the
accuracy of coding and data entry.
When errors are identified, inform the appropriate person. Follow up to ensure their
correction in the PCS data base.
C-78

-------
2.4.2 Special Processing
PCS can perform several special processing retrievals as a taction of the Generalized
Retrieval Subsystem. Two of the special processing retrievals axe the generation ©f the
Quarterly Non-Compliance Report (QNCR) which lists all active major facilities for which
instances of Reportable Non-CompEaiice have been recorded within the current reportiitg period
and the Effluent Data Statistics (EDS) which allow DMR data to be statistically analyzed or
graphed over time. Both of these reports also serve as useful QA tools as discussed below.
2.4.2.1 Effluent Data Statistics (EDS)
The EDS software is used to analyze DMR effluent date. In addition to statistical reports,
this software produces mass loading reports and several types of graphs. Since DMR data can
be sloped in different unit formats, EDS converts data values to PCS standard units before
conducting an analysis. EDS is especially useful in identifying facilities with DMR data quality
problems, improper unit conversions, and improper use of monitoring location codes.
Example of retrieval
statements to produce EDS
reports.
® AH omm-rn	COW-box EST-***
m m omwi raajj-st iwwji-ra sr/srr-m** omw-bm®
a a» wsiw-iai aoiam-i srac-m wow-sat cowc-m
an wnHVDrSTMNiDrr
m wniHmuHMim
a witm
Example of a Loading Report
Generated from EDS,
PQMIT OOWLLMNCE 3YSIEM
mmT B5SOSMOE HA£S UsAJWO HEFOtST
qsograimical unwi by facsjiy
BEO BAtB FACSJTY BMMi NH
EMDIMTBBAW BA8W DOC
sou? iwirosDiiam;
SBCEWIBO
mmm acme fwhmy m m
ivn/n mmm i_ eue/smmjl k
tSn MINOR SPEOAL WPUST
EAST BX «rn WNrn.
rase
¦on. MMY
lUXBMIt
1W 38&M
tm as,®?
im b a
im a.TZ2jtnm

•
C-79

-------
0
Effluent Data Statistics (continued)
Example of a Statistical lading Report Generated from EDS,
OTAfBTfCAL LOADING MfOtT
EEOH3N 11 LOAD TOTALS
— SKIAMETER ODDB 0910 PARAMBIER SESC -BOD. S-DaY SIATB-XX —
AVE A¥B AVB EXCESS FLOWWT
OQNC PLOW MAB LOAD OONC
FACSLSTYKAlffi mMTTf YEAE io omu o.im n,m a a mot
1m 5.133 0JW0 0.0MS 0.0607 I7j0417 12 00107
1991 22J12 1000.00 7S7S142 729M42 10.277 9 24739.1
Figure 2.4.15
Alter a load and statistical report for a particular geographic area has to generated
examine the Load Reports for.
~ 1. Inconsistent load for year to year periods. Outliers may be due to:
~	inconsistent unit codes
~	a change in the permit that is not reflected in the way the data is being entered
~	a change in the way the facility is monitoring the data (such as halting the
monitoring of data in mid-year.
~ 2, Examine the Basic Statistics Reports for:
~	Large standard errors, A standard error greater than five is questionable. He
greater the standard error, the greater the likelihood of an error,
~	Monitoring for less than twelve months. A given facility may not be required to
monitor for 12 months. Check the permit to verify the number of months that the
facility is required to monitor.
~	High flow averages. High flow is typical of only a very few facilities (such as
power plants). Verify other occurrences.
~	High concentrations. High concentrations nfay be due to:
~ data entered with incorrect unit codes.
080

-------
Effluent Data Statistics (continued)
~	dan enteral incorrectly (missing or misplaced decimal point)
~	permit has changed unit codes and this change is not reflected in data being
entered.
Example of Check Retrieval

ACMBRWNtirr
MD FHMA® HK UABI

mm els® mm mjm mstcumimv mm iokum/ismkmim
MUOCML0C aTATttAT WUPU

VtBGVTSM VMUMQAV feffiJMJS MINT WW ICUt (CW W MM
30WMB8® ACMBTOUMBSlir SHI fUM* KgJKSSiStrATWS tW £1
est* mmm mmb mm i hjim misn mnmrmr
1 EmnroHHvtusii) mm avg0ah.it us.
MIA aVBOASLV MX 1
mAMm,94*a i ii«ra .mm sj
JM mmm
MA na WAY 1 .USM2J JlftB JM
rm nww
rntA mm. MkAY I MM J3W1 TV®
-cut 12mm
WA Ml S-Mf 1 33SMB JBSm «U
n* «r»m
mmm wmm
bel dsl rnmnfmr
1 EFSUffiMT CROSS ¥A£«t (0 MM AWEuULV IfcOC

oma mm, m amwr i mm imm
QI4U9I

-------
Effluent Data Statistics (continued)
~	Incorrect monitoring end dates. Each monitoring aid date must be the last day
of the month. Only one measurement per month.
~	Incorrectly coded monitoring locations.
~	Load double counting due to internal pipes.
O Correct NRPU (number of reported units for each value). Usually this code is
1, meaning that each value accounts for one unit. For example, if NRHJ is 6
there would be just two values for the year.
Check permit to verify data found in the check retrieval.
Resolve problems as usual in office. Methods which may be prove useful include:
~	Report problems to state and ask for their cooperation in the correction of these
problems.
~	Call permittees and request copies of DMRs (mail, FAX, or over phone)
C-82

-------
2*4.2.2 PCS Quarterly Noncompliance Report (Selective QNCR)
Hie PCS Quarterly Noncompliance report (QNCR) is used as a QA tool for indicating
whether formal enforcement actions are coded in PCS and specific violations are addressed
properly. Numeric violations are also flagged. This report lists facility name, location, NPDES
number, the instances of reportable non-compliance (RNC), and enforcement actions taken as
a response.
Format for Quarterly
Noncompliance report
Output produced from Quarterly Noncompliance retrieval statements

Mticim
«j**T8*L? SOMSMPltAMCE era
f ** WKSt » Ffc8£ 1


an ra ffieiaa xi

HSWhfftWjrfPAlJ


tmmt mm/m wt ia/ai/w
tecAtiew



Ml! PMEP mum LIKET

tK»fANC£ «F lOWlUiCf
«MC lAfE
actios
9aie fiAtus BAte eewt&frs
tSPF CgmSATXSR
NM-CflMPLVAIff


as^ama



0n99l§41C *»«P



em 9#sk#r fp^nu-ra
tm ti/usf*

PC Wlt/ft Ht 0EPVTWI rokAtm
era 9?es3uQ t® sm^rAre
itrt

wc a esponrite vnunn
em g«fBER2 t® instate
RPT lira/ft

SC ti/]*ra SO BEPG81M ¥I@LA?J®*
ail FrruitMT

MOflCE 0T VlStATSDH^IKPnUSTI
tft/S&m
FPU riMS MtHITTiO
sen

KE l£/2*/?& 35 am BMStfUT vmATIGH




Alt IPPlMt*T

notice of wimArtmpBmtuwfi

HI IPTUWMt

ptinrr oteal to eat «stj
epfum uxtrt^ mmstiie



CMttsnt mm «mm§9 op
nwhlMTf
All EfTIISIIT

imcs of viMATi
-------
PCS Quarterly Noncompliance Report (Selective QNCR) (contlmied)
~	1. For invalid facility status codes, check the status ©f each individual violation:
~	If the status should be RP, check for missing enforcement action, incorrect or
missing docket number.
~	If the status should be WE, check to ensure formal enforcement actions are closed.
Additionally, check to make sure you do not need to manually resolve the
violation, as in the case of a single event violation.
~	If the facility should be NC, ensure that:
~	the correct limits and DMR data are in PCS.
~	the schedule is correctly entered into PCS.
~	the flag on a single event violation has been raised.
~	2. For invalid non-reporting violations of DMR data:
~	Ensure DMR data is actually k PCS.
~	Ensure DMR data and limits are accurate.
~	Check for missing "No Discharge" codes.
~	Check that the pipe is supposed to be active.
~	3. For invalid non-reporting violations of Compliance Schedule data:
~	Ensure that the Compliance Schedule data is in PCS.
~	Ensure actual and received dates are accurate.
~	4, For erroneous effluent violations:
~	Check numeric limits and statistical base codes for accuracy.
~	Check DMR data in PCS for accuracy.
~	Check SNC flags to ensure data was evaluate! by PCS.
~	If missing violation relates to an AO or Consent Decree, ensure correct COLS
was entered on the limit record. Also examine PLFN on the limit record, ERFN
on the Enforcement Action record, and CSFN if there was a schedule.
~	5. If old violations suddenly appear on QNCR, check if the facility was recently upgraded
to a major and had old data in the system. This usually occurs on the QNCR for the
quarter ending December 31.
~	6. For missing enforcement action data:
~	Ensure action was correctly keyed into PCS.
~	If using E3, ensure action was linked to correct violation.
C-84

-------
2.4.2.3 PCS Quarterly Noncompliance Report (Coordinator's QNCR)
The PCS Coordinator's Quarterly Noncompliance report (QNCR) is used as a QA tool in
the same way as the Selective QNCR far indicating whether formal enforcement actions are
coded In PCS and specific violates are addressed property. Numeric violations are also
flagged. This report lists facility name, location, NPDES number, the instances of importable
ncM-CQfflpliaiice (RNC), and enforcement actions taken as a response, It should be reviewed for
the same information as the selective QNCR.
Format for Quarterly
Noncompliance report.
00	8m?AX**MGJOM>»Qra	mT¥<*i timbksm mti&m
01	HQCpcafOHEOKMill
10 NTO»|XW0!!f*»
20 QUVTYP-BC
** defaults m ewbct «
10 1ACCBQA
10 FTYT A®
Output produced from Quarterly Noncompliance retrieval statements
wn«? otwtt mmm xs

SILICftVt
vwrrmi
M&^am.tA»cs !»««
T w« MR ¦»

NM t



UHC
* re* nan* xi









f»0m
•i/nm to? iv»i/«
KKAUfl*







mi mm® $»«fT unit

VlfltlllOl
CM
1
I
tit
»•
i


IM9RAHCI Of MMOrVUiMe

m tote
PffWCfHiHT *£11®
(hill STATU* 0A7C
EOT
wbh®

*v
KOntMf





ilttfW















«ff PSIM 10 (IVbttAte

«*i tiotrvt


bc to mm
IT 98 ¥!8LAffCM
mm omsm N>

•rr u/Ji/tt


ee
is sira
iTne vmAts#*
n gsiaM ro {pvtfATt

¦Ft


fC tl/K/tt to B5PSS
S?S« VIOLATPPt
ALL fFflUtHl


H$fR£E i#
VI0UVIilHiM
SS/IfcTO


*u crfUitNr


notice Of
ineiAfisNttimtsf i



ah cptiiim


NUICi tf
¥X9IATS».»mt9r 1
•i/lvw


Ait unmm


MOT let &
yssuTisMmutst



au irfiMiifr


•OTICI or
¥|SUrZBl*il9!«.i«T!
•l/U/tt


All CffllJtMV


MQT1C1 »

tt/lvt*


Ail ffHVINf


N9IICI 9
¥mitin mttti «t i
lA/tf**


aii «frum


WflCS OP
VlOLATllPl.Mm.llTI



comii TOtAl IA$ (VI
MtA



*C
mnwne
»s HBcmiMca
t scsksuie vimAftm
UWt PUN MMItliB
4A
SOt frVtVti


Nt ii/tfr^S
se <&&&
Figure 2.4.17
C-85

-------
SECTION 3
PCS MANAGEMENT
3.® Overview
Region ll's QA program encompasses more than a set of well-documented procedures for
avoiding and correcting errors. The PCS managers also administer their programs in such a way
as to emphasize the importance of quality assurance and stress the need for continual
improvement of PCS data quality. Region 11'sQA program includes the following management
activities:
¦	Assigned Staff Respoasfl>llitJes. Region 11 has formally established the duties of its staff
to demonstrate the importance of PCS data quality. The staff know their expected QA
duties and have adjusted their performance accordingly. Region 11 has included
responsibility for PCS quality assurance in relevant job descriptions, established
performance objectives for quality assurance, evaluates quality assurance accomplishments
during performance evaluations, and has designated a PCS quality assurance overseer.
¦	Established, Attainable Goals and Targets. Region 11 has set its PCS data quality targets
for timeliness, accuracy, completeness, and consistency by adopting the recommended
national standards. Region 11 concentrates its efforts on improving its performance in the
areas where its QA program is the weakest. The data quality targets and Region ll's
performance is publicized throughout the PCS office.
¦	Berfennaisce Tracked Against Goals. The quality of Region ll's PCS data is measured
at regular intervals and compared to the data quality targets and goals. This evaluation
demonstrates how well Region 11 is doing in terms of its data quality. This evaluation uses
the same, routine method (presented in Appendix B) of gauging data quality performance
at regular intervals (each month, quarter, and year). This routine measurement of
achievements against the quality targets allows Region 11 to track its data quality status
C-86

-------
over time, identify trends for appropriate managerial oversight, and provides information
to use ia fine-tuning the PCS QA program.
Quality Assurance Program Assessed Eegukrly. Periodically Region 11 evaluates how
well its Quality Assurance program functions and determines how to improve it First, the
current status of the QA program is determined and then measured relative to the OWEC
national standards. Region 11 focuses primarily on problems that affect PCS data quality
and on identifying their causes. Once the cause of a problem is identified, possible
solutions are evaluated and the most appropriate solution implemented.
Management of Data Input Personnel. Region 11 manages its staff to support and
enhance PCS data quality. The PCS staff is allowed to devote the time necessary to
complete their data quality work and to follow up cm problems promptly when they occur.
Good staff performance is rewarded. Staff who do not demonstrate a willingness or the
ability to perform quality assurance functions adequately ate reprimanded or reassigned.
Region 11 prepares for staff turnover by cross training the existing PCS staff in the job
duties of the other positions and by moving quickly to replace PCS staff who are leaving,
usually while the experienced staff member is still on the job to orient the replacement.
Consistent Commitment to Data Quality. Region 11 provides full management
commitment to PCS data quality. Region 11 has found that unflagging management support
is crucial for the success of its quality assurance programs. While the QA program has
experienced problems and setbacks, through the perseverance of the PCS managers and the
dedication of the PCS staff most, if not all, of these problems have been overcome.
Consistency, and following the established policy even when difficulties arise, has proven
successful.
C-87

-------
¦ ComrauDkatloD. Communication, including system documentation and staff training, is
a vital part of Region IPs quality assurance program. All PCS documentation is easily
available to all staff. PCS QA training is conducted as frequently as necessary and is
targeted to the staff who can benefit most from it.
3.1	Staff ItoUqf
Region 11 uses the form presented in Figure 3.1 to help track the PCS training that the staff
has received. The local PCS training courses ate added to this form, along with the PCS staff's
names. As staff complete the classes that they need, the form is completed.
3.2	PCS Phone Contacts
The two lists of phone numbers presented in Figure 3.2 are intended to provide a ready
reference for contacts who can help you with PCS problems. Add the phone numbers to reflect
your local, state, or regional contacts and distribute the completed lists to all appropriate
personnel.
3.3	PCS Documentation list
The following list presents documents and references used in Region ll's PCS QA
program. All references ire easily available to the PCS staff.
PCS Quality Assurance Reference Manuals
The following references are readily available to all PCS staff;
~	1.	PCS QA Manual,
~	2.	PCS Data Element Dictionary.
~	3.	PCS Generalized Retrieval Manual.
~	4.	PCS INQUIRY Manual.
~	5.	PCS Data Entry, Edit, and Update Manual.
~	6.	PCS Codes and Descriptions Manual.
~	7.	PCS PC Personal Assistance Link (PAL) User's Guide.
C-88

-------
PCS Staff Training
Name
Position
Course Name
Need?
(Y/N)
Date Taken
Month/Year
L
Basic PCS Training
Advanced PCS Training (QNCR)
Generalized Retrieval Training
Manager's Inquiry (On-line Tutorial)
PCS PAL (On-line Tutorial)
_—
/
	/	
/
/


	/	
/


/
2.
Basic PCS Training
Advanced PCS Training (QNCR)
Generalized Retrieval Training
Manager's Inquiry (On-line Tutorial)
PCS PAL (On-iine Tutorial)

/
i


/
	/	
/
f


3.
Basic PCS Training
Advanced PCS Training (QNCR)
Generalised Retrieval Training
Manager's Inquiry (On-line Tutorial)
PCS PAL (On-line Tutorial)
_—
_~7
i
!


/
/


/
4.
Basic PCS Training
Advanced PCS Training (QNCR)
Generalized Retrieval Training
Manager's Inquiry (On-line Tutorial)
PCS PAL (On-line Tutorial)

	/	
/


I
t


/
/


/
5.
Basic PCS Training
Advanced PCS Training (QNCR)
Generalized Retrieval Training
Manager's Inquiry (On-line Tutorial)
PCS PAL (On-line Tutorial)

/
i

.
i
		
/
		/
/
Figure 3.1
C-89

-------
Fbone Numbers for User Support
PCS USER SUPPORT
U.S. Environmental Protection Agency
401 M Street, S.W. (EN-338)
Washington, D.C. 20460
(FTS/202) 260-8529
NCC-IBM User Support
NCC Network Control Facility
(FTS) 629-4506
(919) 541-4506
NCC Tape Librarian
NCC Training Office
State Contacts
Figure 3.2
C-90

-------
0
Local Phone Contacts for PCS Problems
| Name
Expertise
Phone Number j
1 1,


8





2.






3.






4.






5.







6.

'




7.






8.






1
Figure 3.2 (continued)
C-91

-------
3.4 FCS Staff Responsibilities
The final management chart (see Figure 3.3) lists the QA responsibilities of Region 11*$
PCS staff. This chart reflects the office organization and the assigned duties of the staff. This
list is also used whan writing job descriptions and performance objectives for the staff.
C-92

-------
0
QA Responsibility for PCS FmM&m
1 pcs position
QUALITY ASSURANCE RESPONSIBILITIES
I ¦ PCS Supervisor f PCS Coordinator
¦	Overall supervision of PCS QA Program.
¦	Review and evaluate of staffs QA
performance,
¦	Review of prepared code-sheets. I
¦	Resolution of QA problems j
1 ¦ PCS Assistant / PCS Specialist
¦	Complete and consistent preparation of 1
Sample DMRs (permit writer/coder). j
¦	Preparation of oode-sbeets for data entry. 1
¦	Timely, Accurate, Complete, and Consistent 1
entry of assigned input documents using 1
PCS-ADE. I
¦	Review of Update Audit reports and
resolution of errors.
¦	Research and resolution of assigned QA
problems.
I ¦ PCS Data Entry Personnel
¦	Timely, Accurate, Complete, and Consistent
entry of OMR data (primarily using batch
mode).
¦	Review of Edit Audit (Dummy) report. 1
Research and resolution of QA problems. |
. Review of Update Audit report. Research
and resolution of QA problems. J
Figure 3.3
C-93

-------