United States Office of Research and EPA/600/R-98/018
Environmental Protection Development February 1998
Agency Washington, B.C. 20460
EPA GUIDANCE FOR
QUALITY ASSURANCE
PROJECT PLANS
EPA QA/G-5
-------
FOREWORD
The U.S. Environmental Protection Agency (EPA) has developed the Quality Assurance Project
Plan (QAPP) as an important tool for project managers and planners to document the type and quality of
data needed for environmental decisions and to use as the blueprint for collecting and assessing those
data from environmental programs. The development, review, approval, and implementation of the
QAPP is part of the mandatory Agency-wide Quality System that requires all organizations performing
work for EPA to develop and operate management processes and structures for ensuring that data or
information collected are of the needed and expected quality for their desired use. The QAPP is an
integral part of the fundamental principles of quality management that form the foundation of the
Agency's Quality System and the requirements for a QAPP are contained in EPA QA/R-5, EPA
Requirements for Quality Assurance Project Plans for Environmental Data Operations.
This document is one of the U.S. Environmental Protection Agency Quality System Series
requirements and guidance documents. These documents describe the EPA policies and procedures for
planning, implementing, and assessing the effectiveness of the Quality System. Requirements
documents (identified as EPA/R-x) establish criteria and mandatory specifications for quality assurance
(QA) and quality control (QC) activities. Guidance documents (identified as EPA QA/G-x) provide
suggestions and recommendations of a nonmandatory nature for using the various components of the
Quality System. This guidance document contains advice and recommendations on how to meet the
requirements of EPA QA/R-5. In addition to this guidance document on writing a QAPP, other EPA
documents are available to assist the QAPP writer; these are discussed in Appendix A. Effective use of
this document assumes that appropriate management systems for QA and QC have been established by
the implementing organization and are operational. For requirements and guidance on the structure of
this management system, refer to Appendix A.
Questions regarding this document or other documents from the Quality System Series may be
directed to:
U.S. EPA
Quality Assurance Division (8724R)
Office of Research and Development
401 M Street, SW
Washington, DC 20460
Phone: (202)564-6830
Fax: (202) 565-2441
All requirements and guidance documents are available on the EPA's Quality Assurance Division
website:
http://es.epa.gov/ncerqa/qa/qa_docs.html
EPA QA/G-5 1 QA98
-------
EPA QA/G-5
QA98
-------
TABLE OF CONTENTS
CHAPTER I. INTRODUCTION 1
OVERVIEW 1
PURPOSE OF QA PLANNING 2
CHAPTER II. QAPP REQUIREMENTS 3
EPA POLICY ON QAPPS 3
QAPP GROUPS AND ELEMENTS 3
QAPP RESPONSIBILITIES 5
CHAPTER III. QAPP ELEMENTS 7
A. PROJECT MANAGEMENT 7
Al Title and Approval Sheet 7
A2 Table of Contents and Document Control Format 7
A3 Distribution List 8
A4 Project/Task Organization 8
A5 Problem Definition/Background 10
A6 Project/Task Description and Schedule 11
A7 Quality Objectives and Criteria for Measurement Data 12
A8 Special Training Requirements/Certification 13
A9 Documentation and Records 14
B. MEASUREMENT/DATA ACQUISITION 17
Bl Sampling Process Design (Experimental Design) 17
B2 Sampling Methods Requirements 19
B3 Sample Handling and Custody Requirements 23
B4 Analytical Methods Requirements 28
B5 Quality Control Requirements 30
B6 Instrument/Equipment Testing, Inspection, and Maintenance Requirements . . 32
B7 Instrument Calibration and Frequency 33
B8 Inspection/Acceptance Requirements for Supplies and Consumables 35
B9 Data Acquisition Requirements (Non-Direct Measurements) 37
BIO Data Management 38
C. ASSESSMENT/OVERSIGHT 41
Cl Assessments and Response Actions 41
C2 Reports to Management 44
D. DATA VALIDATION AND USABILITY 45
Dl Data Review, Validation, and Verification Requirements 45
D2 Validation and Verification Methods 47
D3 Reconciliation with Data Quality Objectives 48
EPA QA/G-5 iii QA98
-------
CHAPTER IV. QAPP REVISIONS AND RELATED GUIDANCE 49
QAPP REVISIONS 49
COMPARISON WITH PREVIOUS GUIDANCE (QAMS 005/80) 49
APPENDIX A. CROSSWALKS BETWEEN QUALITY ASSURANCE DOCUMENTS A-l
AA1. Relationship Between E4 and EPA Quality System A-l
AA2. Crosswalk Between QA/R-5 and QAMS-005/80 A-3
AA3. Crosswalk Between EPA QA/R-5 and ISO 9000 A-4
AA4. Crosswalk Between the DQO Process and the QAPP A-5
AA5. EPA Quality Assurance Documents A-7
APPENDIX B. GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS B-l
APPENDIX C. CHECKLISTS USEFUL IN QUALITY ASSURANCE REVIEW C-l
AC1. Sample Handling, Preparation, and Analysis Checklist C-l
AC2. QAPP Review Checklist C-5
ACS. Chain-of-Custody Checklist C-8
APPENDIX D. DATA QUALITY INDICATORS D-l
ADI. Principal DQIs: PARCC D-l
AD2. Other Data Quality Indicators D-5
APPENDIX E. QUALITY CONTROL TERMS E-l
AE1. Quality Control Operations E-l
AE2. Quality Control Requirements in Existing Programs E-3
APPENDIX F. SOFTWARE FOR THE DEVELOPMENT AND PREPARATION
OF A QUALITY ASSURANCE PROJECT PLAN F-l
AF1. Overview of Potential Need for Software in QAPP Preparation F-l
AF2. Existing Software F-3
AF3. Software Availability and Sources F-5
APPENDIX G. ISSUES IN DATA MANAGEMENT G-l
AG1. Introduction G-l
AG2. Regulatory and Policy Framework G-l
AG3. QA Planning for Information Systems G-3
AG4. References G-14
EPA QA/G-5 IV QA98
-------
LIST OF FIGURES
Figure 1. QA Planning and the Data Life Cycle 2
Figure 2. An Example of a Table of Contents and a Distribution List 9
Figure 3. An Example of a Project Organization Chart 10
Figure 4. The DQO Process 12
Figure 5. An Example of a Sample Log Sheet 25
Figure 6. An Example of a Sample Label 26
Figure 7. An Example of a Custody Seal 26
Figure 8. An Example of a Chain-Of-Custody Record 27
Figure 9. Example of a Record for Consumables 36
Figure 10. Example of Inspection/Acceptance Testing Requirements 36
Figure 11. Example of a Log for Tracking Supplies and Consumables 36
Figure AA1. Relationships Among EPA Quality System Documents at the Program Level A-9
Figure AA2. Relationships Among EPA Quality System Documents at the Project Level A-10
Figure ADI. Measurement Bias and Random Measurement Uncertainties. Shots at a Target D-3
LIST OF TABLES
Table 1. Project Quality Control Checks 31
Table AA1. Numbering System for EPA's Quality System Documents A-7
Table AA2. Quality System Documents A-8
Table AD 1. Principal Types of Error D-6
Table AE1. Comparison of QC Terms E-6
Table AE2. QC Requirements for Programs E-13
Table AE3. QC Requirements for Methods E-16
Table API. Software Available to Meet QAPP Development Needs F-4
Table AG1. Project Scope and Risks G-5
Table AG2. Software Development Life Cycle G-7
EPA QA/G-5 V QA98
-------
LIST OF ACRONYMS
ACS American Chemical Society
ADQ Audit of Data Quality
CFR Code of Federal Regulations
DQA Data Quality Assessment
DQI Data Quality Indicator
DQO Data Quality Objective
EPA Environmental Protection Agency
ISO International Organization for Standardization
MSR Management Systems Review
NIST National Institute of Standards and Technology
OSHA Occupational Safety and Health Administration
PARCC Precision, Accuracy, Representativeness, Comparability, and Completeness
PE Performance Evaluation
QA Quality Assurance
QAD Quality Assurance Division
QAMS Quality Assurance Management Staff (now QAD)
QAPP Quality Assurance Project Plan
QC Quality Control
RCRA Resource Conservation and Recovery Act
SOP Standard Operating Procedure
SRM Standard Reference Material
TSA Technical Systems Audit
EPA QA/G-5
VI
QA98
-------
CHAPTER I
INTRODUCTION
OVERVIEW
This document presents detailed guidance on how to develop a Quality Assurance Project Plan
(QAPP) for environmental data operations performed by or for the U.S. Environmental Protection
Agency (EPA). This guidance discusses how to address and implement the specifications in
Requirements for QA Project Plans for Environmental Data Operations (EPA QA/R-5).
The QAPP is the critical planning document for any environmental data collection operation
because it documents how quality assurance (QA) and quality control (QC) activities will be
implemented during the life cycle of a program, project, or task. The QAPP is the blueprint for
identifying how the quality system of the organization performing the work is reflected in a particular
project and in associated technical goals. QA is a system of management activities designed to ensure
that the data produced by the operation will be of the type and quality needed and expected by the data
user. QA is acknowledged to be a management function emphasizing systems and policies, and it aids
the collection of data of needed and expected quality appropriate to support management decisions in a
resource-efficient manner.
In order to obtain environmental data for decision making, a project should be conducted in three
phases: planning, implementation, and assessment. The first phase involves the development of Data
Quality Objectives (DQOs) using the DQO Process or a similar structural systematic planning process.
The DQOs provide statements about the expectations and requirements of the data user (such as the
decision maker). In the second phase, the QAPP translates these requirements into measurement
performance specifications and QA/QC procedures for the data suppliers to provide the information
needed to satisfy the data user's needs. This guidance links the results of the DQO Process with the
QAPP to complete documentation of the planning process. Once the data have been collected and
validated in accordance with the elements of the QAPP, the data should be evaluated to determine
whether the DQOs have been satisfied. In the assessment phase, the Data Quality Assessment (DQA)
Process applies statistical tools to determine whether the data meet the assumptions made during
planning and whether the total error in the data is small enough to support a decision within tolerable
decision error rates expressed by the decision maker. Plans for data validation and DQA are discussed in
the final sections of the QAPP. Thus, the activities addressed and documented in the QAPP cover the
entire project life cycle, integrating elements of the planning, implementation, and assessment phases.
A QAPP is composed of four sections of project-related information called "groups," which are
subdivided into specific detailed "elements." The degree to which each QAPP element should be
addressed will be dependent on the specific project and can range from "not applicable" to extensive
documentation. This document provides a discussion and background of the elements of a QAPP that
will typically be necessary. There is no Agency-wide template for QAPP format; however, QAD
encourages organizational consistency in the presentation and content of the elements contained within
the QAPP. The final decision on the specific need for these elements for project-specific QAPPs will be
made by the overseeing or sponsoring EPA organization(s). The Agency encourages the specific
tailoring of implementation documents within the EPA's general QA framework on a project-specific
basis.
EPA QA/G-5 1 QA98
-------
PLANNING
Data Quality Objectives Process
Quality Assurance Project Plan Development
IMPLEMENTATION
Field Data Collection and Associated
Quality Assurance/Quality Control Activities
ASSESSMENT
Data Validation
Data Quality Assessment
QA PLANNING FOR
DATA COLLECTION
Data Quality Objectives Process
i
r OUTPUTS ^
/Data ,
Quality /
Objectives //
i
r
// Data /
/ Collection /
Design /
, INPUTS 1
r
Quality Assurance Project Plan
Development
^
r
Quality Assurance
Project Plan
^
Figure 1. QA Planning and the Data Life Cycle.
PURPOSE OF QA PLANNING
The EPA Quality System is a structured and documented management system describing the
policies, objectives, principles, organization, responsibilities, accountability, and implementation plan of
an organization for ensuring quality in its work processes, products, and services. The Agency's Quality
System is described in EPA QA/G-0, The EPA Quality System.
EPA policy requires that all projects involving the generation, acquisition, and use of
environmental data be planned and documented and have an Agency-approved QAPP prior to the start of
data collection. The primary purpose of the QAPP is to provide an overview of the project, describe the
need for the measurements, and define QA/QC activities to be applied to the project, all within a single
document. The QAPP should be detailed enough to provide a clear description of every aspect of the
project and include information for every member of the project staff, including samplers, lab staff, and
data reviewers. The QAPP facilitates communication among clients, data users, project staff,
management, and external reviewers. Effective implementation of the QAPP assists project managers in
keeping projects on schedule and within the resource budget. Agency QA policy is described in the
Quality Manual and EPA QA/R-1, EPA Quality System Requirements for Environmental Programs.
EPA QA/G-5
QA98
-------
CHAPTER II
QAPP REQUIREMENTS
EPA POLICY ON QAPPS
It is EPA's internal policy requirement1 that the collection of environmental data by or for the
Agency be supported by a QA program, or quality system. The authority for this requirement for work
done for EPA through extramural agreements may be found in 48 CFR, Chapter 15, Part 1546 for
contractors, and 40 CFR, Parts 30, 31, and 35 for financial assistance recipients, and may be included in
negotiated interagency agreements and consent agreements in enforcement actions.
A key component of this mandatory quality system is the development, review, approval, and
implementation of the QAPP. A QAPP must address all of the elements contained in QA/R-5 unless
otherwise specified by the EPA QA Manager responsible for the data collection. The format of the
QAPP is decided by the QA approving authority prior to preparation of the QAPP.
The QAPP is the logical product of the planning process for any data collection, as it documents
how QA and QC activities will be planned and implemented. To be complete, the QAPP must meet
certain specifications for detail and coverage, but the extent of detail is dependent on the type of project,
the data to be collected, and the decisions to be made. Overall, the QAPP must provide sufficient detail
to demonstrate that:
the project's technical and quality objectives are identified and agreed upon,
• the intended measurements or data acquisition methods are consistent with project
objectives,
• the assessment procedures are sufficient for determining if data of the type and quality
needed and expected are obtained, and
• any potential limitations on the use of the data can be identified and documented.
Documents prepared prior to the QAPP (e.g., standard operating procedures [SOPs], test plans, and
sampling plans) can be appended or, in some cases, incorporated by reference.
QAPP GROUPS AND ELEMENTS
The elements of a QAPP are categorized into "groups" according to their function.
Specifications for each element are found in EPA Requirements for Quality Assurance Project Plans
(EPA QA/R-5). Summaries of each requirement of the elements from that document are contained in a
box at the beginning of each specific element description. The elements of a QAPP are:
Group A: Project Management
This group of QAPP elements covers the general areas of project management, project history
and objectives, and roles and responsibilities of the participants. The following 9 elements ensure that
'EPA Order 5360.1, Policy and Program Requirements to Implement the Mandatory Quality Assurance
Program, was issued originally in April 1984 and will be revised in 1998.
EPA QA/G-5 3 QA98
-------
the project's goals are clearly stated, that all participants understand the goals and the approach to be
used, and that project planning is documented:
Al Title and Approval Sheet
A2 Table of Contents and Document Control Format
A3 Distribution List
A4 Project/Task Organization and Schedule
A5 Problem Definition/Background
A6 Project/Task Description
A7 Quality Objectives and Criteria for Measurement Data
A8 Special Training Requirements/Certification
A9 Documentation and Records
Group B: Measurement/Data Acquisition
This group of QAPP elements covers all of the aspects of measurement system design and
implementation, ensuring that appropriate methods for sampling, analysis, data handling, and QC are
employed and will be thoroughly documented:
Bl Sampling Process Design (Experimental Design)
B2 Sampling Methods Requirements
B3 Sample Handling and Custody Requirements
B4 Analytical Methods Requirements
B5 Quality Control Requirements
B6 Instrument/Equipment Testing, Inspection, and Maintenance Requirements
B7 Instrument Calibration and Frequency
B8 Inspection/Acceptance Requirements for Supplies and Consumables
B9 Data Acquisition Requirements (Non-Direct Measurements)
BIO Data Management
Group C: Assessment/Oversight
The purpose of assessment is to ensure that the QAPP is implemented as prescribed. This group
of QAPP elements addresses the activities for assessing the effectiveness of the implementation of the
project and the associated QA/QC activities:
C1 Assessments and Response Actions
C2 Reports to Management
Group D: Data Validation and Usability
Implementation of Group D elements ensures that the individual data elements conform to the
specified criteria, thus enabling reconciliation with the project's objectives. This group of elements
covers the QA activities that occur after the data collection phase of the project has been completed:
Dl Data Review, Validation, and Verification Requirements
D2 Validation and Verification Methods
D3 Reconciliation with Data Quality Objectives
EPA QA/G-5 4 QA98
-------
QAPP RESPONSIBILITIES
QAPPs may be prepared by EPA organizations and by groups outside EPA including contractors,
assistance agreement holders, or other Federal agencies under interagency agreements. Generally, all
QAPPs prepared by non-EPA organizations must be approved by EPA for implementation. Writing a
QAPP is often a collaborative effort within an organization, or among organizations, and depends on the
technical expertise, writing skills, knowledge of the project, and availability of the staff. Organizations
are encouraged to involve technical project staff and the QA Manager or the QA Officer in this effort to
ensure that the QAPP has adequate detail and coverage.
None of the environmental data collection work addressed by the QAPP may be started until the
initial QAPP has been approved by the EPA Project Officer and the EPA QA Manager and then
distributed to project personnel except under circumstances requiring immediate action to protect human
health and the environment or to operations conducted under police power. In some cases, EPA may
grant conditional or partial approval to a QAPP to permit some work to begin while noncritical
deficiencies in it are being resolved. However, the QA Manager should be consulted to determine the
length of time and nature of the work that may continue and the type of work that may be performed
under a conditionally approved QAPP. Some organizations have defined and outlined these terms as:
• Approval: No remaining identified deficiencies exist in the QAPP and the project may
commence.
• Partial Approval: Some activities identified in the QAPP still contain critical
deficiencies while other activities are acceptable. If the acceptable activities are not
contingent upon the completion of the activities with the deficiencies, a partial approval
may be granted to allow those activities to proceed. Work will continue to resolve the
portions of the QAPP that contain deficiencies.
• Conditional Approval: Approval of the QAPP or portions thereof will be granted upon
agreement to implement specific conditions, specific language, etc. by entities required
to approve the QAPP in order to expedite the initiation of field work. In most situations,
the conditional approval is upgraded to final approval upon receipt, review, and sign off
by all entities of the revised/additional QAPP pages.
The organizational group performing the work is responsible for implementing the approved
QAPP. This responsibility includes ensuring that all personnel involved in the work have copies of or
access to the approved QAPP along with all other necessary planning documents. In addition, the group
must ensure that these personnel understand their requirements prior to the start of data generation
activities.
Moreover, organizations are responsible for keeping the QAPP current when changes to
technical aspects of the project change. QAPPs must be revised to incorporate such changes and the
QAPP must be re-examined to determine the impact of the changes. Any revisions to the QAPP must be
re-approved and distributed to all participants in the project.
EPA QA/G-5 5 QA98
-------
EPA QA/G-5
QA98
-------
CHAPTER III
QAPP ELEMENTS
A PROJECT MANAGEMENT
The following project management elements address the procedural aspects of project
development and what to include in the QAPP project background, task description, and quality
objectives elements. Summaries from R-5 are contained in the text box following the title of each
element.
Al TITLE AND APPROVAL SHEET
Include title of plan; name of the organization(s); and names, titles, signatures of appropriate
approving officials, and their approval dates.
The title and approval sheet includes the title of the QAPP; the name(s) of the organization(s)
implementing the project; and the names, titles, and signatures, and the signature dates of the appropriate
approving officials. The approving officials typically include: the organization's Technical Project
Manager, the organization's Quality Assurance Officer or Manager, the EPA (or other funding agency)
Technical Project Manager/Project Officer, Laboratory Directors, Laboratory QA Officers, the EPA (or
other funding agency) Quality Assurance Officer or Manager, and other key staff, such as the QA Officer
of the prime contractor when a QAPP is prepared by a subcontractor organization.
The purpose of the approval sheet is to enable officials to document their approval of the QAPP.
The title page (along with the organization chart) also identifies the key project officials for the work.
The title and approval sheet should also indicate the date of the revision and a document number, if
appropriate.
A2 TABLE OF CONTENTS AND DOCUMENT CONTROL FORMAT
List sections, figures, tables, references, and appendices.
The table of contents lists all the elements, references, and appendices contained in a QAPP,
including a list of tables and a list of figures that are used in the text. The major headings for most
QAPPs should closely follow the list of required elements; an example is shown in Figure 2. While the
exact format of the QAPP does not have to follow the sequence given here, it is generally more
convenient to do so, and it provides a standard format to the QAPP reviewer. Moreover, consistency in
the format makes the document more familiar to users, who can expect to find a specific item in the same
place in every QAPP.
The table of contents of the QAPP may include a document control component. This information
should appear in the upper right-hand corner of each page of the QAPP when document control format is
desired. For example:
EPA QA/G-5 7 QA98
-------
Project No. or Name
Element or Section No.
Revision No.
Revision Date
Section/Element Page of.
This component, together with the distribution list (see element A3), facilitates control of the
document to help ensure that the most current QAPP is in use by all project participants. Each revision
of the QAPP should have a different revision number and date.
A3 DISTRIBUTION LIST
List all the individuals and their organizations who will receive copies of the approved QAPP
and any subsequent revisions. Include all persons who are responsible for implementation
(including managers), the QA managers, and representatives of all groups involved.
All the persons and document files designated to receive copies of the QAPP, and any planned
future revisions, need to be listed in the QAPP. This list, together with the document control
information, will help the project manager ensure that all key personnel in the implementation of the
QAPP have up-to-date copies of the plan. A typical distribution list appears in Figure 2.
A4 PROJECT/TASK ORGANIZATION
Identify the individuals or organizations participating in the project and discuss their specific
roles and responsibilities. Include principal data users, the decision makers, the project QA
manager, and all persons responsible for implementation.
Ensure that the project QA manager is independent of the unit generating the data.
Provide a concise organization chart showing the relationships and the lines of communication
among all project participants; other data users who are outside of the organization generating
the data; and any subcontractor relationships relevant to environmental data operations.
A4.1 Purpose/Background
The purpose of the project organization is to provide EPA and other involved parties with a clear
understanding of the role that each party plays in the investigation or study and to provide the lines of
authority and reporting for the project.
A4.2 Roles and Responsibilities
The specific roles, activities, and responsibilities of participants, as well as the internal lines of
authority and communication within and between organizations, should be detailed. The position of the
QA Manager or QA Officer should be described. Include the principal data users, the decision maker,
project manager, QA manager, and all persons responsible for implementation of the QAPP. Also
included should be the person responsible for maintaining the QAPP and any individual approving
EPA QA/G-5 8 QA98
-------
CONTENTS
Section
List of Tables iv
List of Figures v
A Project Management 1
1 Project/Task Organization 1
2 Problem Definition/Background 3
3 Project/Task Description 4
4 Data Quality Objectives 7
4.1 Project Quality Objectives 7
4.2 Measurement Performance Criteria 8
5 Documentation and Records 10
B Measurement Data Acquisition 11
6 Sampling Process Design 11
7 Analytical Methods Requirements 13
7.1 Organics 13
7.2 Inorganics 14
7.3 Process Control Monitoring 15
8 Quality Control Requirements 16
8.1 Field QC Requirements 16
8.2 Laboratory QC Requirements 17
9 Instrument Calibration and Frequency 19
10 Data Acquisition Requirements 20
11 Data Management 22
C Assessment/Oversight 23
12 Assessment and Response Actions 23
12.1 Technical Systems Audits 23
12.2 Performance Evaluation Audits 23
13 Reports to Management 24
D Data Validation and Usability 24
14 Data Review, Validation, and Verification Requirements 24
15 Reconciliation with Data Quality Objectives 26
15.1 Assessment of Measurement Performance 26
15.2 Data Quality Assessment 27
Distribution List
N. Wentworth, EPA/ORD (Work Assignment Manager)*
B. Waldron, EPA/ORD (QA Manager)
J. Warren, State University (Principal Investigator)
T. Dixon, State University (QA Officer)
G. Johnson, State University (Field Activities)
F. Haeberer, State University (Laboratory Activities)
B. Odom, State University (Data Management)
E. Renard, ABC Laboratories (Subcontractor Laboratory)
P. Lafornara, ABC Laboratories (QA Manager Subcontractor Laboratory)
indicates approving authority
Figure 2. An Example of a Table of Contents and a Distribution List
EPA QA/G-5 9 QA98
-------
deliverables other than the project manager. A concise chart showing the project organization, the lines
of responsibility, and the lines of communication should be presented; an example is given in Figure 3.
For complex projects, it may be useful to include more than one chart—one for the overall project (with
at least the primary contact) and others for each organization. Where direct contact between project
managers and data users does not occur, such as between a project consultant for a potentially
responsible party and the EPA risk assessment staff, the organization chart should show the route by
which information is exchanged.
EPA Work Assic
*N. We
2O2-56
Office of Researc
nment Manager
ntwo rth
n & Development
Principal Investigator
J. Warren
202-564-6876
State University
Engineering Department
EPA QA Manager
B. Waldron
2O2-2564-683O
ce of Research & Development
communication only
Project QA Officer
T. Dixon, post doctoral fellow
State University
Chemistry Department
Fie,d Activities Laboratory Activities ^^OdorT^'
G. Johnson, graduate student h" Haebe_rel". graduate assistant professor
91 9-541 -761 2 2O2 564 6872 202-564-6881
State University State Unlvefs^ M ^^ Un^ersi^
Engineering Department Chemistry Department Mathemat.cs Departmen
*approving authority
Subcontractor
ABC Laboratories
(GC/MS Analyses Only)
Laboratory Manager
E. Renard
908-321-4355
t QA Manager
P. Lafornara
9O8-9O6-6988
Figure 3. An Example of a Project Organization Chart
AS PROBLEM DEFINITION/BACKGROUND
State the specific problem to be solved or decision to be made and include sufficient
background information to provide a historical and scientific perspective for this particular
project.
A5.1 Purpose/Background
The background information provided in this element will place the problem in historical
perspective, giving readers and users of the QAPP a sense of the project's purpose and position relative
to other project and program phases and initiatives.
EPA QA/G-5
10
QA98
-------
A5.2 Problem Statement and Background
This discussion must include enough information about the problem, the past history, any
previous work or data, and any other regulatory or legal context to allow a technically trained reader to
make sense of the project objectives and activities. This discussion should include:
a description of the problem as currently understood, indicating its importance and
programmatic, regulatory, or research context;
• a summary of existing information on the problem, including any conflicts or
uncertainties that are to be resolved by the project;
a discussion of initial ideas or approaches for resolving the problem there were
considered before selecting the approach described in element A6, "Project/Task
Description"; and
• the identification of the principal data user or decision maker (if know).
Note that the problem statement is the first step of the DQO Process and the decision specification is the
second step of the DQO Process.
A6 PROJECT/TASK DESCRIPTION AND SCHEDULE
Provide a description of the work to be performed and the schedule for implementation.
Include measurements that will be made during the course of the project; applicable technical,
regulatory, or program-specific quality standards, criteria, or objectives; any special personnel
and equipment requirements; assessment tools needed; a schedule for work to be performed;
and project and quality records required, including types of reports needed.
A6.1 Purpose/Background
The purpose of the project/task description element is to provide the participants with a
background understanding of the project and the types of activities to be conducted, including the
measurements that will be taken and the associated QA/QC goals, procedures, and timetables for
collecting the measurements.
A6.2 Description of the Work to be Performed
(1) Measurements that are expected during the course of the project. Describe the
characteristic or property to be studied and the measurement processes and techniques
that will be used to collect data.
(2) Applicable technical quality standards or criteria. Cite any relevant regulatory
standards or criteria pertinent to the project. For example, if environmental data are
collected to test for compliance with a permit limit standard, the standard should be cited
and the numerical limits should be given in the QAPP. The DQO Process refers to these
limits as "action levels," because the type of action taken by the decision maker will
depend on whether the measured levels exceed the limit (Step 5 of the DQO Process).
(3) Any special personnel and equipment requirements that may indicate the
complexity of the project. Describe any special personnel or equipment required for
the specific type of work being planned or measurements being taken.
EPA QA/G-5 11 QA98
-------
(4) The assessment techniques needed for the project. The degree of quality assessment
activity for a project will depend on the project's complexity, duration, and objectives. A
discussion of the timing of each planned assessment and a brief outline of the roles of the
different parties to be involved should be included.
(5) A schedule for the work performed. The anticipated start and completion dates for the
project should be given. In addition, this discussion should include an approximate
schedule of important project milestones, such as the start of environmental
measurement activities.
(6) Project and quality records required, including the types of reports needed. An
indication of the most important records should be given.
A7 QUALITY OBJECTIVES AND CRITERIA FOR MEASUREMENT DATA
Describe the project quality objectives and measurement performance criteria.
A7.1 Purpose/Background
The purpose of this element is to document the DQOs of the project and to establish performance
criteria for the mandatory systematic planning process and measurement system that will be employed in
generating the data.
A7.2 Specifying Quality Objectives
This element of the QAPP should discuss the
desired quality of the final results of the study to ensure that
the data user's needs are met. The Agency strongly
recommends using the DQO Process (see Figure 4), a
systematic procedure for planning data collection activities,
to ensure that the right type, quality, and quantity of data are
collected to satisfy the data user's needs. DQOs are
qualitative and quantitative statements that:
clarify the intended use of the data,
• define the type of data needed to support the
decision,
• identify the conditions under which the data
should be collected, and
• specify tolerable limits on the probability of
making a decision error due to uncertainty
in the data.
Data Quality Indicators (DQIs) can be evolved from DQOs
for a sampling activity through the use of the DQO Process _,. A _,, _.„„ _
/A A• T\\ c A u +u + f+u nr\r\ Figure 4. The DQO Process
(Appendix D). figure 4 shows the seven steps or the DQO
Process, which is explained in detail in EPA QA/G-4, Guidance for the Data Quality Objectives Process.
1. State the Problem
J|,
2. Identify the Decision
J|,
3. Identify Inputs to the Decision
Jl,
4. Define the Study Boundaries
J|,
5. Develop a Decision Rule
J|,
6. Specify Limits on Decision Errors
1 ¥
7. Optimize the Design for Obtaining Data
EPA QA/G-5
12
QA98
-------
Appendix A.4 provides a crosswalk between the requirements of the QAPP and the DQO outputs. The
QAPP should include a reference for a full discussion of the proposed DQOs.
For exploratory research, sometimes the goal is to develop questions that may be answered by
subsequent work. Therefore, researchers may modify activities advocated in QA/G-4 to define decision
errors (see EPA QA/G-4R, Data Quality Objectives for Researchers).
A7.3 Specifying Measurement Performance Criteria
While the quality objectives state what the data user's needs are, they do not provide sufficient
information about how these needs can be satisfied. The specialists who will participate in generating
the data need to know the measurement performance criteria that must be satisfied to achieve the overall
quality objectives. One of the most important features of the QAPP is that it links the data user's quality
objectives to verifiable measurement performance criteria. Although the level of rigor with which this is
done and documented will vary widely, this linkage represents an important advancement in the
implementation of QA. Once the measurement performance criteria have been established, sampling and
analytical methods criteria can be specified under the elements contained in Group B.
A8 SPECIAL TRAINING REQUIREMENTS/CERTIFICATION
Identify and describe any specialized training or certification requirements and discuss how
such training will be provided and how the necessary skills will be assured and documented.
A8.1 Purpose/Background
The purpose of this element is to ensure that any specialized training requirements necessary to
complete the projects are known and furnished and the procedures are described in sufficient detail to
ensure that specific training skills can be verified, documented, and updated as necessary.
A8.2 Training
Requirements for specialized training for nonroutine field sampling techniques, field analyses,
laboratory analyses, or data validation should be specified. Depending on the nature of the
environmental data operation, the QAPP may need to address compliance with specifically mandated
training requirements. For example, contractors or employees working at a Superfund site need
specialized training as mandated by the Occupational Safety and Health (OSF£A) regulations. If
hazardous materials are moved offsite, compliance with the training requirements for shipping hazardous
materials as mandated by the Department of Transportation (DOT) in association with the International
Air Transportation Association may be necessary. This element of the QAPP should show that the
management and project teams are aware of specific health and safety needs as well as any other
organizational safety plans.
A8.3 Certification
Usually, the organizations participating in the project that are responsible for conducting training
and health and safety programs are also responsible for ensuring certification. Training and certification
should be planned well in advance for necessary personnel prior to the implementation of the project.
EPAQA/G-5 13 QA98
-------
All certificates or documentation representing completion of specialized training should be maintained in
personnel files.
A9 DOCUMENTATION AND RECORDS
Itemize the information and records that must be included in the data report package and
specify the desired reporting format for hard copy and electronic forms, when used.
Identify any other records and documents applicable to the project, such as audit reports,
interim progress reports, and final reports, that will be produced.
Specify or reference all applicable requirements for the final disposition of records and
documents, including location and length of retention period.
A9.1 Purpose/Background
This element defines which records are critical to the project and what information needs to be
included in reports, as well as the data reporting format and the document control procedures to be used.
Specification of the proper reporting format, compatible with data validation, will facilitate clear, direct
communication of the investigation.
A9.2 Information Included in the Reporting Packages
The selection of which records to include in a data reporting package must be determined based
on how the data will be used. Different "levels of effort" require different supporting QA/QC
documentation. For example, organizations conducting basic research have different reporting
requirements from organizations collecting data in support of litigation or in compliance with permits.
When possible, field and laboratory records should be integrated to provide a continuous reporting track.
The following are examples of different records that may be included in the data reporting package.
A9.2.1 Field Operation Records
The information contained in these records documents overall field operations and generally
consists of the following:
• Sample collection records. These records show that the proper sampling protocol was
performed in the field. At a minimum, this documentation should include the names of
the persons conducting the activity, sample number, sample collection points, maps and
diagrams, equipment/method used, climatic conditions, and unusual observations.
Bound field notebooks are generally used to record raw data and make references to
prescribed procedures and changes in planned activities. They should be formatted to
include pre-numbered pages with date and signature lines.
• Chain-of-custody records. Chain-of-custody records document the progression of
samples as they travel from the original sampling location to the laboratory and finally to
their disposal area. (See Appendix C for an example of a chain-of-custody checklist.)
EPA QA/G-5 14 QA98
-------
• QC sample records. These records document the generation of QC samples, such as
field, trip, and equipment rinsate blanks and duplicate samples. They also include
documentation on sample integrity and preservation and include calibration and
standards' traceability documentation capable of providing a reproducible reference
point. Quality control sample records should contain information on the frequency,
conditions, level of standards, and instrument calibration history.
• General field procedures. General field procedures record the procedures used in the
field to collect data and outline potential areas of difficulty in gathering specimens.
• Corrective action reports. Corrective action reports show what methods were used in
cases where general field practices or other standard procedures were violated and
include the methods used to resolve noncompliance.
If applicable, to show regulatory compliance in disposing of waste generated during the data operation,
procedures manifest and testing contracts should be included in the field procedures section.
A9.2.2 Laboratory Records
The following list describes some of the laboratory-specific records that should be compiled if
available and appropriate:
Sample Data. These records contain the times that samples were analyzed to verify that
they met the holding times prescribed in the analytical methods. Included should be the
overall number of samples, sample location information, any deviations from the SOPs,
time of day, and date. Corrective action procedures to replace samples violating the
protocol also should be noted.
• Sample Management Records. Sample management records document sample receipt,
handling and storage, and scheduling of analyses. The records verify that the chain-of-
custody and proper preservation were maintained, reflect any anomalies in the samples
(such as receipt of damaged samples), note proper log-in of samples into the laboratory,
and address procedures used to ensure that holding time requirements were met.
Test Methods. Unless analyses are performed exactly as prescribed by SOPs, this
documentation will describe how the analyses were carried out in the laboratory. This
includes sample preparation and analysis, instrument standardization, detection and
reporting limits, and test-specific QC criteria. Documentation demonstrating laboratory
proficiency with each method used could be included.
* QA/QC Reports. These reports will include the general QC records, such as initial
demonstration of capability, instrument calibration, routine monitoring of analytical
performance, calibration verification, etc. Project-specific information from the QA/QC
checks such as blanks (field, reagent, rinsate, and method), spikes (matrix, matrix spike
replicate, analysis matrix spike, and surrogate spike), calibration check samples (zero
check, span check, and mid-range check), replicates, splits, and so on should be included
in these reports to facilitate data quality analysis.
EPAQA/G-5 15 QA98
-------
A9.2.3 Data Handling Records
These records document protocols used in data reduction, verification, and validation. Data
reduction addresses data transformation operations such as converting raw data into reportable quantities
and units, use of significant figures, recording of extreme values, blank corrections, etc. Data
verification ensures the accuracy of data transcription and calculations, if necessary, by checking a set of
computer calculations manually. Data validation ensures that QC criteria have been met.
A9.3 Data Reporting Package Format and Documentation Control
The format of all data reporting packages must be consistent with the requirements and
procedures used for data validation and data assessment described in Sections B, C, and D of the QAPP.
All individual records that represent actions taken to achieve the objective of the data operation and the
performance of specific QA functions are potential components of the final data reporting package. This
element should discuss how these various components will be assembled to represent a concise and
accurate record of all activities impacting data quality. The discussion should detail the recording
medium for the project, guidelines for hand-recorded data (e.g., using indelible ink), procedures for
correcting data (e.g., single line drawn through errors and initialed by the responsible person), and
documentation control. Procedures for making revisions to technical documents should be clearly
specified and the lines of authority indicated.
A9.4 Data Reporting Package Archiving and Retrieval
The length of storage for the data reporting package may be governed by regulatory
requirements, organizational policy, or contractual project requirements. This element of the QAPP
should note the governing authority for storage of, access to, and final disposal of all records.
A9.5 References
Kanare, Howard M. 1985. Writing the Laboratory Notebook. Washington, DC: American Chemical Society.
U.S. Environmental Protection Agency. 1993. Guidance on Evaluation, Resolution, and Documentation of Analytical Problems
Associated-with Compliance Monitoring. EPA/821/B-93/001.
EPAQA/G-5 16 QA98
-------
B MEASUREMENT/DATA ACQUISITION
Bl SAMPLING PROCESS DESIGN (EXPERIMENTAL DESIGN)
Describe the experimental design or data collection design for the project.
Classify all measurements as critical or non-critical.
Bl.l Purpose/Background
The purpose of this element is to describe all the relevant components of the experimental
design; define the key parameters to be estimated; indicate the number and type of samples expected;
and describe where, when, and how samples are to be taken. The level of detail should be sufficient that
a person knowledgeable in this area could understand how and why the samples will be collected. This
element provides the main opportunity for QAPP reviewers to ensure that the "right" samples will be
taken. Strategies such as stratification, compositing, and clustering should be discussed, and diagrams or
maps showing sampling points should be included. Most of this information should be available as
outputs from the final steps of the planning (DQO) process.
In addition to describing the design, this element of the QAPP should discuss the following:
a schedule for project sampling activities,
• a rationale for the design (in terms of meeting DQOs),
the sampling design assumptions,
• the procedures for locating and selecting environmental samples,
a classification of measurements as critical or noncritical, and
• the validation of any nonstandard sampling/measurement methods.
Elements B1.2 through B1.8 address these subjects.
B1.2 Scheduled Project Activities, Including Measurement Activities
This element should give anticipated start and completion dates for the project as well as
anticipated dates of major milestones, such as the following:
schedule of sampling events;
• schedule for analytical services by offsite laboratories;
schedule for phases of sequential sampling (or testing), if applicable;
• schedule of test or trial runs; and
schedule for peer review activities.
The use of bar charts showing time frames of various QAPP activities to identify both potential
bottlenecks and the need for concurrent activities is recommended.
B1.3 Rationale for the Design
The objectives for an environmental study should be formulated in the planning stage of any
investigation. The requirements and the rationale of the design for the collection of data are derived
EPAQA/G-5 17 QA98
-------
from the quantitative outputs of the DQO Process. The type of design used to collect data depends
heavily on the key characteristic being investigated. For example, if the purpose of the study is to
estimate overall average contamination at a site or location, the characteristic (or parameter) of interest
would be the mean level of contamination. This information is identified in Step 5 of the DQO Process.
The relationship of this parameter to any decision that has to be made from the data collected is obtained
from Steps 2 and 3 of the DQO Process (see Figure 4).
The potential range of values for the parameter of interest should be considered during
development of the data collection methodology and can be greatly influenced by knowledge of potential
ranges in expected concentrations. For example, the number of composite samples needed per unit area
is directly related to the variability in potential contaminant levels expected in that area.
The choice between a probability-based (statistical) data collection design or a nonrandom
(judgmental) data collection methodology depends on the ultimate use of the data being collected. This
information is specified in Steps 5 and 6 of the DQO Process. Adherence to the data collection design
chosen in Step 7 of the DQO Process directly affects the magnitude of potential decision error rates
(false positive rate and false negative rate) established in Step 6 of the DQO Process. Any procedures for
coping with unanticipated data collection design changes also should be briefly discussed.
B1.4 Design Assumptions
The planning process usually recommends a specific data collection method (Step 7 of the DQO
Process), but the effectiveness of this methodology rests firmly on assumptions made to establish the
data collection design. Typical assumptions include the homogeneity of the medium to be sampled (for
example, sludge, fine silt, or wastewater effluent), the independence in the collection of individual
samples (for example, four separate samples rather than four aliquots derived from a single sample), and
the stability of the conditions during sample collection (for example, the effects of a rainstorm during
collection of wastewater from an industrial plant). The assumptions should have been considered during
the DQO Process and should be summarized together with a contingency plan to account for exceptions
to the proposed sampling plan. An important part of the contingency plan is documenting the procedures
to be adopted in reporting deviations or anomalies observed after the data collection has been completed.
Examples include an extreme lack of homogeneity within a physical sample or the presence of analytes
that were not mentioned in the original sampling plan. Chapter 1 of EPA QA/G-9 provides an overview
of sampling plans and the assumptions needed for their implementation. EPA QA/G-5S provides
guidance on the construction of sampling plans to meet the requirements generated by the DQO Process.
B1.5 Procedures for Locating and Selecting Environmental Samples
The most appropriate plan for a particular sampling application will depend on: the practicality
and feasibility (e.g., determining specific sampling locations) of the plan, the key characteristic (the
parameter established in Step 5 of the DQO Process) to be estimated, and the implementation resource
requirements (e.g., the costs of sample collection, transportation, and analysis).
This element of the QAPP should also describe the frequency of sampling and specific sample
locations (e.g., sample port locations and traverses for emissions source testing, well installation designs
for groundwater investigations) and sampling materials. When decisions on the number and location of
samples will be made in the field, the QAPP should describe how these decisions will be driven whether
by actual observations or by field screening data. When locational data are to be collected, stored, and
transmitted, the methodology used must be described (or referenced) and include the following:
EPAQA/G-5 18 QA98
-------
procedures for rinding prescribed sample locations,
• contingencies for cases where prescribed locations are inaccessible,
location bias and its assessment, and
• procedures for reporting deviations from the sampling plan.
When appropriate, a map of the sample locations should be provided and locational map
coordinates supplied. EPA QA/G-5S provides nonmandatory guidance on the practicality of
constructing sampling plans and references to alternative sampling procedures.
B1.6 Classification of Measurements as Critical or Noncritical
All measurements should be classified as critical (i.e., required to achieve project objectives or
limits on decision errors, Step 6 of the DQO Process) or noncritical (for informational purposes only or
needed to provide background information). Critical measurements will undergo closer scrutiny during
the data gathering and review processes and will have first claim on limited budget resources. It is also
possible to include the expected number of samples to be tested by each procedure and the acceptance
criteria for QC checks (as described in element B5, "Quality Control Requirements").
B1.7 Validation of Any Nonstandard Methods
For nonstandard sampling methods, sample matrices, or other unusual situations, appropriate
method validation study information may be needed to confirm the performance of the method for the
particular matrix. The purpose of this validation information is to assess the potential impact on the
representativeness of the data generated. For example, if qualitative data are needed from a modified
method, rigorous validation may not be necessary. Such validation studies may include round-robin
studies performed by EPA or by other organizations. If previous validation studies are not available,
some level of single-user validation study or ruggedness study should be performed during the project
and included as part of the project's final report. This element of the QAPP should clearly reference any
available validation study information.
B2 SAMPLING METHODS REQUIREMENTS
Describe the procedures for collecting samples and identify the sampling methods and
equipment. Include any implementation requirements, support facilities, sample preservation
requirements, and materials needed. Describe the process for preparing and decontaminating
sampling equipment, including disposing decontamination by-products; selecting and
preparing sample containers, sample volumes, preservation methods, and maximum holding
times for sampling and/or analysis.
Describe specific performance requirements for the method. Address what to do when a
failure in the sampling occurs, who is responsible for corrective action, and how the
effectiveness of the corrective action shall be determined and documented.
B2.1 Purpose/Background
Environmental samples should reflect the target population and parameters of interest. As with
all other considerations involving environmental measurements, sampling methods should be chosen
with respect to the intended application of the data. Just as methods of analysis vary in accordance with
EPAQA/G-5 19 QA98
-------
project needs, sampling methods can also vary according to these requirements. Different sampling
methods have different operational characteristics, such as cost, difficulty, and necessary equipment. In
addition, the sampling method can materially affect the representativeness, comparability, bias, and
precision of the final analytical result.
In the area of environmental sampling, there exists a great variety of sample types. It is beyond
the scope of this document to provide detailed advice for each sampling situation and sample type.
Nevertheless, it is possible to define certain common elements that are pertinent to many sampling
situations with discrete samples (see EPA QA/G-5S).
If a separate sampling and analysis plan is required or created for the project, it should be
included as an appendix to the QAPP. The QAPP should simply refer to the appropriate portions of the
sampling and analysis plan for the pertinent information and not reiterate information.
B2.2 Describe the Sample Collection, Preparation, and Decontamination Procedures
(1) Select and describe appropriate sampling methods from the appropriate compendia of methods.
For each parameter within each sampling situation, identify appropriate sampling methods from
applicable EPA regulations, compendia of methods, or other sources of methods that have been
approved by EPA. When EPA-sanctioned procedures are available, they will usually be
selected. When EPA-sanctioned procedures are not available, standard procedures from other
organizations and disciplines may be used. A complete description of non-EPA methods should
be provided in (or attached to) the QAPP. Procedures for sample homogenization of nonaqueous
matrices may be described in part (2) as a technique for assuring sample representativeness. In
addition, the QAPP should specify the type of sample to be collected (e.g., grab, composite,
depth-integrated, flow- weighted) together with the method of sample preservation.
(2) Discuss sampling methods' requirements. Each medium or contaminant matrix has its own
characteristics that define the method performance and the type of material to be sampled.
Investigators should address the following:
• actual sampling locations,
choice of sampling method/collection,
• delineation of a properly shaped sample,
inclusion of all particles within the volume sampled, and
• subsampling to reduce the representative field sample into a representative laboratory
aliquot.
Having identified appropriate and applicable methods, it is necessary to include the
requirements for each method in the QAPP. If there is more than one acceptable sampling
method applicable to a particular situation, it may be necessary to choose one from among them.
DQOs should be considered in choosing these methods to ensure that: a) the sample accurately
represents the portion of the environment to be characterized, b) the sample is of sufficient
volume to support the planned chemical analysis, and c) the sample remains stable during
shipping and handling.
(3) Describe the decontamination procedures and materials. Decontamination is primarily
applicable in situations of sample acquisition from solid, semi-solid, or liquid media, but it
should be addressed, if applicable, for continuous monitors as well. The investigator must
EPA QA/G-5 20 QA98
-------
consider the appropriateness of the decontamination procedures for the project at hand. For
example, if contaminants are present in the environmental matrix at the 1% level, it is probably
unnecessary to clean sampling equipment to parts-per-billion (ppb) levels. Conversely, if ppb-
level detection is required, rigorous decontamination or the use of disposable equipment is
required. Decontamination by-products must be disposed of according to EPA policies and the
applicable rules and regulations that would pertain to a particular situation, such as the
regulations of OSHA, the Nuclear Regulatory Commission (NRC), and State and local
governments.
B2.3 Identify Support Facilities for Sampling Methods
Support facilities vary widely in their analysis capabilities, from percentage-level accuracy to
ppb-level accuracy. The investigator must ascertain that the capabilities of the support facilities are
commensurate with the requirements of the sampling plan established in Step 7 of the DQO Process.
B2.4 Describe Sampling/Measurement System Failure Response and Corrective Action Process
This section should address issues of responsibility for the quality of the data, the methods for
making changes and corrections, the criteria for deciding on a new sample location, and how these
changes will be documented. This section should describe what will be done if there are serious flaws
with the implementation of the sampling methodology and how these flaws will be corrected. For
example, if part of the complete set of samples is found to be inadmissable, how replacement samples
will be obtained and how these new samples will be integrated into the total set of data should be
described.
B2.5 Describe Sampling Equipment, Preservation, and Holding Time Requirements
This section includes the requirements needed to prevent sample contamination (disposable
samplers or samplers capable of appropriate decontamination), the physical volume of the material to be
collected (the size of composite samples, core material, or the volume of water needed for analysis), the
protection of physical specimens to prevent contamination from outside sources, the temperature
preservation requirements, and the permissible holding times to ensure against degradation of sample
integrity.
B2.6 References
Publications useful in assisting the development of sampling methods include:
Solid and Hazardous Waste Sampling
U.S. Environmental Protection Agency. 1986. Test Methods for Evaluating SolidWaste (SW-846). 3rd Ed., Chapter 9.
U.S. Environmental Protection Agency. 1985. Characterization of Hazardous Waste Sites - A Methods Manual. Vol. I, Site
Investigations. EPA-600/4-84-075. Environmental Monitoring Systems Laboratory. Las Vegas, NV.
U.S. Environmental Protection Agency. 1984. Characterization of Hazardous Waste Sites - A Methods Manual. Vol. II,
Available Sampling Methods. EPA-600/4-84-076. Environmental Monitoring Systems Laboratory. Las Vegas, NV.
U.S. Environmental Protection Agency. 1987. A Compendium of Superfund Field Operations Methods. NTIS PB88-181557.
EPA/540/P-87/001. Washington, DC.
EPAQA/G-5 21 QA98
-------
Ambient Air Sampling
U.S. Environmental Protection Agency. 1994. Quality Assurance Handbook for Air Pollution Measurement Systems. Vol. I,
Principles. EPA 600/9-76-005. Section 1.4.8 and Appendix M.S.6.
U.S. Environmental Protection Agency. 1994. Quality Assurance Handbook for Air Pollution Measurement Systems. Vol. II,
EPA 600/R-94-038b. Sections 2.0.1 and 2.0.2 and individual methods.
U.S. Environmental Protection Agency. 1984. Compendium of Methods for the Determination of Toxic Organic Compounds in
Ambient Air. EPA/600-4-84-41. Environmental Monitoring Systems Laboratory. Research Triangle Park, NC.
Supplement: EPA-600-4-87-006. September 1986.
Source Testing (Air)
U.S. Environmental Protection Agency. 1994. Quality Assurance Handbook for Air Pollution Measurement Systems. Vol. Ill,
EPA 600/R-94-038c. Section 3.0 and individual methods.
Water/ Ground Water
U.S. Environmental Protection Agency. Handbook: Ground Water. Cincinnati, OH. EPA/625/6-87/016. March 1987.
U.S. Environmental Protection Agency. RCRA Ground Water Monitoring Technical Enforcement Guidance Document.
Washington, DC. 1986.
U.S. Environmental Protection Agency. Standard Methods for the Examination of Water and Waste-water. 16th ed.
Washington, DC. 1985.
Acid Precipitation
U.S. Environmental Protection Agency. 1994. Quality Assurance Handbook for Air Pollution Measurement Systems. Vol. V,
EPA 600/94-038e.
Meteorological Measurements
U.S. Environmental Protection Agency. 1989. Quality Assurance Handbook for Air Pollution Measurement Systems. Vol. IV,
EPA 600/4-90-003.
Radioactive Materials and Mixed Waste
U.S. Department of Energy. 1989. Radioactive-Hazardous Mixed Waste Sampling and Analysis: Addendum to SW-846.
Soils and Sediments
U.S. Environmental Protection Agency. 1985. Sediment Sampling Quality Assurance User's Guide. NTIS PB85-233542.
EPA/600/4-85/048. Environmental Monitoring Systems Laboratory. Las Vegas, NV.
U.S. Environmental Protection Agency. 1989. Soil Sampling Quality Assurance User's Guide. EPA/600/8-89/046.
Environmental Monitoring Systems Laboratory. Las Vegas, NV.
Earth, D.S., and T.H.Starks. 1985. Sediment Sampling Quality Assurance User's Guide. EPA/600-4-85/048. Prepared for
Environmental Monitoring and Support Laboratory. Las Vegas, NV.
Statistics, Geostatistics, and Sampling Theory
Myers, J.C. 1997. Geostatistical Error Measurement. New York: Van Nostrand Reinhold.
Pitard, F.F. 1989. Pierre Gy's Sampling Theory and Sampling Practice. Vollandll. Boca Raton, FL: CRC Press.
EPA QA/G-5 22 QA98
-------
Miscellaneous
American Chemical Society Joint Board/Council Committee on Environmental Improvement. 1990. Practical Guide for
Environmental Sampling and Analysis, Section II. Environmental Analysis. Washington, DC.
ASTM Committee D-34. 1986. Standard Practices for Sampling Wastes from Pipes and Other Point Discharges. Document
No. D34.01-001R7.
Keith, L. 1990. EPA's Sampling and Analysis Methods Database Manual. Austin, TX: Radian Corp.
Keith,L. 1991. Environmental Sampling and Analysis: A Practical Guide. Chelsea, MI: Lewis Publishers, Inc.
B3 SAMPLE HANDLING AND CUSTODY REQUIREMENTS
Describe the requirements and provisions for sample handling and custody in the field,
laboratory, and transport, taking into account the nature of the samples, the maximum
allowable sample holding times before extraction or analysis, and available shipping options
and schedules.
Include examples of sample labels, custody forms, and sample custody logs.
B3.1 Purpose/Background
This element of the QAPP should describe all procedures that are necessary for ensuring that:
(1) samples are collected, transferred, stored, and analyzed by authorized personnel;
(2) sample integrity is maintained during all phases of sample handling and analyses; and
(3) an accurate written record is maintained of sample handling and treatment from the time
of its collection through laboratory procedures to disposal.
Proper sample custody minimizes accidents by assigning responsibility for all stages of sample handling
and ensures that problems will be detected and documented if they occur. A sample is in custody if it is
in actual physical possession or it is in a secured area that is restricted to authorized personnel. The level
of custody necessary is dependent upon the project's DQOs. While enforcement actions necessitate
stringent custody procedures, custody in other types of situations (i.e., academic research) may be
primarily concerned only with the tracking of sample collection, handling, and analysis.
Sample custody procedures are necessary to prove that the sample data correspond to the sample
collected, if data are intended to be legally defensible in court as evidence. In a number of situations, a
complete, detailed, unbroken chain of custody will allow the documentation and data to substitute for the
physical evidence of the samples (which are often hazardous waste) in a civil courtroom. Some statutes
or criminal violations may still necessitate that the physical evidence of sample containers be presented
along with the custody and data documentation.
An outline of the scope of sample custody-starting from the planning of sample collection, field
sampling, sample analysis to sample disposal—should also be included. This discussion should further
stress the completion of sample custody procedures, which include the transfer of sample custody from
field personnel to lab, sample custody within the analytical lab during sample preparation and analysis,
and datastorage.
EPA QA/G-5 23 QA98
-------
B3.2 Sample Custody Procedure
The QAPP should discuss the sample custody procedure at a level commensurate with the
intended use of the data. This discussion should include the following:
(1) List the names and responsibilities of all sample custodians in the field and laboratories.
(2) Give a description and example of the sample numbering system.
(3) Define acceptable conditions and plans for maintaining sample integrity in the field prior
to and during shipment to the laboratory (e.g., proper temperature and preservatives).
(4) Give examples of forms and labels used to maintain sample custody and document
sample handling in the field and during shipping. An example of a sample log sheet is
given in Figure 5; an example sample label is given in Figure 6.
(5) Describe the method of sealing shipping containers with chain-of-custody seals. An
example of a seal is given in Figure 7.
(6) Describe procedures that will be used to maintain the chain of custody and document
sample handling during transfer from the field to the laboratory, within the laboratory,
and among contractors. An example of a chain-of-custody record is given in Figure 8.
(7) Provide for the archiving of all shipping documents and associated paperwork.
(8) Discuss procedures that will ensure sample security at all times.
(9) Describe procedures for within-laboratory chain-of-custody together with verification of
the printed name, signature, and initials of the personnel responsible for custody of
samples, extracts, or digests during analysis at the laboratory. Finally, document
disposal or consumption of samples should also be described. A chain-of-custody
checklist is included in Appendix C to aid in managing this element.
Minor documentation of chain-of-custody procedures is generally applicable when:
• Samples are generated and immediately tested within a facility or site; and
• Continuous rather than discrete or integrated samples are subjected to real- or near real-
time analysis (e.g., continuous monitoring).
The discussion should be as specific as possible about the details of sample storage, transportation, and
delivery to the receiving analytical facility.
EPA QA/G-5 24 QA98
-------
LU
<
Q
G SHEET NO.
O
LLJ
CL
2
<
OT
SURVEY
on
O
LL
Q
Oi
—
D
O
LJJ
W
LJJ
(fl
_l
<
Z
<
!
§
D)
52.
a:
LU
1
Q.
5
<
tn
LU
_l
Q.
s
<
tn
LL.
0
LU
Q.
3QINVAO
1ON3Hd
soiNvoyo aovai
gyaH
S3QIOIlS3d
nova
snviaiAi
asvayoaNvnio
Aiiaiayni
lAiyodnoo nvoad
lAiyodnoo nvioi
aynivyadiAiai
»AiiAiionaNoo
»Hd
oa
AiiNnv>inv
sanosa3QN3dsns
sanos nvioi
001
aoo
aoa
siN3iyinN
LU
<
Z
LU
W
LU
o:
Q.
y3NIVlNOO 3dAl
3i/\innoAnvioi
N3>1V13ndl/\IVS3l/\IIJ
STATION DESCRIPTION
z cc
O LU
p OQ
Bi
w z
O
£
Q
UJ
Z
<
UJ
d
1
0
£/) ^
£ S3
I !
•2: u
LU il
o:
Figure 5. An Example of a Sample Log Sheet
EPA QA/G-5
25
QA98
-------
(Name of Sampling Organization)
Sample Description:
Plant: Location:
Date:
Time:
Media: Station:
Sample Type: Preservative:
Sampled Bv:
Sample ID No.:
Lab No.
w
i_
CO
E
CD
cr
Figure 6. An Example of a Sample Label
9JT3Q
ivas
CUSTODY SEAL
o
3 Date
CD
•?
Signature
Figure 1. An Example of a Custody Seal
EPA QA/G-5
26
QA98
-------
STATION
NUMBER
STATION LOCATION
DATE
Relinquished by: (Signature)
Relinquished by: (signature)
Relinquished by: (signature)
Received by: (signature)
Received by: (Signature)
DATE/T
TIME
SAMPLERS (Signature)
SAMPLE TYPE
WATER
Comp
Grabx
AIR
SEQ
NO.
NO. OF
CONTAINERS
ANALYSIS
REQUIRED
Received by: (Signature)
Received by: (Signature)
Received by: (Signature)
Received by Mobile Laboratory for field
analysis: (Signature)
IME
Received for Laboratory by:
Method of Shipment:
DATE
DATE
DATE
DATE/
DATE
/TIME
/TIME
/TIME
TIME
/TIME
Distribution: Original - Accompany Shipment
1 Copy - Survey Coordinator Field Files
Figure 8. An Example of a Chain-of-Custody Record
EPA QA/G-5
27
QA98
-------
B4 ANALYTICAL METHODS REQUIREMENTS
Identify the analytical methods and equipment required, including sub-sampling or extraction
methods, laboratory decontamination procedures and materials (such as the case of hazardous
or radioactive samples), waste disposal requirements (if any), and specific performance
requirements for the method.
Identify analytical methods by number, date, and regulatory citation (as appropriate). If a
method allows the user to select from various options, then the method citations should state
exactly which options are being selected. For non-standard methods, such as unusual sample
matrices and situations, appropriate method performance study information is needed to
confirm the performance of the method for the particular matrix. If previous performance
studies are not available, they must be developed during the project and included as part of the
project results.
Address what to do when a failure in the analytical system occurs, who is responsible for
corrective action, and how the effectiveness of the corrective action shall be determined and
documented.
Specify the laboratory turnaround time needed, if important to the project schedule. Specify
whether a field sampling and/or laboratory analysis case narrative is required to provide a
complete description of any difficulties encountered during sampling or analysis.
B4.1 Purpose/Background
The choice of analytical methods will be influenced by the performance criteria, Data Quality
Objectives, and possible regulatory criteria. If appropriate, a citation of analytical procedures may be
sufficient if the analytical method is a complete SOP. For other methods, it may suffice to reference a
procedure (i.e., from Test Methods for Evaluating Solid Waste, SW-846) and further supplement it with
the particular options/variations being used by the lab, the detection limits actually achieved, the
calibration standards and concentrations used, etc. If the procedure is unique or an adaption of a
"standard" method, complete analytical and sample preparation procedures will need to be attached to
the QAPP.
Specific monitoring methods and requirements to demonstrate compliance traditionally were
specified in the applicable regulations and/or permits. However, this approach is being replaced by the
Performance-Based Measurement System (PBMS). PBMS is a process in which data quality needs,
mandates, or limitations of a program or project are specified and serve as a criterion for selecting
appropriate methods. The regulated body selects the most cost-effective methods that meet the criteria
specified in the PBMS. Under the PBMS framework, the performance of the method employed is
emphasized rather than the specific technique or procedure used in the analysis. Equally stressed in this
system is the requirement that the performance of the method be documented and certified by the
laboratory that appropriate QA/QC procedures have been conducted to verify the performance. PBMS
applies to physical, chemical, and biological techniques of analysis performed in the field as well as in the
laboratory. PBMS does not apply to the method-defined parameters.
EPA QA/G-5 28 QA98
-------
The QAPP should also address the issue of the quality of analytical data as indicated by the
data's ability to meet the QC acceptance criteria. This section should describe what should be done if the
calibration check samples exceed the control limits due to mechanical failure of the instrumentation, a
drift in the calibration curve occurs, or if a reagent blank indicates contamination. This section should
also indicate the authorities responsible for the quality of the data, the protocols for making changes and
implementing corrective actions, and the methods for reporting the data and its limitations.
Laboratory contamination from the processing of hazardous materials such as toxic or
radioactive samples for analysis and their ultimate disposal should be a considered during the planning
stages for selection of analysis methods. Safe handling requirements for project samples in the
laboratory with appropriate decontamination and waste disposal procedures should also be described.
B4.2 Subsampling
If subsampling is required, the procedures should be described in this QAPP element, and the
full text of the subsampling operating procedures should be appended to the QAPP. Because
subsampling may involve more than one stage, it is imperative that the procedures be documented fully
so that the results of the analysis can be evaluated properly.
B4.3 Preparation of the Samples
Preparation procedures should be described and standard methods cited and used where possible.
Step-by-step operating procedures for the preparation of the project samples should be listed in an
appendix. The sampling containers, methods of preservation, holding times, holding conditions, number
and types of all QA/QC samples to be collected, percent recovery, and names of the laboratories that will
perform the analyses need to be specifically referenced.
B4.4 Analytical Methods
The citation of an analytical method may not always be sufficient to fully characterize a method
because the analysis of a sample may require deviation from a standard method and selection from the
range of options in the method. The SOP for each analytical method should be cited or attached to the
QAPP, and all deviations or alternative selections should be detailed in the QAPP.
The matrix containing the subject analytes often dictates the sampling and analytical methods.
Gaseous analytes often must be concentrated on a trap in order to collect a measurable quantity. If the
matrix is a liquid or a solid, the analytes usually must be separated from it using various methods of
extraction. Sometimes the analyte is firmly linked by chemical bonds to other elements and must be
subjected to digestion methods to be freed for analysis.
Often the selected analytical methods may be presented conveniently in one or several tables
describing the matrix, the analytes to be measured, the analysis methods, the type, the precision/accuracy
data, the performance acceptance criteria, the calibration criteria, and etc. Appendix C contains a
checklist of many important components to consider when selecting analytical methods.
B4.5 References
Greenberg, A.E., L.S. Clescer, and A. D. Eaton, eds. 1992. Standard Methods for the Examination of Water and Waste-water.
18th ed. American Public Health Association. Water Environment Federation.
EPA QA/G-5 29 QA98
-------
U.S. Environmental Protection Agency. 1996. Quality Control: Variability in Protocols. EPA/600/9-91/034. Risk Reduction
Engineering Laboratory. U.S. EPA. Cincinnati, OH.
U.S. Environmental Protection Agency. Test Methods for Evaluating Solid Waste. SW-846. Chapter 2, "Choosing the
Correct Procedure."
B5 QUALITY CONTROL REQUIREMENTS
Identify required measurement QC checks for both the field and the laboratory. State the
frequency of analysis for each type of QC check, and the spike compounds sources and levels.
State or reference the required control limits for each QC check and corrective action required
when control limits are exceeded and how the effectiveness of the corrective action shall be
determined and documented.
Describe or reference the procedures to be used to calculate each of the QC statistics.
B5.1 Purpose/Background
QC is "the overall system of technical activities that measures the attributes and performance of
a process, item, or service against defined standards to verify that they meet the stated requirements
established by the customer." QC is both corrective and proactive in establishing techniques to prevent
the generation of unacceptable data, and so the policy for corrective action should be outlined. This
element will rely on information developed in section A7, "Quality Objectives and Criteria for
Measurement Data," which establishes measurement performance criteria.
B5.2 QC Procedures
This element documents any QC checks not defined in other QAPP elements and should
reference other elements that contain this information where possible. Most of the QC acceptance limits
of EPA methods are based on the results of interlaboratory studies. Because of improvements in
measurement methodology and continual improvement efforts in individual laboratories, these
acceptance limits may not be stringent enough for some projects. In some cases, acceptance limits are
based on intralaboratory studies (which often result in narrower acceptance limits than those based on
interlaboratory limits), and consultation with an expert may be necessary. Other elements of the QAPP
that contain related sampling and analytical QC requirements include:
Sampling Process Design (Bl), which identifies the planned field QC samples as well
as procedures for QC sample preparation and handling;
• Sampling Methods Requirements (B2), which includes requirements for determining if
the collected samples accurately represent the population of interest;
Sample Handling and Custody Requirements (B3), which discusses any QC devices
employed to ensure samples are not tampered with (e.g., custody seals) or subjected to
other unacceptable conditions during transport;
• Analytical Methods Requirements (B4), which includes information on the
subsampling methods and information on the preparation of QC samples in the sample
matrix (e.g., splits, spikes, and replicates); and
EPA QA/G-5 30 QA98
-------
• Instrument Calibration and Frequency (B7), which defines prescribed criteria for
triggering recalibration (e.g., failed calibration checks).
Table 1 lists QC checks often included in QAPPs. The need for the specific check depends on
the project objectives.
Table 1. Project Quality Control Checks
QC Check
Information Provided
Blanks
field blank
reagent blank
rinsate blank
method blank
transport and field handling bias
contaminated reagent
contaminated equipment
response of entire laboratory analytical system
Spikes
matrix spike
matrix spike replicate
analysis matrix spike
surrogate spike
analytical (preparation + analysis) bias
analytical bias and precision
instrumental bias
analytical bias
Calibration Check Samples
zero check
span check
mid-range check
calibration drift and memory effects
calibration drift and memory effects
calibration drift and memory effects
Replicates, splits, etc.
collocated samples
field replicates
field splits
laboratory splits
laboratory replicates
analysis replicates
sampling + measurement precision
precision of all steps after acquisition
shipping + interlaboratory precision
interlaboratory precision
analytical precision
instrument precision
Many QC checks result in measurement data that are used to compute statistical indicators of data
quality. For example, a series of dilute solutions may be measured repeatedly to produce an estimate of
the instrument detection limit. The formulas for calculating such Data Quality Indicators (DQIs) should
be provided or referenced in the text. This element should also prescribe any limits that define
acceptable data quality for these indicators (see also Appendix D, "Data Quality Indicators"). A QC
checklist should be used to discuss the relation of QC to the overall project objectives with respect to:
• the frequency and point in the measurement process in which the check sample is
introduced,
the traceability of the standards,
the matrix of the check sample,
the level or concentration of the analyte of interest,
the actions to be taken if a QC check identifies a failed or changed measurement system,
the formulas for estimating DQIs, and
• the procedures for documenting QC results, including control charts.
EPA QA/G-5
31
QA98
-------
Finally, this element should describe how the QC check data will be used to determine that
measurement performance is acceptable. This step can be accomplished by establishing QC "warning"
and "control" limits for the statistical data generated by the QC checks (see standard QC textbooks or
refer to EPA QA/G-5T for operational details).
Depending on the breadth of the potential audience for reviewing and implementing the QAPP, it
may be advantageous to separate the field QC from the laboratory QC requirements.
B6 INSTRUMENT/EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE
REQUIREMENTS
Describe how inspections and acceptance testing of environmental sampling and measurement
systems and their components will be performed and documented.
Identify and discuss the procedure by which final acceptance will be performed by independent
personnel and/or by the EPA Project Officer.
Describe how deficiencies are to be resolved and when re-inspection will be performed.
Describe or reference how periodic preventive and corrective maintenance of measurement or
test equipment shall be performed. Identify the equipment and/or systems requiring periodic
maintenance. Discuss how the availability of critical spare parts, identified in the operating
guidance and/or design specifications of the systems, will be assured and maintained.
B6.1 Purpose/Background
The purpose of this element of the QAPP is to discuss the procedures used to verify that all
instruments and equipment are maintained in sound operating condition and are capable of operating at
acceptable performance levels.
B6.2 Testing, Inspection, and Maintenance
The procedures described should (1) reflect consideration of the possible effect of equipment
failure on overall data quality, including timely delivery of project results; (2) address any relevant site-
specific effects (e.g., environmental conditions); and (3) include procedures for assessing the equipment
status. This element should address the scheduling of routine calibration and maintenance activities, the
steps that will be taken to minimize instrument downtime, and the prescribed corrective action
procedures for addressing unacceptable inspection or assessment results. This element should also
include periodic maintenance procedures and describe the availability of spare parts and how an
inventory of these parts is monitored and maintained. The reader should be supplied with sufficient
information to review the adequacy of the instrument/equipment management program. Appending
SOPs containing this information to the QAPP and referencing the SOPs in the text are acceptable.
Inspection and testing procedures may employ reference materials, such as the National Institute
of Standards and Technology's (NIST's) Standard Reference Materials (SRMs), as well as QC standards
or an equipment certification program. The accuracy of calibration standards is important because all
data will be measured in reference to the standard used. The types of standards or special programs
should be noted in this element, including the inspection and acceptance testing criteria for all
EPA QA/G-5 32 QA98
-------
components. The acceptance limits for verifying the accuracy of all working standards against primary
grade standards should also be provided.
B7 INSTRUMENT CALIBRATION AND FREQUENCY
Identify all tools, gauges, instruments, and other sampling, measuring, and test equipment used
for data collection activities affecting quality that must be controlled and, at specified periods,
calibrated to maintain performance within specified limits.
Identify the certified equipment and/or standards used for calibration. Describe or reference
how calibration will be conducted using certified equipment and/or standards with known valid
relationships to nationally recognized performance standards. If no such nationally recognized
standards exist, document the basis for the calibration. Indicate how records of calibration
shall be maintained and be traceable to the instrument.
B7.1 Purpose/Background
This element of the QAPP concerns the calibration procedures that will be used for instrumental
analytical methods and other measurement methods that are used in environmental measurements. It is
necessary to distinguish between defining calibration as the checking of physical measurements against
accepted standards and as determining the relationship (function) of the response versus the
concentration. The American Chemical Society (ACS) limits the definition of the term calibration to the
checking of physical measurements against accepted standards, and uses the term standardization to
describe the determination of the response function.
B7.2 Identify the Instrumentation Requiring Calibration
The QAPP should identify any equipment or instrumentation that requires calibration to maintain
acceptable performance. While the primary focus of this element is on instruments of the measurement
system (sampling and measurement equipment), all methods require standardization to determine the
relationship between response and concentration.
B7.3 Document the Calibration Method that Will Be Used for Each Instrument
The QAPP must describe the calibration method for each instrument in enough detail for another
researcher to duplicate the calibration method. It may reference external documents such as EPA-
designated calibration procedures or SOPs providing that these documents can be easily obtained.
Nonstandard calibration methods or modified standard calibration methods should be fully documented
and justified.
Some instrumentation may be calibrated against other instrumentation or apparatus (e.g., NIST
thermometer), while other instruments are calibrated using standard materials traceable to national
reference standards. QAPP documentation for calibration apparatus and calibration standards are
addressed in B7.4 and B7.5.
Calibrations normally involve challenging the measurement system or a component of the
measurement system at a number of different levels over its operating range. The calibration may cover
a narrower range if accuracy in that range is critical, given the end use of the data. Single-point
EPA QA/G-5 33 QA98
-------
calibrations are of limited use, and two-point calibrations do not provide information on nonlinearity. If
single- or two-point calibrations are used for critical measurements, the potential shortcomings should be
carefully considered and discussed in the QAPP. Most EPA-approved analytical methods require
multipoint (three or more) calibrations that include zeros, or blanks, and higher levels so that unknowns
fall within the calibration range and are bracketed by calibration points. The number of calibration
points, the calibration range, and any replication (repeated measures at each level) should be given in the
QAPP.
The QAPP should describe how calibration data will be analyzed. The use of statistical QC
techniques to process data across multiple calibrations to detect gradual degradations in the measurement
system should be described. The QAPP should describe any corrective action that will be taken if
calibration (or calibration check) data fail to meet the acceptance criteria, including recalibration.
References to appended SOPs containing the calibration procedures are an acceptable alternative to
describing the calibration procedures within the text of the QAPP.
B7.4 Document the Calibration Apparatus
Some instruments are calibrated using calibration apparatus rather than calibration standards.
For example, an ozone generator is part of a system used to calibrate continuous ozone monitors.
Commercially available calibration apparatus should be listed together with the make (the manufacturer's
name), the model number, and the specific variable control settings that will be used during the
calibrations. A calibration apparatus that is not commercially available should be described in enough
detail for another researcher to duplicate the apparatus and follow the calibration procedure.
B7.5 Document the Calibration Standards
Most measurement systems are calibrated by processing materials that are of known and stable
composition. References describing these calibration standards should be included in the QAPP.
Calibration standards are normally traceable to national reference standards, and the traceability protocol
should be discussed. If the standards are not traceable, the QAPP must include a detailed description of
how the standards will be prepared. Any method used to verify the certified value of the standard
independently should be described.
B7.6 Document Calibration Frequency
The QAPP must describe how often each measurement method will be calibrated. It is desirable
that the calibration frequency be related to any known temporal variability (i.e., drift) of the
measurement system. The calibration procedure may involve less-frequent comprehensive calibrations
and more-frequent simple drift checks. The location of the record of calibration frequency and
maintenance should be referenced.
B7.7 References
American Chemical Society. 1980. "Calibration." Analytical Chemistry, Vol. 52, pps. 2,242-2,249.
Dieck, R.H. 1992. Measurement Uncertainty Methods and Applications. Research Triangle Park, NC: Instrument Society of
America.
Dux, J.P. 1986. Handbook of Quality Assurance for the Analytical Chemistry Laboratory. New York: VanNostrand
Reinhold.
EPA QA/G-5 34 QA98
-------
ILAC Task Force E. 1984. Guidelines for the Determination ofRecalibration Intervals of Testing Equipment Used in Testing
Laboratories. International Organization for Legal Metrology (OIML). International Document No. 10. 11 Rue
Twigot, Paris 95009, France.
Ku, H.H., ed. 1969. Precision Measurement and Calibration. Selected NBS Papers on Statistical Concepts and Procedures.
Special Publication 300. Vol.1. Gaithersburg, MD: National Bureau of Standards.
Liggett, W. 1986. "Tests of the Recalibration Period of a Drifting Instrument." In Oceans'86 Conference Record. Vol.3.
Monitoring Strategies Symposium. The Institute of Electrical and Electronics Engineers, Inc., Service Center.
Piscataway, NJ.
Pontius, P.E. 1974. Notes on the Fundamentals of Measurement as a Production Process. Publication No. NBSIR 74-545.
Gaithersburg, MD: National Bureau of Standards.
Taylor, J.T. 1987. Quality Assurance of Chemical Measurements. Boca Raton, FL: Lewis Publishers, Inc.
B8 INSPECTION/ACCEPTANCE REQUIREMENTS FOR SUPPLIES AND
CONSUMABLES
Describe how and by whom supplies and consumables shall be inspected and accepted for use in
the project. State acceptance criteria for such supplies and consumables.
B8.1 Purpose
The purpose of this element is to establish and document a system for inspecting and accepting
all supplies and consumables that may directly or indirectly affect the quality of the project or task. If
these requirements have been included under another section, it is sufficient to provide a reference.
B8.2 Identification of Critical Supplies and Consumables
Clearly identify and document all supplies and consumables that may directly or indirectly affect
the quality of the project or task. See Figures 9 and 10 for example documentation of
inspection/acceptance testing requirements. Typical examples include sample bottles, calibration gases,
reagents, hoses, materials for decontamination activities, deionized water, and potable water.
For each item identified, document the inspection or acceptance testing requirements or
specifications (e.g., concentration, purity, cell viability, activity, or source of procurement) in addition to
any requirements for certificates of purity or analysis.
B8.3 Establishing Acceptance Criteria
Acceptance criteria must be consistent with overall project technical and quality criteria (e.g.,
concentration must be within ± 2.5%, cell viability must be >90%). If special requirements are needed
for particular supplies or consumables, a clear agreement should be established with the supplier,
including the methods used for evaluation and the provisions for settling disparities.
B8.4 Inspection or Acceptance Testing Requirements and Procedures
Inspections or acceptance testing should be documented, including procedures to be followed,
individuals responsible, and frequency of evaluation. In addition, handling and storage conditions for
supplies and consumables should be documented.
EPA QA/G-5 35 QA98
-------
B8.5 Tracking and Quality Verification of Supplies and Consumables
Procedures should be established to ensure that inspections or acceptance testing of supplies and
consumables are adequately documented by permanent, dated, and signed records or logs that uniquely
identify the critical supplies or consumables, the date received, the date tested, the date to be retested (if
applicable), and the expiration date. These records should be kept by the responsible individual(s) (see
Figure 11 for an example log). In order to track supplies and consumables, labels with the information
on receipt and testing should be used.
These or similar procedures should be established to enable project personnel to (1) verify, prior
to use, that critical supplies and consumables meet specified project or task quality objectives; and
(2) ensure that supplies and consumables that have not been tested, have expired, or do not meet
acceptance criteria are not used for the project or task.
Unique identification no. (if not clearly shown)_
Date received
Date opened
Date tested (if performed)
Date to be retested (if applicable).
Expiration date
Figure 9. Example of a Record for Consumables
Critical
Supplies and
Consumables
Inspection/
Acceptance
Testing
Requirements
Acceptance
Criteria
Testing
Method
Frequency
Responsible
Individual
Handling/Storage
Conditions
Figure 10. Example of Inspection/Acceptance Testing Requirements
Critical Supplies
and Consumable
(Type, ID No.)
Date
Received
Meets Inspection/
Acceptance Criteria
(Y/N, Include Date)
Requires Retesting
(Y/N, If Yes, Include
Date)
Expiration
Date
Comments
Initials/Date
Figure 11. Example of a Log for Tracking Supplies and Consumables
EPA QA/G-5
36
QA98
-------
B9 DATA ACQUISITION REQUIREMENTS (NON-DIRECT MEASUREMENTS)
Identify any types of data needed for project implementation or decision making that are
obtained from non-measurement sources such as computer databases, programs, literature
files, and historical databases.
Define the acceptance criteria for the use of such data in the project and discuss any
limitations on the use of the data resulting from uncertainty in its quality.
Document the rationale for the original collection of data and indicate its relevance to this
project.
B9.1 Purpose/Background
This element of the QAPP should clearly identify the intended sources of previously collected
data and other information that will be used in this project. Information that is non-representative and
possibly biased and is used uncritically may lead to decision errors. The care and skepticism applied to
the generation of new data are also appropriate to the use of previously compiled data (for example, data
sources such as handbooks and computerized databases).
B9.2 Acquisition of Non-Direct Measurement Data
This element's criteria should be developed to support the objectives of element A7. Acceptance
criteria for each collection of data being considered for use in this project should be explicitly stated,
especially with respect to:
Representativeness. Were the data collected from a population that is sufficiently
similar to the population of interest and the population boundaries? How will potentially
confounding effects (for example, season, time of day, and cell type) be addressed so
that these effects do not unduly alter the summary information?
Bias. Are there characteristics of the data set that would shift the conclusions. For
example, has bias in analysis results been documented? Is there sufficient information to
estimate and correct bias?
• Precision. How is the spread in the results estimated? Does the estimate of variability
indicate that it is sufficiently small to meet the objectives of this project as stated in
element A7? See also Appendix D.
Qualifiers. Are the data evaluated in a manner that permits logical decisions on whether
or not the data are applicable to the current project? Is the system of qualifying or
flagging data adequately documented to allow the combination of data sets?
• Summarization. Is the data summarization process clear and sufficiently consistent
with the goals of this project? (See element D2 for further discussion.) Ideally,
observations and transformation equations are available so that their assumptions can be
evaluated against the objectives of the current project.
EPA QA/G-5 37 QA98
-------
This element should also include a discussion on limitations on the use of the data and the nature of the
uncertainty of the data.
BIO DATA MANAGEMENT
Describe the project data management scheme, tracing the path of the data from their
generation in the field or laboratory to their final use or storage. Describe or reference the
standard record-keeping procedures, document control system, and the approach used for
data storage and retrieval on electronic media.
Discuss the control mechanism for detecting and correcting errors and for preventing loss of
data during data reduction, data reporting, and data entry to forms, reports, and databases.
Provide examples of any forms or checklists to be used.
Identify and describe all data handling equipment and procedures to process, compile, and
analyze the data, including any required computer hardware and software. Address any
specific performance requirements and describe the procedures that will be followed to
demonstrate acceptability of the hardware/software configuration required.
Describe the process for assuring that applicable Agency information resource management
requirements and locational data requirements are satisfied. If other Agency data
management requirements are applicable, discuss how these requirements are addressed.
B10.1 Purpose/Background
This element should present an overview of all mathematical operations and analyses performed
on raw ("as-collected") data to change their form of expression, location, quantity, or dimensionality.
These operations include data recording, validation, transformation, transmittal, reduction, analysis,
management, storage, and retrieval. A diagram that illustrates the source(s) of the data, the processing
steps, the intermediate and final data files, and the reports produced may be helpful, particularly when
there are multiple data sources and data files. When appropriate, the data values should be subjected to
the same chain-of-custody requirements as outlined in element B3. Appendix G has further details.
B10.2 Data Recording
Any internal checks (including verification and validation checks) that will be used to ensure
data quality during data encoding in the data entry process should be identified together with the
mechanism for detailing and correcting recording errors. Examples of data entry forms and checklists
should be included.
B10.3 Data Validation
The details of the process of data validation and prespecified criteria should be documented in
this element of the QAPP. This element should address how the method, instrument, or system performs
the function it is intended to consistently, reliably, and accurately in generating the data. Part D of this
document addresses the overall project data validation, which is performed after the project has been
completed.
EPAQA/G-5 38 QA98
-------
B10.4 Data Transformation
Data transformation is the conversion of individual data point values into related values or
possibly symbols using conversion formulas (e.g., units conversion or logarithmic conversion) or a
system for replacement. The transformations can be reversible (e.g., as in the conversion of data points
using a formulas) or irreversible (e.g., when a symbol replaces actual values and the value is lost). The
procedures for all data transformations should be described and recorded in this element. The procedure
for converting calibration readings into an equation that will be applied to measurement readings should
be documented in the QAPP. Transformation and aberration of data for statistical analysis should be
outlined in element D3, "Reconciliation with Data Quality Objectives."
B10.5 Data Transmittal
Data transmittal occurs when data are transferred from one person or location to another or when
data are copied from one form to another. Some examples of data transmittal are copying raw data from
a notebook onto a data entry form for keying into a computer file and electronic transfer of data over a
telephone or computer network. The QAPP should describe each data transfer step and the procedures
that will be used to characterize data transmittal error rates and to minimize information loss in the
transmittal.
B10.6 Data Reduction
Data reduction includes all processes that change the number of data items. This process is
distinct from data transformation in that it entails an irreversible reduction in the size of the data set and
an associated loss of detail. For manual calculations, the QAPP should include an example in which
typical raw data are reduced. For automated data processing, the QAPP should clearly indicate how the
raw data are to be reduced with a well-defined audit trail, and reference to the specific software
documentation should be provided.
B10.7 Data Analysis
Data analysis sometimes involves comparing suitably reduced data with a conceptual model
(e.g., a dispersion model or an infectivity model). It frequently includes computation of summary
statistics, standard errors, confidence intervals, tests of hypotheses relative to model parameters, and
goodness-of-fit tests. This element should briefly outline the proposed methodology for data analysis
and a more detailed discussion should be included in the final report.
B10.8 Data Tracking
Data management includes tracking the status of data as they are collected, transmitted, and
processed. The QAPP should describe the established procedures for tracking the flow of data through
the data processing system.
B10.9 Data Storage and Retrieval
The QAPP should discuss data storage and retrieval including security and time of retention, and
it should document the complete control system. The QAPP should also discuss the performance
requirements of the data processing system, including provisions for the batch processing schedule and
the data storage facilities.
EPA QA/G-5 39 QA98
-------
EPA QA/G-5
40
QA98
-------
C ASSESSMENT/OVERSIGHT
Cl ASSESSMENTS AND RESPONSE ACTIONS
Identify the number, frequency, and type of assessment activities needed for this project.
List and describe the assessments to be used in the project. Discuss the information expected
and the success criteria for each assessment proposed. List the approximate schedule of
activities, identify potential organizations and participants. Describe how and to whom the
results of the assessments shall be reported.
Define the scope of authority of the assessors, including stop work orders. Define explicitly the
unsatisfactory conditions under which the assessors are authorized to act and provide an
approximate schedule for the assessments to be performed.
Discuss how response actions to non-conforming conditions shall be addressed and by whom.
Identify who is responsible for implementing the response action and describe how response
actions shall be verified and documented.
Cl.l Purpose/Background
During the planning process, many options for sampling design (see EPA QA/G-5S, Guidance
on Sampling Design to Support QAPPs), sample handling, sample cleanup and analysis, and data
reduction are evaluated and chosen for the project. In order to ensure that the data collection is
conducted as planned, a process of evaluation and validation is necessary. This element of the QAPP
describes the internal and external checks necessary to ensure that:
all elements of the QAPP are correctly implemented as prescribed,
• the quality of the data generated by implementation of the QAPP is adequate, and
corrective actions, when needed, are implemented in a timely manner and their
effectiveness is confirmed.
Although any external assessments that are planned should be described in the QAPP, the most
important part of this element is documenting all planned internal assessments. Generally, internal
assessments are initiated or performed by the internal QA Officer so the activities described in this
element should be related to the responsibilities of the QA Officer as discussed in Section A4.
C1.2 Assessment Activities and Project Planning
The following is a description of various types of assessment activities available to managers in
evaluating the effectiveness of environmental program implementation.
Cl.2.1 Assessment of the Subsidiary Organizations
A. Management Systems Review (MSR). A form of management assessment, this process is
a qualitative assessment of a data collection operation or organization to establish
whether the prevailing quality management structure, policies, practices, and procedures
are adequate for ensuring that the type and quality of data needed are obtained. The
EPAQA/G-5 41 QA98
-------
MSR is used to ensure that sufficient management controls are in place and carried out
by the organization to adequately plan, implement, and assess the results of the project.
See the Guidance for the Management Systems Review Process (EPA QA/G-3).
B. Readiness reviews. A readiness review is a technical check to determine if all
components of the project are in place so that work can commence on a specific phase.
Cl.2.2 Assessment of Project Activities
A. Surveillance. Surveillance is the continual or frequent monitoring of the status of a
project and the analysis of records to ensure that specified requirements are being
fulfilled.
B. Technical Systems Audit (TSA). A TSA is a thorough and systematic onsite qualitative
audit, where facilities, equipment, personnel, training, procedures, and record keeping
are examined for conformance to the QAPP. The TSA is a powerful audit tool with
broad coverage that may reveal weaknesses in the management structure, policy,
practices, or procedures. The TSA is ideally conducted after work has commenced, but
before it has progressed very far, thus giving opportunity for corrective action.
C. Performance Evaluation (PE). A PE is a type of audit in which the quantitative data
generated by the measurement system are obtained independently and compared with
routinely obtained data to evaluate the proficiency of an analyst or laboratory. "Blind"
PE samples are those whose identity is unknown to those operating the measurement
system. Blind PEs often produce better performance assessments because they are
handled routinely and are not given the special treatment that undisguised PEs
sometimes receive. The QAPP should list the PEs that are planned, identifying:
the constituents to be measured,
• the target concentration ranges,
the timing/schedule for PE sample analysis, and
• the aspect of measurement quality to be assessed (e.g., bias, precision,
and detection limit).
A number of EPA regulations and EPA-sanctioned methods require the successful
accomplishment of PEs before the results of the test can be considered valid. PE
materials are now available from commercial sources and a number of EPA Program
Offices coordinate various interlaboratory studies and laboratory proficiency programs.
Participation in these or in the National Voluntary Laboratory Accreditation Program
(NVLAP, run by NIST) should be mentioned in the QAPP.
D. Audit of Data Quality (ADQ). An ADQ reveals how the data were handled, what
judgments were made, and whether uncorrected mistakes were made. Performed prior to
producing a project's final report, ADQs can often identify the means to correct
systematic data reduction errors.
E. Peer review. Peer review is not a TSA, nor strictly an internal QA function, as it may
encompass non-QA aspects of a project and is primarily designed for scientific review.
Whether a planning team chooses ADQs or peer reviews depends upon the nature of the
EPA QA/G-5 42 QA98
-------
project, the intended use of the data, the policies established by the sponsor of the
project, and overall the conformance to the Program Office or Region's peer-review
policies and procedures. Reviewers are chosen who have technical expertise comparable
to the project's performers but who are independent of the project. ADQs and peer
reviews ensure that the project activities:
were technically adequate,
• were competently performed,
were properly documented,
• satisfied established technical requirements, and
satisfied established QA requirements.
In addition, peer reviews assess the assumptions, calculations, extrapolations, alternative
interpretations, methods, acceptance criteria, and conclusions documented in the
project's report. Any plans for peer review should conform with the Agency's peer-
review policy and guidance. The names, titles, and positions of the peer reviewers
should be included in the final QAPP, as should their report findings, the QAPP authors'
documented responses to their findings, and reference to where responses to peer-review
comments may be located, if necessary.
F. Data Quality Assessment (DQA). DQA involves the application of statistical tools to
determine whether the data meet the assumptions that the DQOs and data collection
design were developed under and whether the total error in the data is tolerable.
Guidance for the Data Quality Assessment Process (EPA QA/G-9) provides
nonmandatory guidance for planning, implementing, and evaluating retrospective
assessments of the quality of the results from environmental data operations.
C1.3 Documentation of Assessments
The following material describes what should be documented in a QAPP after consideration of
the above issues and types of assessments.
Cl.3.1 Number. Frequency, and Types of Assessments
Depending upon the nature of the project, there may be more than one assessment. A schedule
of the number, frequencies, and types of assessments required should be given.
Cl.3.2 Assessment Personnel
The QAPP should specify the individuals, or at least the specific organizational units, who will
perform the assessments. Internal audits are usually performed by personnel who work for the
organization performing the project work but who are organizationally independent of the management
of the project. External audits are performed by personnel of organizations not connected with the
project but who are technically qualified and who understand the QA requirements of the project.
Cl.3.3 Schedule of Assessment Activities
A schedule of audit activities, together with relevant criteria for assessment, should be given to
the extent that it is known in advance of project activities.
EPA QA/G-5 43 QA98
-------
Cl.3.4 Reporting and Resolution of Issues
Audits, peer reviews, and other assessments often reveal findings of practice or procedure that do
not conform to the written QAPP. Because these issues must be addressed in a timely manner, the
protocol for resolving them should be given here together with the proposed actions to ensure that the
corrective actions were performed effectively. The person to whom the concerns should be addressed,
the decision making hierarchy, the schedule and format for oral and written reports, and the
responsibility for corrective action should all be discussed in this element. It also should explicitly define
the unsatisfactory conditions upon which the assessors are authorized to act and list the project personnel
who should receive assessment reports.
C2 REPORTS TO MANAGEMENT
Identify the frequency and distribution of reports issued to inform management of the status
of the project; results of performance evaluations and systems audits; results of periodic data
quality assessments; and significant quality assurance problems and recommended solutions.
Identify the preparer and the recipients of the reports, and the specific actions management is
expected to take as a result of the reports.
C2.1 Purpose/Background
Effective communication between all personnel is an integral part of a quality system. Planned
reports provide a structure for apprising management of the project schedule, the deviations from
approved QA and test plans, the impact of these deviations on data quality, and the potential
uncertainties in decisions based on the data. Verbal communication on deviations from QA plans should
be noted in summary form in element Dl of the QAPP.
C2.2 Frequency, Content, and Distribution of Reports
The QAPP should indicate the frequency, content, and distribution of the reports so that
management may anticipate events and move to ameliorate potentially adverse results. An important
benefit of the status reports is the opportunity to alert the management of data quality problems, propose
viable solutions, and procure additional resources. If program assessment (including the evaluation of
the technical systems, the measurement of performance, and the assessment of data) is not conducted on
a continual basis, the integrity of the data generated in the program may not meet the quality
requirements. These audit reports, submitted in a timely manner, will provide an opportunity to
implement corrective actions when most appropriate.
C2.3 Identify Responsible Organizations
It is important that the QAPP identify the personnel responsible for preparing the reports,
evaluating their impact, and implementing follow-up actions. It is necessary to understand how any
changes made in one area or procedure may affect another part of the project. Furthermore, the
documentation for all changes should be maintained and included in the reports to management. At the
end of a project, a report documenting the Data Quality Assessment findings to management should be
prepared.
EPA QA/G-5 44 QA98
-------
D DATA VALIDATION AND USABILITY
Dl DATA REVIEW, VALIDATION, AND VERIFICATION REQUIREMENTS
State the criteria used to review and validate data.
Provide examples of any forms or checklists to be used.
Identify any project-specific calculations required.
Dl.l Purpose/Background
The purpose of this element is to state the criteria for deciding the degree to which each data
item has met its quality specifications as described in Group B. Investigators should estimate the
potential effect that each deviation from a QAPP may have on the usability of the associated data item,
its contribution to the quality of the reduced and analyzed data, and its effect on the decision.
The process of data verification requires confirmation by examination or provision of objective
evidence that the requirements of these specified QC acceptance criteria are met. In design and
development, verification concerns the process of examining the result of a given activity to determine
conformance to the stated requirements for that activity. For example, have the data been collected
according to a specified method and have the collected data been faithfully recorded and transmitted?
Do the data fulfill specified data format and metadata requirements. The process of data verification
effectively ensures the accuracy of data using validated methods and protocols and is often based on
comparison with reference standards.
The process of data validation requires confirmation by examination and provision of objective
evidence that the particular requirements for a specific intended use have been fulfilled. In design and
development, validation concerns the process of examining a product or result to determine conformance
to user needs. For example, have the data and assessment methodology passed a peer review to evaluate
the adequacy of their accuracy and precision in assessing progress towards meeting the specific
commitment articulated in the objective or subobjective. The method validation process effectively
develops the QC acceptance criteria or specific performance criteria.
Each of the following areas of discussion should be included in the QAPP elements. The
discussion applies to situations in which a sample is separated from its native environment and
transported to a laboratory for analysis and data generation. However, these principles can be adapted to
other situations (for example, in-situ analysis or laboratory research).
D1.2 Sampling Design
How closely a measurement represents the actual environment at a given time and location is a
complex issue that is considered during development of element B1. See Guidance on Sampling Designs
to Support QAPPs (EPA QA/G-5S). Acceptable tolerances for each critical sample coordinate and the
action to be taken if the tolerances are exceeded should be specified in element Bl.
Each sample should be checked for conformity to the specifications, including type and location
(spatial and temporal). By noting the deviations in sufficient detail, subsequent data users will be able to
EPA QA/G-5 45 QA98
-------
determine the data's usability under scenarios different from those included in project planning. The
strength of conclusions that can be drawn from data (see Guidance Document for Data Quality
Assessment, EPA QA/G-9) has a direct connection to the sampling design and deviations from that
design. Where auxiliary variables are included in the overall data collection effort (for example,
microbiological nutrient characteristics or process conditions), they should be included in this evaluation.
D1.3 Sample Collection Procedures
Details of how a sample is separated from its native time/space location are important for
properly interpreting the measurement results. Element B2 provides these details, which include
sampling and ancillary equipment and procedures (including equipment decontamination). Acceptable
departures (for example, alternate equipment) from the QAPP, and the action to be taken if the
requirements cannot be satisfied, should be specified for each critical aspect. Validation activities should
note potentially unacceptable departures from the QAPP. Comments from field surveillance on
deviations from written sampling plans also should be noted.
D1.4 Sample Handling
Details of how a sample is physically treated and handled during relocation from its original site
to the actual measurement site are extremely important. Correct interpretation of the subsequent
measurement results requires that deviations from element B3 of the QAPP and the actions taken to
minimize or control the changes, be detailed. Data collection activities should indicate events that occur
during sample handling that may affect the integrity of the samples.
At a minimum, investigators should evaluate the sample containers and the preservation methods
used and ensure that they are appropriate to the nature of the sample and the type of data generated from
the sample. Checks on the identity of the sample (e.g., proper labeling and chain-of-custody records) as
well as proper physical/chemical storage conditions (e.g., chain-of-custody and storage records) should
be made to ensure that the sample continues to be representative of its native environment as it moves
through the analytical process.
D1.5 Analytical Procedures
Each sample should be verified to ensure that the procedures used to generate the data (as
identified in element B4 of the QAPP) were implemented as specified. Acceptance criteria should be
developed for important components of the procedures, along with suitable codes for characterizing each
sample's deviation from the procedure. Data validation activities should determine how seriously a
sample deviated beyond the acceptable limit so that the potential effects of the deviation can be
evaluated during DQA.
D1.6 Quality Control
Element B5 of the QAPP specifies the QC checks that are to be performed during sample
collection, handling, and analysis. These include analyses of check standards, blanks, spikes, and
replicates, which provide indications of the quality of data being produced by specified components of
the measurement process. For each specified QC check, the procedure, acceptance criteria, and
corrective action (and changes) should be specified. Data validation should document the corrective
actions that were taken, which samples were affected, and the potential effect of the actions on the
validity of the data.
EPA QA/G-5 46 QA98
-------
D1.7 Calibration
Element B7 addresses the calibration of instruments and equipment and the information that
should be presented to ensure that the calibrations:
• were performed within an acceptable time prior to generation of measurement data;
• were performed in the proper sequence;
• included the proper number of calibration points;
• were performed using standards that "bracketed" the range of reported measurement
results (otherwise, results falling outside the calibration range are flagged as such); and
had acceptable linearity checks and other checks to ensure that the measurement system
was stable when the calibration was performed.
When calibration problems are identified, any data produced between the suspect calibration event and
any subsequent recalibration should be flagged to alert data users.
D1.8 Data Reduction and Processing
Checks on data integrity evaluate the accuracy of "raw" data and include the comparison of
important events and the duplicate rekeying of data to identify data entry errors.
Data reduction is an irreversible process that involves a loss of detail in the data and may involve
averaging across time (for example, hourly or daily averages) or space (for example, compositing results
from samples thought to be physically equivalent). Since this summarizing process produces few values
to represent a group of many data points, its validity should be well-documented in the QAPP. Potential
data anomalies can be investigated by simple statistical analyses (see Guidance for Data Quality
Assessment, EPA QA/G-9).
The information generation step involves the synthesis of the results of previous operations and
the construction of tables and charts suitable for use in reports. How information generation is checked,
the requirements for the outcome, and how deviations from the requirements will be treated, should be
addressed in this element.
D2 VALIDATION AND VERIFICATION METHODS
Describe the process to be used for validating and verifying data, including the chain of
custody for data throughout the life cycle of the project or task.
Discuss how issues shall be resolved and identify the authorities for resolving such issues.
Describe how the results are conveyed to the data users.
Precisely define and interpret how validation issues differ from verification issues for this
project.
EPA QA/G-5 47 QA98
-------
D2.1 Purpose/Background
The purpose of this element is to describe, in detail, the process for validating (determining if
data satisfy QAPP-defmed user requirements) and verifying (ensuring that conclusions can be correctly
drawn) project data. The amount of data validated is directly related to the DQOs developed for the
project. The percentage validated for the specific project together with its rationale should be outlined or
referenced. The QAPP should have a clear definition of what is implied by "verification" and
"validation."
D2.2 Describe the Process for Validating and Verifying Data
The individuals responsible for data validation together with the lines of authority should be
shown on an organizational chart and may be indicated in the chart in element A7. The chart should
indicate who is responsible for each activity of the overall validation and verification processes.
The data to be validated should be compared to "actual" events using the criteria documented in
the QAPP. The data validation procedure for all environmental measurements should be documented in
the SOPs for specific data validation. Verification and validation issues are discussed at length in
Guidance on Environmental Verification and Validation, (EPA QA/G-8).
D3 RECONCILIATION WITH DATA QUALITY OBJECTIVES
Describe how the results obtained from the project or task will be reconciled with the
requirements defined by the data user or decision maker.
Outline the proposed methods to analyze the data and determine possible anomalies or
departures from assumptions established in the planning phase of data collection.
Describe how issues will be resolved and discuss how limitations on the use of the data will be
reported to decision makers.
D3.1 Purpose/Background
The purpose of element D3 is to outline and specify, if possible, the acceptable methods for
evaluating the results obtained from the project. This element includes scientific and statistical
evaluations of data to determine if the data are of the right type, quantity, and quality to support their
intended use.
D3.2 Reconciling Results with DQOs
The DQA process has been developed for cases where formal DQOs have been established.
Guidance for Data Quality Assessment (EPA QA/G-9) focuses on evaluating data for fitness in decision
making and also provides many graphical and statistical tools.
DQA is a key part of the assessment phase of the data life cycle, as shown in Figure 1. As the
part of the assessment phase that follows data validation and verification, DQA determines how well the
validated data can support their intended use. If an approach other than DQA has been selected, an
outline of the proposed activities should be included.
EPA QA/G-5 48 QA98
-------
CHAPTER IV
QAPP REVISIONS AND RELATED GUIDANCE
QAPP REVISIONS
During the course of environmental data collection, it is possible that changes will occur and
revisions to the QAPP will have to be made. Any changes to the technical procedures should be
evaluated by the EPA QA Officer and Project Officer to determine if they significantly affect the
technical and quality objectives of the project. If so, the QAPP should be revised and reapproved, and a
revised copy should be sent to all the persons on the distribution list.
COMPARISON WITH PREVIOUS GUIDANCE (QAMS-005/80)
EPA's previous guidance for preparing QAPPs, Interim Guidelines and Specifications for
Preparing Quality Assurance Project Plans (QAMS-005/80), was released in December 1980. The
evolution of EPA programs, changing needs, and changes to quality management practices have
mandated the preparation of a new guidance. The QAPPs that will be generated based on this guidance
will be slightly different from those in the past because:
• New QAPP specifications are given in the R-5 requirements document.
• Additional guidance documents from the Agency including Guidance for the Data
Quality Objectives Process (EPA QA/G-4), and Guidance for Data Quality Assessment
(EPA QA/G-9), are available on important quality management practices. These
guidance documents show how the DQO Process, the QAPP, and the DQA Process link
together in a coherent way (see Appendix A for a crosswalk between the DQOs and the
QAPP).
• The new guidance includes flexibility in the requirements and reporting format.
However, if an element of the QAPP is not applicable to a particular project, the
rationale for not addressing the element should be included.
• The elements of the QAPP are now organized in an order that corresponds to the
customary planning, implementation, and assessment phases of a project. They have
been categorized into four groups for ease of implementation:
• Project Management,
Measurement/Data Acquisition,
• Assessment/Oversight, and
Data Validation and Usability.
• There are more elements identified than in the previous QAMS-005/80 guidance and this
encourages flexibility in construction of defensible QAPPs.
A comparison between the requirements of QAMS-005/80 and the R-5 document is presented in
Appendix A, "Crosswalk Between EPA QA/R-5 and QAMS-005/80."
EPA QA/G-5 49 QA98
-------
APPENDIX A
CROSSWALKS BETWEEN QUALITY ASSURANCE DOCUMENTS
This appendix consists of five sections. The first section describes the relationship between the
systems requirements developed in ANSI/ASQC E4-1994 and the Environmental Protection Agency
(EPA) Quality System requirements. The second section provides a crosswalk between the requirements
document for Quality Assurance Project Plans (QAPPs), EPA QA/R-5, EPA Requirements for Quality
Assurance Project Plans for Environmental Data Operations, and its predecessor document, QAMS
005/80, Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans. The third
section provides a crosswalk between QA/R-5 and the elements of International Organization for
Standardization (ISO) 9000. The fourth section is a crosswalk between the requirements of the QAPP
and the steps of the Data Quality Objectives (DQOs) Process. The final section describes the Agency's
QA documents at the program and project levels.
AA1. RELATIONSHIP BETWEEN E4 AND EPA QUALITY SYSTEM
EPA Order 5360.1 establishes a mandatory Agency-wide Quality System that applies to all
organizations, both internal and external, performing work for EPA. (The authority for the requirements
defined by the Order are contained in the applicable regulations for extramural agreements.) These
organizations must ensure that data collected for the characterization of environmental processes and
conditions are of the appropriate type and quality for their intended use and that environmental
technologies are designed, constructed, and operated according to defined expectations. All EPA
Regional, Office, and Laboratory quality systems established in accordance with these requirements shall
comply with ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs, which conforms generally to
ISO 9000. In addition, EPA has developed two documents S EPA QA/R-1, EPA Quality Systems
Requirements for Environmental Programs, and EPA QA/R-2, EPA Requirements for Quality
Management Plans S that specify the requirements for developing, documenting, implementing, and
assessing a Quality System. This appendix describes these three Agency documents (Order 5360.1, EPA
QA/R-1, and EPA QA/R-2) in order to define their relationships and roles in laying the foundation for
EPA's Quality System.
ANSI/ASQC E4-1994 provides the basis for the preparation of a quality system for an
organization's environmental programs. The document provides the requisite management and technical
area elements necessary for developing and implementing a quality system. The document first describes
the quality management elements that are generally common to environmental problems, regardless of
their technical scope. The document then discusses the specifications and guidelines that apply to
project-specific environmental activities involving the generation, collection, analysis, evaluation, and
reporting of environmental data. Finally, the document contains the minimum specifications and
guidelines that apply to the design, construction, and operation of environmental technology.
EPA QA/R-1 provides the details on EPA quality management requirements to organizations
conducting environmental programs. This document states that"... all EPA organizations and all
organizations performing work for EPA shall develop and establish Quality Systems, as appropriate, that
conform to the American National Standard ANSI/ASQC E4-1994, Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental Technology Programs, and its
additions and supplements from the American National Standards Institute (ANSI) and the American
Society for Quality Control (ASQC)." R-l applies to all EPA programs and organizations, unless
explicitly exempted, that produce, acquire, or use environmental data depending on the purposes for
EPAQA/G-5 A-l QA98
-------
which the data will be used. This document also applies to systems, facilities, processes, and methods
for pollution control, waste treatment, waste remediation, and waste packaging and storage. Essentially,
R-l formally describes how EPA Order 5360.1 applies to extramural organizations.
EPA Requirements for Quality Management Plans, EPA QA/R-2, discusses the development,
review, approval, and implementation of the Quality Management Plan (QMP). The QMP is a means of
documenting how an organization will plan, implement, and assess the effectiveness of the management
processes and structures (required under R-l) that relate to the Quality System. R-2 describes the
program elements that should be part of a QMP. These requirements match the quality management
elements described in ANSI/ASQC E4-1994 that are generally common to environmental projects.
These elements include the following: (1) management and organization, (2) quality system and
description, (3) personnel qualifications and training, (4) procurement of items and services, (5)
documents and records, (6) computer hardware and software, (7) planning, (8) implementation of work
processes, (9) assessment and response, and (10) quality improvement.
The procedures, roles, and responsibilities for QAPPs are addressed in the organization's QMP.
In essence, the QMP establishes the nature of the requirements for QAPPs for work done by or for that
organization.
EPA QA/G-5 A-2 QA98
-------
AA2. CROSSWALK BETWEEN EPA QA/R-5 AND QAMS-005/80
QAMS-005/80 ELEMENTS
1.0
2.0
3.0
4.0
5.0
6.0
7.0
8.0
9.0
10.0
11.0
12.0
13.0
14.0
15.0
16.0
Title Page with Provision for Approval
Signatures
Table of Contents
Project Description
Project Organization and
Responsibility
QA Objectives for Measurement
Data (PARCC)
Sampling Procedures
Sample Custody
Calibration Procedures and Frequency
Analytical Procedures
Data Reduction, Validation, and
Reporting
Internal Quality Control Checks and
Frequency
Performance and Systems
Preventive Maintenance
Specific Routine Procedures
Measurement Parameters Involved
Corrective Action
QA Reports to Management
QA/R-5 ELEMENTS
Al
A2
A5
A6
A4
A9
A7
Bl
B2
A8
B3
B7
B4
Dl
D2
B9
BIO
B5
Cl
B6
B8
D3
Cl
A3
C2
Title and Approval Sheet
Table of Contents
Problem Definition/Background
Project/Task Description
Project/Task Organization
Documentation and Records
Quality Objectives and Criteria for Measurement
Data
Sampling Process Design
Sampling Methods Requirements
Special Training Requirements or Certification
Sample Handling and Custody Requirements
Instrument Calibration and Frequency
Analytical Methods Requirements
Data Review, Validation, and Verification
Requirements
Validation and Verification Methods
Data Acquisition Requirements
Data Quality Management
Quality Control Requirements
Assessments and Response Actions
Instrument/Equipment Testing, Procedures and
Schedules Inspection, and Maintenance
Requirements
Inspection/ Acceptance Requirements for Supplies
and Consumables
Reconciliation with Data Used to Assess
PARCC for Quality Objectives Measurement
Assessments and Response Actions
Distribution List
Reports to Management
EPA QA/G-5
A-3
QA98
-------
AA3. CROSSWALK BETWEEN EPA QA/R-5 AND ISO 9000
EPA QA/R-5 Elements
Al
A2
A3
A4
A5
A6
A7
A8
A9
Bl
B2
B3
B4
B5
B6
B7
B8
B9
BIO
Cl
C2
Dl
D2
D3
Title and Approval Sheet
Table of Contents
Distribution List
Project/Task Organization
Problem Definition/Background
Project/Task Description
Quality Objectives and Criteria for
Measurement Data
Special Training Requirements/Certification
Documentation and Records
Sampling Process Design
Sampling Methods Requirements
Sample Handling and Custody Requirements
Analytical Methods Requirements
Quality Control Requirements
Instrument/Equipment Testing, Inspection, and
Maintenance Requirements
Instrument Calibration and Frequency
Inspection/ Acceptance Requirements for
Supplies and Consumables
Data Acquisition Requirements
Data Quality Management
Assessments and Response Actions
Reports to Management
Data Review, Validation, and Verification
Requirements
Validation and Verification Methods
Reconciliation with User Requirements
ISO 9000 Elements
N/A
N/A
N/A
4
Management Responsibility
N/A
N/A
5
5.2
Quality System Principles
Structure of the Quality System
N/A
N/A
8
10
16
10
11
13
Quality in Specification and Design
Quality of Production
Handling and Post-Production Functions
Quality of Production
Control of Production
Control of Measuring and Test Equipment
N/A
9
11.2
Quality in Procurement
Material Control and Traceability
N/A
N/A
5.4
14
15
5.3
6
11.7
12
Auditing the Quality System
Nonconformity
Corrective Action
Documentation of the Quality System
Economics - Quality Related Costs
Control of Verification Status
Verification Status
N/A
7
Quality in Marketing
EPA QA/G-5
A-4
QA98
-------
AA4.
CROSSWALK BETWEEN THE DQO PROCESS AND THE QAPP
Elements
Requirements
DQO Overlap
PROJECT MANAGEMENT
Al Title and Approval
Sheet
A2 Table of Contents
A3 Distribution List
A4 Project/Task
Organization
A5 Problem
Definition/Backgro
und
A6 Project/Task
Description
A7 Data Quality
Objectives for
Measurement Data
A8 Special Training
Requirements/
Certification
A9 Documentation
and Record
Title and approval sheet.
Document control format.
Distribution list for the QAPP revisions and final
guidance.
Identify individuals or organizations participating
in the project and discuss their roles,
responsibilities and organization.
1 ) State the specific problem to be solved or the
decision to be made.
2) Identify the decision maker and the principal
customer for the results.
1) Hypothesis test, 2) expected measurements, 3)
ARARs or other appropriate standards, 4)
assessment tools (technical audits), 5) work
schedule and required reports.
Decision(s), population parameter of interest,
action level, summary statistics and acceptable
limits on decision errors. Also, scope of the
project (domain or geographical locale).
Identify special training that personnel will need.
Itemize the information and records that must be
included in a data report package, including
report format and requirements for storage, etc.
N/A
N/A
List the members of the scoping team.
Step 1: State the Problem.
Step 1 : State the Problem requires
definition of the DQO scoping or
planning team, which includes the
decision maker, technical staff, data
users, etc. This step also requires the
specification of each member's role and
responsibilities.
Step 1: State the Problem/Step 2:
Identify the Decision requires a
description of the problem. It also
identifies the decision makers who
could use the data.
Step 1 : State the Problem/Step 2:
Identify the Decision requires a work
schedule. Step 3: Identify the Inputs
requires the ARARs or standards and
expected measurements. Step 6:
Specify Limits on Decision Errors.
Step 1: State the Problem, Step 2:
Identify the Decision, Step 4: Define
the Boundaries, Step 5: Develop a
Decision Rule, Step 6: Specify Limits
on Decision Errors.
Step 3: Identify the Inputs to the
Decision.
Step 3: Identify the Inputs to the
Decision, Step 7: Optimize the Design
for Obtaining Data.
MEASUREMENT/DATA ACQUISITION
Bl Sampling Process
Designs
(Experimental
Design)
B2 Sampling Methods
Requirements
B3 Sample Handling
and Custody
Requirements
B4 Analytical
Methods
Requirements
B5 Quality Control
Requirements
Outline the experimental design, including
sampling design and rationale, sampling
frequencies, matrices, and measurement
parameter of interest.
Sample collection method and approach.
Describe the provisions for sample labeling,
shipment, chain-of-custody forms, procedures for
transferring and maintaining custody of samples.
Identify analytical method(s) and equipment for
the study, including method performance
requirements.
Describe routine (real-time) QC procedures that
should be associated with each sampling and
measurement technique. List required QC checks
and corrective action procedures.
Step 5: Develop a Decision Rule, Step
7: Optimize the Design for Obtaining
Data.
Step 7: Optimize the Design for
Obtaining Data.
Step 3: Identify the Inputs to the
Decision.
Step 3: Identify the Inputs to the
Decision, Step 7: Optimize the Design
for Obtaining Data.
Step 3: Identify the Inputs to the
Decision.
EPA QA/G-5
A-5
QA98
-------
Elements
B6 Instrument/Equip
ment Testing
Inspection and
Maintenance
Requirements
B7 Instrument
Calibration and
Frequency
B8 Inspection/Accepta
nee Requirements
for Supplies and
Consumables
B9 Data Acquisition
Requirements
(Non-direct
Measurements)
BIO Data Management
Requirements
Discuss how inspection and acceptance testing,
including the use of QC samples, must be
performed to ensure their intended use as
specified by the design.
Identify tools, gauges and instruments, and other
sampling or measurement devices that need
calibration. Describe how the calibration should
be done.
Define how and by whom the sampling supplies
and other consumables will be accepted for use in
the project.
Define the criteria for the use of non-
measurement data such as data that come from
databases or literature.
Outline the data management scheme including
the path and storage of the data and the data
record-keeping system. Identify all data handling
equipment and procedures that will be used to
process, compile, and analyze the data.
DQO Overlap
Step 3: Identify the Inputs to the
Decision.
Step 3: Identify the Inputs to the
Decision.
N/A
Step 1 : State the Problem, Step 7:
Optimize the Design for Obtaining
Data.
Step 3: Identify the Inputs to the
Decision, Step 7: Optimize the Design
for Obtaining Data.
ASSESSMENT/OVERSIGHT
Cl Assessments and
Response Actions
C2 Reports to
Management
Describe the assessment activities needed for this
project. These may include DQA, PE, TSA,
MSR/PR/RR
Identify the frequency, content, and distribution
of reports issued to keep management informed.
Step 5: Develop a Decision Rule, Step
6: Specify Limits on Decision Errors.
N/A
DATA VALIDATION AND USABILITY
Dl Data Review,
Validation, and
Verification
Requirements
D2 Validation and
Verification
Methods
D3 Reconciliation
With Data Quality
Objectives
State the criteria used to accept or reject the data
based on quality.
Describe the process to be used for validating and
verifying data, including the chain-of-custody for
data throughout the lifetime of the project.
Describe how results will be evaluated to
determine if DQOs have been satisfied.
Step 7: Optimize the Design for
Obtaining Data.
Step 3: Identify the Inputs to the
Decision.
Step 7: Optimize the Design for
Obtaining Data.
EPA QA/G-5
A-6
QA98
-------
AA5. EPA QUALITY ASSURANCE DOCUMENTS
The Quality Assurance Division issues QA documents for use both internally (National Programs,
Centers, and Laboratories) and externally (state and local agencies, contractors, extramural agreement
holders, and nonprofit groups). The scopes of the documents span all aspects of QA and can be obtained
by writing QAD directly or by visiting the QAD Website:
http://es.epa.gov/ncerqa/qa/qa_docs.html
QAD documents fall into three categories: the EPA Quality Manual (for internal use);
Requirements documents (for external use, labeled 'R-xx'); and Guidance documents (for internal and
external use, labeled 'G-xx'). Requirements documents and the Quality Manual contain the Agency's
QA policies and Guidance documents contain nonmandatory guidance on how to achieve these QA
requirements.
Table Al shows the general numbering system for EPA's Quality System documents, and Table
A2 illustrates some specific documents available and under construction. The auxiliary letter on some of
the documents denotes specialized audiences or areas of interest. Figure Al shows the relationship
among the documents at the Policy and Program levels. Figure A2 demonstrates the sequence and
interrelationship of documents at the Program level.
Not all of the documents listed in Table A2 are available, as some are in various stages of
development and will not be finalized until late 1998. Consult the Website or contact QAD directly for
information on the current status and availability of all QAD documents.
Table AA1. Numbering System for EPA's Quality System Documents
1 = Quality System Policy and Quality Manual 6 = Standard Operating Procedures (SOPs)
2 = Quality Management Plans (QMPs) 7 = Technical Assessments (TAs)
3 = Management Systems Reviews (MSRs) 8 = Data Verification and Validation
4 = Data Quality Objectives (DQOs) 9 = Data Quality Assessment (DQA)
5 = Quality Assurance Project Plans (QAPPs) 10 = Training Issues
EPA QA/G-5 A-7 QA98
-------
Table AA2. Quality System Documents
Overview
QA/G-0
Program level
QA/R-1
QA/G-1
QA/R-2
QA/G-2
QA/G-2C
QA/G-2EA
QA/G-2F
QA/G-3
QA/G-10
Project level
QA/G-4
QA/G-4CS
QA/G-4D
QA/G-4HW
QA/G-4R
QA/R-5
QA/G-5
QA/G-5I
QA/G-5 S
QA/G-5T
QA/G-6
QA/G-7
QA/G-8
QA/G-9
QA/G-9D
EPA Quality System Description
EPA Quality Systems Requirements for Environmental Programs
Guidance for Developing Quality Systems for Environmental Data Operations
EPA Requirements for Quality Management Plans
Guidance for Preparing Quality Management Plans
Guide to Satisfying EPA Quality Assurance Requirements for Contracts
Guide to Implementing Quality Assurance in Extramural Agreements
Guide to Satisfying EPA Quality Assurance Requirements for Financial Assistance
Agreements
Guidance for the Management Systems Review Process
Guidance for Determining Quality Training Requirements for Environmental Data
Operations
Guidance for the Data Quality Objectives Process
The Data Quality Objectives Process: Case Studies
Data Quality Objectives Decision Errors Feasibility Trials (DEFT) Software
Guidance for the Data Quality Objectives Process for Hazardous Waste Sites
Guidance for the Data Quality Objectives for Researchers
EPA Requirements for Quality Assurance Project Plans
EPA Guidance for Quality Assurance Project Plans
Guidance for Data Quality Indicators
Guidance on Sampling Designs to Support Quality Assurance Project Plans
Guidance on Specialized Topics in Quality Assurance
Guidance for the Preparation of Standard Operating Procedures for Quality-Related
Operations
Guidance on Technical Assessments for Environmental Data Operations
Guidance on Environmental Data Verification and Validation
Guidance for Data Quality Assessment: Practical Methods for Data Analysis
Data Quality Evaluation Statistical Toolbox (DataQUEST).
EPA QA/G-5
A-8
QA98
-------
o
__i
o
Q_
Agency-
wide
Policies.
Requirements
and
procedures
EPA
Quality
Manual
Authority —
Internal
assessment
o
o
Q_
o
§
Quality
System
structure,
procedures,
standards
Quality
Management
Plan
(QMP)
(G-2)
(R-2)
Quality
Assurance
Annual
Report and
Work Plan
(QAARWP)
Ensures
resources for
implementation of
Quality System
procedures
Quality System
structure, procedures,
standards
Revisions to
QMP
Requirements,
procedures
Internal assessment of
Quality System
effectiveness
Assess-
ment of —
training
needs
Assessment
of Quality
System
effectiveness
Management
Systems Reviews
(MSRs)
(G-3)
Performance
measures
Organizational
responsibilities, Quality
System structure,
procedures, standards,
personnel
Ensure
adequacy of
knowledge,
skills
Assess conformity of
project components
to QMP
1
Performance
measures
Individual Projects
Figure AA1. Relationships Among EPA Quality System Documents at the Program Level
EPA QA/G-5 A-9 QA98
-------
PLANNING
IMPLEMENTATION
ASSESSMENT
INITIAL
PLANNING
DESIGN
IMPLEMENTATION
PLANNING
R-5
Requirements for
QA Project Plans
G-4/G-4R/G-4HW
Guidance on the
DQO Process
(decision making)
G-5
Guidance on QA
Project Plans
G-9
Guidance for Data
Quality Assessment
KEY
G-xx | EPA QA guidance document
^ primary outputs/inputs
^ background/reference information
Figure AA2. Relationship Among EPA Quality System Documents at the Project Level
EPAQA/G-5 A-10
QA98
-------
APPENDIX B
GLOSSARY OF QUALITY ASSURANCE AND RELATED TERMS
Acceptance criteria — Specified limits placed on characteristics of an item, process, or service defined
in requirements documents. (ASQC Definitions)
Accuracy — A measure of the closeness of an individual measurement or the average of a number of
measurements to the true value. Accuracy includes a combination of random error (precision) and
systematic error (bias) components that are due to sampling and analytical operations; the EPA
recommends using the terms "precision " and "bias ", rather than "accuracy," to convey the information
usually associated with accuracy. Refer to Appendix D, Data Quality Indicators for a more detailed
definition.
Activity — An all-inclusive term describing a specific set of operations of related tasks to be performed,
either serially or in parallel (e.g., research and development, field sampling, analytical operations,
equipment fabrication), that, in total, result in a product or service.
Assessment — The evaluation process used to measure the performance or effectiveness of a system and
its elements. As used here, assessment is an all-inclusive term used to denote any of the following: audit,
performance evaluation (PE), management systems review (MSR), peer review, inspection, or
surveillance.
Audit (quality) — A systematic and independent examination to determine whether quality activities
and related results comply with planned arrangements and whether these arrangements are implemented
effectively and are suitable to achieve objectives.
Audit of Data Quality (ADQ) — A qualitative and quantitative evaluation of the documentation and
procedures associated with environmental measurements to verify that the resulting data are of
acceptable quality.
Authenticate — The act of establishing an item as genuine, valid, or authoritative.
Bias — The systematic or persistent distortion of a measurement process, which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value). Refer to
Appendix D, Data Quality Indicators, for a more detailed definition.
Blank — A sample subjected to the usual analytical or measurement process to establish a zero baseline
or background value. Sometimes used to adjust or correct routine analytical results. A sample that is
intended to contain none of the analytes of interest. A blank is used to detect contamination during
sample handling preparation and/or analysis.
Calibration — A comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.
Calibration drift — The deviation in instrument response from a reference value over a period of time
before recalibration.
EPAQA/G-5 B-l QA98
-------
Certification — The process of testing and evaluation against specifications designed to document,
verify, and recognize the competence of a person, organization, or other entity to perform a function or
service, usually for a specified time.
Chain of custody — An unbroken trail of accountability that ensures the physical security of samples,
data, and records.
Characteristic — Any property or attribute of a datum, item, process, or service that is distinct,
describable, and/or measurable.
Check standard — A standard prepared independently of the calibration standards and analyzed exactly
like the samples. Check standard results are used to estimate analytical precision and to indicate the
presence of bias due to the calibration of the analytical system.
Collocated samples — Two or more portions collected at the same point in time and space so as to be
considered identical. These samples are also known as field replicates and should be identified as such.
Comparability — A measure of the confidence with which one data set or method can be compared to
another.
Completeness — A measure of the amount of valid data obtained from a measurement system compared
to the amount that was expected to be obtained under correct, normal conditions. Refer to Appendix D,
Data Quality Indicators, for a more detailed definition.
Confidence Interval — The numerical interval constructed around a point estimate of a population
parameter, combined with a probability statement (the confidence coefficient) linking it to the
population's true parameter value. If the same confidence interval construction technique and
assumptions are used to calculate future intervals, they will include the unknown population parameter
with the same specified probability.
Confidentiality procedure — A procedure used to protect confidential business information (including
proprietary data and personnel records) from unauthorized access.
Configuration — The functional, physical, and procedural characteristics of an item, experiment, or
document.
Conformance — An affirmative indication or judgment that a product or service has met the
requirements of the relevant specification, contract, or regulation; also, the state of meeting the
requirements.
Consensus standard — A standard established by a group representing a cross section of a particular
industry or trade, or a part thereof.
Contractor — Any organization or individual contracting to furnish services or items or to perform
work.
Corrective action — Any measures taken to rectify conditions adverse to quality and, where possible, to
preclude their recurrence.
EPA QA/G-5 B-2 QA98
-------
Data Quality Assessment (DQA) — The scientific and statistical evaluation of data to determine if data
obtained from environmental operations are of the right type, quality, and quantity to support their
intended use. The five steps of the DQA Process include: 1) reviewing the DQOs and sampling design,
2) conducting a preliminary data review, 3) selecting the statistical test, 4) verifying the assumptions of
the statistical test, and 5) drawing conclusions from the data.
Data Quality Indicators (DQIs) — The quantitative statistics and qualitative descriptors that are used to
interpret the degree of acceptability or utility of data to the user. The principal data quality indicators are
bias, precision, accuracy (bias is preferred), comparability, completeness, representativeness.
Data Quality Objectives (DQOs) — The qualitative and quantitative statements derived from the DQO
Process that clarify study's technical and quality objectives, define the appropriate type of data, and
specify tolerable levels of potential decision errors that will be used as the basis for establishing the
quality and quantity of data needed to support decisions.
Data Quality Objectives (DQO) Process — A systematic strategic planning tool based on the scientific
method that identifies and defines the type, quality, and quantity of data needed to satisfy a specified use.
DQOs are the qualitative and quantitative outputs from the DQO Process.
Data reduction — The process of transforming the number of data items by arithmetic or statistical
calculations, standard curves, and concentration factors, and collating them into a more useful form.
Data reduction is irreversible and generally results in a reduced data set and an associated loss of detail.
Data usability — The process of ensuring or determining whether the quality of the data produced meets
the intended use of the data.
Deficiency — An unauthorized deviation from acceptable procedures or practices, or a defect in an item.
Demonstrated capability — The capability to meet a procurement's technical and quality specifications
through evidence presented by the supplier to substantiate its claims and in a manner defined by the
customer.
Design — The specifications, drawings, design criteria, and performance requirements. Also, the result
of deliberate planning, analysis, mathematical manipulations, and design processes.
Design change — Any revision or alteration of the technical requirements defined by approved and
issued design output documents and approved and issued changes thereto.
Design review — A documented evaluation by a team, including personnel such as the responsible
designers, the client for whom the work or product is being designed, and a quality assurance (QA)
representative but excluding the original designers, to determine if a proposed design will meet the
established design criteria and perform as expected when implemented.
Detection Limit (DL) — A measure of the capability of an analytical method to distinguish samples that
do not contain a specific analyte from samples that contain low concentrations of the analyte; the lowest
concentration or amount of the target analyte that can be determined to be different from zero by a single
measurement at a stated level of probability. DLs are analyte- and matrix-specific and may be
laboratory-dependent.
EPA QA/G-5 B-3 QA98
-------
Distribution — 1) The appointment of an environmental contaminant at a point over time, over an area,
or within a volume; 2) a probability function (density function, mass function, or distribution function)
used to describe a set of observations (statistical sample) or a population from which the observations are
generated.
Document control — The policies and procedures used by an organization to ensure that its documents
and their revisions are proposed, reviewed, approved for release, inventoried, distributed, archived,
stored, and retrieved in accordance with the organization's requirements.
Duplicate samples — Two samples taken from and representative of the same population and carried
through all steps of the sampling and analytical procedures in an identical manner. Duplicate samples are
used to assess variance of the total method, including sampling and analysis. See also collocated sample.
Environmental conditions — The description of a physical medium (e.g., air, water, soil, sediment) or a
biological system expressed in terms of its physical, chemical, radiological, or biological characteristics.
Environmental data — Any parameters or pieces of information collected or produced from
measurements, analyses, or models of environmental processes, conditions, and effects of pollutants on
human health and the ecology, including results from laboratory analyses or from experimental systems
representing such processes and conditions.
Environmental data operations — Any work performed to obtain, use, or report information pertaining
to environmental processes and conditions.
Environmental monitoring — The process of measuring or collecting environmental data.
Environmental processes — Any manufactured or natural processes that produce discharges to, or that
impact, the ambient environment.
Environmental programs — An all-inclusive term pertaining to any work or activities involving the
environment, including but not limited to: characterization of environmental processes and conditions;
environmental monitoring; environmental research and development; the design, construction, and
operation of environmental technologies; and laboratory operations on environmental samples.
Environmental technology — An all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies and their
components that may be utilized to remove pollutants or contaminants from, or to prevent them from
entering, the environment. Examples include wet scrubbers (air), soil washing (soil), granulated
activated carbon unit (water), and filtration (air, water). Usually, this term applies to hardware-based
systems; however, it can also apply to methods or techniques used for pollution prevention, pollutant
reduction, or containment of contamination to prevent further movement of the contaminants, such as
capping, solidification or vitrification, and biological treatment.
Estimate — A characteristic from the sample from which inferences on parameters can be made.
Evidentiary records — Any records identified as part of litigation and subject to restricted access,
custody, use, and disposal.
EPA QA/G-5 B-4 QA98
-------
Expedited change — An abbreviated method of revising a document at the work location where the
document is used when the normal change process would cause unnecessary or intolerable delay in the
work.
Field blank — A blank used to provide information about contaminants that may be introduced during
sample collection, storage, and transport. A clean sample, carried to the sampling site, exposed to
sampling conditions, returned to the laboratory, and treated as an environmental sample.
Field (matrix) spike — A sample prepared at the sampling point (i.e., in the field) by adding a known
mass of the target analyte to a specified amount of the sample. Field matrix spikes are used, for example,
to determine the effect of the sample preservation, shipment, storage, and preparation on analyte recovery
efficiency (the analytical bias).
Field split samples — Two or more representative portions taken from the same sample and submitted
for analysis to different laboratories to estimate interlaboratory precision.
Financial assistance — The process by which funds are provided by one organization (usually
governmental) to another organization for the purpose of performing work or furnishing services or
items. Financial assistance mechanisms include grants, cooperative agreements, and governmental
interagency agreements.
Finding — An assessment conclusion that identifies a condition having a significant effect on an item or
activity. An assessment finding may be positive or negative, and is normally accompanied by specific
examples of the observed condition.
Goodness-of-fit test — The application of the chi square distribution in comparing the frequency
distribution of a statistic observed in a sample with the expected frequency distribution based on some
theoretical model.
Grade — The category or rank given to entities having the same functional use but different
requirements for quality.
Graded approach — The process of basing the level of application of managerial controls applied to an
item or work according to the intended use of the results and the degree of confidence needed in the
quality of the results. (See also Data Quality Objectives (DQO) Process.)
Guidance — A suggested practice that is not mandatory, intended as an aid or example in complying
with a standard or requirement.
Guideline — A suggested practice that is not mandatory in programs intended to comply with a standard.
Hazardous waste — Any waste material that satisfies the definition of hazardous waste given in 40 CFR
261, "Identification and Listing of Hazardous Waste."
Holding time — The period of time a sample may be stored prior to its required analysis. While
exceeding the holding time does not necessarily negate the veracity of analytical results, it causes the
qualifying or "flagging" of any data not meeting all of the specified acceptance criteria.
Identification error — The misidentification of an analyte. In this error type, the contaminant of
concern is unidentified and the measured concentration is incorrectly assigned to another contaminant.
EPA QA/G-5 B-5 QA98
-------
Independent assessment — An assessment performed by a qualified individual, group, or organization
that is not a part of the organization directly performing and accountable for the work being assessed.
Inspection — The examination or measurement of an item or activity to verify conformance to specific
requirements.
Internal standard — A standard added to a test portion of a sample in a known amount and carried
through the entire determination procedure as a reference for calibrating and controlling the precision
and bias of the applied analytical method.
Laboratory split samples — Two or more representative portions taken from the same sample and
analyzed by different laboratories to estimate the interlaboratory precision or variability and the data
comparability.
Limit of quantitation — The minimum concentration of an analyte or category of analytes in a specific
matrix that can be identified and quantified above the method detection limit and within specified limits
of precision and bias during routine analytical operating conditions.
Management — Those individuals directly responsible and accountable for planning, implementing, and
assessing work.
Management system — A structured, nontechnical system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for conducting work and producing items and services.
Management Systems Review (MSR) — The qualitative assessment of a data collection operation
and/or organization(s) to establish whether the prevailing quality management structure, policies,
practices, and procedures are adequate for ensuring that the type and quality of data needed are obtained.
Matrix spike — A sample prepared by adding a known mass of a target analyte to a specified amount of
matrix sample for which an independent estimate of the target analyte concentration is available. Spiked
samples are used, for example, to determine the effect of the matrix on a method's recovery efficiency.
Mean (arithmetic) — The sum of all the values of a set of measurements divided by the number of
values in the set; a measure of central tendency.
Mean squared error — A statistical term for variance added to the square of the bias.
Measurement and Testing Equipment (M&TE) — Tools, gauges, instruments, sampling devices, or
systems used to calibrate, measure, test, or inspect in order to control or acquire data to verify
conformance to specified requirements.
Memory effects error — The effect that a relatively high concentration sample has on the measurement
of a lower concentration sample of the same analyte when the higher concentration sample precedes the
lower concentration sample in the same analytical instrument.
Method — A body of procedures and techniques for performing an activity (e.g., sampling, chemical
analysis, quantification), systematically presented in the order in which they are to be executed.
EPA QA/G-5 B-6 QA98
-------
Method blank — A blank prepared to represent the sample matrix as closely as possible and analyzed
exactly like the calibration standards, samples, and quality control (QC) samples. Results of method
blanks provide an estimate of the within-batch variability of the blank response and an indication of bias
introduced by the analytical procedure.
Mid-range check — A standard used to establish whether the middle of a measurement method's
calibrated range is still within specifications.
Mixed waste — A hazardous waste material as defined by 40 CFR 261 Resource Conservation and
Recovery Act (RCRA) and mixed with radioactive waste subject to the requirements of the Atomic
Energy Act.
Must — When used in a sentence, a term denoting a requirement that has to be met.
Nonconformance — A deficiency in a characteristic, documentation, or procedure that renders the
quality of an item or activity unacceptable or indeterminate; nonfulfillment of a specified requirement.
Objective evidence — Any documented statement of fact, other information, or record, either
quantitative or qualitative, pertaining to the quality of an item or activity, based on observations,
measurements, or tests that can be verified.
Observation — An assessment conclusion that identifies a condition (either positive or negative) that
does not represent a significant impact on an item or activity. An observation may identify a condition
that has not yet caused a degradation of quality.
Organization — A company, corporation, firm, enterprise, or institution, or part thereof, whether
incorporated or not, public or private, that has its own functions and administration.
Organization structure — The responsibilities, authorities, and relationships, arranged in a pattern,
through which an organization performs its functions.
Outlier — An extreme observation that is shown to have a low probability of belonging to a specified
data population.
Parameter — A quantity, usually unknown, such as a mean or a standard deviation characterizing a
population. Commonly misused for "variable," "characteristic," or "property."
Peer review — A documented critical review of work generally beyond the state of the art or
characterized by the existence of potential uncertainty. Conducted by qualified individuals (or an
organization) who are independent of those who performed the work but collectively equivalent in
technical expertise (i.e., peers) to those who performed the original work. Peer reviews are conducted to
ensure that activities are technically adequate, competently performed, properly documented, and satisfy
established technical and quality requirements. An in-depth assessment of the assumptions, calculations,
extrapolations, alternate interpretations, methodology, acceptance criteria, and conclusions pertaining to
specific work and of the documentation that supports them. Peer reviews provide an evaluation of a
subject where quantitative methods of analysis or measures of success are unavailable or undefined, such
as in research and development.
EPA QA/G-5 B-7 QA98
-------
Performance Evaluation (PE) — A type of audit in which the quantitative data generated in a
measurement system are obtained independently and compared with routinely obtained data to evaluate
the proficiency of an analyst or laboratory.
Pollution prevention — An organized, comprehensive effort to systematically reduce or eliminate
pollutants or contaminants prior to their generation or their release or discharge into the environment.
Precision — A measure of mutual agreement among individual measurements of the same property,
usually under prescribed similar conditions expressed generally in terms of the standard deviation. Refer
to Appendix D, Data Quality Indicators, for a more detailed definition.
Procedure — A specified way to perform an activity.
Process — A set of interrelated resources and activities that transforms inputs into outputs. Examples of
processes include analysis, design, data collection, operation, fabrication, and calculation.
Project — An organized set of activities within a program.
Qualified data — Any data that have been modified or adjusted as part of statistical or mathematical
evaluation, data validation, or data verification operations.
Qualified services — An indication that suppliers providing services have been evaluated and
determined to meet the technical and quality requirements of the client as provided by approved
procurement documents and demonstrated by the supplier to the client's satisfaction.
Quality — The totality of features and characteristics of a product or service that bears on its ability to
meet the stated or implied needs and expectations of the user.
Quality Assurance (QA) — An integrated system of management activities involving planning,
implementation, assessment, reporting, and quality improvement to ensure that a process, item, or service
is of the type and quality needed and expected by the client.
Quality Assurance Program Description/Plan — See quality management plan.
Quality Assurance Project Plan (QAPP) — A formal document describing in comprehensive detail the
necessary quality assurance (QA), quality control (QC), and other technical activities that must be
implemented to ensure that the results of the work performed will satisfy the stated performance criteria.
The QAPP components are divided into four classes: 1) Project Management, 2) Measurement/Data
Acquisition, 3) Assessment/Oversight, and 4) Data Validation and Usability. Requirements for preparing
QAPPs can be found in EPA QA/R-5.
Quality Control (QC) — The overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer; operational techniques and activities that are used to fulfill
requirements for quality. The system of activities and checks used to ensure that measurement systems
are maintained within prescribed limits, providing protection against "out of control" conditions and
ensuring the results are of acceptable quality.
Quality control (QC) sample — An uncontaminated sample matrix spiked with known amounts of
analytes from a source independent of the calibration standards. Generally used to establish intra-
EPA QA/G-5 B-8 QA98
-------
laboratory or analyst-specific precision and bias or to assess the performance of all or a portion of the
measurement system.
Quality improvement — A management program for improving the quality of operations. Such
management programs generally entail a formal mechanism for encouraging worker recommendations
with timely management evaluation and feedback or implementation.
Quality management — That aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning,
allocation of resources, and other systematic activities (e.g., planning, implementation, and assessment)
pertaining to the quality system.
Quality Management Plan (QMP) — A formal document that describes the quality system in terms of
the organization's structure, the functional responsibilities of management and staff, the lines of
authority, and the required interfaces for those planning, implementing, and assessing all activities
conducted.
Quality system — A structured and documented management system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for ensuring quality in its work processes, products (items), and services. The quality
system provides the framework for planning, implementing, and assessing work performed by the
organization and for carrying out required quality assurance (QA) and quality control (QC).
Radioactive waste — Waste material containing, or contaminated by, radionuclides, subject to the
requirements of the Atomic Energy Act.
Readiness review — A systematic, documented review of the readiness for the start-up or continued use
of a facility, process, or activity. Readiness reviews are typically conducted before proceeding beyond
project milestones and prior to initiation of a major phase of work.
Record (quality) — A document that furnishes objective evidence of the quality of items or activities
and that has been verified and authenticated as technically complete and correct. Records may include
photographs, drawings, magnetic tape, and other data recording media.
Recovery — The act of determining whether or not the methodology measures all of the analyte
contained in a sample. Refer to Appendix D, Data Quality Indicators, for a more detailed definition.
Remediation — The process of reducing the concentration of a contaminant (or contaminants) in air,
water, or soil media to a level that poses an acceptable risk to human health.
Repeatability — The degree of agreement between independent test results produced by the same
analyst, using the same test method and equipment on random aliquots of the same sample within a short
time period.
Reporting limit — The lowest concentration or amount of the target analyte required to be reported
from a data collection project. Reporting limits are generally greater than detection limits and are usually
not associated with a probability level.
EPA QA/G-5 B-9 QA98
-------
Representativeness — A measure of the degree to which data accurately and precisely represent a
characteristic of a population, a parameter variation at a sampling point, a process condition, or an
environmental condition. See also Appendix D, Data Quality Indicators.
Reproducibility — The precision, usually expressed as variance, that measures the variability among the
results of measurements of the same sample at different laboratories.
Requirement — A formal statement of a need and the expected manner in which it is to be met.
Research (applied) — A process, the objective of which is to gain the knowledge or understanding
necessary for determining the means by which a recognized and specific need may be met.
Research (basic) — A process, the objective of which is to gain fuller knowledge or understanding of
the fundamental aspects of phenomena and of observable facts without specific applications toward
processes or products in mind.
Research development/demonstration — The systematic use of the knowledge and understanding
gained from research and directed toward the production of useful materials, devices, systems, or
methods, including prototypes and processes.
Round-robin study — A method validation study involving a predetermined number of laboratories or
analysts, all analyzing the same sample(s) by the same method. In a round-robin study, all results are
compared and used to develop summary statistics such as interlaboratory precision and method bias or
recovery efficiency.
Ruggedness study — The carefully ordered testing of an analytical method while making slight
variations in test conditions (as might be expected in routine use) to determine how such variations affect
test results. If a variation affects the results significantly, the method restrictions are tightened to
minimize this variability.
Scientific method — The principles and processes regarded as necessary for scientific investigation,
including rules for concept or hypothesis formulation, conduct of experiments, and validation of
hypotheses by analysis of observations.
Self-assessment — The assessments of work conducted by individuals, groups, or organizations directly
responsible for overseeing and/or performing the work.
Sensitivity — the capability of a method or instrument to discriminate between measurement responses
representing different levels of a variable of interest. Refer to Appendix D, Data Quality Indicators, for a
more detailed definition.
Service — The result generated by activities at the interface between the supplier and the customer, and
the supplier internal activities to meet customer needs. Such activities in environmental programs
include design, inspection, laboratory and/or field analysis, repair, and installation.
Shall — A term denoting a requirement that is mandatory whenever the criterion for conformance with
the specification permits no deviation. This term does not prohibit the use of alternative approaches or
methods for implementing the specification so long as the requirement is fulfilled.
EPAQA/G-5 B-10 QA98
-------
Significant condition — Any state, status, incident, or situation of an environmental process or
condition, or environmental technology in which the work being performed will be adversely affected
sufficiently to require corrective action to satisfy quality objectives or specifications and safety
requirements.
Software life cycle — The period of time that starts when a software product is conceived and ends
when the software product is no longer available for routine use. The software life cycle typically
includes a requirement phase, a design phase, an implementation phase, a test phase, an installation and
check-out phase, an operation and maintenance phase, and sometimes a retirement phase.
Source reduction — Any practice that reduces the quantity of hazardous substances, contaminants, or
pollutants.
Span check — A standard used to establish that a measurement method is not deviating from its
calibrated range.
Specification — A document stating requirements and referring to or including drawings or other
relevant documents. Specifications should indicate the means and criteria for determining conformance.
Spike — A substance that is added to an environmental sample to increase the concentration of target
analytes by known amounts; used to assess measurement accuracy (spike recovery). Spike duplicates
are used to assess measurement precision.
Split samples — Two or more representative portions taken from one sample in the field or in the
laboratory and analyzed by different analysts or laboratories. Split samples are quality control (QC)
samples that are used to assess analytical variability and comparability.
Standard deviation — A measure of the dispersion or imprecision of a sample or population distribution
expressed as the positive square root of the variance and has the same unit of measurement as the mean.
Standard Operating Procedure (SOP) — A written document that details the method for an operation,
analysis, or action with thoroughly prescribed techniques and steps and that is officially approved as the
method for performing certain routine or repetitive tasks.
Supplier — Any individual or organization furnishing items or services or performing work according to
a procurement document or a financial assistance agreement. An all-inclusive term used in place of any
of the following: vendor, seller, contractor, subcontractor, fabricator, or consultant.
Surrogate spike or analyte — A pure substance with properties that mimic the analyte of interest. It is
unlikely to be found in environmental samples and is added to them to establish that the analytical
meyhod has been performed properly.
Surveillance (quality) — Continual or frequent monitoring and verification of the status of an entity and
the analysis of records to ensure that specified requirements are being fulfilled.
Technical review — A documented critical review of work that has been performed within the state of
the art. The review is accomplished by one or more qualified reviewers who are independent of those
who performed the work but are collectively equivalent in technical expertise to those who performed the
original work. The review is an in-depth analysis and evaluation of documents, activities, material, data,
EPAQA/G-5 B-ll QA98
-------
or items that require technical verification or validation for applicability, correctness, adequacy,
completeness, and assurance that established requirements have been satisfied.
Technical Systems Audit (TSA) — A thorough, systematic, on-site qualitative audit of facilities,
equipment, personnel, training, procedures, record keeping, data validation, data management, and
reporting aspects of a system.
Traceability — The ability to trace the history, application, or location of an entity by means of recorded
identifications. In a calibration sense, traceability relates measuring equipment to national or
international standards, primary standards, basic physical constants or properties, or reference materials.
In a data collection sense, it relates calculations and data generated throughout the project back to the
requirements for the quality of the project.
Trip blank — A clean sample of a matrix that is taken to the sampling site and transported to the
laboratory for analysis without having been exposed to sampling procedures.
Validation — Confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use have been fulfilled. In design and development, validation
concerns the process of examining a product or result to determine conformance to user needs. See also
Appendix G, Data Management.
Variance (statistical) — A measure or dispersion of a sample or population distribution.
Verification — Confirmation by examination and provision of objective evidence that specified
requirements have been fulfilled. In design and development, verification concerns the process of
examining a result of a given activity to determine conformance to the stated requirements for that
activity.
EPAQA/G-5 B-12 QA98
-------
APPENDIX C
CHECKLISTS USEFUL IN QUALITY ASSURANCE REVIEW
This appendix contains three checklists:
AC.l Sample Handling, Preparation, and Analysis Checklist
AC.2 QAPP Review Checklist
AC.3 Chain-of-Custody Checklist
These three checklists were developed as tools for quality assurance (QA) managers to screen for
completeness of documentation. This appendix was not intended to be used or adapted for auditing
purposes. The items listed on the checklists are not ranked or identified to indicate which items are
trivial and which are of major importance. When using these checklists, it is extremely important to
ensure that a mechanism be established for assessing and addressing important comments or violations
during the data assessment (e.g., Data Quality Assessment [DQA]) stage.
AC1. SAMPLE HANDLING, PREPARATION, AND ANALYSIS CHECKLIST
This checklist covers most of the appropriate elements performed during the analysis of
environmental samples. Functions not appropriate for a specific analysis should be annotated.
Information on the collection and handling of samples should be completely documented to
allow the details of sample collection and handling to be re-created. All information should be entered
in ink at the time the information was generated in a permanently bound logbook. Errors should not be
erased or crossed-out but corrected by putting a line through the erroneous information and by entering,
initialing, and dating the correct information. Blank spaces should have an obliterating line drawn
through to prevent addition of information. Each set of information should have an identifying printed
name, signature, and initials.
Sample Handling
• Field Logs Documentation of events occurring during field sampling to
identify individual field samples.
• Sample Labels Links individual samples with the field log and the chain-of-
custody record.
• Chain-of-Custody Records Documentation of exchange and transportation of samples from
the field to final analysis.
• Sample Receipt Log Documentation of receipt of the laboratory or organization of
the entire set of individual samples for analysis.
Sample Preparation and Analysis
• Sample Preparation Log Documents the preparation of samples for a specific method.
• Sample Analysis Log Records information on the analysis of analytical results.
• Instrument Run Log Records analyses of calibration standards, field samples, and
quality control (QC) samples.
Chemical Standards
• Chemical Standard Receipt Log Records receipt of analytical standards and chemicals.
• Standards/Reagent Preparation Log Records of the preparation of internal standards, reagents,
spiking solutions, surrogate solutions, and reference materials.
EPAQA/G-5 C-l QA98
-------
AC.l SAMPLE HANDLING, REPORTING, AND ANALYSIS CHECKLIST
Field Loss
ELEMENT
Project name/ID and location
Sampling personnel
Geological observations including map
Atmospheric conditions
Field measurements
Sample dates, times, and locations
Sample identifications present
Sample matrix identified
Sample descriptions (e.g., odors and colors)
Number of samples taken per location
Sampling method/equipment
Description of any QC samples
Any deviations from the sampling plan
Difficulties in sampling or unusual circumstances
COMMENT
Sample Labels
ELEMENT
Sample ID
Date and time of collection
Sampler's signature
Characteristic or parameter investigated
Preservative used
COMMENT
Chain of Custody Records
ELEMENT
Project name/ID and location
Sample custodian signatures verified and on file
Date and time of each transfer
Carrier ID number
Integrity of shipping container and seals verified
Standard Operating Procedures (SOPs) for receipt on file
Samples stored in same area
Holding time protocol verified
SOPs for sample preservation on file
Identification of proposed analytical method verified
Proposed analytical method documentation verified
QA Plan for proposed analytical method on file
COMMENT
EPA QA/G-5
C-2
QA98
-------
AC.l SAMPLE HANDLING, REPORTING, AND ANALYSIS CHECKLIST (CONTINUED)
Sample Receipt Los
ELEMENT
Date and time of receipt
Sample collection date
Client sample ID
Number of samples
Sample matrices
Requested analysis, including method number(s)
Signature of the sample custodian or designee
Sampling kit code (if applicable)
Sampling condition
Chain-of-custody violations and identities
COMMENT
SAMPLE PREPARATION AND ANALYSIS
Sample Preparation Loss
ELEMENT
Parameter/analyte of investigation
Method number
Date and time of preparation
Analyst's initials or signature
Initial sample volume or weight
Final sample volume
Concentration and amount of spiking solutions used
QC samples included with the sample batch
ID for reagents, standards, and spiking solutions used
COMMENT
Sample Analysis Loss
ELEMENT
Parameter analyte of investigation
Method number/reference
Date and time of analysis
Analyst's initials or signature
Laboratory sample ID
Sample aliquot
Dilution factors and final sample volumes (if applicable)
Absorbance values, peak heights, or initial concentrations reading
Final analyte concentration
Calibration data (if applicable)
Correlation coefficient (including parameters)
Calculations of key quantities available
Comments on interferences or unusual observations
QC information, including percent recovery
COMMENT
EPA QA/G-5
C-3
QA98
-------
AC.l SAMPLE HANDLING, REPORTING, AND ANALYSIS CHECKLIST (CONTINUED)
Instrument Run Loss
ELEMENT
Name/type of instrument
Instrument manufacturer and model number
Serial number
Date received and date placed in service
Instrument ID assigned by the laboratory (if used)
Service contract information, including service representative details
Description of each maintenance or repair activity performed
Date and time when of each maintenance or repair activity
Initials of maintenance or repair technicians
COMMENT
CHEMICAL STANDARDS
Chemical/Standard Receipt Loss
ELEMENT
Laboratory control number
Date of receipt
Initials or signature of person receiving chemical
Chemical name and catalog number
Vendor name and log number
Concentration or purity of standard
Expiration date
COMMENT
Standards/Reagent Preparation Los
ELEMENT
Date of preparation
Initials of analyst preparing the standard solution or reagent
Concentration or purity of standard or reagent
Volume or weight of the stock solution or neat materials
Final volume of the solution being prepared
Laboratory ID/control number assigned to the new solution
Name of standard reagent
Standardization of reagents, titrants, etc. (if applicable)
Expiration date
COMMENT
References
Roserance, A. and L. Kibler. 1994. "Generating Defensible Data," Environmental Testing and Analysis. May/June.
Roserance, A. and L. Kibler. 1996. "Documentation and Record Keeping Guidelines." In Proceedings of the 12th
Annual Waste Testing and Quality Assurance Symposium. July.
EPA QA/G-5
C-4
QA98
-------
AC.2 QAPP REVIEW CHECKLIST
ELEMENT
Al. Title and Approval Sheet
Title
Organization's name
Dated signature of project manager
Dated signature of quality assurance officer
Other signatures, as needed
A2. Table of Contents
A3. Distribution List
A4. Project/Task Organization
Identifies key individuals, with their responsibilities (data users, decision-
makers, project QA manager, subcontractors, etc.)
Organization chart shows lines of authority and reporting responsibilities
A5. Problem Definition/Background
Clearly states problem or decision to be resolved
Provides historical and background information
A6. Project/Task Description
Lists measurements to be made
Cites applicable technical, regulatory, or program-specific quality standards,
criteria, or objectives
Notes special personnel or equipment requirements
Provides work schedule
Notes required project and QA records/reports
A7. Quality Objectives and Criteria for Measurement Data
States project objectives and limits, both qualitatively and quantitatively
States and characterizes measurement quality objectives as to applicable
action levels or criteria
A8. Special Training Requirements/Certification Listed
States how provided, documented, and assured
A9. Documentation and Records
Lists information and records to be included in data report (e.g., raw data,
field logs, results of QC checks, problems encountered)
States requested lab turnaround time
Gives retention time and location for records and reports
Bl. Sampling Process Design (Experimental Design)
States the following:
Type and number of samples required
Sampling design and rationale
Sampling locations and frequency
Sample matrices
COMMENTS
EPA QA/G-5
C-5
QA98
-------
AC.2 QAPP REVIEW CHECKLIST (CONTINUED)
ELEMENT
Classification of each measurement parameter as either critical or needed for
information only
Appropriate validation study information, for nonstandard situations
B2. Sampling Methods Requirements
Identifies sample collection procedures and methods
Lists equipment needs
Identifies support facilities
Identifies individuals responsible for corrective action
Describes process for preparation and decontamination of sampling
equipment
Describes selection and preparation of sample containers and sample volumes
Describes preservation methods and maximum holding times
B3. Sample Handling and Custody Requirements
Notes sample handling requirements
Notes chain-of -custody procedures, if required
B4. Analytical Methods Requirements
Identifies analytical methods to be followed (with all options) and required
equipment
Provides validation information for nonstandard methods
Identifies individuals responsible for corrective action
Specifies needed laboratory turnaround time
B5. Quality Control Requirements
Identifies QC procedures and frequency for each sampling, analysis, or
measurement technique, as well as associated acceptance criteria and
corrective action
References procedures used to calculate QC statistics including precision and
bias/accuracy
B6. Instrument/Equipment Testing, Inspection, and Maintenance Requirements
Identifies acceptance testing of sampling and measurement systems
Describes equipment preventive and corrective maintenance
Notes availability and location of spare parts
B7. Instrument Calibration and Frequency
Identifies equipment needing calibration and frequency for such calibration
Notes required calibration standards and/or equipment
Cites calibration records and manner traceable to equipment
B8. Inspection/Acceptance Requirements for Supplies and Consumables
States acceptance criteria for supplies and consumables
Notes responsible individuals
B9. Data Acquisition Requirements for Nondirect Measurements
COMMENTS
EPA QA/G-5
C-6
QA98
-------
AC.2 QAPP REVIEW CHECKLIST (CONTINUED)
ELEMENT
Identifies type of data needed from nonmeasurement sources (e.g., computer
databases and literature files), along with acceptance criteria for their use
Describes any limitations of such data
Documents rationale for original collection of data and its relevance to this
project
BIO. Data Management
Describes standard record-keeping and data storage and retrieval requirements
Checklists or standard forms attached to QAPP
Describes data handling equipment and procedures used to process, compile,
and analyze data (e.g., required computer hardware and software)
Describes process for assuring that applicable Office of Information Resource
Management requirements are satisfied
Cl. Assessments and Response Actions
Lists required number, frequency and type of assessments, with approximate
dates and names of responsible personnel (assessments include but are not
limited to peer reviews, management systems reviews, technical systems
audits, performance evaluations, and audits of data quality)
Identifies individuals responsible for corrective actions
C2. Reports to Management
Identifies frequency and distribution of reports for:
Project status
Results of performance evaluations and audits
Results of periodic data quality assessments
Any significant QA problems
Preparers and recipients of reports
Dl. Data Review, Validation, and Verification
States criteria for accepting, rejecting, or qualifying data
Includes project-specific calculations or algorithms
D2. Validation and Verification Methods
Describes process for data validation and verification
Identifies issue resolution procedure and responsible individuals
Identifies method for conveying these results to data users
D3. Reconciliation with User Requirements
Describes process for reconciling project results with DQOs and reporting
limitations on use of data
COMMENTS
References
Personal Communication, Margo Hunt, EPA Region II, February, 1996.
Personal Communication, Robert Dona, EPA Region VII, November, 1997.
EPA QA/G-5
C-7
QA98
-------
AC.3 CHAIN-OF-CUSTODY CHECKLIST
Item
1 . Is a sample custodian designated?
If yes, name of sample custodian.
2. Are the sample custodian's procedures and responsibilities
documented?
If yes, where are these documented?
3. Are written Standard Operating Procedures (SOPs) developed
for receipt of samples?
If yes, where are the SOPs documented (laboratory manual,
written instructions, etc.)?
4. Is the receipt of chain-of-custody record(s) with samples being
documented?
If yes, where is this documented?
5. Is the nonreceipt of chain-of-custody record(s) with samples
being documented?
If yes, where is this documented?
6. Is the integrity of the shipping container(s) being documented
(custody seal(s) intact, container locked, or sealed properly,
etc.)?
If yes, where is security documented?
7. Is the lack of integrity of the shipping container(s) being
documented (i.e., evidence of tampering, custody seals broken
or damaged, locks unlocked or missing, etc.)?
If yes, where is nonsecurity documented?
8. Is agreement between chain-of-custody records and sample
tags being verified and documented?
If yes, state source of verification and location of
documentation.
9. Are sample tag numbers recorded by the sample custodian?
If yes, where are they recorded?
10. Are written SOPs developed for sample storage?
If yes, where are the SOPs documented (laboratory manual,
written instructions, etc.)?
1 1 . Are samples stored in a secure area?
If yes, where and how are they stored?
12. Is sample identification maintained?
If yes, how?
13. Is sample extract (or inorganics concentrate) identification
maintained?
If yes, how?
14. Are samples that require preservation stored in such a way as
to maintain their preservation?
If yes, how are the samples stored?
Y
N
Comment
EPA QA/G-5
QA98
-------
AC.3 CHAIN-OF-CUSTODY CHECKLIST (CONTINUED)
Item
15. Based upon sample records examined to determine holding
times, are sample holding time limitations being satisfied?
Sample records used to determine holding times:
16. Are written SOPs developed for sampling handling and
tracking?
If yes, where are the SOPs documented (laboratory manual,
written instructions, etc.)?
17. Do laboratory records indicate personnel receiving and
transferring samples in the laboratory?
If yes, what laboratory records document this?
18. Does each instrument used for sample analysis (GC, GC/MS,
AA, etc.) have an instrument log?
If no, which instruments do not?
19. Are analytical methods documented and available to the
analysts?
If yes, where are these documented?
20. Are QA procedures documented and available to the analysts?
If yes, where are these documented?
21. Are written SOPs developed for compiling and maintaining
sample document files?
If yes, where are the SOPs documented (laboratory manual,
written instructions, etc.)?
22. Are sample documents filed by case number?
If no, how are documents filed?
23. Are sample document files inventoried?
24. Are documents in the case files consecutively numbered
according to the file inventories?
25. Are documents in the case files stored in a secure area?
If yes, where and how are they stored?
26. Has the laboratory received any confidential documents?
27. Are confidential documents segregated from other laboratory
documents?
If no, how are they filed?
28. Are confidential documents stored in a secure manner?
If yes, where and how are they stored?
29. Was a debriefing held with laboratory personnel after the audit
was completed?
30. Were any recommendations made to laboratory personnel
during the debriefing?
Y
N
Comment
EPA QA/G-5
C-9
QA98
-------
APPENDIX D
DATA QUALITY INDICATORS
INTRODUCTION
Data Quality Indicators (DQIs) are qualitative and quantitative descriptors used in interpreting
the degree of acceptability or utility of data. The principal DQIs are precision, bias, representativeness,
comparability, and completeness. Secondary DQIs include sensitivity, recovery, memory effects, limit of
quantitation, repeatability, and reproducibility. Establishing acceptance criteria for the DQIs sets
quantitative goals for the quality of data generated in the analytical measurement process. DQIs may be
expressed for entire measurement systems, but it is customary to allow DQIs to be applied only to
laboratory measurement processes. The issues of design and sampling errors, the most influential
components of variability, are discussed separately in EPA QA/G-5S, Guidance on Sampling Designs to
Support QAPPs.
Of the five principal DQIs, precision and bias are the quantitative measures, representativeness
and comparability are qualitative, and completeness is a combination of both quantitative and qualitative
measures.
The five principal DQIs are also referred to by the acronym PARCC, with the "A" in PARCC
referring to accuracy instead of bias. This inconsistency results because some analysts believe accuracy
and bias are synonymous, and PARCC is a more convenient acronym than PBRCC. Accuracy comprises
both random error (precision) and systematic error (bias), and these indicators are discussed separately in
this appendix. DQIs are discussed at length in EPA QA/G-5I, Guidance on Data Quality Indicators.
ADI. PRINCIPAL DQIs: PARCC
AD1.1 PARCC: Precision
Precision is a measure of agreement among replicate measurements of the same property, under
prescribed similar conditions. This agreement is calculated as either the range (R) or as the standard
deviation (s). It may also be expressed as a percentage of the mean of the measurements, such as relative
range (RR) (for duplicates) or relative standard deviation (RSD).
For analytical procedures, precision may be specified as either intralaboratory (within a
laboratory) or interlaboratory (between laboratories) precision. Intralaboratory precision estimates
represent the agreement expected when a single laboratory uses the same method to make repeated
measurements of the same sample. Interlaboratory precision refers to the agreement expected when two
or more laboratories analyze the same or identical samples with the same method. Intralaboratory
precision is more commonly reported; however, where available, both intralaboratory and interlaboratory
precision are listed in the data compilation.
When possible, a sample subdivided in the field and preserved separately is used to assess the
variability of sample handling, preservation, and storage along with the variability of the analysis process.
When collocated samples are collected, processed, and analyzed by the same organization,
intralaboratory precision information on sample acquisition, handling, shipping, storage, preparation, and
analysis is obtained. Both samples can be carried through the steps in the measurement process together
EPAQA/G-5 D-l QA98
-------
to provide an estimate of short-term precision. Likewise, the two samples, if separated and processed at
different times or by different people and/or analyzed using different instruments, provide an estimate of
long-term precision.
AD1.2 PARCC: Bias
Bias is the systematic or persistent distortion of a measurement process that causes errors in one
direction. Bias assessments for environmental measurements are made using personnel, equipment, and
spiking materials or reference materials as independent as possible from those used in the calibration of
the measurement system. When possible, bias assessments should be based on analysis of spiked
samples rather than reference materials so that the effect of the matrix on recovery is incorporated into
the assessment. A documented spiking protocol and consistency in following that protocol are important
to obtaining meaningful data quality estimates. Spikes should be added at different concentration levels
to cover the range of expected sample concentrations. For some measurement systems (e.g., continuous
analyzers used to measure pollutants in ambient air), spiking samples may not be practical, so
assessments should be made using appropriate blind reference materials.
For certain multianalyte methods, bias assessments may be complicated by interferences among
multiple analytes, which prevents all of the analytes from being spiked into a single sample. For such
methods, lower spiking frequencies can be employed for analytes that are seldom or never found. The
use of spiked surrogate compounds for multianalyte gas chromatography/ mass spectrometry (GC/MS)
procedures, while not ideal, may be the best available procedure for assessment of bias.
AD1.3 PARCC: Accuracy
Accuracy is a measure of the closeness of an individual measurement or the average of a number
of measurements to the true value. Accuracy includes a combination of random error (precision) and
systematic error (bias) components that result from sampling and analytical operations.
Accuracy is determined by analyzing a reference material of known pollutant concentration or by
reanalyzing a sample to which a material of known concentration or amount of pollutant has been added.
Accuracy is usually expressed either as a percent recovery (P) or as a percent bias (P - 100).
Determination of accuracy always includes the effects of variability (precision); therefore, accuracy is
used as a combination of bias and precision. The combination is known statistically as mean square
error.
Mean square error (MSB) is the quantitative term for overall quality of individual measurements
or estimators. To be accurate, data must be both precise and unbiased. Using the analogy of archery, to
be accurate, one must have one's arrows land close together and, on average, at the spot where they are
aimed. That is, the arrows must all land near the bull's-eye (see Figure AD. 1).
Mean square error is the sum of the variance plus the square of the bias. (The bias is squared to
eliminate concern over whether the bias is positive or negative.) Frequently, it is impossible to quantify
all of the components of the mean square error—especially the biases—but it is important to attempt to
quantify the magnitude of such potential biases, often by comparison with auxiliary data.
AD1.4 PARCC: Representativeness
Representativeness is a measure of the degree to which data accurately and precisely represent a
characteristic of a population parameter at a sampling point or for a process condition or environmental
EPA QA/G-5 D-2 QA98
-------
(a) High bias + low precision = low accuracy
(b) Low bias + low precision = low accuracy
(c) High bias + high precision = low accuracy
(d) Low bias + high precision = high accuracy
Figure ADI. Measurement Bias and Random Measurement Uncertainties:
Shots at a Target
condition. Representativeness is a qualitative term that should be evaluated to determine whether in situ
and other measurements are made and physical samples collected in such a manner that the resulting data
appropriately reflect the media and phenomenon measured or studied.
AD1.5 PARCC: Comparability
Comparability is the qualitative term that expresses the confidence that two data sets can
contribute to a common analysis and interpolation. Comparability must be carefully evaluated to
establish whether two data sets can be considered equivalent in regard to the measurement of a specific
variable or groups of variables. In a laboratory analysis, the term comparability focuses on method type
comparison, holding times, stability issues, and aspects of overall analytical quantitation.
There are a number of issues that can make two data sets comparable, and the presence of each
of the following items enhances their comparability:
• two data sets should contain the same set of variables of interest;
• units in which these variables were measured should be convertible to a common metric;
• similar analytic procedures and quality assurance should be used to collect data for both
data sets;
• time of measurements of certain characteristics (variables) should be similar for both
data sets;
EPA QA/G-5
D-3
QA98
-------
• measuring devices used for both data sets should have approximately similar detection
levels;
• rules for excluding certain types of observations from both samples should be similar;
• samples within data sets should be selected in a similar manner;
• sampling frames from which the samples were selected should be similar; and
• number of observations in both data sets should be of the same order or magnitude.
These characteristics vary in importance depending on the final use of the data. The closer two
data sets are with regard to these characteristics, the more appropriate it will be to compare them. Large
differences between characteristics may be of only minor importance, depending on the decision that is
to be made from the data.
Comparability is very important when conducting meta-analysis, which combines the results of
numerous studies to identify commonalities that are then hypothesized to hold over a range of
experimental conditions. Meta-analysis can be very misleading if the studies being evaluated are not
truly comparable. Without proper consideration of comparability, the findings of the meta-analysis may
be due to an artifact of methodological differences among the studies rather than due to differences in
experimentally controlled conditions. The use of expert opinion to classify the importance of differences
in characteristics among data sets is invaluable.
AD1.6 PARCC: Completeness
Completeness is a measure of the amount of valid data obtained from a measurement system,
expressed as a percentage of the number of valid measurements that should have been collected (i.e.,
measurements that were planned to be collected).
Completeness is not intended to be a measure of representativeness; that is, it does not describe
how closely the measured results reflect the actual concentration or distribution of the pollutant in the
media sampled. A project could produce 100% data completeness (i.e., all samples planned were
actually collected and found to be valid), but the results may not be representative of the pollutant
concentration actually present.
Alternatively, there could be only 70% data completeness (30% lost or found invalid), but, due to
the nature of the sample design, the results could still be representative of the target population and yield
valid estimates. Lack of completeness is a vital concern with stratified sampling. Substantial incomplete
sampling of one or more strata can seriously compromise the validity of conclusions from the study. In
other situations (for example, simple random sampling of a relatively homogeneous medium), lack of
completeness results only in a loss of statistical power. The degree to which lack of completeness affects
the outcome of the study is a function of many variables ranging from deficiencies in the number of field
samples acquired to failure to analyze as many replications as deemed necessary by the QAPP and
DQOs. The intensity of effect due to incompleteness of data is sometimes best expressed as a qualitative
measure and not just as a quantitative percentage.
Completeness can have an effect on the DQO parameters. Lack of completeness may require
reconsideration of the limits for the false negative and positive error rates because insufficient
completeness will decrease the power of the statistical test.
The following four situations demonstrate the importance of considering the planned use of the
data when determining the completeness of a study. The purpose of the study is to determine whether the
average concentration of dioxin in surface soil is no more than 1.0 ppb. The DQOs specified that the
EPA QA/G-5 D-4 QA98
-------
sample average should estimate the true average concentration to within ±0.30 ppb with 95 %
confidence. The resulting sampling design called for 30 samples to be drawn according to a simple
random sampling scheme. The results were as follows:
Study result Completeness Outcome
1. 1.5 ppb ± 0.28 ppb 97% satisfies DQOs and study purpose
2. 500 ppb ± 0.28 ppb 87% satisfies DQOs and study purpose
3. 1.5 ppb ±0.60 ppb 93% doesn't satisfy either
4. 500 ppb ±0.60 ppb 67% fails DQOs but meets study purpose
For all but the third situation, the data that were collected completely achieved their purpose,
meeting data quality requirements originally set out, or providing a conclusive answer to the study
question. The degree of incompleteness did not affect some situations (situations 2 and 4) but may have
been a prime cause for situation 3 to fail the DQO requirements. Expert opinion would then be required
to ascertain if further samples for situation 3 would be necessary in order to meet the established DQOs.
Several factors may result in lack of completeness: (1) the DQOs may have been based on poor
assumptions, (2) the survey design may have been poorly implemented, or (3) the design may have
proven impossible to carry out given resource limitations. Lack of completeness should always be
investigated, and the lessons learned from conducting the study should be incorporated into the planning
of future studies.
AD2. OTHER DATA QUALITY INDICATORS
AD2.1 Sensitivity
Sensitivity is the capability of a method or instrument to discriminate between measurement
responses representing different levels of a variable of interest. Sensitivity is determined from the value
of the standard deviation at the concentration level of interest. It represents the minimum difference in
concentration that can be distinguished between two samples with a high degree of confidence.
AD2.2 Recovery
Recovery is an indicator of bias in a measurement. This is best evaluated by the measurement of
reference materials or other samples of known composition. In the absence of reference materials, spikes
or surrogates may be added to the sample matrix. The recovery is often stated as the percentage
measured with respect to what was added. Complete recovery (100%) is the ultimate goal. At a
minimum, recoveries should be constant and should not differ significantly from an acceptable value.
This means that control charts or some other means should be used for verification. Significantly low
recoveries should be pointed out, and any corrections made for recovery should be stated explicitly.
AD2.3 Memory Effects
A memory effect occurs when a relatively high-concentration sample influences the measurement
of a lower concentration sample of the same analyte when the higher concentration sample precedes the
lower concentration sample in the same analytical instrument. This represents a fault in an analytical
measurement system that reduces accuracy.
EPA QA/G-5 D-5 QA98
-------
AD2.4 Limit of Quantitation
The limit of quantitation is the minimum concentration of an analyte or category of analytes in a
specific matrix that can be identified and quantified above the method detection limit and within
specified limits of precision and bias during routine analytical operating conditions.
AD2.5 Repeatability
Repeatability is the degree of agreement between independent test results produced by the same
analyst using the same test method and equipment on random aliquots of the same sample within a short
time period.
AD2.6 Reproducibility
Reproducibility is the precision that measures the variability among the results of measurements
of the same sample at different laboratories. It is usually expressed as a variance and low values of
variance indicate a high degree of reproducibility.
AD2.7 DQIs and the QAPP
At a minimum, the following DQIs should be addressed in the QAPP: accuracy and/or bias,
precision, completeness, comparability, and representativeness. Accuracy (or bias), precision,
completeness, and comparability should be addressed in Section A7.3, Specifying Measurement
Performance Criteria. Refer to that section of the G-5 text for a discussion of the information to present
and a suggested format. Representativeness should be discussed in Sections B4.2 (Subsampling) and Bl
(Sampling Design).
Table ADI. Principal Types of Error
Types of Error
Sources of Error
Random Error
(precision; "P" in PARCC)
Natural variability in the population from which the sample is
taken.
Measurement system variability, introduced at each step of
sample handling and measurement processes.
Systematic Error
(accuracy/bias; "A" in PARCC)
Interferences that are present in sample matrix.
Loss (or addition) of contaminants during sample collection and
handling.
Loss (or addition) of contaminants during sample preparation
and analysis.
Calibration error or drift in the response function estimated by
the calibration curve.
EPA QA/G-5
D-6
QA98
-------
Lack of representativeness
("R" in PARCC)
Sample is not representative of the population, which often
occurs in judgmental sampling because not all the units of the
population have equal or known selection probabilities.
Sample collection method does not extract the material from its
natural setting in a way that accurately captures the desired
qualities to be measured.
Subsample (taken from a sample for chemical analysis) is not
representative of the sample, which occurs because the sample
is not homogeneous and the subsample is taken from the most
readily available portion of the sample. Consequently, other
parts of the sample had less chance of being selected for analysis.
Lack of comparability
("C" in PARCC)
Failure to use similar data collection methods, analytical
procedures, and QA protocols.
Failure to measure the same parameters over different data sets.
Lack of completeness
("C" in PARCC)
Lack of completeness sometimes caused by loss of a sample,
loss of data, or inability to collect the planned number of
samples.
Incompleteness also occurs when data are discarded because
they are of unknown or unacceptable quality.
AD2.8 References
American Society for Quality Control. 1996. Definitions of Environmental Quality Assurance Terms.
Milwaukee, WI: ASQC Press.
Gilbert, R.O. 1987. Statistical Methods for Environmental Pollution Monitoring. New York: Van
Nostrand.
Ott, W.R. 1985. Environmental Statistics and Data Analysis. Boca Raton, FL: Lewis Publishers Inc.
Taylor, J.K. and T.W. Stanley, eds. 1985. Quality Assurance for Environmental Measurements.
Philadelphia, PA: American Society for Testing and Materials.
Taylor, J.K. 1987. Quality Assurance of Chemical Measurements. Chelsea, MI: Lewis Publishers Inc.
U.S. Environmental Protection Agency. 1984. Chapter 5. Calculation of Precision, Bias, and Method
Detection Limit for Chemical and Physical Measurements.
U.S. Environmental Protection Agency. 1994. AEERL Quality Assurance Procedures Manual for
Contractors and Financial Assistance Recipients.
EPA QA/G-5
D-7
QA98
-------
U.S. Environmental Protection Agency. 1994. EPA Requirements for Quality Management Plans. EPA
QA/R-2, Draft Interim Final. August.
Youden, W.J. 1967. Journal of the Association of Official Analytical Chemists. Vol. 50. p. 1007.
EPA QA/G-5 D-8 QA98
-------
APPENDIX E
QUALITY CONTROL TERMS
AE1. QUALITY CONTROL OPERATIONS
Quality control (QC) plays an increasingly important role in environmental studies, especially
when those studies are conducted to decide how to address an environmental problem. To minimize the
chance of making an incorrect decision, data of adequate quality must be collected. The purpose of QC
is to ensure that measurement and other data-producing systems operate within defined performance
limits as specified in planning. QC programs can both lower the chances of making an incorrect decision
and help the data user understand the level of uncertainty that surrounds the decision. QC operations
help identify where error is occurring, what the magnitude of that error is, and how that error might
impact the decision-making process. This appendix provides a brief overview of this complex topic. It
surveys the different types of QC samples that can be applied to environmental studies and evaluates
how they are currently deployed as specified by EPA methods and regulations.
AE1.1 General Objectives
The most important QC questions a project manager should consider are:
• What are the QC requirements for the methods to be used in the project?
• What types of problems in environmental measurement systems do these requirements
enable the Agency to detect?
Addressing these questions should provide the manager with the background needed for defining
a uniform, minimum set of QC requirements for any environmental data collection activity.
Understanding existing QC requirements for environmental data generation activities provides a
framework for considering what set of QC requirements should be considered "core" requirements
irrespective of the end use of the data.
While it is difficult to define a standard of data quality regardless of its intended use, core QC
requirements can be established that will enable one to provide data of known quality in accordance with
the Agency's QA program. This program requires that all environmental data collection efforts gather
information on bias, variability, and sample contamination. These error types are incurred throughout
the data generation process, including all sampling and analytical activities (i.e., sample collection,
handling, transport, and preparation; sample analysis; and subsampling). The principal issue centers on
what level of detail in the error structure should QC operations be capable of revealing, given that it is
impractical to explore every known potential source of error.
AE1.2 Background
Many of the essential elements of a Quality Assurance Project Plan (QAPP) apply directly to
sampling and analytical activities and include: Quality assurance (QA) objectives for measurement data
specified in terms of Data Quality Indicators (precision, accuracy, bias, representativeness and
comparability); sampling procedures; sample custody; calibration procedures and frequency; analytical
procedures; internal QC checks and frequency; performance and system audits and frequency; and
specific routine procedures that should be used to assess both data precision and the completeness of the
specific measurement parameters involved.
EPAQA/G-5 E-l QA98
-------
AE1.3 Definitions and Terminology
In order to ensure that managers have a uniform perspective of QC requirements, it is necessary
to discuss some basic terminology and definitions. QC and QA, total study error and its components,
types of QC operations, and Good Laboratory Practices (GLPs) will be discussed here. Specific
definitions of these terms and others are provided in Appendix B, Glossary of Quality Assurance and
Monitoring Terms, while Table E. 1 summarizes the results of a study on how these terms are defined and
used in EPA and non-EPA literature. Five commonly available sources are discussed in Table E.I:
Appendix B in EPA QA/G-5; American Society for Quality Control (1996); van Ee, Blume, and Starks
(1989); Taylor (1987); and Keith (1988).
AE1.1.3 Quality Control vs. Quality Assurance
All of the cited literature provides somewhat similar definitions for both QA and QC. QC
activities are designed to control the quality of a product so that it meets the user's needs. QA includes
QC as one of the activities needed to ensure that the product meets defined standards of quality.
These two terms have been defined in slightly different ways by other authors, but all are in
agreement that QC is a component of QA. Many authors define QC as "those laboratory operations
whose objective is to ensure that the data generated by the laboratory are of known accuracy to some
stated, quantitative degree of probability" (Dux 1986). The objective of QC is not to eliminate or
minimize errors but to measure or estimate what they are in the system as it exists. The same authors
then define QA as the ability to prove that the quality of the data is as reported. QA relies heavily on
documentation, including documentation of implemented QC procedures, accountability, traceability,
and precautions to protect raw data.
AE1.3.2 PC Samples
Table E.I offers a broad survey of commonly used QC terms, including the definitions of QC
sample types that span the measurement process. The authors cited in Table E.I define different sample
types in varied ways; however, the definitions are not contradictory.
AE1.3.3 Good Laboratory Practices
The Food and Drug Administration (FDA) promulgated the first version of the Good Laboratory
Practices (GLPs) in 1978. The EPA enacted similar guidance requirements in 1983 for Resource
Conservation Recovery Act (RCRA) and Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA)
compliance. The FIFRA GLPs were revised in 1988. Though much of the content relates to laboratory
animal science, many requirements are relevant to the analytical chemist. The GLP standards for FIFRA
(40 Code of Federal Regulations [CFR] Part 160) and the Toxic Substances Control Act (TSCA) (40
CFR 792) are similar (Dux 1986). Selected topics of FIFRA subparts A through K appear below.
Subpart A General Provisions.
Subpart B Organization and Personnel. Includes QA unit.
Subpart C Facilities. Includes: facilities for handling test, control, and reference
substances; laboratory operations areas; and specimen and data storage
facilities.
Subpart D Equipment. Includes: maintenance and calibration of equipment.
Subpart E Testing Facilities Operation. Includes: standard operation procedures
(SOPs); reagents and solutions.
EPA QA/G-5 E-2 QA98
-------
Subpart F Test, Control, and Reference Substances. Includes: characterization and
handling; mixtures of substances with carriers.
Subpart G Protocol for and Conduct of a Study.
Subpart H Reserved.
Subpart I Reserved.
Subpart J Records and Reports. Includes: reporting of study results; storage and
retrieval of records and data; and retention of records.
GLPs are defined similarly by the Agency and by Taylor (1987) as an acceptable way to perform
some basic laboratory operation or activity that is known or believed to influence the quality of its
outputs.
AE2. QUALITY CONTROL REQUIREMENTS IN EXISTING PROGRAMS
To identify QC requirements for this section, standard EPA method references, such as SW-846,
and the CFR were reviewed together with information on non-EPA methods identified through a
computerized literature search. Within the EPA literature, some of the major programs were reviewed,
including the Drinking Water, Air and the Contract Laboratory Program (CLP). Different types of
methods, such as gas chromatography (GC), atomic absorption (AA), and inductively coupled plasma
(ICP), and different media were included in this process, but it was not intended to be exhaustive.
AE2.1 Summary of QC Requirements by Program and Method
Table AE.2 presents the frequency of QC requirements for different selected programs and Table
AE.3 presents information for methods. In cases where different programs use dissimilar terms for
similar QC samples, the table uses the term from the program or method.
AE2.2 Comparing Various QC Requirements
AE2.2.1 QC Requirements for Program Offices
Table AE.2 shows that QC requirements vary considerably and are established by the Program
Office responsible for the data collection activity. Ambient air monitoring methods (Office of Air
Quality Planning and Standards [OAQPS]) require periodic analysis of standards for assessment of
accuracy (combination of imprecision and bias) for manual methods, and analysis of collocated samples
for the assessment of imprecision. Prevention of Significant Deterioration (PSD) and State and Local
Air Monitoring Stations (SLAMS) make a unique distinction in defining two terms: precision checks and
accuracy checks. These checks entail essentially the same QC requirements, but they are performed by
different parties; the accuracy check is essentially an external audit, while the precision check is an
internal QC operation. It should be noted that some water methods require additional QC operations for
GC/MS than for other methods (e.g., tuning, isotopic dilution).
In general, the wet chemistry analytical methods (the toxicity characteristic leaching procedure
[TCLP] being a preparation method) require periodic analysis of blanks and calibration standards. Most
require analysis of matrix spikes and replicate samples, the exceptions being the 200 Series (no spikes or
replicates) and the 600 series (GC/MS require no replicates).
While the QC operations for the PSD and SLAMS methods appear minimal, these monitoring
programs require active QA programs that include procedures for zero/span checks. (The zero check
may be considered a blank sample, while the span check may be considered a calibration check sample.)
EPA QA/G-5 E-3 QA98
-------
The Program Office Quality Assurance Officer (QAO) or representative should have details on
specific QC requirements.
AE2.2.2 Organized by Type of Potential Problem
Table AE.3 lists the QC requirements of various EPA measurement methods and presents the
required frequencies for different kinds of QC operations. The table is divided into four sections, one for
each general type of QC problem:
• Contamination: This occurs when the analyte of interest or an interferant is introduced
through any of a number of sources, including contaminated sample equipment,
containers, and reagents. The contaminant can be the analyte of interest or another
chemical that interferes with the measurement of the analyte or causes loss or generation
of the analyte.
• Calibration Drift: This is a nonrandom change in the measurement system over time,
such as a (systematic) change in instrument response over time. It is often detectable by
periodic remeasurement of a standard.
• Bias: This can be regarded as a systematic error caused by contamination and calibration
drift and also by numerous other causes, such as extraction efficiency by the solvent,
matrix effect, and losses during shipping and handling.
• Imprecision: This is a random error, observed as different results from repeated
measurements of the same or identical samples.
For internal consistency, the names of QC operations used in Table AE.3 are those given in the specific
reference methods.
AE2.3 Using QC Data
The relationships between monitoring design specifications and the final use of the data
described above incorporate two significant assumptions: (1) laboratory measurements, through the use
of internal standards or other adjustments that are integral to the analytical protocol, are unbiased; and
(2) the variance structure of these measurements does not change over time. Bias enters as a
consequence of under-recovery of the contaminant of interest during the sample preparation stage of the
analytical protocol and as undetected drift in calibration parameters. The variance of measurements also
may change over time due to unintentional changes in the way samples are prepared and/or to
degradation of the electromechanical instrumentation used to analyze the samples. QC samples are
intended to detect bias and variability changes and should be specified in the QAPP.
QC samples that address bias are calibration check standards (CCSs) and spiked samples
(performance check samples [PCSs]). CCSs typically consist of reagent water samples spiked with the
concentrations used to develop the calibration curve. Measurements obtained by analyzing these
samples, which reflect the existing calibration relationship, are compared to the actual concentrations
that were added to the samples. If the difference exceeds a prespecified calibration test limit, the
measurement system is considered "out of control" and the calibration function is re-estimated.
EPA QA/G-5 E-4 QA98
-------
Detecting a change in calibration parameters is a statistical decision problem in detecting a
material change in the calibration function. In many QC programs, CCSs typically are analyzed at the
beginning and end of each shift and after any other QC sample has detected a failure. By definition,
significant change in the calibration parameters leads to biased measurements of field samples. This can
be detected through use of statistical tests.
A spiked sample typically has the same matrix characteristics found in field samples, but it has
been spiked (as soon after the sample is taken as is practical) with a known concentration of the target
contaminant. Because spiked samples are intended to detect recovery changes, they are processed
through the same preparation steps as field samples, and the spiked sample measurement is used to form
an estimate of recovery. Significant changes lead to the conclusion that measurements of field samples
are biased.
The second of the two monitoring program assumptions identified at the beginning of this
section is a constant variance structure for monitoring data over time. Measurements from split (or
duplicate) field samples provide a check on this variance assumption. Changes in measurement
variability, for example a uniform increase in the standard deviation or changes in the way variability
depends on concentration, have a direct impact on subsequent investigations.
AE2.4 Classifying QC Samples: Control versus Assessment
QC programs are designed foremost to detect a measurement process entering an "out of control"
state so that corrective measures can be initiated. QC samples used in this way are performing a control
function. Each of the three types of QC samples previously discussed, CCSs, spiked samples, and split
(or duplicate) samples, may be used for control. In addition, spiked samples and split samples also may
be used to estimate measurement bias and variability. QC samples that also can be used to estimate
measurement parameters are sometimes referred to as quality assessment samples. These should not be
confused with the much larger Data Quality Assessment Process; see also EPA QA/G-9, Guidance for
Data Quality Assessment.
QC samples that are used for control must be analyzed and reported soon after they are obtained
if their intervention potential is to be realized. Among the three types of QC samples discussed above,
CCSs are the most likely to be effective for control purposes. Spiked samples and split samples
generally are not effective for control purposes, in part because they are analyzed "blind" and therefore
the results cannot be reviewed immediately. Spiked samples and split samples, however, may be used
for control if consecutive batches of similar field samples are being analyzed.
Spiked samples and split samples can be effective quality assessment samples. For example,
spiked samples may be used to indicate the presence of bias. The estimate is applied as a bias correcting
adjustment to individual measurements or to batches of measurements before the measurements are used
in compliance tests. The adjustment improves the test by eliminating bias. However, the variance of the
adjusted estimate used in the test is greater than the variance of the unadjusted estimate.
Split (or duplicate) samples also can be used as quality assessment samples, but their application
in the monitoring program is not as constructive as the application of spiked samples. Split samples lead
to an estimate of the measurement replication component of variability. (The variance of a measurement
has, at a minimum, a sampling component and a measurement replication component, which is
sometimes referred to as measurement error. If the sampling design involves stratification, the variance
will include additional components.) If the estimate based on split samples suggests a measurement
replication standard deviation larger than the value assumed in establishing the original sampling design,
a loss in efficiency will result.
EPA QA/G-5 E-5 QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Blank sample
A clean sample or a sample of matrix
processed so as to measure artifacts in the
measurement (sampling and analysis)
process.
Blanks provide a measure of various
cross-contamination sources, background
levels in reagents, decontamination
efficiency, and other potential error that
can be introduced from sources other
than the sample. A rinsate blank
(decontamination sample) measures any
chemical that may have been on the
sampling and sample preparation tools
after the decontamination process is
completed.
The measured value obtained when a
specified component of a sample is not
present during measurement. Measured
value/signal for the component is
believed to be due to artifacts; it should
be deducted from a measured value to
give a net value due to the component
contained in a sample. The blank
measurement must be made to make the
correction process valid.
Samples expected to have negligible or
unmeasurable amounts of the substance
of interest. They are necessary for
determining some of the uncertainty due
to random errors. Three kinds required
for proper quality assurance: equipment
blanks, field blanks, and sampling
blanks.
Blind sample
A subsample submitted for analysis with
a composition and identity known to the
submitter but unknown to the analyst.
Used to test analyst or laboratory
proficiency in execution of the
measurement process.
Single-Blind Samples: Field Rinsate
Blanks, Preparation Rinsate Blank, Trip
Blank
A sample submitted for analysis whose
composition is known to the submitter
but unknown to the analyst. One way to
test the proficiency of a measurement
process.
Calibration
standard
A substance or reference material used to
calibrate an instrument, (calibration
check standard, reference standard,
quality control check sample)
In physical calibration, an artifact
measured periodically, the results of
which typically are plotted on a control
chart to evaluate the measurement
process.
Or quality control calibration standard
(CCS). In most laboratory procedures, a
solution containing the analyte of
interest at a low but measurable
concentration. Standard deviation of the
CCSs is a measure of instrument
precision unless the CCS is analyzed as
a sample, in which case it is a measure
of method precision.
Checks sample
Example: ICP Interference Check Sample -
Part A contains potential interfering
analytes. Part B contains both the
analytes of interest and the target
analytes. Part A and B are analyzed
separately to determine the potential for
interferences.
Check standard
A substance or reference material
obtained from a source independent from
the source of the calibration standard;
used to prepare check samples, (control
standard")
Laboratory control standards are
certified standards, generally supplied by
an outside source. They are used to
ensure that the accuracy of the analysis
is in control.
EPA QA/G-5
E-6
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Double blind
samples
Samples that can not be distinguished
from routine samples by analytical
laboratory. Examples: Field Evaluation
Samples, Low Level Field Evaluation
Samples, External Laboratory Evaluation
Samples, Low Level External Laboratory
Evaluation Samples, Field Matrix Spike,
Field Duplicate, Field Split
A sample known by the submitter but
submitted to an analyst so that neither its
composition nor its identification as a
check sample are known to the analyst.
Duplicate
measurement
A second measurement made on the same
(or identical) sample of material to assist
in the evaluation of measurement
Duplicate
sample
Two samples taken from and
representative of the same population and
carried through all steps of the sampling
and analytical procedures in an identical
manner. Used to assess variance of the
total method including sampling and
analysis.
Field duplicate - an additional sample
taken near the routine field sample to
determine total within-batch
measurement variability.
Analytical laboratory duplicate - a
subsample of a routine sample analyzed
by the same method. Used to determine
method precision. It is non-blind so it
can only be used by the analyst in
internal control, not an unbiased estimate
of analytical precision.
A second sample randomly selected from
a population of interest to assist in the
evaluation of sample variance.
Error
The difference between a computed,
observed, or measured value or condition
and the true, specified, or theoretical
value or condition.
Difference between the true or expected
value and the measured value of a
quantity or parameter.
Field blank
Used to estimate incidental or accidental
contamination of a sample during the
collection procedure. One should be
allowed per sampling team per day per
collection apparatus. Examples include
matched-matrix blank, sampling media
or trip blank, equipment blank.
EPA QA/G-5
E-7
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Good
Laboratory
Practices
(GLPs)
Either general guidelines or formal
regulations for performing basic
laboratory operations or activities that are
known or believed to influence the
quality and integrity of the results.
An acceptable way to perform some basic
operation or activity in a laboratory that
is known or believed to influence the
quality of its outputs. GLPs ordinarily
are essentially independent of the
measurement techniques used.
Instrument
blank
Also called system blank. Used to
establish baseline response of an
analytical system in the absence of a
sample. Not a simulated sample but a
measure of instrument or system
background response.
Method blank
One of the most important in any
process. DDI water processed through
analytical procedure as a normal sample.
After use to determine the lower limit of
detection, a reagent blank is analyzed for
each 20 samples and whenever a new
batch of reagents is used.
Non-blind
sample
QC samples with a concentration and
origin known to the analytical laboratory.
Examples: Laboratory Control Sample,
Pre-digest Spike, Post-digest Spike,
Analytical Laboratory Duplicate, Initial
Calibration Verification and Continuing
Calibration Verification Solutions, Initial
Calibration Blank and Continuing
Calibration Blank Solution, CRDL
Standard for ICP and AA, Linear Range
Verification Check Standard, ICP
Interference Check Sample.
Performance
Evaluation
(PE)
A type of audit in which the quantitative
data generated in a measurement system
are obtained independently and compared
with routinely obtained data to evaluate
the proficiency of an analyst or
laboratory.
rPefmed in EPA OA/G-5. App. B1
EPA QA/G-5
E-8
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Quality
assessment
Assessment is the evaluation of
environmental data to determine if they
meet the quality criteria required for a
specific application.
The overall system of activities that
provides an objective measure of the
quality of data produced.
The overall system of activities whose
purpose is to provide assurance that the
quality control activities are done
effectively. It involves a continuing
evaluation of performance of the
production system and the quality of the
products produced.
Quality
assessment
sample (QAS)
Those samples that allow statements to
be made concerning the quality of the
measurement system. Allow assessment
and control of data quality to assure that
it meets original objectives. Three
categories: double-blind, single-blind,
and non-blind.
Quality
assurance
(QA)
An integrated system of activities
involving planning, quality control,
quality assessment, reporting and quality
improvement to ensure that a product or
service meets defined standards of
quality with a stated level of confidence.
A system of activities whose purpose is
to provide to the producer or user of a
product or service the assurance that it
meets defined standards of quality. It
consists of two separate, but related
activities, quality control and quality
assessment.
Same as van Ee.
Quality
control (QC)
The overall system of technical activities
whose purpose is to measure and control
the quality of a product or service so that
it meets the needs of users. The aim is to
provide quality that is satisfactory,
adequate, dependable, and economical.
The overall system of activities whose
purpose is to control the quality of the
measurement data so that they meet the
needs of the user.
The overall system of activities whose
purpose is to control the quality of a
product or service so that it meets the
needs of users. The aim is to provide
quality that is satisfactory, adequate
dependable, and economic.
Quality
control sample
An uncontaminated sample matrix spiked
with known amounts of analytes from a
source independent from the calibration
standards. Generally used to establish
intralaboratory or analyst specific
precision and bias or to assess
performance of all or part of the
measurement system. (Laboratory control
sample)
rPefmed in EPA OA/G-5. App. B1
A sample of well-characterized soil,
whose analyte concentrations are known
to the laboratory. Used for internal
laboratory control. Also called QC audit
sample.
A material of known composition that is
analyzed concurrently with test samples
to evaluate a measurement process.
Used in quality control procedures to
determine whether or not the analytical
procedures are in control.
EPA QA/G-5
E-9
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Reagent blank
A sample consisting of reagent(s),
without the target analyte or sample
matrix, introduced into analytical
procedure at the appropriate point and
carried through all subsequent steps to
determine the contribution of the reagents
in the absence of matrix and the involved
analytical steps to error in the observed
value (analytical blank, laboratory blank).
(Defined in EPA QA/G-5, App. B)
Also called method blank. Used to
detect and quantitate contamination
introduced during sample preparation
and analysis. Contains all reagents used
in sample preparation and analysis and is
carried through the complete analytical
procedure.
Reference
material
A material or substance, one or more
properties of which are sufficiently well
established to be used for the calibration
of an apparatus, the assessment of a
measurement method, or for the
assignment of values to materials.
Sample
preparation
blank
Required when methods like stirring,
mixing, blending, or subsampling are
used to prepare a sample prior to
analysis. One should be prepared per 20
samples processed.
Sampling
equipment
blank
Used to determine types of contaminants
introduced through contact with
sampling equipment; also to verify the
effectiveness of cleaning procedures.
Prepared by collecting water or solvents
used to rinse sampling equipment.
Solvent blank
Used to detect and quantitate solvent
impurities; the calibration standard
corresponds to zero analyte
concentration. Consists only of solvent
used to dilute the sample.
EPA QA/G-5
E-10
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Spiked sample
A sample prepared by adding a known
mass of target analyte to a specified
amount of matrix sample for which an
independent estimate of target analyte
concentration is available. Spiked
samples are used, for example, to
determine the effect of the matrix on a
method's recovery efficiency (matrix
spike).
A sample prepared by adding a known
amount of reference chemical to one of a
pair of split samples. Comparing the
results of the analysis of a spiked
member to that of the non-spiked
member of the split measures spike
recovery and provides a measure of the
analytical bias.
Field matrix spike - a routine sample
spiked with the contaminant of interest in
the field.
Matrix control or field spike -for sample
matrices where a complex mixture (e.g.
sediments, sludges) may interfere with
analysis, a field spike may be required to
estimate the magnitude of those
interferences. Losses from transport,
storage treatment, and analysis can be
assessed by adding a known amount of
the analyte of interest to the sample in
the field.
Split sample
Two or more representative portions
taken from a sample or subsample and
analyzed by different analysts or
laboratories. Split samples are used to
replicate the measurement of the
variable(s) of interest.
Samples can provide: a measure of
within-sample variability; spiking
materials to test recovery; and a measure
of analytical and extraction errors.
Where the sample is split determines the
components of variance that are
measured. Field split - a sample is
homogenized and spilt into two samples
of theoretically equal concentration at the
sampling site. Indicate within-batch
measurement error. Also called
replicates.
A replicate portion or subsample of a
total sample obtained in such a manner
that is not believed to differ significantly
from other portions of the same sample.
Total
measurement
error
The sum of all the errors that occur from
the taking of the sample through the
reporting of results; the difference
between the reported result and the true
value of the population that was to have
been sampled.
Transport
blank
Used to estimate sample contamination
from the container and preservative
during transport and storage of the
sample. One should be allowed per day
per type of sample.
EPA QA/G-5
E-ll
QA98
-------
Table AE1. Comparison of QC Terms
Terms
ASQC, Definitions of Environmental
Quality Assurance Terms
or
EPA QA/G-5 Appendix B
van Ee, Blume, and Starks
A Rationale for the Assessment of Errors
in the Sampling of Soils
John Keenan Taylor
Quality Assurance of Chemical
Measurements
Lawrence H. Keith, ed.
Principles of Environmental Sampling
Trip blank
A clean sample of matrix that is carried
to the sampling site and transported to
the laboratory for analysis without having
been exposed to sampling procedures.
(Defined in EPA QA/G-5, App. B)
Used when volatile organics are sampled.
Consists of actual sample containers
filled with ASTM Type II water, kept
with routine samples throughout
sampling event, packaged for shipment
with routine samples and sent with each
shipping container to the laboratory.
Used to determine the presence or
absence of contamination during
shipment.
A type of field blank also called
sampling media blank. To detect
contamination associated with the
sampling media such as filters, traps, and
sample bottles. Consists of sampling
media used for sample collection.
EPA QA/G-5
E-12
QA98
-------
Table AE2. QC Requirements for Programs
Potential
Problems:
QC
Samples to
Identify
Potential
Problems:
CLP
Organics:
1991
Statement of
Work,
Exhibit E
Contamination
Blanks
Volatiles
Semi-
volatiles
Pesticides/
Aroclor
A method
blank once
every 12
hours.
A method
blank with
every batch.
Instrument
blank at start
of analyses
and every 12
hours.
Method
blank with
each case,
14 days, or
batch.
Sulfur
blanks are
sometimes
required.
Calibration
Drift
Calibration
Check
Samples
Continuing
calibration
standard every
12 hours. BFB
analysis once
every 12 hours.
DFTPP analysis
once every 12
hours.
Continuing
calibration
standard every
12 hours.
Performance
evaluation
mixture to
bracket 12-hour
periods.
Bias
Spike
Matrix spike
with every
case, batch, 20
samples, or 14
days.
Matrix spike
with every
case, batch, 20
samples, or 14
days.
Matrix spike
with every 20
samples.
Standard
3 system monitoring
compounds added to
every sample.
8 surrogates spiked into
each sample.
2 surrogates added to
each sample.
Imprecision
Replicate
Matrix spike
duplicate with
every case,
batch, 20
samples, or 14
days.
Matrix spike
duplicate with
every case,
batch, 20
samples, or 14
days.
Matrix spike
duplicate with
every 20
samples.
Collocated
Other
EPA QA/G-5
E-13
QA98
-------
Table AE2. QC Requirements for Programs
Potential
Problems:
QC
Samples to
Identify
Potential
Problems:
CLP
Inorganics:
1991
Statement
of Work,
Exhibit E
PSD
40CFR
Part 58
Appendix B
Contamination
Blanks
Initial calibration blank; then
continuing calibration blank
10% or every 2 hours.
Preparation blank with every
batch.
Calibration
Drift
Calibration
Check
Samples
Initial
calibration
verification
standard; then
continuing
calibration
verification
10% or every 2
hours.
Bias
Spike
1 spike for
every batch.
Method of
standard
additions for
AA if spikes
indicate
problem.
Standard
Interference check
sample for ICP
2 x /8 hours.
Laboratory control
sample with each
batch.
For SO2, NO2, O3, and
CO, response check I/
sampling quarter. For
TSP and lead, sample
flow check
I/sampling quarter.
For lead, check with
audit strips I/quarter.
Imprecision
Replicate
1 duplicate/
batch. For AA,
duplicate
injections.
Collocated
For TSP and
lead, collocated
sample I/week
or every 3rd
day for
continuous
sampling.
Other
For S02, N02,
O3, and CO,
precision
check once
every 2
weeks.
EPA QA/G-5
E-14
-------
Table AE2. QC Requirements for Programs
Potential
Problems:
QC
Samples to
Identify
Potential
Problems:
SLAMS
40CFR
Part 58
Appendix A
A Rationale
for the
Assessment
of Errors in
the
Sampling of
Soils, by
van Ee,
Blume, and
Starks
Contamination
Blanks
Preparation rinsate blanks and
field rinsate blanks discussed,
but no frequency given.
Calibration
Drift
Calibration
Check
Samples
Bias
Spike
Standard
For automated SO2,
N02, 03, and CO
response check for at
least 1 analyzer (25%
of all) each quarter.
For manual SO2 and
NO2, analyze audit
standard solution each
day samples are
analyzed (at least
2x/quarter). For TSP,
PM10, and lead,
sample flow rate
check at least 1
analyzer/quarter (25%
of all analyzers). For
lead, check with audit
strips I/quarter.
At least 2 1 pairs of
field evaluation
samples. At least 20
pairs of external
laboratory evaluation
samples if estimating
components of
variance is important.
Imprecision
Replicate
At least 20
pairs or 10
triples of field
duplicates. At
least 20 pairs of
preparation
splits if
estimating
variance is
important.
Collocated
For manual
methods,
including lead,
collocated
sample I/week.
Other
For automated
S02, N02, 03,
and CO,
precision
check once
every 2
weeks.
EPA QA/G-5
E-15
QA98
-------
Table AE3. QC Requirements for Methods
Potential
Problems:
QC Samples to
Identify Potential
Problems:
SW-846 Method
7000 (Proposed
Update I)
Atomic Absorption
SW-846 Method
8000 (Proposed
Update I) Gas
Chromatography
503.1 Volatile
Aromatic and
Unsaturated Organic
Compounds in
Water by Purge and
Trap GC (from
PB89-220461)
200 Atomic
Absorption Methods
(from EPA-600-4-
79-020)
624-Purgeables
40 CFR Part 136,
Appendix A
Contamination
Blanks
Reagent blank as
part of daily
calibration.
Reagent blank
before sample
analysis and for
each batch of up
to 20 samples.
Laboratory
reagent blank with
each batch. Field
reagent blank with
each set of field
samples.
Reagent blank at
least daily.
Reagent water
blank daily.
Calibration Drift
Calibration
Check Samples
Mid-range standard
analyzed every 10
samples.
A daily calibration
sample analyzed.
Calibration verified
daily with 1 or more
calibration standards.
Daily checks at least
with reagent blank and
1 standard.
Verification with an
additional standard
every 20 samples.
Analyze BFB every
day analyses are
performed.
Bias
Spike
1 spiked matrix
sample analyzed
every 20 samples or
analytical batch.
Method of standard
additions required
for difficult matrices.
1 matrix spike for
each batch of up to
20 samples.
Laboratory-fortified
blank with each
batch or 20 samples.
Spike a minimum of
5% of samples.
Standard
QC check sample
required, but
frequency not
specified.
QC sample
analyzed at least
quarterly.
Analysis of an
unknown
performance
sample at least
once per year.
Surrogate
standards used
with all samples.
Analyze QC
check samples as
5% of analyses.
Imprecision
Replicate
1 replicate sample
every 20 samples or
analytical batch; 1
spiked replicate
sample for each
matrix type.
1 replicate or matrix
spike replicate for
each analytical batch
of up to 20 samples.
Samples collected in
duplicate.
Laboratory- fortified
blanks analyzed in
duplicate at least
quarterly.
Collocated
Other
EPA QA/G-5
E-16
QA98
-------
Table AE3. QC Requirements for Methods
Potential
Problems:
QC Samples to
Identify Potential
Problems:
1624-Volatile
Organic Compounds
by Isotope Dilution
GC/MS
40 CFR Part 136,
Appendix A
TCLP-Fed. Reg.,
Vol 55, No. 126
Friday, June 29,
1990
SW-846 Method
6010 (Proposed
Update I)
Inductively Coupled
Plasma Atomic
Emission
Spectroscopy
Contamination
Blanks
Blanks analyzed
initially and with
each sample lot.
1 blank for every
20 extractions.
At least 1 reagent
blank with every
sample batch.
Calibration Drift
Calibration
Check Samples
Aqueous standard with
BFB, internal
standards, and
pollutants is analyzed
daily. A standard used
to compare syringe
injection with purge
and trap.
Verify calibration
every 10 samples and
at the end of the
analytical run with a
blank and standard.
Bias
Spike
All samples spiked
with labeled
compounds.
1 matrix spike for
each waste type and
for each batch.
Spiked replicate
samples analyzed at
a frequency of 20%.
Standard
An interference
check sample
analyzed at the
beginning and end
of each run or 8-
hour shift.
Imprecision
Replicate
8 aliquots of the
aqueous
performance
standard analyzed
initially.
1 replicate with
every batch or 20
samples. Also spiked
replicates analyzed,
as discussed under
"Spikes."
Collocated
Other
EPA QA/G-5
E-17
QA98
-------
AE2.4 References
American Society for Quality Control. Environmental Restoration Committee. Terms and Definitions Task Group.
1996. Definitions of Environmental Quality Assurance Terms. Milwaukee, WI: Quality Press.
American Society for Quality Control. Chemical Process Industries Division. 1987. Quality Assurance for the
Chemical Process Industries, a Manual of Good Practices. Washington, DC.
Dux, James P. 1986. Handbook of Quality Assurance for the Analytical Chemistry Laboratory.
Federal Insecticide, Fungicide andRodenticide Act (FIFRA). 1989. Good Laboratory Practices Standards. Final
Rule. Federal Register, vol. 54, no. 158, August.
Good Laboratory Practices: An Agrochemical Perspective. 1987. Division of Agrochemicals, 194th Meeting of the
American Chemical Society.
Grant, E.L. and R.S. Leavenworth. 1988. Statistical Quality Control, 6th Edition. New York: McGraw-Hill.
Griffith, Gary K. 1996. Statistical Process Control Methods for Long and Short Runs, 2nd Edition. Milwaukee,
WI: ASCQ Quality Press.
Hayes, Glenn E. and Harry G. Romig. 1988. Modern Quality Control. Revised Edition. Encino, CA.
Juran, J.M. and Frank M. Gryna. 1993. Quality Planning and Analysis, 3rd Edition. New York: McGraw-Hill.
Keith, Lawrence H., ed. 1988. Principles of Environmental Sampling. Washington, DC: American Chemical
Society Press.
Taylor, John Keenan. 1987. Quality Assurance of Chemical Measurements. Chelsea, MI: Lewis Publishers, Inc.
van Ee, J. Jeffrey, Louis J. Blume, and Thomas H. Starks. 1989. A Rationale for the Assessment of Errors in
Sampling of Soils. EPA/600/X-89/203.
EPAQA/G-5 E-18 QA98
-------
APPENDIX F
SOFTWARE FOR THE DEVELOPMENT AND PREPARATION OF A QUALITY
ASSURANCE PROJECT PLAN
This appendix contains three sections:
AF1. an overview of the potential need for software in QAPP preparation,
AF2. information on existing software, and
AF3. information on software availability and sources.
The information presented in this appendix on various types of software that may be useful in
constructing a QAPP is only a subset of what is available to the QA Manager. Mention of certain
products or software does not constitute endorsement, only that some potentially useful material can be
obtained from those products.
AF1. OVERVIEW OF POTENTIAL NEED FOR SOFTWARE IN QAPP PREPARATION
The general structure of a QAPP can be adapted easily for an organization's needs by automating
some of the components of the QAPP. Several commercial and governmental organizations have
produced software to facilitate this automation. The software needs are categorized under the four
classes of QAPP elements. Within each category is an explanation of the general functions of the
software that could prove useful in preparing, reviewing, or implementing a QAPP. In addition, the
QAPP elements to which the software applies are listed.
AF1.1 Class A: Project Management
This type of software can be used to produce planning documentation and preparation of the
QAPP document. In addition, this type of software can be used to produce other project documentation
such as Standard Operating Procedures (SOPs), Quality Management Plans (QMPs), and Data Quality
Objectives (DQOs) reports.
GENERAL SOFTWARE FUNCTIONS
Provides the user guidance on what to address in each QAPP element and
serves as a template for the production of the QAPP document.
Generates flowcharts to assist in preparing project organization charts and in
illustrating processes that occur in the project, such as sample collection and
analysis or data management.
Identifies training or certification required for personnel in given program
areas.
Provides applicable regulatory standards (e.g., action or clean-up levels) for the
various program areas (e.g., air, water, and solid waste).
Provides guidance on implementing the DQO Process.
QAPP
ELEMENTS
All elements
A4, BIO
A8
A6
A5, A6, A7
EPA QA/G-5
F-l
QA98
-------
AF1.2 Class B: Measurement and Data Acquisition
This type of software can be used to assist in the design of a sampling plan. In addition, this
software can provide information on analytical methods and sample collection and handling.
GENERAL SOFTWARE FUNCTIONS
Assists in the development of sampling designs that will meet specified
DQOs. The software should handle a variety of general design types with
and without compositing, such as simple random sampling, grid sampling,
and stratified sampling.
Provides information on analytical procedures and sampling methods for
various contaminants and media. This software provides QC data for the
analytical method (method detection limit [MDL], precision, and bias),
references to standard methods, and SOPs (where calibration and
maintenance information could be found).
Assists in tracking samples and assisting with documenting sample handling
and custody.
Integrates QC design and sampling design to meet DQOs and facilitate Data
Quality Assessment (DQA).
QAPP ELEMENTS
Bl
B2, B4, B5, B6, B7
B3
Bl, B5, BIO
AF1.3 Class C: Assessment and Oversight
This software can assist in assessment and oversight activities.
GENERAL SOFTWARE FUNCTIONS
Produces checklists, checklist templates, or logic diagrams (such as problem
diagnostics) for Technical Systems Audits (TSAs), Management Systems
Reviews (MSRs), and Audits of Data Quality (ADQs).
Perform DQA and facilitates corrective actions during the implementation
phase as preliminary or field screening data become available.
QAPP ELEMENTS
Cl
Cl, C2
AF1.4 Class D: Data Validation and Usability
This software assists in validating data and assessing its usability.
GENERAL SOFTWARE FUNCTIONS
Assists in performing data validation and usability.
Assists in performing data quality assessment.
QAPP ELEMENTS
D2
D3
EPA QA/G-5
F-2
QA98
-------
AF2. EXISTING SOFTWARE
This information is summarized as a list of identified software; a more detailed description of
each item is found in Section AF3. A variety of commercial software packages are available to assist in
statistical analysis, laboratory QC, and related activities, but this appendix focuses on software used
specifically by those preparing, implementing, and reviewing QAPPs. See Table AF.l for a summary of
the software described below.
AF2.1 Template Software
Several applications have been implemented in word-processing software that provide guidance
on how to complete each QAPP element and a template for the discussion portion. Four examples of
these applications are:
• Quality Integrated Work Plan Template (QIWP) (Section AF3, No. 2)
• QAPP Template (Section AF3, No. 3)
• Region 5 QAPP Template (Section AF3, No. 4)
A more sophisticated application, Quality Assurance Sampling Plan for Environmental Response
(QASPER), was identified that combines a template with links to a variety of lists that provide the user
response options (Section AF3, No. 1).
AF2.2 Flowcharting Software
Various flowcharting software is commercially available. One example found in QA/QC
literature is allCLEAR III (Section AF3, No. 5). Other more sophisticated packages link the flowchart
diagrams to active databases or simulation modeling capabilities.
AF2.3 Regulatory Standards Software
This software provides regulatory limits under the various statutes for a wide variety of
contaminants:
• Environmental Monitoring Methods Index (EMMI) (Section AF3, No. 6)
• Clean-Up Criteria for Contaminated Soil and Groundwater (an example of a commercially
available product) (Section AF3, No. 8)
AF2.4 Sampling Design Software
A variety of software has been developed to assist in the creation of sampling designs:
• Decision Error Feasibility Trials (DEFT) (Section AF3, No. 9)
• GeoEAS (Section AF3, No. 10)
• ELIPGRID-PC (Section AF3, No. 11)
• DQOPro (Section AF3, No. 12)
In addition, there are many statistical packages that support sampling design.
EPA QA/G-5 F-3 QA98
-------
AF2.5 Analytical Methods Software
This software provides information on method detection limits (MDLs) and method summaries
for a wide variety of analytical methods:
• EMMI (Section AF3, No. 6)
• EPA's Sampling and Analysis Methods Database (Section AF3, No. 7)
AF2.6 Data Validation Software
The Research Data Management and Quality Control System (RDMQ) (Section AF3, No. 13) is
a data management system that allows for the verification, flagging, and interpretation of data.
AF2.7 Data Quality Assessment Software
Several software packages have been developed to perform data quality assessment tasks.
Examples of this software include:
• DataQUEST (Section AF3, No. 14)
• ASSESS (Section AF3, No. 15)
• RRELSTAT (Section AF3, No. 16)
Note that most commercially available statistical packages (not listed above) perform a variety of
DQA tasks.
AF2.8 QAPP Review
QATRACK (Section AF3, No. 17) is used to track QAPPs undergoing the review process.
laoie AVI. aonware Avaiiaoie to ivieet i^vrr ueveio
SOFTWARE NEED
PROJECT MANAGEMENT
Template guidance
Flowcharting
Regulatory standards
MEASUREMENT AND DATA
ACQUISITION
Sample design
Analytical and sampling procedures
Integrating QC design and sampling design
to meet DQOs and facilitate DQA.
QAPP
ELEMENTS
All elements
A4, BIO
A6
Bl
B2, B4, B5, B6,
B7
B1,B5, BIO
)ineiu i>eeas
EXISTING SOFTWARE
QASPER, QWIP, QAPP Template
allCLEAR III
EMMI, Clean-Up Criteria for Contaminated
Soil and Groundwater
DEFT, GeoEAS, ELIPGRID-PC, DRUMs,
DQOPro, miscellaneous statistical packages
EMMI, EPA's Sampling and Analysis Database
DQOPro
EPA QA/G-5
F-4
QA98
-------
SOFTWARE NEED
ASSESSMENT AND OVERSIGHT
Data Quality Assessment
DATA VALIDATION AND USABILITY
Data validation
Data Quality Assessment
QAPP
ELEMENTS
Cl, C2
D2
D3
EXISTING SOFTWARE
DataQUEST, ASSESS, RRELSTAT
RDMQ
DataQUEST, ASSESS, RRELSTAT,
miscellaneous statistical packages
AF3. SOFTWARE AVAILABILITY AND SOURCES
The wide variety of existing software has potential to meet the needs identified for preparing
QAPPs. As illustrated in Table AF.l, at least one example of a software tool was identified that could
potentially be applied to aspects of QAPP preparation or implementation for all but three of the need
areas. The capabilities of the existing software should match the QAPP needs, as most of the software
was developed for use with a QAPP or for environmental data collection or analysis. Software not
designed for these uses could be modified or used to form the basis of an application that is more tailored
to QAPP preparation or implementation.
AF3.1 Quality Assurance Sampling Plan for Environmental Response (QASPER), Version 4.0
QASPER allows the creation and editing of a Quality Assurance sampling plan for
environmental response. The plan template consists of 11 sections: (1) title page, (2) site background,
(3) data use objectives, (4) sampling design, (5) sampling and analysis, (6) SOPs, (7) QA requirements,
(8) data validation, (9) deliverables, (10) project organization and responsibilities, and (11) attachments.
While preparing the plan, the user may enter the required information or select from the options provided
in a variety of "picklists." The picklists cover topics such as holding times, methods, preservatives, and
sampling approaches. The user may add or delete options from the picklists. QASPER also provides
various utility functions such as backing up, restoring, exporting, and importing a plan. Output may be
directed to a file or a printer. Contact: EPA, (732) 906-6921, Quality Assurance Sampling Plan for
Environmental Response (QASPER Version 4.0 User's Guide; latest version is QASPER Version 4.1,
January 1995.
AF3.2 Quality Integrated Work Plan (QIWP) Template for R&D and Monitoring Projects
The QIWP template is a tool designed to assist with planning, managing, and implementing a
specific monitoring or R&D project. The QIWP template is formatted with comment boxes that provide
guidance on the information to provide in each section. When activated, the text in the comment boxes
will appear on screen; however, they will not appear in a printout. An asterisk indicates where the user
should begin entering the discussion for each section. The QIWP document control format is already set
up in the template header. When a particular element is considered not applicable, the rationale for that
decision must be stated in response to that element. Once the user is satisfied with the information
entered under all elements of the template, the resulting printout is the combined project work plan and
QA plan. In addition, a printout of the QIWP template, prior to entering project related information, can
be used as a checklist for planning and review purposes. Other software packages available are the
QWIP Template for Model Development Projects and the QWIP Template for Model Application
Projects. Contact: EPA, (919) 541-3779 and North American Research Strategy for Tropospheric Ozone
(NARSTO) homepage.
EPA QA/G-5
F-5
QA98
-------
AF3.3 Region 2 QAPP Template
This package contains an annotated template containing instructions for completing each section
of the QAPP. The users are also instructed where to insert their discussions within the template. After
completing the QAPP, the italicized instructions are not printed, leaving only the preparer's discussion.
In addition, a table of contents is automatically generated. The template describes the information that
should be provided under the main topics of project management, measurement/data acquisition, data,
assessment/oversight, and references. The project management section covers the introduction, goals of
the project, organization of the project participants and of QA, and DQOs. The measurement/data
acquisition section discusses the topics to address to describe the statistical research design and
sampling. This section also covers the elements related to sample analysis: description of the instrument,
calibration, QC, consumables, and preventative maintenance. The data section provides for a discussion
of the data management procedures. The assessment/oversight section covers audits and QA reports.
The next section is a list of references. Finally, six tables are provided as examples for displaying
information on the following topics: (1) measurement quality criteria; (2) sample collection, handling,
and preservation; (3) instrument data and interferences; (4) instrument calibration, (5) QC checks; and
(6) preventive maintenance. Contact: EPA, (401) 782-3163, or (503) 754-4670.
AF3.4 Region 5 QAPP Template
This software consists of two model documents (one for Superfund sites and one for RCRA
sites) that describe the preparation of a QAPP in a series of elements. Each element contains two types
of information: (1) content requirements that are presented as smaller text and (2) structural guidance
that is presented as larger text and headed by the appropriate section number. This information is
intended to show to the QAPP preparer the requirements that must be described in each element and the
level of detail that is typically needed to gain Region 5 approval. Example text is provided that should be
deleted and replaced with the specific site information.
A TSCA Model Plan template is also available that attempts to be a comprehensive guide to all
the data gathering activities for Fiscal Year 94 Title IV grantees. In this template, headers are provided
in "background" format, and text that may apply to specific situations is in an italic font. Open spaces
indicate where the preparer's input is required. Contact: EPA, (312) 886-6234.
AF3.5 allCLEARIII
This software enables the creation of simple process diagrams, organizational charts, or decision
trees. It also creates diagrams from text outlines, spreadsheets, and database information. Contact:
American Society for Quality Control Quality Press, Publications Catalogue, (800) 248-1946.
AF3.6 Environmental Monitoring Methods Index (EMMI)
This software consists of an analytical methods database containing more than 4,200 analytes,
3,400 analytical and biological methods, and 47 regulatory and nonregulatory lists. EMMI cross-
references analytes, methods, and lists and has information about related laws, organizations, and other
chemical databases. This information does not include measurement method performance such as
precision and bias. Contact: DynCorp Environmental Technical Support, (703) 519-1222.
EPA QA/G-5 F-6 QA98
-------
AF3.7 EPA's Sampling and Analysis Methods Database, 2nd Edition
This software has a menu-driven program allowing the user to search a database of 178 EPA-
approved analytical methods with more than 1,300 method and analyte summaries. The database covers
industrial chemicals, pesticides, herbicides, dioxins, and PCBs and focuses on water, soil matrices, and
quality parameters. The software generates reports that are stand-alone documents that can be browsed,
printed, or copied to files. Each report contains information for initial method selection such as
applicable matrices, analytical interferences and elimination recommendations, sampling and
preservation requirements, MDLs, and precision, accuracy, and applicable concentration ranges.
Contact: Radian Corporation, (512) 454-4797.
AF3.8 Clean-Up Criteria for Contaminated Soil and Groundwater, 2nd edition
This software consists of a one-volume document and diskette summarizing cleanup criteria
developed by EPA, all 50 State regulatory agencies, and select countries outside the United States.
Contact: ASTM Publications Catalogue, (610) 832-9585, http://www.astm.org.
AF3.9 Decision Error Feasibility Trials (DEFT)
This package allows quick generation of cost information about several simple sampling designs
based on the DQO constraints. The DQO constraints can be evaluated to determine their appropriateness
and feasibility before the sampling and analysis design is finalized.
This software supports the Guidance for the Data Quality Objectives Process, EPA QA/G-4, that
provides general guidance to organizations on developing data quality criteria and performance
specifications for decision-making. The Data Quality Objectives Decision Error Feasibility Trials
(DEFT) User's Guide, contains detailed instructions on how to use DEFT software and provides
background information on the sampling designs that the software uses. Contact: EPA, (202) 564-6830.
AF3.10 GeoEAS
Geostatistical Environmental Assessment Software (GeoEAS) is a collection of interactive
software tools for performing two-dimensional geostatistical analyses of spatially distributed data.
Programs are provided for data file management, data transformations, univariate statistics, variogram
analysis, cross-validation, kriging, contour mapping, post plots, and line/scatter plots. Users may alter
parameters and re-calculate results or reproduce graphs, providing a "what if analysis capability.
This software and a user's guide can be downloaded through the Office of Research and
Development (ORD) World Wide Web site at http://www.epa.gov/ORD or
http://www.epa.gov/ORD/nerl.htm. Contact: GEO-EAS 1.2.1 User's Guide, EPA/600/8-91/008, April,
1991, EPA, (702)798-2248.
AF3.11 ELIPGRID-PC
ELIPGRID-PC calculates the probabilities related to hitting a single hot spot. The user has the
following options: (1) calculating the probability of detecting a hot spot of given size and shape when
using a specified grid, (2) calculating the grid size required to find a hot spot of given size and shape with
specified confidence, (3) calculating the size of the smallest hot spot likely to be hit with a specified
sampling grid, (4) calculating a grid size based on fixed sampling cost, and (5) displaying a graph of the
EPA QA/G-5 F-7 QA98
-------
probability of hitting a hot spot versus sampling costs. Contact: ELIPGEJD-PC: UPGRADED
VERSION, Oak Ridge National Laboratory/TM-13103, (970) 248-6259.
AF3.12 DQOPro
This software consists of a series of three computer programs that calculate the number of
samples needed to meet specific DQOs. DQOPro provides answers for three objectives: (1) determining
the rate at which an event occurs, (2) determining an estimate of an average within a tolerable error, and
(3) determining the sampling grid necessary to detect "hot-spots." Contact: Radian International, (512)
454.4797.
AF3.13 Research Data Management and Quality Control System (RDMQ)
This software is a data management system that allows for the verification, flagging, and
interpretation of data. PvDMQ is a menu-driven application with facilities for loading data, applying QC
checks, viewing and changing data, producing tabular and graphical reports, and exporting data in ASCII
files. PvDMQ provides a shell environment that allows the user to perform these tasks in a structured
manner. Contact: Environment Canada, (416) 639-5722, or EPA, (919) 541-2408.
AF3.14 DataQUEST
This tool is designed to provide a quick and easy way for managers and analysts to perform
baseline Data Quality Assessment. The goal of the system is to allow those not familiar with standard
statistical packages to review data and verify assumptions that are important in implementing the DQA
Process. This software supports the Guidance for Data Quality Assessment, EPA QA/G-9, that
demonstrates the use of the DQA Process in evaluating environmental data sets. Contact: EPA, (202)
564-6830.
AF3.15 ASSESS I.Ola
This software tool was designed to calculate variances for quality assessment samples in a
measurement process. The software performs the following functions: (1) transforming the entire data
set, (2) producing scatter plots of the data, (3) displaying error bar graphs that demonstrate the variance,
and (4) generating reports of the results and header information. Contact: EPA, (702) 798-2367.
AF3.16 QATRACK
This Microsoft Access software provides a database that tracks QAPPs requiring approval. Data
are entered into QATRACK during the assistance agreement start-up stage, as soon as the QA manager
reviews and signs the agreement. Users can edit the data, query the database to perform data reviews,
and archive files once the QAPP is approved. Contact: EPA, (919) 541-2408.
EPA QA/G-5 F-8 QA98
-------
APPENDIX G
ISSUES IN DATA MANAGEMENT
AG1. INTRODUCTION
EPA QA/G-5 provides guidance on many different operations that involve generating, collecting,
manipulating, and interpreting environmental data. These activities include field sampling, sample
handling and storage, laboratory analysis, modeling, data storage and retrieval, and Data Quality
Assessment. All these activities generate data or require data to be manipulated in some way, usually
with the aid of a computerized data management tool such as a database, spreadsheet, computer model, or
statistical program.
This appendix expands the guidance currently provided in EPA QA/G-5, Section BIO, Data
Management. Guidance is provided on Quality Assurance (QA) considerations and planning for the
development, implementation, and testing of computer-based tools that perform the data management
aspects of the overall environmental project described in the Quality Assurance Project Plan (QAPP).
These data management aspects include data storage, data acquisition, data transformations, data
reduction, modeling, and other data management tasks associated with environmental data collection
projects. This guidance can be used for applications developed in-house or for those developed using
commercial software. It can be used for systems of different sizes, from individual spreadsheet
applications to large integrated systems. The amount of planning and documentation involved are
tailored according to the use of the data and the size and complexity of the application.
This appendix incorporates into EPA QA/G-5 the QA elements of guidance from the EPA Office
of Information Resources Management (OIRM) and applicable industry standards, such as those of the
Institute of Electronic and Electrical Engineers, relating to development of information and data
management systems. Because data and information system development projects differ widely in many
different respects, this appendix does not attempt to address the low-level details of planning,
implementation and assessment nor does it provide step-by-step procedures to follow when developing a
data management system. These details are left to other EPA guidance documents (See Section AG2.4),
national consensus standards, and the best judgement of the personnel on each project.
AG2. REGULATORY AND POLICY FRAMEWORK
This section provides a brief overview of the legislation, policies, standards and guidelines most
applicable to the development of EPA data management and information systems. Sections AG2.1 and
AG2.2 of this overview are intended to provide the QAPP preparer (specifically the preparer of the data
management section) with a general understanding of the relevant agency-level policies, Sections AG2.3
and AG2.4 provide a reference for the major guidance documents containing more specific and detailed
information on development of data management systems.
AG2.1 Legislation
The following is a summary of the major legislative policies that pertain to information
technology and the development of data management systems. The two most relevant pieces of legislation
are:
(1) the Paperwork Reduction Act (PRA) of 1980 (P.L. 96-511) as amended in 1986 (P.L. 99-500)
and 1995 (P.L. 104-13), and
EPA QA/G-5 G-l QA98
-------
(2) the Clinger-Cohen Act of 1996 (P.L.-104-208). (Note that the Clinger-Cohen Act is the
amended title for the Information Technology Management Reform Act and the Federal
Acquisition Reform Act of 1996 (P.L. 104-106)).
The overall purpose of the PRA is to reduce paperwork and enhance the economy and efficiency
of the government and private sector by improving Federal information policy development and
implementation. The PRA establishes a broad mandate for executive agencies to perform their
information activities in an efficient, effective, and economical manner. The 1995 amendments
established several broad objectives for improving the management of Federal information resources.
These objectives include maximizing the utility of information, improving the quality and use of
information to strengthen decision making, and establishing uniform resource management policies.
The Clinger-Cohen Act (CCA) sets forth requirements for the Office of Management and Budget
(OMB) and the individual executive agencies. OMB responsibilities include promoting and improving
the acquisition, use, and disposal of information technology by the Federal Government to improve the
productivity, efficiency, and effectiveness of Federal programs. In addition, the CCA requires each
agency to design and implement a process for maximizing the value and assessing and managing the risks
of information technology acquisitions. The CCA also requires each agency to utilize the same
performance- and results-based management practices as encouraged by OMB.
AG2.2 Policy Circulars and Executive Orders
Circular A-130 implements OMB authority under the PRA and sets forth the policy that applies
to the information activities of all the executive agencies. The policies include requirements for
information management planning as well as information systems and information technology
management. Part of the information management policy is that agencies, when creating or collecting
data, need to plan from the outset how to perform the following data management functions: (1) data
processing and transmission, (2) data end use and integrity protection, (3) data access, (4) data
dissemination, (5) data storage and retrieval, and (6) data disposal. In addition, these planning activities
need to be documented. The information systems and information technology management policies
describe an information system life cycle that is defined as the phases through which an information
system passes. These phases are typically characterized as initiation, development, operation, and
termination. However, no specific number of phases is set, and the life cycle management techniques
that agencies use may vary depending on the complexity and risk inherent in the project. In addition, the
division between the phases of the system life cycle may not be distinct.
Current implementation of the CCA comes through Executive Order 13011, which outlines the
executive agencies. The agencies are to strengthen the quality of decisions about the use of information
resources to meet mission needs and establish mission-based performance measures for information
systems. In addition, to establish agency-wide and project-level management structures and processes
responsible and accountable for managing, selecting, controlling, and evaluating investments in
information systems.
G2.3 Federal Information Processing Standards
The National Institute of Standards and Technology (NIST) develops standards for Federal
computer systems. NIST issues these standards and guidelines as Federal Information Processing
Standards (FIPS) for government-wide use. NIST develops FIPS when there are compelling Federal
government requirements (such as for security and interoperability) and there are no acceptable industry
standards or solutions. FIPS publications include standards, guidelines, and program information
EPA QA/G-5 G-2 QA98
-------
documents in the following seven subject areas: (1) general publications, (2) hardware standards and
guidelines, (3) software standards and guidelines, (4) data standards and guidelines, (5) computer
security standards and guidelines, (6) operations standards and guidelines, and (7) telecommunications
standards. Additional information about FIPS, including ordering information and a list and description
of the individual documents, is available online using the World Wide Web (WWW) at the following
Uniform Resource Locator (URL) address: http//www.nist.gov/itl/div879/pubs/.
AG2.4 EPA Guidance
EPA's Office of Information Resources Management (OIRM), which has the primary functional
responsibility for Information Resources Management (IRM) policy development and overall
management of EPA's IRM program, has published several IRM guidance documents. The Information
Resources Management Policy Manual 2100 establishes a policy framework for managing information
resources in the Agency. The document is intended to provide a structure for the implementation of
legislation concerning the management of Federal information resources such as the PRA. Also, the
manual establishes the authorities and responsibilities under which the OIRM will function. The Policy
Manual consists of twenty chapters that cover subjects such as software management, information
security, system life cycle management, and information and data management. The Policy Manual can
be obtained online using the WWW at the following URL address: http://www.epa.gov/irmpoli8/.
The System Design and Development Guidance document provides a framework that Agency
managers can use to document a problem and justify the need for an information-system-based solution.
The document also provides guidance for identifying solutions to specified problems and for information
system development. The guidance consists of three volumes (A, B, and C). Volume A provides a
method for documenting the need for an information system and developing an initial system concept
that describes the inputs, outputs, and processes of the proposed system. Volume B provides guidance
for developing design options that satisfy the initial system concept developed in Volume A. Volume B
also gives guidance for selecting the most cost-effective solution. Volume C describes the system-design
and development process (and the required associated documentation) and outlines a software
management plan that is used to ensure the quality of EPA software design, development,
implementation, and maintenance efforts. This document can be obtained online using the WWW at the
following URL address: http://www.epa.gov/irmpoli8/.
Additional EPA guidance documents pertaining to information system development, operations,
and maintenance are listed in Section G4, References. Up-to-date OIRM documents can be obtained
online using the WWW at the following URL address: http://www.epa.gov/irmpoli8/.
Another source of guidance is EPA Quality Assurance Division's (QAD) Development
Management System Template. The template includes a description of the roles of management in
planning for the development of data management systems. The responsible project officer or
contracting officer representative outlines a management scheme based upon the planning and
documentation activities that satisfy OIRM policy or an organization's Quality Management Plan. The
project manager works with the quality assurance manager to identify the tasks, work products, and
management procedures for the project.
AG3. QA PLANNING FOR INFORMATION SYSTEMS
Data generated or managed by an information system must be defensible and appropriate to their
final use or the conclusions to be drawn from the data. To help ensure that data will be defensible,
EPA QA/G-5 G-3 QA98
-------
project teams should include adequate QA planning in the development of data management or other
information systems. There are three elements to QA planning for data management:
• Needs Analysis—identifying applicable qualitative and quantitative requirements and
establishing corresponding quality goals.
• Planning and Implementing—implementing an appropriate planning and management
framework for achieving these goals.
• Verification—testing and auditing to determine that the established goals are being met.
AG3.1 Quality Assurance Needs Analysis
The type and magnitude of the QA effort needed in developing a new information system
depends on the qualitative and quantitative criteria that the data must meet and on the complexity and
magnitude of the project. Other specific concerns such as security and system performance also help
define the QA program requirements. Only by establishing the ultimate needs and objectives for data
quality in the early planning stages can appropriate decisions be made to guide the system development
process to a successful conclusion.
AG3.1.1 Quantitative and Qualitative Criteria
Considerations similar to those in the Data Quality Objectives (DQO) framework can be used to
identify and define the general criteria that computer-processed data must meet. For example, very high
standards must be set for information systems that generate or manage data supporting Congressional
testimony, for developing new laws and regulations, for litigation, or for real-time health and safety
protection. More modest levels of defensibility and rigor are required for data used for technology
assessment or "proof of principle," where no litigation or regulatory actions are expected. Still lower
levels of defensibility apply to basic exploratory research requiring extremely fast turn-around, or high
flexibility and adaptability. In this case, the work may have to be replicated under tighter controls or the
results carefully reviewed prior to publication. By analyzing the end-use needs, appropriate criteria can
be established to guide the information system development process.
More detailed criteria can also be developed to address the specific ways in which computer-
generated or computer-processed results can be in error. The following are some specific questions to be
asked when quantitative or qualitative objectives are being defined:
• What is the required level of accuracy/uncertainty for numerical approximations?
• Are the correct data elements being used in calculations (e.g., the correct "cell" in a
spreadsheet)?
• Have the appropriate statistical models, mathematical formulas, etc. been chosen?
• What "chain-of-custody" requirements pertain to the data and results?
AG3.1.2 Project Scope. Magnitude, and Complexity Criteria
Software and systems development projects vary widely in scope and magnitude. Application of
effective management controls (including the QA program) are critical for successful performance on
large projects. Risks associated with large, complex projects commonly include overruns and schedule
delays. The integrity of results can also be compromised by rushing to complete an overdue project.
Table Gl summarizes risks as a function of project size or scope.
EPA QA/G-5 G-4 QA98
-------
Table AG1. Project Scope and Risks
PROJECT SCOPE
Large Project (information system development
is a major component)
Medium Size Project (including projects in
which an information system is not the major
component)
Small Projects (including projects in which
computer-related development is a minor
component)
Projects with ad hoc software development and
data management practices (no QA program)
POTENTIAL RISKS
• Major budget overruns
• Schedule slippage
• Unusable system or data
• Public relations problems
• Budget overrun
• Schedule slippage
• Uncertain data quality
• Lack of confidence in data
• Lack of data traceability
• Schedule slippage
• Lack of confidence in data
• Inefficient use of time and resources
EPA OIRM's Chapter 17, System Life Cycle Management, in Information Resources
Management Policy Manual, provides a similar rationale for categorizing information systems. Four
system types are defined based on the significance of the risk assessment for the Information System.
Major factors included in this risk assessment are the importance of the data, the cost of the system, and
the organizational scope of the system. For the purposes of a management review, OIRM defines
Information Systems using the following classes:
• A Major Agency System is a system that is mission critical for multiple AAships or
Regions or Agency Core Financial System or has a life cycle cost greater than $25
million or $5 million annually.
• A Major AAship or Regional System is a system that is mission critical for one AAship
or Regional Office or has a life cycle cost greater than $10 million or $1 million
annually.
• A Significant Program Office System is a system that is mission critical in one Program
Office or has a life cycle cost greater than $2 million or $100,000 annually.
• A Local Office or Individual Use System is a system for local office or individual user or
costs less than $100,000 annually for one project.
AG3.1.3 Other Quality Issues
While the issues discussed in the preceding two sections are of key importance in determining
the necessary level of the QA effort, there are many individual quality issues that should not be
overlooked in defining the requirements for a particular project. These issues should be addressed in
project planning, implementation, and testing. Some commonly encountered issues are discussed in the
following text.
AG3.1.3.1 Security Issues. There are many different types of threats to data security and
communications. Common concerns include viruses, hackers, and interception of e-mail. If these
concerns apply for a particular system, the following issues should be addressed during system planning.
Tests and audits may be planned to assess system vulnerability. Some of the management and QA
techniques that can be employed in this assessment include:
EPA QA/G-5
G-5
QA98
-------
• reviewing the project requirements documentation to ensure that security issues are
included among project requirements;
• reviewing the testing documents to ensure that security features are adequately and
thoroughly tested; and
• planning audits to be conducted by security personnel outside the immediate project
team.
AG3.1.3.2 Communication Issues. Most business computers are extensively interconnected through the
Internet, agency networks, or local networks. Computer communications is a rapidly changing area of
technology. Consequently, communications software and hardware are frequently the source of
problems to developers and users. Some communication issues that might be addressed in system
planning, design, and testing include the following:
• adequately defining the communication interfaces;
• thoroughly testing the communications hardware and software, including "stress testing"
under high load and adverse conditions; and
• conducting a beta test that encompasses users with a variety of different hardware and
communications connections.
AG3.1.3.3 Software Installation Issues. Many software packages are being developed and distributed by
the Agency to run on the individual user's personal computer. Many of these use auto-installation
routines that copy files into various directories and modify system initialization and registry files.
Planning the necessary systems requirements should address the following considerations:
• testing on as many different platforms as possible including various combinations of
processors, memory sizes, video controllers, and printers (Beta Testing can be extremely
helpful for this);
• including an "uninstall" program that not only deletes files, but also properly removes
entries in the initialization and registry files; and
• ensuring that both the "setup" and "uninstall" routines are thoroughly tested and
debugged before release.
AG3.1.3.4 Response Time Issues. A frequently overlooked aspect of computerized systems is the
impact of of system load and the resulting effect on response time. Response time is important not only
for real-time data acquisition and control systems, but also for interactive user interfaces. It is a good
idea to establish quantitative objectives for response time performance for all interactive and real-time
systems. These goals must be explicit and testable. A typical specification might be that the user should
not wait longer than x seconds for a response after submitting a request to the program.
AG3.1.3.5 Compliance with EPA and other Federal Policies and Regulations. Since individual
managers and scientists may not track information systems regulations and policy, requirements should
be determined at project inception. Some of the more important policies have been summarized in
Section AG2 of this appendix. Many of the policies and guidances are aimed at ensuring individual
project success, while others are intended to foster Agency-wide goals, including consistency of
hardware and software platforms, purchasing economies, and security. For example, EPA's Acquisition
Regulation requires Agency contractors to collect and review OIRM's most recent policies by
downloading the most current documents available online at OIRM's WWW Site.
EPA QA/G-5 G-6 QA98
-------
AG3.2 System Development Planning
Proper planning, execution, and QA protocols are vital to the success of projects involving
information systems development, software development, or computer data processing. The project
management team should work closely with the responsible QA staff to implement a program that best
suits the needs of the individual project. A few of the issues to be addressed include the level of
documentation required, schedule, personnel assignments, and change control. The following section
describes a commonly used planning framework and associated documentation that is based on the
widely recognized software- or system-development life cycle.
AG3.2.1 System Development Life Cycle
Software and information system development projects tend to evolve in distinct phases.
Recognition of this fact can be helpful in planning and managing a new project. Table G2 outlines eight
commonly recognized stages in the system development life cycle, along with typical activities and
documentation for each stage. This approach can be modified to meet the needs of individual projects.
Table AG2. Software Development Life Cycle
LIFE CYCLE
STAGE
TYPICAL ACTIVITIES
DOCUMENTATION
Needs Assessment
and High- Level
Requirements
Definition
Assessment of needs and requirements
through literature search, interviews with
users and other experts.
• Needs Assessment
Documentation (e.g., QA
Project Plan)
• Requirements Document
Detailed
Requirements
Analysis
Listing of all inputs, outputs, actions,
computations, etc. that the system is to
perform.
Listing of ancillary needs such as
security, user interface requirements.
Design team meetings.
Detailed Requirements
Document, including
Performance, Security, User
Interface Requirements etc.
System Development
Standards
System Design
Translation of requirements into a design
to be implemented.
Design Document(s)
including Technical Design
(algorithms, etc.),
Software/Systems Design
Implementation
Controls
Coding and configuration control.
Design/implementation team meetings.
In-line comments
Change control
documentation
Testing, Verification,
and Validation
Verification that the system meets
requirements.
Verification that the design has been
correctly implemented.
Beta Testing (users outside team).
Acceptance Testing (for final acceptance
of a contracted product).
Implement necessary corrective actions.
• Test Plan
• Test Result Documentation
• Corrective Action
Documentation
• Beta Test Comments
• Acceptance Test Results
EPA QA/G-5
G-7
QA98
-------
LIFE CYCLE
STAGE
Installation and
Training
Operations,
Maintenance, and
User Support
System Retirement
and Archival
TYPICAL ACTIVITIES
Installing data management system and
training users.
Use of the system or data requires usage
instructions and maintenance resources.
Information on how data or software can
be retrieved if needed.
DOCUMENTATION
• Installation Documentation
• User's Guide
• User's Guide
• Maintenance Manual or
Programmer's Manual
• Project files
• Final Report
AG3.2.2 Planning Documentation
Individual project and QA managers should tailor documentation to meet the specific needs of
their project. References in Section AG4 such as EPA System Design and Development Guidance and
Chapter 17, System Life Cycle Management, in Information Resources Management Policy Manual
describe in more detail the various types of documentation related to the system life cycle planning
phases. The following list describes in more detail some of the planning documentation listed in Table
AG2:
• Requirements Documentation—The high-level requirements document gives an
overview of the functions an information system must perform. Detailed requirements
documents define all critical functions that the completed information system must
support. Performance goals derived from analysis of the project's DQOs should be
included among the requirements. In addition, frequently overlooked issues such as
those described in Section AG3.1.3 should be addressed. Requirements documentation
should be reviewed by the end-user, if possible, to ensure that critical functions and other
requirements have not been overlooked.
• Design Documentation—Design documents are used to plan and describe the structure of
the computer program. These are particularly important in multi-programmer projects in
which modules written by different individuals must interact. Even in small or single-
programmer projects, a formal design document can be useful for communication and
for later reference.
• Coding Standards or SOPs—These may apply to a single project, an entire organizational
Branch, or other functional group. Uniform standards for code formats, subroutine
calling conventions, and in-line documentation can significantly improve the
maintainability of software.
• Testing Plans—Testing, which is discussed in Section AG3.3, must be planned in
advance and must address all original requirements and performance goals. Specific
procedures for the corrective action and retesting process should be described in QA
planning documents and implemented in the Testing Plan.
• Data Dictionary—A data dictionary can be useful to developers, users, and maintenance
programmers who may need to modify the system later. The data dictionary is often
EPA QA/G-5
G-8
QA98
-------
developed before code is written as part of the design process. The dictionary should be
updated as necessary when new elements are added to the data structure. A data
dictionary need not be a separately written document. For example, the record definition
files required for many database systems can serve this purpose, provided that they are
available in a form that is readily accessible to the user or maintenance programmer.
• User's Manual—The user's manual can often borrow heavily from the requirements
document because all of the software's functions should be specified there. The scope of
the user's manual should take into account such issues as the level and sophistication of
the intended user and the complexity of the interface. Online help can also be used to
serve this function.
• Maintenance Manual—The maintenance manual's purpose is to explain a program's logic
and organization for the maintenance programmer. This manual should also contain
crucial references documenting algorithms, numerical methods, and assumptions.
Instructions on how to rebuild the system from source code must be included. The
maintenance manual will often borrow heavily from the design manual.
• Source Code—It is usually not necessary to print the source code in hard copy form
unless needed for a specific purpose. However, it is very important to archive computer-
readable copies of source code according to the policies of each Office, Region, National
Center, or Laboratory.
AG3.3 Audits and Testing
As with any project involving generation or handling of environmental data, audits can be used
to verify that goals and objectives are being met. Audits of the Information System development
process, audits of security, and data verification audits may be particularly helpful when conducted by
personnel outside the immediate project team. Security audits by someone with expertise in this field
can be valuable when data confidentiality and prevention of tampering are important issues. Data
verification audits can be conducted using a known data set. Such a data set might be developed by an
end-user or an outside expert to verify that the information system produces the expected results.
Testing procedures and criteria need not be specified in detail by the QA Project Plan (or
equivalent document); however, the general extent and approach to testing should be described. QA
planning documents for developing a new information system should generally provide the following
project elements:
• a list of planned test documentation to be written;
• a description of the types of testing that will be conducted;
• a schedule for testing and audits; and
• a section on corrective actions.
The purpose of testing is not simply to detect errors but also to verify that the completed software
meets user requirements. In designing any test, the "correct" or "acceptable" outputs should be known in
advance, if possible. Testing should be planned in an orderly, structured way and documented. A phased
approach to testing, which is often employed in larger scale information system development projects,
might employ a sequence of testing procedures such as those presented in Sections AG3.3.1 through
AG3.3.5.
EPA QA/G-5 G-9 QA98
-------
AG3.3.1 Individual Module Tests
Individual module tests are applied to individual functions. For sequential programming
languages, such as FORTRAN, BASIC, or C, individual modules might include functions and
subroutines. For other types of software (e.g., spreadsheets), defining a functional module is more
problematic, because the software may not be designed in a modular way. However, well-planned design
strategies, such as compartmentalized design, can ease the testing effort.
AG3.3.2 Integration Tests
Integration tests are done to check the interfaces between modules and to detect unanticipated
interactions between them. Integration testing should be done in a hierarchical way, increasing the
number of modules tested and the subsystem complexity as testing proceeds. Each level of subsystem
integration should ideally correspond to a unified subset of system functions such as the "user interface."
Because all the elements may not be present, it may be necessary to develop test data sets or
hardware/software test beds to conduct the tests effectively.
When problems are encountered at any level of integration or system testing, it is necessary to
track the errors back to their origin, which may be any phase of the project. When the original reason for
the problem is identified, all affected modules and subsystems should be corrected and retested as
described in the next section.
AG3.3.3 Regression Testing
After a system module has been modified, all testing performed on the original version of the
module should be repeated, including all integration tests that include the module. This reduces the
chance that any new "bugs" introduced by the changes will go undetected while modifying the code to
correct an existing problem. Spreadsheets may be particularly difficult to test thoroughly after changes
have been made because their data dependencies are often difficult to trace. In such cases, it may be
useful to have a suite of tests that can be run whenever a change is made to verify that other functions are
not affected.
AG3.3.4 System Testing
Testing the full system is the ultimate level of integration testing and should be done in a realistic
simulation of the end-user's operational environment. If a detailed requirements document was written,
each requirement should be tested systematically. It is often helpful for a representative end-user to
participate in the system test to verify that all requirements have been implemented as intended.
Elements of the special tests described in Section AG3.3.5 can be incorporated into the system test.
For some projects, the in-house system test may be the final stage of testing. For larger or more
critical projects, formal acceptance tests or beta testing would follow. The system test should exercise all
functions possible, and the data sets used to demonstrate the software should be as realistic as possible.
AG3.3.5 Other Special Testing
AG3.3.5.1 Stress Testing should be included in the system-level testing whenever a system might be
load-sensitive (e.g., real-time data acquisition and control systems). The stress test should attempt to
simulate the maximum input, output, and computational load expected during peak usage. The specific
rates of input, output, and processing for which the system is designed are important criteria in the
EPAQA/G-5 G-10 QA98
-------
original requirements specification. The maximum load is a key quality indicator and should have been
specified early in planning. The load can be defined quantitatively using criteria such as the frequency of
inputs and outputs or the number of computations or disk accesses per unit of time. Developing an
artificial test bed to supply the necessary inputs may be necessary. The test bed can consist of hardware,
software, or a combination of the two that presents the system with realistic inputs to be processed. The
project team can write programs to carry out this testing, or automated tools may be available
commercially. Test data sets may be necessary if the software needs external inputs to run.
AG3.3.5.2 Acceptance Testing refers to contractually required testing that must be done before
acceptance by the customer and final payment. Specific procedures and the criteria for passing the
acceptance test should be listed before the test is done. A stress test is a recommended part of the
acceptance test, along with thorough evaluation of the user interface.
AG3.3.5.3 Beta Testing refers to a system-level verification in which copies of the software are
distributed outside the project group. In beta testing, the users typically do not have a supplied testing
protocol to follow; instead, they use the software as they would normally and record any anomalies
encountered. Users report these observations to the developers, who address the problems before release
of the final version.
AG3.3.5.4 Spreadsheet testing is particularly difficult because of an inherent lack of readability and
structure. One of the best ways to test spreadsheets is to challenge them with known data, although this
can be very time-consuming. Another approach is to independently recede some or all of the spreadsheet
and compare results. Software packages for spreadsheet analysis exist, but their usefulness for testing
must be evaluated on a case-by-case basis.
AG3.4 Examples
The following examples present three different data management projects: a computer model, a
spreadsheet, and a local-area-network-distributed database. Some of the QA, management, and testing
issues peculiar to each type of project are discussed.
AG3.4.1 Model Development
Mathematical models are widely used in the environmental sciences. Modeling is necessary
when the complexity of a particular situation makes a simple solution impossible, as when many different
processes are closely coupled and occur simultaneously. Some models are used to generate data that may
be used for planning and regulatory purposes.
A high level of mathematical and scientific expertise is required to develop and test the
algorithms used to represent the different physical processes. This expertise is often a scarce and
valuable resource. Consequently, a team approach may be used under which the senior scientific staff
concentrates on developing, testing, and documenting the "core" algorithms, while support staff take care
of other duties on the development project, including developing the user interface, communications,
coding, and documentation. Quality Assurance planning for developing a new model should include the
following:
• The staffing section of the QAPP should state the relevant qualifications for the key
scientific personnel. The need for peer review of novel algorithms should be addressed
if new research developments are to be incorporated in the model. Guidance documents
on conducting peer review of models are referenced in Section AG4.5. The topics
EPAQA/G-5 G-ll QA98
-------
addressed include verification testing, model code documentation, and review of the
conceptual and mathematical performance of a model.
• The end use of the data produced will dictate how exhaustively the models must be
tested and the types of demonstrations that should be done before release. A regulatory
model should be compared with existing regulatory models using identical or similar
data sets. Environmental models such as air dispersion models can be compared with
actual field data. However, care should be taken in evaluating discrepancies between
model results and the field data, because differences between monitoring data and model
results can arise from a variety of sources.
• Capabilities and needs of the end-users will dictate how much effort is spent developing
and testing the user interface and on providing user documentation and online help
functions. User interface issues should be addressed in the requirements definition, and
these functions should be tested exhaustively. Beta testing results should be reviewed
carefully to identify problems with the user interface.
• It may be possible to develop specific objectives for parameters such as bias and
precision by modeling cases that have known and accurate results. This is usually
possible only in relatively simple cases, since new models are usually developed to
expand beyond the capabilities of currently available models.
AG3.4.2 Spreadsheet for Data Processing in an Ongoing Project
Spreadsheets have replaced hand calculators for many simple applications but can sometimes
grow to approach the complexity of a database management system. Spreadsheets developed on an ad
hoc basis are usually not tested in any systematic way and may not be archived with project data.
Consequently, there can be little accountability for the correctness of calculations, even when those
results are used for sensitive applications such as regulatory reporting. This lack of testing and
verification can present significant risks. The following QA guidelines are suggested for spreadsheets
developed or used in support of projects involving environmental data:
• QA or other project planning documents should indicate all data processing tasks to be
done using spreadsheets. The origin of any spreadsheets obtained from outside the
project group should be documented.
• Spreadsheets should be developed by personnel with the appropriate education and
training. Personnel who maintain or use the spreadsheet should also have appropriate
qualifications and training.
• Documentation should be provided for correct use and maintenance of the spreadsheet.
• Data quality audits for projects processing environmental data should examine all
spreadsheets used to produce reportable data for the project. Questions such as the
following should be asked during the audit:
Have all critical calculations performed by the spreadsheet been verified (i.e.,
has the spreadsheet been tested)? Is there a record of validation including the
date and the specific inputs and outputs?
EPAQA/G-5 G-12 QA98
-------
Have significant changes been made to the spreadsheet since the last time its
output was validated?
Are users properly trained in the use of the spreadsheet? Do they have sufficient
reference material? An interview with users other than the spreadsheet
developer may be helpful in determining this.
What provisions are there for quality control of manual data input? As with any
other type of manual data entry situation, duplicate key entry or similar means of
quality control should be used when entering large data sets.
Does the spreadsheet incorporate complex table lookup functions or macros?
These features significantly complicate spreadsheets and can make their detailed
operation virtually impossible to understand fully. In such cases, the auditor
should review the reasonableness of outputs produced by the spreadsheet using a
known data set.
• Provisions should be made for archiving the spreadsheet in a format that is usable in case
data have to be reprocessed. The time window in which data may have to be reprocessed
should be considered. Some spreadsheets (as well as other types of computer software)
can sometimes remain in use long after the original project has ended, and
documentation must be provided so that the spreadsheet's functions can be understood at
a later time.
AG3.4.3 A Local-Area-Network-Distributed Database Application
Communication software is complex and is evolving rapidly. This leads to fundamental concerns
in the areas of security, privacy, and accountability. The following example, based on a real system now
in use for reporting and distributing environmental data, will illustrate some of the QA considerations
relevant to a relatively simple distributed application:
The data base application resides on a centralized server with PC- or workstation-based clients
accessing the data over a local area network (LAN). Users can also communicate with the server
using dial-up access or via the Internet. By relying on a commercially available communications
product, the system has been developed by existing project personnel, none of whom have formal
training in computer science. The database programming was done using a popular,
commercially available data base development product. Individual project team members and
some outside users can log on remotely and are able to add and modify data, query the data base,
and generate reports.
Management and QA planning for this project should address the normal concerns of ensuring
that the system is acquired and installed within budget and on-schedule, and that calculations and reports
are correct. QA concerns specific to this system include the following:
AG3.3.4.1 Security. There are many potential security vulnerabilities, and planners should identify as
many of these as possible and state explicitly how they will be prevented. Specific tests should be
conducted that address the security features of the system. Some specific methods for addressing
security vulnerabilities include the following:
• Using separate passwords for user log on, for remote dial-in, and for access to sensitive
portions of the database.
• Restricting downloads of files that could contain viruses and performing regular virus
checks on all machines on the network. Viruses are easily transmitted over the Internet,
and can spread rapidly over LANs. Viruses represent both an operational and a security
EPAQA/G-5 G-13 QA98
-------
risk. Recent viruses have infected wordprocessor macros. Because word processing
files are frequently interchanged via the WWW and e-mail, even such "nonexecutable"
files can pose a danger.
AG3.4.3.2 Privacy. Since this system may contain records of proprietary business data and voluntarily
submitted emissions information, the records must be kept private. The means for ensuring that privacy
is protected was a fundamental requirement for the system. Regular QA reviews are done to verify that
established privacy-related procedures are being followed. These include encrypting the identifying
records and restricting use only to personnel with special password-protected access.
AG3.4.3.3 Personnel Qualifications and Training. Although it is common for technical or clerical
personnel to develop small information systems using currently available "User-friendly" software and
systems environments, this practice can represent a significant risk to a larger project. On the project
described in the example, the qualifications of project staff had been carefully evaluated with respect to
experience with similar information-systems development projects of comparable magnitude. A key
person with this experience was identified and was made the lead programmer/developer. The QA
Officer also had significant computer background and was able to provide additional support during
project implementation. A number of books, utility programs, and other aids were purchased for the
project.
AG3.4.3.4 Data Defensibility and Traceability. With many different users having read/write access to a
common data set, assurance of data integrity was a concern. If absolute traceability of each data item had
been required, an Audit Trail, which records each transaction and includes the date, time, and person
responsible for the change, would be a fundamental part of the requirement. However, an audit trail was
not deemed necessary for this particular project. Backup copies of the data base are being maintained on
a weekly basis and are archived. This serves the dual purpose of providing a backup as well as to trace
any data tampering that might occur. Backups have proved valuable in this relatively open environment
when a user inadvertently overwrites or deletes files. Occasional internal audits are performed to detect
any unexplained changes in the data set overtime.
AG4. REFERENCES
This section provides references that were used in developing this appendix along with
documents that provide more detailed coverage of the topics listed below.
AG4.1 General
U.S. Environmental Protection Agency. 1995. Air Pollution Prevention and Control Division Quality
Assurance Procedures Manual, Appendix G Quality Assurance Planning for Software and Data
Management Projects. Revision 1. Research Triangle Park, NC.
U.S. Environmental Protection Agency. 1996. EPA Guidance for Quality Assurance Project Plans, EPA
QA/G-5. Washington, DC.
AG4.2 Legislation
Clinger-Cohen Act of 1996 (P.L.-104-208). (Note that the Clinger-Cohen Act is the amended title for
the Information Technology Management Reform Act and the Federal Acquisition Reform Act
of 1996 (P.L. 104-106).
Information Technology Management Reform Act of 1996.
EPA QA/G-5 G-14 QA98
-------
Paperwork Reduction Act of 1980 (P.L. 96-511) as amended in 1986 (P.L. 99-500) and 1995
(P.L. 104-13).
AG4.3 Executive Orders and Policy Directives
Executive Order, Federal Information Technology, July 17, 1996.
Office of Management and Budget Circular Number A-130, Management of Federal Information
Resources, February, 1996.
AG4.4 System Development, Operations and Maintenance Guidance Documents
Institute of Electrical and Electronics Engineers. 1994. Software Engineering. Piscataway, NJ.
U.S. Environmental Protection Agency. 1987. Data Standards for the Electronic Transmission of
Laboratory Measurement Results. EPA Directive Number 2180.2. Washington, DC.
U.S. Environmental Protection Agency. 1993. EPA Information Security Manual. EPA Directive
Number 2195. Washington, DC.
U.S. Environmental Protection Agency. 1993. EPA System Design and Development Guidance. EPA
Directive Number 2182. Washington, DC.
U.S. Environmental Protection Agency. 1993. Hardware and Software Standards. Washington, DC.
U.S. Environmental Protection Agency. 1993. Operations and Maintenance Manual. EPA Directive
Number 2181. Washington, DC.
U.S. Environmental Protection Agency. 1994. EPA Information Paper, Distributed System Management,
Draft for Comments. EPA 722/003. Washington, DC.
U.S. Environmental Protection Agency. 1995. Information Resources Management Policy Manual. EPA
Directive Number 2100. Washington, DC.
U.S. Environmental Protection Agency. 1995. Information Technology Architecture Road Map. EPA
612/002A. Washington, DC.
AG4.5 Modeling
U.S. Environmental Protection Agency. 1994. Guidance for Conducting External Peer Review of
Environmental Regulatory Models. EPA 100-B-94-001.
U.S. Environmental Protection Agency. 1994. Report of the Agency Task Force on Environmental
Regulatory Modeling. EPA 500-R-94-001.
EPAQA/G-5 G-15 QA98
------- |