-------
QA Project Plans (QAPPs)
\
Required planning documents that explain how
environmental data collection activities are planned,
implemented, documented, and assessed during the titetff
cycle of a specific program, project, or task
X
j**-
'K-
j.\v
C* QA Project Plans are required when environmental data j
operations occur for:
-Contracts, work assignments, delivery orders
- Grants, cooperative agreements
-Interagency agreements (when negotiated)
-State-EPA agreements
- Responses to statutory or regulatory requirements and to
consent agreements
QMPs vs. QAPPs
Quality Management Plans reflect activities and
policies common to all projects.
Quality Assurance Project Plans reflect specific
projects.
QAPP QAPP
QAPP QAPP
L>
OverQSys -7/00 27-28
-------
Systematic Planning
Planning is the key to successful programs
EPA policy requires that all work be planned using a
Systematic Planning Process
Quality requirements should be based on a Graded
Approach
Effective planning must include all stakeholders (data
users, data producers, decision makers) to ensure
needs are defined at the outset
Planning must be documented
Data Quality Objectives (DQO) Process
Purpose:
To define type and quality of data that a
decision maker needs before carrying out
data collection
- Saves time and money
- Doing it right the first time
- Obtaining data sufficient for analysis
Responsibility:
Developed by project team of data users and
data generators -
Guidance:
Guidance for the DQO Process (QA/G-4),
September 1994
Software:
DEFT Software for the DQO Process
(QA/G-4D) September 1994
OverQSys -7/00 29-30
-------
gommon Element^ln All Systematic
Planning Approaches
Questions to be answered:
-Who is making the decision?
-Why are data being collected?
-What data are needed to make the decision?
-Why does the decision maker need that type and
quality of data?
- How does the decision maker plan to use the data
to make a defensible decision?
-What are the "measures of success" for the
project?
Get only the type, quantity, and quality of data
necessary
Standard Operating Procedures (SOPs)
Purpose:
To document routine technical and
administrative activities to ensure
consistency in the quality of the
product
Responsibility:
Appropriate technical personnel
working with QA Manager
Documentation:
Guidance for the Preparation of
Standard Operating Procedures
(QA/G-6), November 1995
OverQSys -7/00 31-32
-------
Cj&dt /0)c -
Standard Operating Procedures (SOPs)
Written documents that give precise descriptions of
routine procedures
Detail stepwise process for sample collection
operations, laboratory analyses, or equipment use
Ensure consistency and conformance with
organizational practices
Serve as training aids on methods and instrument
use
Provide ready reference and documentation of proper
procedures
Quality Assurance Annual Report and
Work Plans (QAARWPs)
Purpose: To summarize the results of the
implementation of an EPA organization's
quality system in the previous fiscal year
and to describe QA activities planned for
the upcoming year
Responsibility: Senior Management
Documentation: EPA Quality Manual for Environmental
Programs (EPA Order 5360 A1)
^]8(r\ It, ho IQTft &/YW ix
1
C TO u f '~f
uq_
4
(/fyj'^J)) - ฎ r
(ri 3. Iffitz fer V
(j&Q &CCILT~
Pฃ /riJLn^r tP\ Vfr^sTl
/rJL 'n^ folUsr
OverQSys -7/00 33-34
-------
Quality Assurance Annual Report and
Work Plans (QAARWPs)
QAARWPs are:
Management tools for documenting the past fiscal
year's activity and for estimating the workload for the
current year
Required by EPA Order 5360.1 A2
Submitted annually (usually in November) to the
Quality Staff, Office of Environmental Information
Quality System Assessment
Management
Technical
Data Quality
OverQSys -7/00 35-36
-------
Management Assessments
Purpose:
Responsibility:
Documentation:
To determine conformance with an
approved QMP and to assess the
suitability and effectiveness of its
implementation
EPA Managers of individual organizations
Guidance for the Management Systems
Review (MSR) Process (QA/G-3MSR)
.
Management Systems Reviews ' ซ
^ m/Ci(L
Process Quality Audits
Product/Service Quality Audits
C\'rtrni r^(J) / [) )jy\
b'b h^iD *\tf) do
'innp-
"T&juV
Ct!ฃ- i>n ^-^1 I
'd Gi 4- * p^r^rry\ SDruUc
jo :^V
J
/ฆ
-------
Quality System Audit
A documented activity performed to verify, by
examination and evaluation of objective evidence,
that applicable elements of the quality system are
appropriate and have been developed, documented,
and effectively implemented in accordance and in
conjunction with specified requirements.
Such requirements may be defined by:
-EPA Orders
-Extramural Agreement Regulations
-Approved Quality Management Plans
r/u T
sv P: 1/0/ L >
Quality System Audit
QSA uses quantitative approach to documented
quality systems.
Findings are based on objective evidence.
QSA is a conformance/compliance audit:
-Does the quality system conform to specifications?
-Does the quality system comply with regulations?
-Does the quality systems satisfy the QMP?
QSA does not judge quality of individual data sets.
Ad /laV LqfX- 0^
'VI&J
o
f
m
OverQSys -7/00 39-40
-------
Management Systems Review
The qualitative assessment of QA and QC practices
to establish if they conform to policies and
requirements and are adequately implemented to
satisfy needs and expectations.
Such policies and requirements may be given by:
-EPA Orders
-Extramural Agreement Regulations
-Approved Quality Management Plans
Management Systems Review
Similar, but less quantitative than QSA.
Applies best to situations whoro the quality system is
not well-documented.
Investigative In nature - - seeks to determine what is
actually happening.
Interview is primary data collection method.
MSR doe^not judge the quality of individual data
sets.
OverQSys -7/00 41-42
-------
Process Quality Audits
A verification by evaluation of an operation or
method against documented instructions and
standards to measure conformance to these
standards and the effectiveness of the instructions.
PQA examines a PART of the quality system.
Process Is largely the same as the QSA.
Process Quality Audits
PQA is typically shorter and less complex than a
QSA.
PQA is less labor intensive.
Reporting of results may be less formal and quicker.
OverQSys -7/00 43-44
-------
Product/Service Quality Audit
An in-depth examination of a particular product,
result, or service to evaluate whether it conforms to
the documented specifications, performance
standards, and customer/user requirements.
PSQA determines the value-added of the quaity
system to the results achieved.
The quality of individual data sets and other results
are integral to the PSQA with:
-conformance to Agency policy
-compliance with regulations
Product/Service Quality Audit
PSQA is labor intensive and typically time
consuming.
PSQA requires extensive planning to determine:
-scope of the audit
-issues to be addressed
-potential impacts or vulnerabiities of
non-compliances
-limits of time and resources
PSQA uses interviews, detailed file and case study
reviews, and analyses of data to determine value of
results.
OverQSys -7/00 45-46
-------
Product/Service Quality Audit
PSQA requires detailed reporting to link the results to
the audit criteria.
PSQA provides high potential for:
- Comprehensiveness
-Value to management
- Contentiousness
Technical Assessments
Purpose:
Responsibility:
Documentation:
To evaluate the implementation of a
project or activity against its defined
technical or quality procedures or criteria
Project Managers with the assistance of
the appropriate technical personnel, their
EPA Manager, and QA Manager
Guidance on Technical Assessments foL-Cp.
Environmental Data Operations(fQA/G-7),
January 2000
;u"
OverQSys -7/00 47-48
-------
Data Quality Assessment (DQA)
Determine if environmental data are of the type, quantity,
and quality needed
Scientific and statistical evaluation of data
The DQA Process may be performed:
- During a project to check the process of data collection
- At the end of a project to check if objectives were met
The DQA Process provides a tool for confirming that the
systematic planning criteria were met
SUMMARY
Authorities
Internal EPA Policies
>-EPA Order 5360.1 A2
EPA Manual 5360 A1
External Policies
>ฆ 48 CFR 46
*- 40 CFR 30, 31, 36
Quality System Tools
Quality Management Plans
QA Project Plans
Standard Operating Procedures
Systematic Planning and the Data Quality
Objectives Process
Assessments
Quality Assurance Annual Report and Work Plan
OverQSys -7/00 51-52
-------
Technical Assessments
Technical Assessments are self or independent
evaluation processes used to measure the
conformance, performance, or effectiveness of
systems
ฆ Technical Assessments include:
-Technical Systems Audits
-Readiness Reviews
-Surveillances
- Performance Evaluations u, ฆ
-Audits of Data Quality
- Peer Reviews
Data Quality Asse$$ment (DQA)
Purpose:
To assess type, quantity, and quality of data
Verifies DQOs ,
- Develops DQOs objectives if not fully
developed
Verifies QAPP components
- Verifies sample collection procedures
Responsibility:
Appropriate technical personnel
Documentation:
Guidance for DQA: Practical Methods lor
Data Analysis (QA/G-9), July 1996
Data Quality Evaluation Statistical Toolbox
(DataQUEST) (QA/G-9D), December 1997
OverQSys -7/00 49-50
-------
SUMMARY (continued)
Quality System Documents
-EPA Requirements for Quality Management Plans
(QA/R-2)
-EPA Requirements for Quality Assurance Project
Plans (QA/R-5)
-Guidance for Quality Assurance Project Plans
(QA/G-5)
-Guidance for the Data Quality Objectives Process
(QA/G-4)
-Guidance for the Preparation of Standard Operation
Procedures for Quality-Related Documents
(QA/G-6)
OverQSys -7/00 53-54
-------
-V o-\ @ ,'J),
l"fd{ odbk^M ฃ>^ ->U>kiJ
HANDOUTS
(s'i, t~WปA (Wj
Learning is not compulsory but neither is survival,
- W. Edwards Deming ~
\Um/ฃ>fr fhDyr&jyyf
Q'fC'^i lJS~>"CslVrp J
-------
HANDOUT #1
OVERVIEW OF QUALITY SYSTEM REQUIREMENTS
Pre-Course Self Assessment
Yes
No
1. I know what a Quality System is.
2. I can describe my organization's Quality System.
3. I know who my Quality Assurance Manager is.
4. I know EPA's Quality System requirements.
5. I understand the purpose and applicability of:
- Quality Management Plans
- Quality Assurance Project Plans
- Systematic Planning
- Assessments
- Standard Operating Procedures
Overview of Quality System Requirements
-------
HANDOUT#2
Quality-Related Definitions
(From EPA Manual 5360, July 1998)
assessment - the evaluation process used to measure the performance or effectiveness of a
system and its elements. As used here, assessment is an all-inclusive term used to denote any of
the following: audit, performance evaluation, management systems review, peer review,
inspection, or surveillance.
audit (quality) - a systematic and independent examination to determine whether quality
activities and related results comply with planned arrangements and whether these arrangements
are implemented effectively and are suitable to achieve objectives.
bias - the systematic or persistent distortion of a measurement process which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value).
calibration - comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.
data quality assessment (DQA) - a statistical and scientific evaluation of the data set to
determine the validity and performance of the data collection design and statistical test, and to
determine the adequacy of the data set for its intended use.
data quality objectives (DQOs) - qualitative and quantitative statements derived from the DQO
Process that clarify study objectives, define the appropriate type of data, and specify tolerable
levels of potential decision errors that will be used as the basis for establishing the quality and
quantity of data needed to support decisions.
data quality objectives process - a systematic planning tool to facilitate the planning of
environmental data collection activities. Data quality objectives are the qualitative and
quantitative outputs from the DQO Process.
design - specifications, drawings, design criteria, and performance requirements. Also the result
of deliberate planning, analysis, mathematical manipulations, and design processes.
document - any compilation of information which describes, defines, specifies, reports, certifies,
requires, or provides data or results pertaining to environmental programs.
environmental conditions - the description of a physical medium (e.g., air, water, soil,
sediment) or biological system expressed in terms of its physical, chemical, radiological, or
biological characteristics.
environmental data -any measurements or information that describe environmental processes,
Overview of Quality System Requirements -1 -
-------
location, or conditions; ecological or health effects and consequences; or the performance of
environmental technology. For EPA, environmental data include information collected directly
from measurements, produced from models, and compiled from other sources such as data bases
or the literature.
environmental data operations - work performed to obtain, use, or report information
pertaining to environmental processes and conditions.
environmental processes - manufactured or natural processes that produce discharges to, or that
impact, the ambient environment.
environmental programs - work or activities involving the environment, including but not
limited to: characterization of environmental processes and conditions; environmental
monitoring; environmental research and development; and the design, construction, and
operation of environmental technologies; and laboratory operations on environmental samples
environmental technology - an all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies and
their components that may be utilized to remove pollutants or contaminants from or prevent them
from entering the environment. Examples include wet scrubbers (air), soil washing (soil),
granulated activated carbon unit (water), and filtration (air, water). Usually, this term applies to
hardware-based systems; however, it also applies to methods or techniques used for pollution
prevention, pollutant reduction, or containment of contamination to prevent further movement of
the contaminants, such as capping, solidification or vitrification, and biological treatment.
extramural agreement - a legal agreement between EPA and an organization outside EPA for
items or services to be provided. Such agreements include contracts, work assignments, delivery
orders, task orders, cooperative agreements, research grants, state and local grants, and EPA-
fimded interagency agreements.
financial assistance - the process by which funds are provided by one organization (usually
government) to another organization for the purpose of performing work or furnishing services or
items. Financial assistance mechanisms include grants, cooperative agreements, and government
interagency agreements.
graded approach - the process of basing the level of application of managerial controls applied
to an item or work according to the intended use of the results and the degree of confidence
needed in the quality of the results.
independent assessment - an assessment performed by a qualified individual, group, or
organization that is not a part of the organization directly performing and accountable for the
work being assessed.
management - those individuals directly responsible and accountable for planning,
implementing, and assessing work.
management assessment - the qualitative assessment of a particular program operation and/or
organization(s) to establish whether the prevailing quality management structure, policies,
Overview of Quality System Requirements -2-
-------
practices, and procedures are adequate for ensuring that the type and quality of results needed are
obtained. A management assessment may either be performed by those immediately responsible
for overseeing and/or performing the work (i.e., a management self-assessment) or by someone
other that the group performing the work (i.e., a management independent assessment).
management system - a structured non-technical system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of
an organization for conducting work and producing items and services.
management systems review (MSR) - the qualitative assessment of a data collection operation
and/or organization(s) to establish whether the prevailing quality management structure, policies,
practices, and procedures are adequate for ensuring that the type and quality of data needed are
obtained.
measurement and testing equipment - tools, gauges, instruments, sampling devices or systems
used to calibrate, measure, test, or inspect in order to control or acquire data to verify
conformance to specified requirements.
method - a body of procedures and techniques for performing an activity (e.g., sampling,
chemical analysis, quantification) systematically presented in the order in which they are to be
executed.
observation - an assessment conclusion that identifies a condition (either positive or negative)
which does not represent a significant impact on an item or activity. An observation may
identify a condition which does not yet cause a degradation of quality.
organization - a company, corporation, firm, enterprise, or institution, or part thereof, whether
incorporated or not, public or private, that has its own functions and administration. In the
context of this Manual, an EPA organization is an office, region, national center or laboratory.
peer review - a documented critical review of work by qualified individuals (or organizations)
who are independent of those who performed the work, but are collectively equivalent in
technical expertise. A peer review is conducted to ensure that activities are technically adequate,
competently performed, properly documented, and satisfy established technical and quality
requirements. The peer review is an in-depth assessment of the assumptions, calculations,
extrapolations, alternate interpretations, methodology, acceptance criteria, and conclusions
pertaining to specific work and of the documentation that supports them.
performance evaluation (PE) - a type of audit in which the quantitative data generated in a
measurement system are obtained independently and compared with routinely obtained data to
evaluate the proficiency of an analyst or laboratory.
Overview of Quality System Requirements -3-
-------
precision - a measure of mutual agreement among individual measurements of the same
property, usually under prescribed similar conditions, expressed generally in terms of the
standard deviation.
process - a set of interrelated resources and activities which transforms inputs into outputs.
Examples of processes include analysis, design, data collection, operation, fabrication, and
calculation.
quality - the totality of features and characteristics of a product or service that bear on its ability
to meet the stated or implied needs and expectations of the user.
quality assurance (QA) - an integrated system of management activities involving planning,
implementation, documentation, assessment, reporting, and quality improvement to ensure that a
process, item, or service is of the type and quality needed and expected by the customer.
quality assurance manager (QAM) - the individual designated as the principal manager within
the organization having management oversight and responsibilities for planning, documenting,
coordinating, and assessing the effectiveness of the quality system for the organization.
quality assurance project plan (QAPP) - a document describing in comprehensive detail the
necessary QA, QC, and other technical activities that must be implemented to ensure that the
results of the work performed will satisfy the stated performance criteria.
quality control (QC) - the overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the
stated requirements established by the customer; operational techniques and activities that are
used to fulfill requirements for quality.
quality improvement - a management program for improving the quality of operations. Such
management programs generally entail a formal mechanism for encouraging worker
recommendations with timely management evaluation and feedback or implementation.
quality management - that aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning,
allocation of resources, and other systematic activities (e.g., planning, implementation,
documentation, and assessment) pertaining to the quality system.
quality management plan (QMP) - a document that describes a quality system in terms of the
organizational structure, policy and procedures, functional responsibilities of management and
staff, lines of authority, and required interfaces for those planning, implementing, documenting,
and assessing all activities conducted.
Overview of Quality System Requirements -4-
-------
quality system - a structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and
implementation plan of an organization for ensuring quality in its work processes, products
(items), and services. The quality system provides the framework for planning, implementing,
documenting, and assessing work performed by the organization and for carrying out required
QA and QC.
readiness review - a systematic, documented review of the readiness for the start-up or
continued use of a facility, process, or activity. Readiness reviews are typically conducted before
proceeding beyond project milestones and prior to initiation of a major phase of work.
record - a completed document that provides objective evidence of an item or process. Records
may include photographs, drawings, magnetic tape, and other data recording media.
scientific method - the principles and processes regarded as necessary for scientific
investigation, including rules for concept or hypothesis formulation, conduct of experiments, and
validation of hypotheses by analysis of observations.
self-assessment - assessments of work conducted by individuals, groups, or organizations
directly responsible for overseeing and/or performing the work.
standard operating procedure (SOP) - a written document that details the method for an
operation, analysis, or action with thoroughly prescribed techniques and steps, and that is
officially approved as the method for performing certain routine or repetitive tasks.
supplier - any individual or organization furnishing items or services or performing work
according to a procurement document or financial assistance agreement. This is an all-inclusive
term used in place of any of the following: vendor, seller, contractor, subcontractor, fabricator, or
consultant.
surveillance (quality) - continual or frequent monitoring and verification of the status of an
entity and the analysis of records to ensure that specified requirements are being fulfilled.
technical assessment - the evaluation process used to measure the performance or effectiveness
of a technical system and its elements with respect to documented specifications and objectives.
Such assessments may include qualitative and quantitative evaluations. A technical assessment
may either be performed by those immediately responsible for overseeing and/or performing the
work (i.e., a technical self-assessment) or by someone other that the group performing the work
(i.e., a technical independent assessment).
Overview of Quality System Requirements -5-
-------
technical review - a documented critical review of work that has been performed within the state
of the art. The review is accomplished by one or more qualified reviewers who are independent
of those who performed the work, but are collectively equivalent in technical expertise to those
who performed the original work. The review is an in-depth analysis and evaluation of
documents, activities, material, data, or items that require technical verification or validation for
applicability, correctness, adequacy, completeness, and assurance that established requirements
are satisfied.
technical systems audit (TSA) - a thorough, systematic, on-site, qualitative audit of facilities,
equipment, personnel, training, procedures, record keeping, data validation, data management,
and reporting aspects of a system.
user - an organization, group, or individual that utilizes the results or products from
environmental programs or a customer for whom the results or products were collected or
created.
validation - confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use are fulfilled. In design and development, validation
concerns the process of examining a product or result to determine conformance to user needs.
verification - confirmation by examination and provision of objective evidence that specified
requirements have been fulfilled. In design and development, verification concerns the process
of examining a result of a given activity to determine conformance to the stated requirements for
that activity.
Overview of Quality System Requirements
-6-
-------
HANDOUT #3
Overview of Quality System Requirements
-------
HANDOUT #4
E4 Structure
ANSI/ASQC E4-1994, Specifications and Guidelines for
Quality Systems for Environmental Data Collection
and Environmental Technology Programs
EPA has adopted the American National Standard ANSI/ASQC E4-1994, Specifications and
Guidelines for Quality Systems for Environmental Data Collection and Environmental
Technology Programs, as the basis for its quality system. E4 is a national consensus standard
authorized by the American National Standards Institute (ANSI) and developed by the American
Society for Quality (ASQ).
The standard is modular in design and is organized into three parts. Part A describes the quality
management elements that are common to environmental programs regardless of their technical
scope. The other parts of the standard contain the quality system elements applicable to technical
areas. The specific applicability of the standard (or parts thereof) to individual environmental
programs is left to the user of the standard to determine.
Part A describes the quality management elements needed for managing environmental programs
effectively. These include:
management and organization,
quality system and description,
personnel qualification and training
procurement of items and services,
documentation and records,
computer hardware and software,
planning,
implementation of work processes,
assessment and response, and
quality improvement.
Part A defines the framework containing the common quality management practices that enable
project-specific operations to be planned, implemented, and assessed. These elements are used
in conjunction with the other parts of the standard to formulate a complete quality system.
Part B contains the additional quality system elements needed to plan, implement, and assess
environmentally-related data operations, including the collection, handling, analysis, and
evaluation of environmentally-related data. The Part B elements must be used in conjunction
with Part A in order to provide an adequate quality system for collecting and evaluating
environmental data. Such data include chemical, biological, toxicological, ecological, radio-
logical, and physical data. These data may be obtained directly from the environment or from
systems representing environmental conditions, such as laboratories or test chambers. The
activities described in Part B have traditionally been associated with environmental monitoring.
Overview of Quality System Requirements -1 -
-------
Part B elements also apply to the collection of environmental data that are used directly to
design, construct, or operate environmental technology. The program elements contained in Part
B are:
Environmental data also include data derived from samples collected from the environment, the
results of other analytical testing (e.g., geophysical, hydrogeological) of environmental
conditions, and process or physical parameters from the operation of environmental technologies.
Part C provides the additional quality system elements pertaining to environmental technology
(and their system components) that remediate environmental contamination, prevent or remove
pollutants from process discharges, or dispose of or store hazardous, radioactive, and/or mixed
wastes. The Part C elements must be used in conjunction with Part A to provide an adequate
quality system for the design, construction, and operation of environmental technology. The
program elements contained in Part C are:
planning,
design of systems,
construction/fabrication of systems and components,
operation of systems,
quality assessment and response, and
verification and acceptance of systems
The Part C elements describe the project-specific activities needed to plan, implement, and assess
the design, construction, and operation of such technologies, and to ensure that the technologies
will perform as intended. Environmental process or condition characterization activities that
produce data used in support of the design, construction, and operation of environmental
technology must be conducted according to the specifications of Part B.
Copies of the ANSI/ASQC E4 may be purchased from:
planning and scoping,
design of data collection operations,
implementation of planned operations,
quality assessment and response, and
assessment and verification of data usability.
ASQC Quality Press
P.O. Box 3005
Milwaukee, WI 53201-3005
Phone: (800) 248-1946
www.asq.org
Overview of Quality System Requirements
-2-
-------
HANDOUT#5
Determining the Quality Requirements for Financial Agreements with EPA
Contract
Cooperative
Agreement
Grant*
Inter-Agency
Agreement
Unfunded
Mandate
Contractor
48 CFR 1546
48 CFR 46
N/A
N/A
N/A
N/A
Federal
Agency
N/A
N/A
N/A
Negotiated into
each agreement
Contained in specific Federal Regulation
that requires data collection
Hospital
48 CFR 1546
48 CFR 46
40 CFR 30
40 CFR 30
N/A
Contained in specific Federal Regulation
that requires data collection
Institute
of Higher
Education
48 CFR 1546
48 CFR 46
40 CFR 30
40 CFR 30
N/A
Contained in specific Federal Regulation
that requires data collection
Local
Government
48 CFR 1546
48 CFR 46
40 CFR 31
40 CFR 35
40 CFR 31
40 CFR 35
N/A
Contained in specific Federal Regulation
that requires data collection
Non-profit
Organization
48 CFR 1546
48 CFR 46
40 CFR 30
40 CFR 30
N/A
Contained in specific Federal Regulation
that requires data collection
Regulated
Entity
N/A
N/A
N/A
N/A
Contained in specific Federal Regulation
that requires data collection
State
Government
48 CFR 1546
48 CFR 46
40 CFR 31
40 CFR 35
40 CFR 31
40 CFR 35
N/A
Contained in specific Federal Regulation
that requires data collection
Tribal
Government
48 CFR 1546
48 CFR 46
40 CFR 31
40 CFR 35
40 CFR 31
40 CFR 35
N/A
Contained in specific Federal Regulation
that requires data collection
*Grants include Performance Partnership Grants and Performance Partnership Agreements.
Overview of Quality System Requirements
-1-
-------
Federal Regulations
48 CFR 1546: Requires the development of a Quality Management Plan and/or a Quality
Assurance Project Plan. This regulation will be removed pending notice in the Federal Register.
48 CFR 46: Allows Federal Agencies to select a national consensus standard as a basis for their
quality requirements. EPA intends to select ANSI/ASQC E4 as the basis for its quality
requirements and require that applicants/contractors, through revised clauses, submit a Quality
Management Plan (or equivalent) and/or a Quality Assurance Project Plan (or equivalent) to
demonstrate conformance to the standard. The selection of E4, the revised contracting clauses,
and the removal of 48 CFR 1546 will be effective pending notice in the Federal Register.
40 CFR 30: Grantee must comply with the American National Standard, ANSI/ASQC E4. EPA
requires that grantees submit a Quality Management Plan and/or a Quality Assurance Project
Plan to demonstrate conformance.
40 CFR 31: Requires grantee to develop and implement quality assurance practices to produce
data of adequate quality to meet project objectives. To clarify this requirement, EPA has issued
clarifying language, posted at www.epa.gov/ogd/Qa.htm. which is consistent with 40 CFR Part
30. In essence, the clarifying language states that a grantee must have a quality system that
conforms to the American National Standard, ANSI/ASQC E4-1994 and is required to submit a
Quality Management Plan and/or a Quality Assurance Project Plan.
40 CFR 35: Requires grantee to comply with 40 CFR 31. For the full text of 40 CFR 35, see
qa_cfrs.html#40P ART35.
Overview of Quality System Requirements
-2-
-------
OCE
Revision No 0
Date: October 1. 1994
Page- i of in
QUALITY MANAGEMENT PLAN
FOR
THE OFFICE FOR A CLEAN ENVIRONMENT
(Document Control No. OA-OCE-Ol)
Office for a Clean Environment
Office of the Assistant Administrator
for Ensuring a Clean Environment
U. S. Environmental Protection Agency
October 1, 1994
S/9S
222 QiVIP Example
-------
QUALITY MANAGEMENT PLAN (QMP) EXAMPLE
INSTRUCTIONS:
You have just been appointed Director of the Office for a Clean Environment (OCE). Among the
items waiting on your desk for approval is your new office's Quality Management Plan
submission. You study this document carefully in order to learn how a key component of your
organization is structured.
Since you understand the importance of Quality Management Plans (QMPs) and know what
belongs in them, it's easy for you to determine that the document you are reviewing is a dreadful
mess. There are at least ten fundamental flaws in this plan. Your new office clearly needs help!
In order to focus on the necessary revision process, you should now do the following:
1. On a piece of paper, list the major deficiencies in this QMP.
2. Briefly describe how to correct each deficiency. After you have
listed the plan's deficiencies and described how to correct them, be
prepared, as part of the ensuing class discussion, to discuss them.
3. Discuss some consequences to your organization's quality
assurance effort if the deficiencies are left uncorrected.
S/98
222 QMP Example
-------
OCE
Revision No . 0
Date October 1, 1994
Page u of in
QUALITY MANAGEMENT PLAN IDENTIFICATION AND APPROVAL FORM
Document Title:
Document Control No.:
Organization Title:
Address:
Responsible Official:
QA Manager:
Date:
Concurrence:
Title:
Date:
Quality Management Plan
QA-OCE-Ol
Office for a Clean Environment
P. O. Box 12345
Washington, DC 20460
Your name
Phone No. 555-1212
James MacArthur
Phone No. 555-1414
October 1, 1994
Your name
Director, Office for a Clean Environment
October 3, 1994
Approval for the Agency:
Title:
Robert J. Huggett, Ph.D.
Assistant Administrator for Research and Development
Date:
S/9S
222 QiVlP Example
-------
OCE
Revision No 0
Date- Octobei 1. 1994
Paae in of in
TABLE OF CONTENTS
SECTION
Cover Page
Identification and approval page
Table of Contents
1. Management and Organization
2. Quality System and Description
3. Personnel Qualification and Training
4. Procurement of Items and Services
5. Documents and Records
6. Computer Hardware and Software
7.
Planning
Implementation of Work Processes
9. Assessment and Response
PAGE
in
10. Quality Improvement
DATE OF LAST
REVISION
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
October 1, 1994
8/98
222 QMP Example
-------
OC.E
Revision No 0
Date Octobei 1. 1994
Page- 1 of 4
1. MANAGEMENT AND ORGANIZATION
1.1 ENVIRONMENTAL DATA COLLECTION ACTIVITIES (EDCAs)
OCE is responsible for ensuring that all aspects of the environment we'live in are clean and
healthy. To carry out its mission, OCE is organized into five divisions: 1) the Division of Clean
Air; 2) the Division of Clean Water; 3) the Division of Clean Land; 4) the Division of Research
and Development; and 5) the Division of Administration and Support. Each division is headed
by a Division Director who reports directly to the Director, OCE. The Director, OCE,
determines which aspects of the environment are not clean and healthy and reports these findings
to the Assistant Administrator, EPA. OCE's Quality Assurance Officer reports directly to the
Director of the Division of Administration and Support.
Data collection activities are carried out under the direction of the responsible Division Director,
who communicates the results of these activities to the Director, OCE.
1.1.1 In-House Projects
Intramural projects involving environmentally-related measurements are conducted by the
Divisions of Clean Air, Clean Water, and Clean Land. The Division Director determines which
projects require QA Project Plans.
1.1.2 Extramural Projects
Extramural projects involving environmentally-related measurements are also conducted by
independent contractors. The Division Director designates a staff person to oversee the
implementation of these projects. The designated individual determines the need to develop QA
Project Plans.
1.2 DATA GENERATION DELEGATED TO REGIONAL OFFICES
OCE periodically meets with Regional personnel on QA issues to assure that the Regional
Offices give appropriate priority to QA.
1.3 OA RESPONSIBILITIES
1.3.1 Organization, Delegations, and Responsibilities
The Director, OCE, is responsible for ensuring that QA is an integral part of OCE operations.
Division Directors are responsible for overseeing the collection of data derived from
environmentally-related measurements whose quality is known. Management of in-house and
extramural projects is delegated to a responsible staff person.
S/98
222 QMP Example
-------
OCE
Revision No . 0
Date October 1. 1994
Page- 2 of 4
1.3.2 Responsibilities/Authorities of the QAO
The Quality Assurance Officer (QAO) is an important link in the management and
implementation of OCE's QA program. The QAO reports directly to the Chief, Administration
Section in the Division of Administration and Support. The QAO is responsible for ensuring that
OCE has an approved Quality Management Plan.
2 0 QUALITY SYSTEM AND DESCRIPTION
2.1 OA POLICY STATEMENT
It is the policy of the Office for a Clean Environment (OCE) to ensure the generation of data
derived from the environmentally-related measurements whose quality (i.e., precision, bias,
completeness, representativeness, and comparability) is known. The Office's quality assurance
effort is accomplished through the development and implementation of a Quality Management
Plan.
This document sets forth the QA policies, procedures, and management systems needed by OCE
to implement its QA program.
2.2 OA PROGRAM TOOLS
The OCE uses the latest QA tools available to obtain good data. These tools include QA Project
Plans (QAPPs), audits, and Standard Operating Procedures (SOPs). The OCE always uses the
latest guidance from headquarters on these QA tools and adds changes to the guidance for
application to specific projects as determined by the project officer.
3 PERSONNEL QUALIFICATIONS AND TRAINING
OCE has a trained and competent staff capable of ensuring that the Office's QA effort is carried
out efficiently and in a timely manner. Consequently, there is no need for a formal QA training
program.
4 PROCUREMENT OF ITEMS AND SERVICES
The OCE follows the procurement regulations to the letter in acquiring any necessary items or
services. The project officer determines if QA is required on any procurement and tells the
procurement office what should be included in the procurement.
S/9S
222 QiMP Example
-------
OCE
Revision No 0
Date October 1. 1994
Page 3 of 4
5 DOCUMENTS AND RECORDS
5.1 DOCUMENTS
The OCE keeps its documents in a library where all staff can have access to them, including
documents prepared by contractors and others.
5.2 RECORDS
The official records for a contract are kept by the Contracts Officer. Other records may be kept
by the Project Officer until a project is finished. The Project Officer determines how long to
retain the records and how to dispose of them.
6. COMPUTER HARDWARE AND SOFTWARE
The OCE uses the latest in personal computers to perform project work. Project officers use PCS
to evaluate data collected and to track projects with spreadsheets. When too many data are
received, the data are loaded into the EPA mainframe computer so that models can be used on
that data.
The OCE has a cooperative agreement with Whatsamatta U. that provides a Ph.D. statistician and
two part-time statistics students to perform analyses of data collected and audit results.
7. PLANNING
7.1 PLANNING FOR DATA GENERATION
7.1.1 Data Quality Objectives
Data Quality Objectives (DQOs) are established for many projects conducted by the Divisions of
Clean Air, Clean Land, and Clean Water. The decision to develop DQOs for a specific project
rests with the Division Directors. When the decision is made to develop DQOs for a particular
project, they are developed by the Project Officer with lead responsibility for the activity. (DQOs
are based primarily on the capabilities and limitations of the applicable equipment and
measurement methods.)
7.2 ANNUAL PLANNING
The Director, Division of Administration and Support, meets annually with the QAO to plan
OCE quality assurance efforts for the year. Although QA activities are not a line item in the
OCE budget, the QAO tries to ensure that adequate resources are provided for OCE's quality
assurance program.
222 QMP Example
-------
OCE
Revision No.: 0
Date October 1. 1994
Page: 4 of 4
8 IMPLEMENTATION OF WORK PROCESSES
The Project Officer is responsible for implementing the projects of the OCE through contracts,
work assignments, cooperative agreements, and grants.
9 ASSESSMENT AND RESPONSE
9.1 THE AUDIT/REVIEW PROGRAM
OCE periodically conducts audits of its QA program and data collection activities. The Division
of Administration and Support schedules and conducts the following audits on an "as needed"
basis:
Data quality audits;
Performance evaluations;
Technical systems audits.
Follow-up of audit results is left up to the appropriate Project Officer.
10. QUALITY IMPROVEMENT
The Director, OCE, reviews the performance of Project Officers in the Office to determine where
improvements can be made. Division Directors determine when corrective actions are needed to
improve work.
S/98
222: QMP Example
-------
Ai KAalulm f&x4inii Amuow i-ks^a
prปป4iU.
^0-c.y-iMC INTEGRATING quality assurance
tyh INTO PROJECT DEVELOPMENT
:l_r agenda ฅ M G'H ^ L
G&' August 9,2000 (W71
Qd^>" r-^
Vป* K C\
8:30 Introductions and Welcome
nenting tb
BREAK
Implementing the Quality System V. jd^- ' " /)
BREAK ttrฐ^ A /
QAPP Part A: Project Management and Sampling Design
Developing DQOs: Steps 1-5
(W Boundaries Setting Activity
12:00 LUNCH
ฆSO
1:00 DQO Process Step 6: Setting Limits on Decision Errors
DQO Step 7/Sampling Design Process
BREAK
Sampling Design (DEFT) Exercise
5:00 Dismiss
US EPA
August 10^000", C wmica| Libraries
L:h
-------
1
-------
Integrating Quality Assurance
into Project Development
!
Trainers
Malcolm Bertoni
Alison Boshes
Research Triangle Institute
202/728-2067
Nancy Hassig
Battelle Pacific Northwest Division
650/969-3969
I ntro/Warm-u p-8/00-1 -2
-------
Overview
This course is advanced and requires prerequisite
training and/or experience.
Dav One
~ How do you plan to collect the right data?
~ How are your project and data quality objectives
translated into a sampling design?
Dav Two
~ How do you document your planning and
implementation activities in the project's QAPP?
- How do you verify that the data you collected
met the assumptions you made during planning?
~ What do your data tell you?
Learning Objectives
1. Participants will be able to explain the value of
systematic processes in developing QAPPs.
2. Participants will be able to describe the links
between DQOs, DQIs, DQA, and the QAPP.
3. Participants will be aware of and be able to use Qj 5
resources and tools, including: G-4, G-4D (DEFT),
G-5, G-9, and G-9D (DataQUEST).
4. Participants will be able to explain how the outputs
of DQO Process Steps 6 and 7, precision and bias
assumptions, and distributional assumptions are
used to develop a sample collection design.
I ntro/Warm-up-8/00-3-4
-------
Approach
This training workshop is interactive. It will consist
of small and large group discussions, group and
individual exercises, lecture, and demonstrations.
You will work in teams. Your team is comprised of
the people sitting around your table. Together, you
will complete part of the DQO Process, develop a
sampling design, prepare sections of a QAPP, and
analyze data.
The site you will be working with during the training
exercises is a simulated hazardous waste site. It is
called the Electronic Manufacturing Corporation of
America (EMCA) site, also known as SimSITE and is
based on an actual Superfund site.
Day 1
8:30 Introductions and welcome
Implementing the Quality System
Quality System exercise
BREAK
QAPP: project management & sampling design
DQO Process Steps 1 -5
Boundaries setting activity
12:00 LUNCH
1:00 DQO Process Step 6
DQO Process Step 7
BREAK
Sampling design exercise
5:00 Summarize and dismiss
Intro/Warm-up-8/00-5-6
-------
Map of EMCA site
Dirt road
area
'I Z 50
ppm) sprayed over dirt road to control dust
ป PCBs are carcinogens and cause harmful liver,
skin, reproductive, and developmental effects
Contaminated media: surface soil
Primary pathways: direct ingestion of soil,
inhalation of dust
j Receptors: adults, children
I
i
i
I
I
I
lntro/Warm-up-8/00-7-8
-------
Regulatory Context
Superfund site (CERCLA/SARA)
~ Site is on National Priorities List
Currently in early Remedial Investigation stage
ป Preliminary Assessment I Site Inspection
determined the types of contamination
Candidate for the streamlined approach advocated
by the EPA Region
Focus will be on one operable unit of site ^
Sociopolitical Context
Nearby residential development has a homeowners
association: some members are active in
environmental advocacy
Mayor and majority of City Council interested in^e^y-
redevelopment of site inr
rui lunc uicmuiauiuiei
both interested in property
Residential developer and Fortune 500 manufacturer^^
EPA Regional Administrator is interested in
promoting Superfund streamlining process
lntro/Warm-up-8/00-9-10
-------
Project Resources and Constraints
Site manager has access to EPA scientists,
contractors, and consultants with expertise in a
variety of scientific and engineering disciplines
Project funded by Superfund; preliminary field
investigation budget of $200K for PCB area
Schedule calls for draft report on field investigation
results in 6 months
Writing Activity
Turn to the back of your notebook and locate the
participant's journal.
Answer the following questions in the space provided on
page J-1.
What are the one or two most pressing or important QA
issues that you have to deal with in your work?
Considering the objectives, agenda, and your personal
experience, list some expectations you have of this course.
lntro/Warm-up-8/00-11 -12
-------
Warm-up Activity
At your tables, please introduce yourself to your
teammates by giving the following information:
* Name,
~ Affiliation, and
ป A brief summary of your technical
background.
(please be brief)
lntro/Warm-up-8/00-13-14
-------
Implementing the Quality
System
What is a Quality System?
A structured and documented management system.
Describes the policies, objectives, principles,
organizational authority, responsibility,
accountability, and implementation plan of an
organization for ensuring quality in its work
processes, products (items), and services.
Provides the framework for planning, implementing,
and assessing work performed by the organization
and for carrying out required QA and QC activities.
-------
EPA Quality System Model
op
&
t*
ntemal EPA Folic
EPA Order 5360 1
EPA Manual 536Q
Quality System
Documentation
Supporting System Elements
Training/Communication
Annual Review and
Planning
Systems
Assessments
SyntunutJc
Planning
-------
(%>n m$V)
Graded Approach to QA/
ISA
L-
The Graded Approach defines quality
requirements according to:
o Thetype of work being done; and
Tl(ferisk^f making a wrong decision from ฃ
thedatacollected ^
Examples of Varying Quality Requirements:
Bench-level research investigation
o Superfund remedial investigation
Enforcement/compliance determination
QA in a Project's Lifecycle
There are three key phases to a project:
* Planning (Data Quality Objectives)
~ Implementation (QA Project Plans)
Assessment (Data Quality Assessment)
151a-8/2000-5-6
-------
QAPPs are Required
QAPPs are jrequired\for all environmental data
collection operations involving direct
measurements performed by orfor EPA.
A3.
(EPA Order 5360.4-GHฉ"1 (July 1998) "Policy and Program
Requirements to Implement the Mandatory Quality
Assurance Program") ^ '
QAPPs must demonstrate...
The project's technical and quality objectives
are identified and agreed upon by management
Intended measurements and acquisition
methods are consistent with project objectives
Assessment procedures are sufficient for
determining if data are of type and quality
needed for decision-making
Limitations on the use of data are identified
151a-8/2000-7-8
-------
Data Quality Objectives Process
The DQO Process is the Agency's
recommended planning process for decision
making^that provides for:
- Early involvement of the decision maker
- A graded approach to data quality
requirements
~ More effective sampling and analysis
programs
- A basis for judging the usability of the
collected data
Data Quality Objectives (DQOs)
The full set of planning information needed to design
a project, including:
ปWhat decision will be made?
ปWhat data will be gathered?
ป- Where and when?
ป How will data be used?
* What quality must the data be?
~ How much time and money is available and
needed?
~ What are tolerable levels of uncertainty?
This information can be used to produce an optimal
study design meeting the decision maker's needs in a
resource-effective manner
151a-8/2000-9-10
-------
Seven Steps of the DQO Process
1. State the Problem
2. Identify the Decision
3. Identify Inputs to the Decision
4. Define the Boundaries
5. Develop a Decision Rule
6. Specify Limits on Decision Errors
7. Optimize the Design
QA Project Plans (QAPPs)
QAPPs must be approved by EPA prior to the
start of the data collection
QAPPs are mandatory when environmental data
operations occur for:
Contracts, work assignments, and delivery
orders
Assistance agreements
Interagency agreements (when negotiated)
151a-8/2000- 11-12
-------
QAPP Structure
QAPP is composed of 24 elements
Elements are grouped into 4 classes:
Part A: Project Management
Part B: Measurement/Data Acquisition
Part C: Assessment/Oversight
Part D: Data Validation and Usability
Not all projects require 24 elements
Other projects may require additional
information not in 24 elements
Integrating QA Processes with
Documentation Requirements
PROCESS DOCUMENTATION
INFORMATION FLOW
151a - 8/2000 -13-14
-------
DQA Process - Retrospective
Were the data quality objectives achieved?
Where the data quality objectives
meaningful?
Where the assumptions viable?
Were the statistical tests powerful enough?
DQA Process - Prospective
What supplemental information is needed?
Are further data really necessary?
Should parts of the DQOs be changed?
Where can the DQO process be improved?
151a-8/2000-15-16
-------
Steps of the DQA Process
1. Review the DQOs and Sampling Design
2. Conduct Preliminary Data Review
3. Select a Statistical Test
4. Verify Assumptions Underlying Test
5. Draw Conclusions from the Data
EPA Guidance
Planning: EPA QA/G-4, Guidance for the
Data Quality Objectives Process
Implementation: EPA QA/G-5, EPA Guidance
for Quality Assurance Project Plans
Assessment: EPA QA/G-9, Guidance for
Data Quality Assessment
L
151a-8/2000-17-18
-------
Exercise
Turn to page J-2 of your participant handbook.
1. Circle the component(s) of the Quality System with which
you have the most experience or for which you have the
most responsibility.
2. Think about your answer to question 1 on page J-1 and draw
a box around the components involved in addressing the
issues you identified as especially pressing or important.
Discuss your answers at your tables.
Day One- Planning
QAPP Part A
DQO Process
Sampling design
151b-8/2000- 1-2
-------
Day Two- Implementation and
Assessment
Data Quality Indicators
QAPP Part B
QAPP Parts C and D
Data Quality Assessment
Application Planning
Summary
The EPA Quality System captures the data life
cycle (planning, implementation, assessment,
and documentation) at three levels:
policy
organization
project
This course focuses on the elements of the
quality system at the project level.
151b-8/2000- 3-4
-------
Quality Assurance Project Plan:
Project Management and
Sampling Design
Integrating QA Processes with
Documentation Requirements
PROCESS DOCUMENTATION
INFORMATION FLOW
QAPP h
Part A
Part B
Part C
Part D
i Oat* CoUaetkm i
^
OQA
Project Report
[
354a. pre-4/11/99-1-2
-------
QAPP Documents
Requirement document:
EPA QA/R-5 EPA Requirements for Quality
Assurance Project Plans for Environmental Data
Operations
Guidance documents:
EPA QA/G-5 EPA Guidance for Quality Assurance
Project Plans
EPA QA/G-5S Guidance on Sampling Design to
Support Quality Assurance Project Plans (Draft)
EPA QA/G-5i Data Quality Indicators (Draft)
QAPP Structure
QAPP is composed of 24 elements
Elements are grouped into 4 classes:
Part A: Project Management
Part B: Measurement/Data Acquisition
Part C: Assessment/Oversight
Part'D: Data Validation and Usability
Not all projects require 24 elements
Other projects may require additional information
not in 24 elements
354a.pre-4/11/99-3-4
-------
QAPP Structure
A. Project Management
Project history and objectives, roles and responsibilities
of participants
A1 - Title and Approval Sheet
A2 - Table of Contents and Document Control Format
A3 - Distribution List
A4 - Project/Task Organization
A5 - Problem Definition/Background
A6 - Project/Task Description and Schedule
A7 - Quality Objectives and Criteria for Measurement Data
A8 - Special Training Requirements/Certification
A9 - Documentation and Records
Seven Steps of the DQO Process
1. State the Problem
2. Identify the Decision
3. Identify Inputs to the Decision
4. Define the Boundaries
5. Develop a Decision Rule
6. Specify Limits on Decision Errors
7. Optimize the Design
354a. pre-4/11/99-5-6
-------
DQO and QAPP Linkage
DQO Step
c
<0
s i
| 5?
LU (6
Q_ 2
Q_
5*
^ O
1
2
3
4
5
6
7
A1
A2
A3
A4
~
A5
~
~
A6
~
~
~
~
A7
~
~
~
A8
~
A9
Source EPA QA/G-5, A4
A4: Project/Task Organization
Identifies key individuals, with their responsibilities
(data users, decision-makers, project QA manager,
subcontractors, etc.)
DQO Step 1, State the Problem
Organization chart showing lines of authority and
reporting responsibilities
354a. pre-4/11/99-7-8
-------
A5: Problem Definition/Background
Clearly states the problem or decision to be resolved
DQO Step 2, Identify the Decision
Provides historical and background information
DQO Step 1, State the Problem
A6: Project/Task Description
and Schedule
Lists measurements to be made
DQO Step 3, Identify Inputs to the Decision;
DQO Step 5, Develop a Decision Rule
Cites applicable technical, regulatory, or program-specific
quality standards, criteria, or objectives
DQO Step 3, Identify Inputs to the Decision;
DQO Step 6, Specify Tolerable Limits on Decision Errors
Specifies special personnel or equipment requirements
DQO Step 4, Define Boundaries
Provides work schedule
Specifies required project and QA records/reports
354a.pre-4/11/99-9-10
-------
A7: Quality Objectives and Criteria
for Measurement Data
States project objectives and limits, both qualitatively and
quantitatively
DQO Step 5, Develop a Decision Rule;
DQO Step 6, Specify Tolerable Limits on Decision Errors;
DQO Step 7, Optimize the Design
States and characterizes measurement quality objectives
for the applicable action levels or criteria
DQO Step 6, Specify Tolerable Limits on Decision Errors;
DQO Step 7, Optimize the Design
QAPP Structure
B. Measurement/Data Acquisition
B1 - Sampling Process Design
B2 - Sampling Methods Requirements
B3 - Sampling Handling and Custody Requirements
B4 - Analytical Methods Requirements
B5 - Quality Control Requirements
B6 - Instrument/Equipment Testing, Inspection, and
Maintenance Requirements
B7 - Instrument Calibration and Frequency
B8 - Inspection/Acceptance Requirements for
Supplies and Consumables
B9 - Data Acquisition Requirements (Non-direct
Measurements)
B10- Data Management
354a.pre-4/11/99-11-12
-------
DQO and QAPP Linkage
DQO Steps
c
LU
Q.
o.
<
a
c
^3 o
ฃ 1
ฃ o-
D O
> <
a>
-------
QAPP Structure
C. Assessment/Oversight
Assessing the effectiveness of the implementation of
the project and associated QA/QC
C1 - Assessments and Response Actions
C2 - Reports to Management
QAPP Structure
| D. Data Validation and Usability
i
; Determining whether or not the data conform to the specified
! criteria
D1 - Data Review, Validation, and Verification Requirements
D2 - Validation and Verification Methods
D3 - Reconciliation with Data Quality Objectives
354a. pre-4/11/99-15-16
-------
DQOs and QAPP: Planning Phase
Outputs of DQO Process and resulting sampling design
are documented in the QAPP
QAPP includes additional information: management
authorities, personnel, schedule, policies, and
procedures for data collection (including SOPs)
Analytical lab methods (i.e., SW846 methods, specify
equipment needs) selection part of DQO Process
DQO Process identifies
-Type
~ Quantity
ป Quality of data required for decision making
DQOs and QAPP: Implementation Phase
Data collected according to methods and procedures
documented in QAPP
During data collection, technical assessments (TAs) conducted
to evaluate compliance with QAPP (and hence DQOs)
TAs generate QA/QC data (which was justified and prescribed
in sampling design - DQO Step 7)
: Any change to the following elements may require another
iteration of DQO Process
~ conceptual model
~ boundaries
~ decisions to be made
~ feasibility of sampling design
~decision error limits
354a.pre-4/11/99-17-18
-------
DQOs and QAPP: Assessment Phase
Data Quality Assessment (DQA) Process assesses
whether data meets:
ป Stakeholder's performance criteria (DQO Step 6)
~ Assumptions of conceptual model (DQO Steps 2 and 3),
and
ป Choice of statistical test for decision rule (DQA Step 3,
DQO Step 7)
If DQOs/DQIs cannot be met with data being
collected, "may" require a second iteration of DQO
Process
- Collect more data
~ Relax decision performance criteria
~ Select a more appropriate (or more powerful) statistical test
- Change the decision rule
354a.pre-4/11/99-19-20
-------
Developing DQOs
Steps 1 - 5
Problem
(Investigation or Study)
Resource Effective Data
Collection Design
State the Problem
Identify the Decision
Identify Inputs to the
Decision
Define Boundaries of the
Study
Develop a Decision Rule
Specify Limits on Decision
Errors
Optimize the Design
348a-9/1/99-1-2
-------
Overview
Goal: Develop Data Quality Objectives for EMCA
Site
Gain experience with DQO Process
Use results later to develop sampling designs
and a quality assurance project plan (QAPP)
Approach: Work through the first 5 steps of the
DQO process
Discuss background information and DQO
Process as a group
Use worksheets to document DQO outputs
What is the DQO Process?
The DQO Process is a systematic planning
process for generating environmental data that
will be sufficient for their intended use.
348a-9/1/99-3-4
-------
The DQO Process
Designed to answer:
What do you need to know?
Why do you need environmental data?
How will you use the data?
How good does the answer need to be?
I
i
!
i
i
i What are DQOs?
i
i
DQOs are quantitative and qualitative criteria
i that:
j
Clarify study objectives
Define appropriate types of data to collect
Specify the tolerable probabilities of potential
decision errors
| Identify the effects of these decision errors
348a-9/1/99-5-6
-------
I
*
1. All collected data have error
2. Absolute certainty comes with a high price
3. DQO Process defines tolerable error rates
4. Without DQOs, the quality of decisions are
unknown ~ " ~"
5. DQO Process is based on the scientific method
ItfJlS ~ 4t> prfH^/dxSpYWL.
Getting Ready
>v- ;
f
Get the right people involved in the right way ,
- Stakeholders/^^^vG. vHu? c&y^ <^s c<&
_ . . , G naa^re. clCM^
~ Decision maker
~ Technical experts
* Environmental scientist with statistical
training
Prepare for developing DQOs
~ Gather existing site knowledge
* Consider overall project objectives
ป Be realistic about resource and
sociopolitical constraints
(U<4-
348a-9/1/99-7-8
-------
Step 1: State the Problem
Activities
~ Identify planning team members
* Develop conceptual site model
~ Develop list of anticipated contaminants and
define exposure scenarios
~ Consider resource and logistical constraints
* Summarize knowledge of site
Outputs
ป List of known and expected contaminants
ป Conceptual site model, exposure scenarios
- Summary of previous response actions, data
collection activities
Map of EMCA Site
H
Dirt road
area
City water
well field
Main area of
EMCA
industrial !j
activities
Nearby
residential
development
B
i L
348a-9/1/99-9-10
-------
Elements of Conceptual Site Model
Source: waste oil contaminated with PCBs (>50
ppm^sprayed over dirt road to control dust
* PCBs are carcinogens and cause harmful
liver, skin, reproductive, and developmental
effects
Contaminated media: surface soil
Primary pathways: direct ingestion of soil,
inhalation of dust " ""
Receptors: adults, children
Step 2: Identify the Decision
Principal study questions:
~ Does PCB contamination pose an
unacceptable risk at the EMCA/ECC site?
ปIf so, what is the extent of unacceptable
contamination?
Alternative actions:
- Site evaluation accomplished; no further
action
- Design and implement remedial action
348a-9/1/99-11-12
-------
Identify the Decision
What actions will
resolve the
problem?
State each decision
in terms of whether
to take action.
Identify each
data-driven
decision and
develop DQOs for
each.
Step 3: Identify Inputs to the Decision
o Identify the information needed to resolve the
decision statement and to establish the action
level (e.g., applicable technical, regulatory,
program-specific quality standards, criteria,
or objectives)
Determine sources for the needed information
Confirm that appropriate measurement
methods exist to provide the data
EXAMPLE DECISION LOGIC
Identify each data-driven decision and
develop DQOs for each
-------
Inputs and Sources of Information
PCB action level = 1 ppm
* Policy, soil screening guidance
Futurejand use^industrial
ป Community groups, regulators
PCB toxicity = carcinogenic; chronic,
long-term exposure
- Toxicologist
Remediation options = removal, washing
- Vendors, stakeholders
Step 4: Define Boundaries
Spatial and temporal
boundaries
ปDefinition of target
population
~Specification of
subpopulations (i.e.,
areas/time periods within
which target population is
heterogeneous)
Scale of decision making
Practical constraints
'xjfaci
ฆ
' 'if1':
Lrr.^dii
-H
' ' i ' . <
! ' -t
j_l4 1 -l-i. J
1-hrt-
. f :
U-L i ฆ1
h=ttr~ '
1
348a-9/1 /99-15-16
-------
Boundaries for EMCA Site
Spatial boundaries
~ Surface soil top two inches of soil^)
~ Subpopulations - left for exercise
Temporal boundaries
~ Data used for estimating long-term exposures
~ Low volatility & stability of PCBs in soil, moderate
climate => temporal flexibility in sampling
Scale of decision making
- Residential scenario => 1/2 acre
~Industrial scenario => larger areas
Setting the Scale of Decision Making
What is the basis for the action level:
if risk based^hat is appropriate scale(s) for
"""* aggregation, given expected future land
use?
if remediation based, what is smallest
practical volume/area that can be remediated
using the most likely remedial alternatives
(e.g., removal, in situ treatment)?
if regulatory based, what time/space
requirements are specified by regulations?
348a-9/1/99-17-18
-------
Tips for Setting Boundaries
Ensure that each subpopulation for which a
separate decision is desired is well-defined and
appropriate
* Each additional subpopulation will increase
data requirements
Utilize site maps to depict subpopulation of
interest and corresponding sampling design
Make sure the design team has an adequate
understanding to select appropriate sampling
methods (e.g., depth of surface soil)
Step 5: Develop a Decision Rule
Combine the decision rule elements into an "If...
then..." statement that provides the logical basis
for choosing among alternative actions
~ Parameter of interest = mean
* Action level = PCB Soil Screening Level is 1
ppm
ป Alternative actions = design remedial action,
no action
- Scale of decision making = sampling zone
* Surface soil definition = top 2 inches
After Step 7, ask "Will collected data be
sufficient to resolve this decision?"
348a-9/1/99-19-20
-------
Decision Rule Example
If the mean concentration of PCBs within
a decision unit tsize. depth, location) is greater
than 1 ppm, then take remedial action.
Otherwise, no further action is required.
Data Quality Objectives:
Outputs from Each Step of the Process
i
i
Get consensus from all stakeholders
348a-9/1/99-21-22
DQOs
Problem:
Decision:
Inputs:
Boundaries: _
Decision Rule:
Limits on Decision Errors:
ssxmasm
-------
Next Steps
Documenting DQO Outputs in the QAPP
Step 6: Specify Limits on Decision Errors
- How to determine decision error rates
based on potential consequences of an
error
Step 7: Optimizing the Data Collection Design
- How the DQO outputs from Steps 1 to 6
are used in sampling design
ป Different types of designs
348a-9/1799-23-24
-------
Boundaries Setting Activity
1. Open the "Boundaries Setting Activity" envelope
that contains instructions, one 11 x 17 OU2 site map,
and several 8.5 x 11 copies of that map.
2. Turn to pages J-5 through J-7 in your journal to view
other site maps and a conceptual site model. Given
this information, discuss a rationale for multiple
decision units (DUs).
3. In your group, reach consensus on DU boundaries.
4. Using the ruler and colored pencils at your table,
document your group's boundaries on the 11x17
OU2 map.
348b-4/11/99-1-2
-------
DQO Process Step 6:
Specifying Limits on
Decision Errors
Step 6 Overview
Describe decision errors
Determine potential consequences
Define the baseline condition
Specify quantitative limits on decision errors
348c-8/2000- 1-2
-------
What are Decision Errors?
Decision errors occur when data mislead a
decision maker to draw a conclusion that is
inconsistent with the true state of nature
We're doing our best by basing our
decision on scientific observations
(good decision)
However, by chance (or undetected
problems) our observations unwittingly
lead us to an erroneous conclusion
(bad outcome)
Good News I Bad News
Usually the data will be "close enough" to
the truth and we will make the right decision
The consequences of being wrong might be
severe enough to impact negatively on
human health
~ If so, we need to reduce the chance of
making a decision error to tolerably low
levels
348c - 8/2000 - 3-4
-------
Probability of Making Decision Errors
Depends on Variance
Variability in
observations is due to
variation in the
population and
random errors from
sampling and
measurement
30 ppm
True Mean
50 ppm
Action
Level
Contaminant
Concentration
The greater the
variation, the greater
the chance of
observing values that
are far away from the
true mean
Contaminant
Concentration
Probability of Making Decision Errors
Also Depends on Context
If the true mean is much
less than the action level,
high values are less likely
to be observed, which
reduces the chance of
reaching a wrong
conclusion
Contaminant
Concentration
If the true mean is close
to the action level, higher
values are more likely to
be observed, which
increases the chance of
drawing an erroneous
conclusion
348c - 8/2000 - 5-6
-------
Decision Errors Derive from
Total Study Error
Population
variation
Sampling
Error
Sampling
design
FiekL-
Sample
Collection
Measurement
:EiTor
Sample
Handling,
Storage, etc.
Analytical
Method
Data values may not agree with the
true population distribution
CONTAMINANT CONCENTRATION
348c - 8/2000 - 7-8
-------
Another View: Truth Table
T5
Q)
ฃ
.Q
o
c
o
TO
4-*
ฃ
0
O
C
o
o
"Above
Standard"
Standard ~
"Below
Standard"
0
Decision
Error
(false alarm)
Correct
Decision
Correct
Decision
Decision
Error
(miss)
Below
Standard
Standard
True Concentration
Above
Standard
EMCA Decision Errors
The two decision errors for the EMCA Site:
Decide to proceed with remedial design
when the true mean PCB concentration is
less than 1 ppm
Decide to take no further action when the
true mean PCB concentration is greater
than 1 ppm
-------
Determine Potential Consequences of
Decision Errors
i
Anticipate what might happen if you commit
a decision error
Consequences may include:
ป Human health risks
^ Ecological risks
- Political risks
^ Social risks
Economic risks
- Schedule risks
Worksheet:
Consequences of Decision Errors
Proceed with remedial design
when true PCB mean < 1
\pX^C ;
' vca_VLirvc
* W&-v\iL-
Take no further action when
true PCB mean >1
. 5
-------
Define the Baseline Condition
Establishes where the "burden of proof1 lies
Can be based on any of several considerations:
Regulatory requirements
ป Severity of decision error consequence
~ Preponderance of prior evidence
~ Technical issues
Under Soil Screening framework (Superfund and
Agency guidance), presume that the site is
contaminated
Site Condition
Baseline: Site is unacceptably contaminated
(i.e., true mean PCB concentration is greater
than 1)
* Scientists and statisticians call this the
Null Hypothesis
Alternative: Site is considered clean (i.e., true
mean PCB concentration is less than 1)
ป Scientists and statisticians call this the
Alternative Hypothesis
348c-8/2000- 13-14
-------
Decision Errors
Baseline Condition: true PCB mean > 1
Type I Error (false positive or false rejection):
Data indicate the site is clean when, in fact,
the site is unacceptably contaminated
Type II Error (false negative or false
acceptance):
Data indicate the site is unacceptably
contaminated when, in fact, the site is clean
Setting Quantitative Limits
What range of mean PCB contamination is
possible at the site?
What is the range of the true PCB mean
concentrations where the consequences of
your decision are relatively minor?
What are your tolerable probability limits for
making an incorrect decision?
348c-8/2000- 15-16
-------
An Ideal Sampling and
Measurement System
1
Probability
1
1
of taking
1
remedial
1
action
0
1
1
1
3 1 High
True PCB mean
(ppm)
Real-World Sampling and
Measurement Systems
Probability
of taking
remedial
action
Performance curve
with relatively high
precision (low
variability / more
samples)
I
Performance curve
with relatively low
precision (high
variability / fewer
samples)
High
True PCB mean
(ppm)
348c-8/2000-17-18
-------
Decision Performance Goal Diagram:
Parameter Range of Concern
ฆ*
1
ฆ
Probability
1
1
of taking
1
remedial
1
action
1
1
0
1
1
0 1 ?
True PCB mean
(PPm)
Gray Region
Gray region - the range of possible mean PCB
concentration values where the consequences of
decision errors are relatively minor (too close to call)
Bounded on one side by the action level (1 ppm)
Bounded on the other side by the mean PCB
concentration where consequences of making a
decision error begins to be significant
348c - 8/2000 -19-20
-------
Quantitative Limits:
Example of Gray Region
True PCB mean
(PPm)
Assigning Probability Limits
(PPm)
348c-8/2000-21-22
-------
Setting Decision Error Limits:
Some Existing Starting Values
Organization
Media
baseline
condition
Gray
Region
Type I error rate
(alpha)
Type II error
rate (beta)
Superfund Soil
Screening
Guidance
soil
Contaminated
1/2 SSL to
2xSSL
05
.2
Superfund
DQO
Guidance
soil,
groundwater,
air
Contaminated
none
specified
01
01
Superfund
Data Usability
for Risk
Assessment
all media
(Not contami-
nated)
(no specific
recom-
mendation)
20
.10
Washington
State
soil,
groundwater
.05 for
comparing data
with background
or standard
RCRA
ASTM
groundwater
.05 for detection
monitoring
.01 for single
comparison
I
Construct a "What If' Table for the
EMCA Site
Measured
Cone, (ppm)
Decision
True
Cone, (ppm)
Error
Type
Tolerable
Decision Error Rate
>1
Cleanup
0-d
&ฆ
-
>1
Cleanup
d-1
&
Gray Region
<1
Leave
1-5
I
d = the lower bound of the gray region
348c - 8/2000 - 23-24
-------
Next Steps
Complete your DQOs (exercise)
~ Specify your team's limits on decision
errors
DQO Step 7: Optimizing the Data Collection
Design
ปSampling design process
~ DEFT software
348c - 8/2000 - 25-26
-------
Step 7: Optimizing the
Design for the EMCA Site
What is Needed to Complete Step 7?
The Decision
- parameter of interest (mean, boundary,
maximum)
ปdecision unit
The Decision Error Limits
ป consequences of inappropriate actions
ป how decision errors could be made based on
sampling data
Conceptual Site Model
~ where do we expect to find contamination?
~ how do we expect it is distributed?
Cost information
348d-8/2000-1-2
-------
"Optimize" the Sampling Design
"Optima!" measured in terms of decision error
Provides suitable data to make decisions about the site
Risk versus cost tradeoffs are required
~ EPA QA/G-4 is silent on how to make the tradeoffs
~ Methods are available: cost/benefit, decision analysis,
value of information
- Must be acceptable to all stakeholders
Step 7 outputs
~ Number of samples required (in DQI & DQA lectures)
- Quality of samples required (in DQI lecture)
- Type of samples required (Step 4, DQO Process)
~ Location (spatial) and timing (temporal) of samples
required (focus of this lecture)
What do DQOs tell us for
developing the design?
Sampling
Design
34&d - 8/2000 - 3-4
-------
Types of Sampling Design Approaches
Haphazard
Judgmental
Search
Probability Based
~ Simple random
~ Stratified
- Multistage
~ Cluster
* Systematic
Composite
- Double
\anv\fn ip Uh&-i 0/wxj?
IadIuajl
pJrV_ ฉ-fct
Non-Probability Based Designs
o Benefits
- Useful in exploratory and feasibility studies
ป Analysis of historical data
D rawbacks
(^Judgment samples are non-probability-based,^
andjnference to the general population is
problematic
~ Utility of a judgment sample is only as good as
the conceptual model used to define the target
population and the expert's knowledge
Cannot determine decision error or data
variability
U.S. EPA Heaq
Mail o
1200 PennsyK
Washingto
quarters Library
-------
Probability Based Designs
Benefits
~ Provides ability to estimate uncertainty
ป Reproducible results within decision error limits
ป Provides ability to make statistical inferences
ป Ability to handle multiple objectives and decision
error criteria
Drawbacks
ป Can be more expensive than judgmental sampling
ป Optimal design depends on a good conceptual
model
Type of Sample: Hybrid
Judgmental
Sampling
Simple
Random
Sampling
Adaptive, Stratified,
Random Stratified
Sampling
348d - 8/2000 - 7-8
-------
Probability Based Sampling
Simple Random
Sampling
iotfc/YJL &'TP
mfr, v %
6cinrpU3 ฎ
Cluster
Sampling
Stratified Random
Sampling
Systematic Grid
Sampling
Two-Stage
Sampling
Primary
Units~<
Random
Sampling
Within Rlnrhs
Which designs may work best?
Simple random sampling - hard to do in the field; not
feasible given project constraints.
Systematic grid sampling - inconsistent with DQOs (not
searching for hot spots).
Composite sampling - potentially feasible; will
investigate.
Stratified random sampling -- potentially feasible will
investigate later if necessary.
Other designs - may be possible if initial designs are
infeasible; will investigate later if necessary.
348d-8/2000-9-10
-------
Evaluating design alternatives
Professional judgment may be necessary to reduce the
number of potential designs to a more manageable
number.
Each design must be associated with a statistical test
that can be applied to the data, consistent with DQOs.
Statistical aspects of sampling design - design will
control decision error by controlling major sources of
data variability (e.g., spatial, support, lab)
Resources: Guidance on Sampling Design toSujwort
Quality Assurance Project Plans (EPA QA/G-5S); J
Guidance for Data Quality Assessment (EPA QA/G-9).
How should we evaluate the
performance and cost of each design?
1. Model variability (sources are additive)
n
<0
field
2. Model performance
(design statistical test)
3. Model cost
348d-8/2000-11-12
-------
-total
Modeling Variability
field
TOTAL VARIABILITY
Inherent Variability
Physical Support
Variability
Temporal/Seasonal
Spatial
Specimen collection
& handling
Compositing
Develop a model for the
sampling variability of the
parameter of interest.
To estimate variability,
examine data from a pilot
study, studies of a similar
population, or expert
opinion.
Measurement
Variability
Data Handling
Variability
Subsample acquisition
Subsample analysis
Non-response
Censoring
Adjustments
Aggregation
Variability is measured by
variance or standard
deviation
Modeling Variance
(or Standard Deviation)
Generally speaking:
Total laboratory
variability = variability
field
+ variability
Jฐtal
field
field
Kty
,v
.ฅ
(p-
to
0.1^ UcC-
r
rO
348d -8/2000- 13-14
-------
Compositing
Mixing in
field
Compositing is a "physical averaging" that reduces
the variability of data
Usually applied when estimating^ mean
Very useful when cost of analysis is high compared
to cost of sample collection in the field
Very useful when relatively large field variability
obscures interpretation of data
The Effect of Compositing
-,
_ _ standard deviation (lab)
Define r = ~ )r~rz
standard deviation (total)
Then form "n" composite samples having "m"
mini-samples, and the composite standard deviation is:
Standard Dev(composite) = Crm * standard dev (total)
Where: Cra is a constant depending on "r" and the
number of mini-samples
348d-8/2000- 15-16
-------
Increasing Compositing is Not Always
Cost-effective
08
07
05
0 1
ฆ
0
A
~
ฆ
O
ฆ
~
P
ฆ
ฆ
A
O
ฆ
ฆ B
T
A
O
O
T
A
A
O
O
ฐ O o
O
O n r
T
A A
A
A
^-o-o-tnr
o o o
~
X * A
A
* ~ A
k A A A A
~
~ V ,
' ? ~ ~ T T T
! . I . | , I I
10 15 20
Number of Mini-samples (m) per composite
ฆ r =
05
o r =
04
A r =
03
~ r =
02
Improving/Reducing the
Estimate of Variance
o Improve the estimate of variability using past
records, preliminary results, expert opinion, or a
, pilot study
| Reduce variance using design features
* Divide the problem into homogenous areas
- Further reduce variability by compositing within
decision units
Investigate the use of more sophisticated schemes
! - Double sampling screening
i - Sequential systematic compositing
| - Adaptive cluster sampling
i
I
i
348d-8/2000- 17-18
-------
EMCA/ECC Site
Decision made to use composite sampling
Team must:
- Review budget constraints specified earlier in
the DQO Process
~ Develop a cost model based upon estimates of
field and lab costs and calculate the cost of
collecting and analyzing n samples with m
increments per sample.
Tradeoffs Will Be Necessary
The exact combination of:
~ samples (n), mini-samples (m);
ป quality versus quantity of data;
- risk versus cost-benefit
will be subject to negotiation.
Each team will probably take a slightly different
approach in evaluating these tradeoffs
A more complex task may need simulation studies
and computer-intensive methods
348d - 8/2000 - 19-20
-------
Selecting the Design..
o Team must document the design, overall rationale
for selecting that design, and implementation
procedures (QAPP element B1).
This includes types of samples required, the sample
size, decision units, temporal frequency, matrices,
parameter(s) of interest, and rationale for the design.
Documenting key assumptions of the design will be
a starting point for Data Quality Assessment.
Selecting the Design...
If none of the acceptable designs are affordable,
revisit the DQOs and revise the budget constraints,
alter the method(s) used to collect and/or analyze the
data, or revise the performance criteria.
If the least-cost design that satisfies the DQOs is too
inexpensive, tighten the performance criteria or use
a better (and presumably more expensive) method to
achieve even higher power.
If one of the designs achieves the DQOs and is
cost-feasible, select this design.
348d - 8/2000 - 21-22
-------
The EMCA Site - Preliminaries
Contract laboratory indicates SD moratory tQ fce
about 0.3 (SDro7>li = 1.2) making the r-ratio about
0.25
Project manager wants to use compositing to reduce
variability by about two-thirds
The EMCA Site - Composite
Mini-Samples
Reducing total variability by roughly 66% (i.e.,
Cm=.33) with an r-ratio of 0.25 implies roughly 16
mini-samples needed per composite^ " ~
Field crew experienced in obtaining composites of
16
Cost of collecting mini-samples relatively low
348d - 8/2000 - 23-24
-------
Sampling Design Exercise
Overview
Exercise is divided into three parts:
- Review/confirm decision unit (DU) boundaries
and estimate total variability
ป Establish sample size(s) subject to a budget
constraint
- Document your design
You have a $200,000 budget to establish a sampling
design by working through these three tasks with
your team.
358c-8/2000-1-2
-------
Getting started.
1. Review DU boundaries.
2. Inventory exercise packet:
~Instructions
~1 Sampling Design Calculations worksheet
~ 1 OU2 Sampling Plan (11" X 17")
~ 3 pilot study envelopes (DO NOT OPEN!)
ป1 diskette
3. Decide whether you will purchase pilot study package(s) to
improve your estimate of variability.
4. Revise boundaries of DUs, if necessary.
5. Use available information to generate estimates of total
variability for each DU (record on worksheet).
6. Using SimCost.xls, calculate "r" for each DU.
Developing Your Design...
1. Use DEFT to generate sample size (n) and
number of increments per sample (m) for each
DU.
ป Laboratory costs per sample = $300
~ Field costs per sample = $15
2. Use SimCost.xls to calculate the cost of the
design you established using DEFT. Are you
within budget? If not, iterate back to Step 1.
3. Record final design on worksheet and save
SimCostfile as your team name on disk.
358c - 8/2000 - 3-4
-------
Documenting Your Design...
1. Draw your decision units on your Sampling Plan (11"
x 17"). Number the DUs and write m and n for each
DU on the map.
2. Document costs on Sampling Design Calculations
(SDC) worksheet.
3. Turn in completed: 1) Sampling Plan, 2) SDC
worksheet, 3) diskette, 4) pilot packages
4. Make sure to write your team name on EVERYTHING!
358c - 8/2000 - 5-6
-------
2
-------
Integrating Quality Assurance
into Project Development
Day 2
!"
Overview
Day One
~ How do you plan to collect the right data?
ป How are your project and data quality objectives
translated into a sampling design?
Day Two
ป How do you document your planning and
implementation activities in the project's QAPP?
~ How do you verify that the data you collected
met the assumptions you made during planning?
What do your data tell you?
lntro_Day2 - 8/2000 - 1-2
-------
Learning Objectives
1. Participants will be able to explain the value of
systematic processes in developing QAPPs.
2 Participants will be able to describe the links
between DQOs, DQIs, DQA, and the QAPP.
3. Participants will be aware of and be able to use QAD
resources and tools, including: G-4, G-4D (DEFT),
G-5, G-9, and G-9D (DataQUEST).
4. Participants will be able to explain how the outputs
of DQO Process Steps 6 and 7, precision and bias
assumptions, and distributional assumptions are
used to develop a design.
Day 2
8:30 Reconvene, overview of Day 2
Relating DQIs to Sampling Design
Design Comparison I
BREAK
QAPP Part B
QAPP Part B exercise
QAPP Parts C and D
12:00 LUNCH
1:00 QAPP C exercise
Data Quality Assessment
DQA exercise
BREAK
Design comparison II
Application planning
5:00 Wrap up and dismiss
lntro_Day2 - 8/2000 - 3-4
-------
What's in store for today...
QAPP Parts B, C, and D; DQIs; DQA.
We used SimSITE to generate data for each table's
design. You will assess your data.
Combination of lecture, exercises, discussion, and
hands-on practice will be similar to yesterday.
lntro_Day2 - 8/2000 - 5-6
-------
Relating DQIs to
Sampling Design
A Review:
What Was Done in DQO Step 7
Select an "Optimal" Sampling Design
ป Number of Samples
~ Type of Samples
ป Location of Samples
~ Quality of Samples (e.g., field screening vs. CLP)
A 2
must match with o (ofa/ in sample size calculation
Cost of Desired Samples. If not within budget, need
to make tradeoffs (revisit Step 6, possibly Steps 1-6,
relax decision error rates, accept lower data quality)
-------
DQO and DQIs
DQOs
DQIs
How good does this decision
have to be?
How good do the data have
to be?
DQOs are Driver for DQIs
DQIs = used in interpreting the degree of acceptability
or utility of data.
What are the Primary and Secondary
DQIs?
Primary DQIs: The PARCCS Parameters
Precision
Accuracy (or Bias)
Representativeness
Comparability
Completeness
Sensitivity
Secondary DQIs
Recovery
Memory Effects
Selectivity
Limit of Quantitation
Repeatability
Reproducibility
358a b 9-99 3-4
-------
PARCCS Parameters
e Precision, Accuracy, and Sensitivity
~ Quantitative measures
a Representativeness, Comparability and
Completeness
~ Qualitative measures
Accuracy (or Bias), Precision, Completeness and
Comparability should be addressed in QAPP Section
A7.3, Specifying Measured Performance Criteria.
Representativeness should be discussed in QAPP
Section B4.2, Subsampling, and B1, Sampling
Design.
Sensitivity should be discussed in Section B4,
Analytical Methods.
Precision
Definition: measure of agreement among replicates
of the same property, under prescribed similar conditions
Options for
Measure Reported:
Range
Relative Range ^ x
Standard Deviation s _
Relative Standard Deviation
(* MIN > *MAX )
^ X X ^
A MIN A MAX
X J
Z(XI~X)
1=1
n -1
358a b 9-99 5-6
-------
Precision (cont.)
Components of precision:
~ Sampling error
~ Field instrument measurement variation
~ Laboratory measurement variation
~ Temporal/spatial variation
~ Seasonality
~ Data preparation variation
Key driver for number of samples required, type of
methods used to select and analyze samples
Relative importance of components of precision
should be considered (see Guidance QA/G-51,
Data Quality Indicators
Accuracy comprised of both precision and bias and
use of this term is archaic.
Definition: systematic or persistent distortion of a
measurement process that causes errors in one
direction.
Options for Bias 'T) where *is avera9e
Accuracy (Bias)
Measure Reported:
value of a set of measurements of a
standard and T is the reference value of
the standard.
Percent Bias
358a b 9-99 7-8
-------
Accuracy (cont.)
Shooting at a Target
ฎ
(a) High bias + low precision e low accuracy
(b) Low bias + low precision * low accuracy
0
(c) ttgh bias + high precision = low accuracy
(d) Low bias + high precision = high accuracy
% Representativeness ^ d
%
0^
Definition: measure of the degree to which data represents a
characteristic of:
A population,
A process condition, or
An environmental condition.
Measure Reported: a qualitative statement of degree of
representativeness and difficult to define precisely.
Statement may include justification of sample collection and
handling techniques, description of sampling population vs.
target population, and justification of sampling design used.
Only PARCC parameter that is defined outside of laboratory 1
management process. J
358a b 9-99 9-10
-------
Representativeness (cont.)
Sources of Error
~ Sample not representative of population
-unknown or unequal sample selection probabilities
-small sample volumes cannot capture full diversity of
populations units
~ Sample selection method does not accurately capture
material from natural setting
If data are not representative in some way, little
significance can be assigned to other PARCC
parameters.
Lack of agreement during planning (in DQO Process)
on the decision to be made may result in
unrepresentative data.
Comparability
Definition: measure of the degree to which two data
sets can contribute to a common analysis and
interpretation, and are equivalent with respect to
measurement of a specific variable or group of
variables.
Measure Reported: a qualitative statement of degree
of comparability needing expert opinion.
Description of similarity/dissimilarity of variable(s)
measured units, QA/QC methods used, time frame,
sampling methods, analysis methods, and sampling
designs.
358a b 9-99 11-12
-------
Completeness
Definition: a measure of the amount of valid data
obtained from a measurement system. Data may be
lost, found invalid, or sampling design may be
infeasible to implement.
The basis for determining the invalid data can have a
large impact on bias.
Measure Reported:
number of valid samples
number of samples planned to be collected
Where the invalid data are located with respect to the valid data
Sensitivity
Definition: The capability of a method or instrument to
discriminate between measurement responses
representing different levels of a variable of interest.
Measure reported: The standard deviation of values at
different concentration levels accompanied, when
possible, by a probabilistic statement on the
differentiation between adjacent values in the range of
concentration of concern.
358a b 9-99 13-14
-------
Where do targets for DQIs come from?
Regulations, Statement of Work, Record of Decision
(Superfund)
Laboratory or contractual requirements
Answer: From risk versus cost tradeoffs
established in the DQO Process
ฎ Agreement established through Performance
Based Measurement Systems
Quality Level Tradeoffs
T radeoffs
Large number of lower
quality samples vs. small
number of high quality
samples.
Considerations
If both options meet
decision performance
criteria, select one with
minimum cost.
Spend more resources on
better (more precise)
methods
Increase duplicates,
replicates, split samples,
etc. to reduce component
variabilities
If can't meet decision
performance criteria must
modify criteria sampling
design.
Budget constraints,
sample collection method
358a b 9-99 15-16
-------
Components of Total Variability
a 2 =
Q2 + CJ2 +
a 2 +G2
TOTAL
FIELD PHYSICAL
lab INSTRUMENT
SUPPORT
REPEATABILITY
Sometimes
can't decrease,
only estimate
more accurately
v
Can spend resources to decrease
Can select among options
Two Principal Sources of Variability
LABORATORY
Term to describe:
precision
Source: random errors in
collection and
measurement equipment
Issue: since multiple
measurements show
range of values, what
value to report?
FIELD
Term to describe:
variability
Source: natural variability
(spatial, temporal in
population). Value varies
from sample to sample.
Major component of
variability
Issue: since multiple
measurements show a
"different view" of
population, what decision
to make regarding
parameter of population?
358a b 9-99 17-18
-------
Reporting for Two Sources of Variability
(Also Called Random Errors, Precision, Uncertainty)
What Labs Mean by What Statisticians Mean by
Uncertainty Uncertainty
Step 7: Optimize the Design
Sample Size Formula
.(zi-g+Zl-JgT . fc-J
A2 2
,2
assuming Normality
358a b 9-99 19-20
-------
Use Sample Size Formula to Help
Make Tradeoffs
i
a 1
d l^xs-
(Zl-g +Zl-p) (?POP + ^SAMPLE +^LAb) (Zl-g)
C^JLX-\
T
C^\JL.c^ CIA.C
n is set by tradeoffs made in Step 7 d trr p>
Prior estimates of the variance components are ,
needed (*) -7
fYYUJS"
' A7^CnT
- ^
/o-t-
Equipment/Method Selection Using
| Iterative Method with Sample Size Formula
I
Step 1: Calculate sample size formula using best
quality estimate for G2T = oW + a2SAMPLE + oW
| Call this nr
Step 2: Compare cost for n1 to sampling budget. If
cost < budget, stop. If not, go to Step 3.
I
Step 3: Put negotiated n2 into sample size formula and
solve for 02LAB. If equipment is available and
within budget, stop. If not, iterate.
Guidance Documerit QA/G-5I: Pata Quality Indicators,
addresses this problem
358a b 9-99 21-22
-------
Conclusions
DQOs are source for DQIs targets.
Precision and bias have different interpretations
depending on field vs. lab setting.
Can't forget population variability and
representativeness when considering overall effect
of DQIs on decision making.
No simple answers for DQIs. Sequential search
techniques, tradeoffs, and performance evaluations
are useful tools.
Team Exercise
Identify significant sources of variability.
Order them by relative magnitude.
Identify ways to reduce the magnitude of
variability.
358a b 9-99 23-24
-------
Quality Assurance Project Plan:
Measurement/Data Acquisition
I
Integrating QA Processes with
Documentation Requirements
PROCESS DOCUMENTATION
INFORMATION FLOW
354b. pre-4/12/99-1-2
-------
QAPP Structure
B. Measurement/Data Acquisition
B1 - Sampling Process Design
B2 - Sampling Methods Requirements
B3 - Sampling Handling and Custody Requirements
B4 - Analytical Methods Requirements
B5 - Quality Control Requirements
B6 - Instrument/Equipment Testing, Inspection, and
Maintenance Requirements
B7 - Instrument Calibration and Frequency
B8 - Inspection/Acceptance Requirements for
Supplies and Consumables
B9 - Data Acquisition Requirements (Non-direct
Measurements)
B10- Data Management
DQO and QAPP Linkage
1 ^ o
0)
E
Q)
LU
Q_
Q_
<
a
11
0
L_
3
a"
o
) <
o jS
Ss
DQO Steps
1
2
3
4
5
6
7
B1
~
~
~
~
B2
~
~
B3
~
B4
~
~
B5
~
B6
~
B7
~
B8
~
B9
~
~
B10
~
Source EPA QA/G-5
354b. pre-4/12/99-3-4
-------
B1: Sampling Process Design
(Experimental Design)
Type and number of samples required
DQO Step 7, Optimize the Design
DQI: Precision, Completeness, Representativeness
Sampling design and rationale
DQO Step 5, Develop a Decision Rule;
DQO Step 6, Specify Tolerable Limits on Decision Errors;
DQO Step 7, Optimize the Design
Sampling location and frequency
DQO Step 7, Optimize the Design;
DQO Step 4, Define the Boundaries
Schedules for collection and laboratory analysis
Document key assumptions underlying design
B2: Sampling Methods Requirements
Describe/reference the methods by which samples will
be collected.
DQO Step 3, Identify Inputs into the Decision;
DQO Step 7, Optimize the Design
Describe physical sample, including sample mass,
sample volume, and/or sample matrix
DQO Step 7, Optimize the Design
What area or volume the sample will represent
DQO Step 3, Identify inputs to the Decision;
DQO Step 7, Optimize the Design
DQI: Representativeness
354b. pre-4/12/99-5-6
-------
B2: Sampling Methods Requirements
(continued)
Corrective actions for failure/inability to obtain
samples
DQO Step 7, Optimize the Design
Procedures to prevent contamination I deterioration
of the sample after collection
B3: Sample Handling and
Custody Requirements
Provisions for ensuring samples are handled by
authorized personnel
Provisions for ensuring sample integrity is
maintained
Procedures for maintaining written records of all
phases of sample analysis
Procedures for maintenance of chain-of-custody
DQO Step 3, Identify Inputs to the Decision
354b. pre-4/12/99-7-8
-------
B4. Analytical Methods Requirements
Identify the analytical methods to be used
(for PBMS, the performance characteristics)
Procedures for sub-sampling necessary for the
proposed analytical method(s)
Procedures for sample preparation
Sensitivity requirements/standards
Identify all needed Standard Operating Procedures
DQO Step 3, Identify Inputs to the Decision
B5. Quality Control Requirements
Specify the technical activities necessary to produce
data of the quality established by DQOs
Other QAPP elements where QC data is specified (to
be cross-referenced in this element):
ป B1, Sampling Process Design
~ B2, Sampling Methods Requirements
ป B3, Sample Handling and Custody
ป B4, Analytical Methods Requirements
~ B5, Instrument Calibration
Include information on blanks, spikes, replicates
used to control variability
DQO Step 3, Identify Inputs to the Decision
DQI: Precision, Bias
354b. pre-4/12/99-9-10
-------
B6. Instrument/Equipment Testing,
Inspection, and Maintenance Requirements
Identify the method(s) and procedures for inspection
and testing
Specify the schedule and procedures for
maintenance of QC performance measures
Identify personnel responsible for maintenance and
testing
DQO Step 3, Identify Inputs to the Decision
B7. Instrument Calibration
and Frequency
Identify procedures for and frequency of calibration
Identify the documentation required for:
~ Calibration apparatus
~ Calibration standards
v Calibration frequency
DQO Step 3, Identify Inputs to the Decision
i
I
354b.pre-4/12/99-11-12
-------
B8. Inspection/Acceptance Requirements
for Supplies and Consumables
Identify items/supplies requiring inspection
Document acceptance criteria
Identify tracking procedures and frequency of
inspection of supplies and consumables
DQO Step 3, Identify Inputs to the Decision
B9. Data Acquisition Requirements
(Non-direct Measurements)
Identify types of data needed for project that are
obtained from non-measurement sources
Document rationale and relevance to project
objectives
Define acceptance criteria for and limitations on the
use of data resulting from uncertainty in their quality
DQO Step 1, State the Problem;
DQO Step 3, Identify Inputs to the Decision
354b. pre-4/12/99-13-14
-------
B10. Data Management
Describe project data management scheme,
including:
* Data entry checks
* Data transmittal procedures
~ Data tracking
Data storage and retrieval
DQO Step 2, Identify the Decision
-------
QAPP Part B Activity
1. As a team, consider elements B2 through B10 of the
sample QAPP.
2. Assign each team member to review at least one of
these elements in the sample QAPP.
3. Review your assigned element(s) and document
your comments/findings on your QAPP review form.
4. In your group, discuss the questions listed on the
exercise instruction sheet.
354C-4/12/99-1-2
-------
Quality Assurance Project Plan:
Assessment/Oversight
Data Validation and Usability
C. Assessment/Oversight
D. Data Validation and Usability
Assessing the effectiveness of the implementation of
the project and associated QA/QC
~ C1 - Assessments and Response Actions
* C2 - Reports to Management
Determining whether or not the data conform to the
specified criteria
~ D1 - Data Review, Validation, and Verification
~ D2 - Validation and Verification Methods
ป D3 - Reconciliation with User Requirements
354d.prz-8/2000- 1-2
-------
C1: Assessment and Response Actions
Surveillance
The observation of project implementation activities
Technical Systems Audit (TSA)
Formal audit of facilities, personnel, equipment, and record-keeping
Performance Evaluation (PE)
Independent evaluation for proficiency in analytical work
Audit of Data Quality (ADQ)
How the data were handled; were uncorrected mistakes present?
Peer Review
Refer to the Agency's peer review policy and guidance
Data Quality Assessment (DQA)
Application of statistics to the data
DQA Process
Were the DQOs achieved?
Should parts of the DQOs be changed?
Where the assumptions made during planning viable?
Were the statistical tests powerful enough?
What supplemental information is needed?
354d.prz - 8/2000 - 3-4
-------
Steps of the DQA Process
But is it really necessary to address
DQA in a QAPP?
How can you include
statistical analyses if
the data has yet to be
collected? A QAPP
must be approved
before data collection.
How else can you know
if the data appear likely
to meet the project
objectives (DQOs)?
MO!
YES!
354d.prz - 8/2000 - 5-6
-------
Resolution
C1 should contain an outline of the proposed
methods of DQA that will be used during the
implementation phase to check:
ป key data assumptions
- whether the DQOs are likely to be met
C1 should also include a cross-reference to D3,
which specifies the assessment-phase DQA
procedures and where the complete DQA
report/documentation may be obtained.
C2. Reports to Management
Project status
Results of Performance Evaluations and Technical
Systems Audits
Results of Data Quality Assessment
Significance of QA problems and recommended
solutions
354d.prz-8/2000-7-8
-------
D1. Data Review, Validation,
and Verification Requirements
ซ Purpose of this element:
To synthesize previously conducted activities and
describe how deviations from the requirements
specified in the QAPP will be addressed.
D2. Validation and Verification Methods
Define verification and validation, e.g.,
Verification: Have the procedures outlined in
the QAPP been carried out properly?
Validation: Were the procedures used to
generate the data consistent with the intended
use of the data?
Discuss the process for validation and verification of
the data
Describe how data verification and validation issues
will be resolved and conveyed
354d.prz-8/2000-9-10
-------
D3. Reconciliation with DQOs
Describe how the results will be reconciled with the
DQOs established by the data user(s)
~ What types of calculations will be needed to
check the DQO assumptions and draw
conclusions?
Define how the DQA results will be documented
Discuss how limitations of the data will be reported
to decision-makers
EPA Quality System - Project Level
354d.prz-8/2000-11-12
-------
QAPP Part C Journal Activity
1. Turn to page J-8 in your journal and review the
activity instructions.
2. Read an excerpt from EPA QA/G-5 about element C1
and an overview of technical assessments.
3. Write your responses to the questions listed on J-8.
4. In your group, discuss your individual responses.
-------
The Data Quality
Assessment Process
Objectives
Participants will understand how DQOs are essential
to DQA
Participants will understand the meaning and value
of verifying assumptions made when selecting the
statistical test
Participants will be able to produce various displays
of data and understand the impact the visual display
has on interpreting the results
Participants will learn how to use and will be given
the opportunity to conduct DQA on their own data
sets.
392.prz- 8/2000 -1-2
-------
What is the Data Quality
Assessment Process?
The statistical analysis of environmental data to
determine whether the quality of data is sufficient to
support the decision.
Decisions (based on sample data collected) are made
during the DQA Process.
Does data provide "sufficient evidence" to draw
conclusions about the site?
EPA Quality System- Project Level
i
i
!
i
i
1
392.prz - 8/2000 - 3-4
-------
DQA Process: Answers it can provide
ซ Do the data violate the conceptual site model or test
assumptions?
I
i
i
| Did I collect enough data?
~ Were the data quantity and quality consistent
with the DQO assumptions and limits on
decision errors?
What do I conclude about the state of contamination
at the site?
i
DQA Process: Answers it cannot provide
Did I make a decision error?
; (good decision -- bad outcome)
What are the "true" site conditions?
I
1
Do I need different types of data?
(may require another iteration through the DQO
Process)
i
I
392.prz - 8/2000 - 5-6
-------
The Data Quality Assessment Process
Review DQOs and Sampling Design
M-
Conduct Preliminary Data Review
M-
Select the Statistical Test
M.
Verify the Assumptions
1-
Draw Conclusions From the Data
r
The Data Quality Assessment Process
Guidance for the Data Quality Assessment Process (G-9)
Written for non-statisticians
Supplements Agency guidance
Does not replace statistical texts
Regular supplements
~ Current examples
ป Updated techniques
ปShared information
Data Quality Evaluation Statistical Toolbox (G-9D)
Runs on most IBM-compatible personal computers
Interactive and easy to use
Implements tools from G-9
392.prz - 8/2000 - 7-8
-------
DQA Step 1:
Review DQOs and Sampling Design
If DQO were previously developed, verify
hypotheses, limits on decision errors
If DQOs were not previously developed, consult the
data user(s) to develop a hypothesis and limits on
decision errors retrospectively.
Review sampling design
ป Small deviations vs. major deviations
ป Was the design correctly implemented?
Tolerable Limits on Decision Errors
l
| Mean PCB Concentration (in ppm^
i _j
392.prz- 8/2000 -9-10
-------
Investigation of Assumptions
Conceptual model
Sampling design
~ Independence of data
~ Variability estimates
~ Equal probabilities for sampling units
Statistical test
~ Standard (parametric) versus non-parametric
~ Most powerful for data
Performance Goal Diagram
ป Range of values for true parameter
ป Severity of consequences
Primary Statistical Hypotheses
If the mean concentration of total PCBs in surface soil
(top 1 inch) over the Northeast quadrant (96,000 sq. ft.)
decision unit exceeds 1 ppm, then investigate the area
further; otherwise, take no further action. Baseline
condition or assumption is that the true mean level of
contamination is at least 1 ppm.
Baseline Condition (Null Hypothesis): Mean > 1 ppm
vs.
Alternative Condition (Alternative Hypothesis): Mean < 1 ppm
392.prz- 8/2000- 11-12
-------
Discussion Question
Given that we hold onto the baseline condition
unless we have clear and convincing evidence
otherwise, what would be clear and convincing
numerical evidence?
i
Where Did the Samples Come From?
ซ The sampling area in the Northeast quadrant has
been divided into 10 ft x 10 ft sampling units.
The entire Northeast quadrant is 40 sampling units
across by 24 sampling units down, creating a total of
960 sampling units.
60 sampling units will be selected on a rectangular
grid with a random start.
Each selected sampling unit will be further
subdivided into a 3x3 grid (with a random start
location). Nine minisamples will be collected and
composited to form a single sample for analysis.
392. prz- 8/2000- 13-14
-------
Assuming approximate normality,
_ (Z'-a + Gr | (Va)2
A2 2
_ (1.64 + 1.28)2(1.3)2 (1.64)2
(ฆ5) + 2
= 60
"Note: a = .05, p = .10
DQA Step 2:
Conduct Preliminary Data Review
Review quality assurance reports for anomalies
Calculate standard statistical quantities
Display the data using graphical representations
392.prz-8/2000-15-16
-------
Data (N=60)
PCB concentration levels were measured (in ppm) from 60 surface
soil samples (top one inch of soil) from the area of concern. Each
soil sample consists of 9 minisamples composited together.
2.9714918
0.3944508
0.0636396
0.0135851
0.2141057
0.8270809
0.4256530
0.3517509
7.4508734
0.0100000
0.2305239
0 0100000
0.1370552
0.3279608
2.0771547
0.0823492
0.5161426
2.4257983
0.4162389
0.4958090
0.2064603
2.0352892
0.2907452
0.0116283
0.3611203
0.0111995
0.1429569
0.5941579
0.4145329
0.0294908
01116369
0.5943270
0.5467844
0.0586851
0.0929529
0 4828711
0.1715708
0.0901670
0.0345738
0.0921064
0.2474780
0.1591530
0.0428163
0.0521538
3.1877187
0.0424898
0.0939190
4.9384590
0.3583117
0.0472357
0.1374893
0.0196463
2.2771285
0.2121811
0.0526599
0.1130365
0 4680810
2.2941993
0.1808077
0 0240088
Summary Statistics
Number of Observations: 60
Minimum:
Mean:
Variance:
Range:
0.010
0.670
1.680
7.441
Coefficient of Variation:
Coefficient of Skewness:
Coefficient of Kurtosis:
Maximum: 7.451
Median: 0.209
Standard Deviation: 1.296
Interquartile Range: 0.428
1.935
3.357
12.542
Percentiles:
1st: 0.010
75th:
0.489
5th: 0.011
90th:
2.286
10th: 0.022
95th:
3.080
25th: 0.061
99th:
7.451
392.prz- 8/2000-17-18
-------
Cleaning-up the Data
Data rounded to the closest hundredth:
2.97
0.39
0.06
0.01
0.21
0.83
0.43
0.35
7.45
0.01
0.23
0.01
0.14
0.33
2.08
0.08
0.52
2.43
0.42
0.50
0.21
2.04
0.29
0.01
0.36
0.01
0.14
0.59
0.41
0.03
0.11
0.59
0.55
0.06
0.09
0.48
0.17
0.09
0.03
0.09
0.25
0.16
0.04
0.05
3.19
0.04
0.09
4 94
0.36
0.05
0.14
0.02
2.28
0.21
0.05
0.11
0.47
2.29
0.18
0 02
Viewing the Data (Frequency Histogram)
Histogram
Data Points (N = 60) PCB Concentration
Care should be taken to ensure that the box sizes have contextual leaning.
G-9, Section 2.3 1, G-9D - Graphs.1
392.prz- 8/2000- 19-20
-------
Graphical Displays
Multiple histograms can be similarly useful:
Boxplot (Box and Whiskers)
Quartile
Median
Quartile
Extreme (Large)
value
"1o~
Sample Values
"to"
392.prz- 8/2000 -21-22
-------
Boxplot Examples
Boxplot for normal data:
Boxplot for lognormal data:
- Data transformation may be useful
Graphical Representations
Boxplot for EMCA data
*
m
G-9, Section 2 3 3, G-9D - Graphs,3
392.prz - 8/2000 - 23-24
-------
DQA Step 3:
Select the Statistical Test
Select the statistical test based on the data user's
objectives and the preliminary data review
Identify assumptions underlying the statistical test:
ป Distributional form
- Independence
ป Dispersion characteristics
ป Homogeneity
- Basis for randomization
Statistical Tests
One sample tests
~ Standard (parametric) vs. non parametric
ป one sample t-test, Wilcoxon signed rank test,
sign test, one sample proportion test
Two sample tests
ป Student's two sample t-test (equal variances)
- Satherwaite's two sample t-test (unequal
variances)
* Wilcoxon rank sum test, slippage test, quantile
test, two sample proportion test
392.prz - 8/2000 - 25-26
-------
! -
j
Select Test and
Identify Assumptions
e.g., ONE SAMPLE t-TEST
No outliers (sample mean and standard deviation are
| very sensitive to outliers)
Sample mean is approximately normally distributed
Random sample (independence of the data values)
Has difficulty in dealing with less-than values, e.g.,
values below the detection limit
! G-9, Section 3.2 1.1
DQA Step 4: Verify the Assumptions
of the Statistical Test
Determine approach for verifying assumptions
Perform tests of assumptions
If necessary, determine any corrective actions
392.prz- 8/2000 -27-28
-------
Identifying Outliers
Is the extreme value of 7.45 a statistical outlier?
; Extreme Value Test will be used
I
ป Test assumes data without outlier are normally
distributed
i
i
i
Discordance Test for Outliers
Null Hypothesis: The value 7.45 belongs to the rest of the data.
Alternative: The value 7.45 is an outlier.
Value Tested: 7.451
Sample Value: 5.231
Tabled Value: 2.956
For this test, reject the Null if sample value exceeds tabled value.
; Conclude 7.451 is an outlier at a 1% significance level.
G-9, Section 4 4 4; G-9D - Tools, 3, 2
392.prz - 8/2000 - 29-30
-------
Testing Data for Normality
Filliben Test
Null Hypothesis: Data are normally distributed.
Alternative: Data are not normally distributed.
Sample Value: 0.736
Tabled Value: 0.970
For this test, reject the Null if the sample value is less than
tabled value.
Conclude that non-normality has been detected at a 1%
significance level.
G-9, Section 4.2.2; G-9D - Tools, 1,1,1
Not Normally Distributed -
What Should We Do?
Data appear to be skewed in the histogram, which
may indicate a lognormal distribution.
So, apply Filliben Test to natural logarithms of the
data to test for lognormality. If logged data are
normally distributed, then untransformed data are
lognormally distributed.
Original data value 0.011 becomes -1.951, 7.451
becomes 0.872, etc.
392.prz- 8/2000 -31-32
-------
Number of Observations:
Minimum:
Mean:
Variance:
Range:
-4.605
-1.620
2.607
6.614
Coefficient of Variation:
Coefficient of Skewness:
Coefficient of Kurtosis:
Maximum: 2.008
Median: -1.564
Standard Deviation:
Interquartile Range:
-0.997
0.107
-0.451
1.615
2.080
Percentiles:
1st: -4.605
5th: -4.473
10th: -3.830
25th: -2.795
75th
90th
95th
99th
-0.715
0.827
1.124
2.008
Viewing the Transformed Data
Histogram
-t tt -i ?t -i ii - ii ~ ii i m ] !
Data Points (N = 60) PCB Concentration (Ln transform)
Care should be taken co ensuce chat the box sizes have contextual ueaning.
G-9, Section 2 3 1, G-9D - Graphs, 1
392.prz - 8/2000 - 33-34
-------
Box and Whiskers Plot
Natural log transformed data
+-
Testing Data for Lognormality
Filliben Test
Null Hypothesis: Data are normally distributed.
Alternative: Data are not normally distributed.
Sample Value: 0.978
Tabled Value: 0.970
For this test, reject the Null if the sample value is less than
tabled value.
There is not enough evidence to reject the assumption of
lognormality with a 1% significance level.
G-9, Section 4 2.3, G-9D - Tools, 1,1,1
392.prz - 8/2000 - 35-36
-------
Testing Data for Lognormality
Cannot reject the assumption that the data are
lognormally distributed and that the logs of the data
are normally distributed.
Apply test for outliers on the logged data.
Extreme value of 7.451 becomes 2.008.
G-9, Section 4 6, G-9D - Tools, 5
Discordance Test for Outliers:
Transformed data
i Discordance Test for Outliers
| Null Hypothesis: The value 2.008 belongs to the set of data.
|
! Value Tested: 2.008
; Sample Value: 2.247
Tabled Value: 2.956
|
For this test, reject the Null if sample value exceeds tabled value.
i
! There is not enough evidence to conclude that 2.008 is an outlier
at a 1% significance level.
G-9, Section 4 4 4; G-9D - Tools, 3, 2
i
392.prz - 8/2000 - 37-38
-------
Are Assumptions Satisfied?
Outliers? None; high values can be expected with
lognormally distributed data.
^Sample mean approximately normally distributed?
Invoke Central Limit Theorem (or perform tests on
natural logs of data).
Random sample? Rectangular grid with a random
start location and composite samples.
v Data below the detection limit? Only 2 non-detects
in this example. Detection limit is much smaller than
action level, so impact is minimal.
DQA Step 5:
Draw Conclusions from the Data
Perform the calculations for the statistical
hypothesis test
Evaluate the statistical test results and draw
conclusions
Evaluate the performance of the sampling design if
the design is to be used again
392.prz - 8/2000 - 39-40
-------
Perform Calculations - Natural Log Scale
Student's t-Test for a One-Sample Mean
In natural log scale, action level is 0 (i.e., In(1)=0)
Null Hypothesis Ho: mean > 0.0
Alternative: mean < 0.0
Sample Value (t) = -7.770
Tabled Value = -1.671
For this test, reject the Null if the sample value is less than
i the tabled value.
I
I Reject null hypothesis at a 5% significance level.
G-9, Section 3 2 11, G-9D - Hypothesis, 1
Perform Statistical Test
Calculate t test statistic for transformed data
,=IHL= -1-62-0 =_777
sl-jn 1.615//60
Decide Decide site contaminated
For H0: p.T>0 ppm site clean I _
Reject H0 if t < -ta
X X
Since -7.77 < -1.67, reject H0
/ \
Reject
I
c
II
L
o
392.prz- 8/2000 -41-42
-------
Interpreting Statistical Results
"Significant at 5%"
If the Null Hypothesis is true yet the statistical test
rejects the null hypothesis at a 5% significance level,
such chance sampling results will occur with a
chance of less than one-in-twenty.
Even though the sample mean fell in the "gray
region," the test provided sufficient evidence to
conclude that the true mean was below the action
level.
Conclusions
First 2 steps of DQA involve "getting to know" your
data
ป how they were generated and why
* how they look from a variety of perspectives
Next 2 steps of DQA involve exploring appropriate
methods and assumptions
ป Some tests of assumptions require their own
verification of assumptions
ป Logic and context can be employed to support
verification of assumptions
You will make some decision based on data. You
can know what the data are "doing to you," or be
blind to it. DQA puts you in control.
392. prz - 8/2000 - 43-44
-------
DQA Exercise
1. Open exercise packet, read instructions, and
confirm other contents of packet as directed.
2. Confirm data files for each DU is on your diskette.
Start DataQUEST and specify a file name for the DU
your team will analyze.
3. Review and record summary statistics.
4. View graphical representation of data.
5. Test assumption for the one-sample t-test.
6. Conduct hypothesis test.
7. Record results on DQA worksheet.
8. Open next file and repeat steps 3-8 until data from all
DUs have been analyzed.
-------
3
-------
Think about the following questions and write your answer in the space below.
1. What are the one or two most pressing or important QA
issues that you have to deal with in your work?
~ itj- vP-n1L ฉA p CxA LlajUO
^ O^vvfX^c-v-u^ i-iWr
tziSzM- CVul (Xjk&~
(^^lyyujVh 8-t^ ^btTvuf" (LaJtCu c^-^l
^o(U-u^ c^r uoi)
(jlQA i- 0)A pin)c
-------
Quality System
Documentation
(e g , Quality Management Plan)
Supporting System Elements
(e g , Procurements,
Computer Hardware/Software)
Training/Communication
(e g , Training Plan,
...Conferences),
Annual Review and Planning
{e g , QA Annual Report
and Work Plan)
Systematic
Planning
(e g . DQO Process)
O
UJ
)
O
K
CL
QA
Project Ptan
Acquire Data
Data Verification
& Validation
Standard
Operating
Procedures
Technical
Assessments
PLANNING
k
IMPLEMENTATION
Defensible Products and Decisions
1. Circle the component(s) of the Quality System with which you have the most experience
or for which you have the most responsibility.
2. Think about your answer to question 1 on page J-l and draw a box around the
components of the Quality System that are involved in addressing the issue(s) you
identified.
Integrating QA into Project Development
J-2
8/2000
-------
Consequences of Decision Errors
Proceed with remedial design
when the true mean [PCB] < 1
Health Risks
Ecological Risks
Political Risks
Social Risks
Resource Risks
Take no further action when the
true mean [PCB] > 1
Health Risks
Ecological Risks
Political Risks
Social Risks
Resource Risks
Integrating QA into Project Development
J-3
8/2000
-------
Exercise: Setting Quantitative Limits on Decision Errors
With your team, work through the following 5 steps in order to complete a decision performance
goal diagram for the Artificial Site scenario. Document your results on the flip chart at your
table in the same form as the decision performance goal diagram shown below. All questions
marks should be replaced when you are finished.
1. Confirm the action level and the baseline (i.e., null hypothesis).
2. Set the parameter range of concern (i.e., the range of mean PCB contamination
that is possible at the site).
3. Establish the gray region (i.e., the range of mean PCB concentrations where the
consequences of a decision error are relatively minor).
4. Specify your team's tolerable probability limits for making a type I error (i.e.,
reject the null hypothesis when it is true).
5. Specify your team's tolerable probability limits for making a type n error (i.e., do
not reject the null hypothesis when it is false).
Your team will use these quantitative outputs later in this training course, but you will have the
opportunity to make revisions. Remember to consider the consequences of your choices.
Setting Quantitative Limits on Decision Errors
Baseline: Mean [PCB] > 1
1
Probability
of taking
remedial
action
??
0
ฆ
f? 1
'-True mean [PCB]
(ppm)
-33
99
oJ
10
Integrating QA into Project Development
J-4
8/2000
-------
Migration of PCBs via
PCB Source Area
PCB-contaminated waste oil
sprayed on dirt road
Surficial Aquifer
(silty sand)
Semi-Confining Layer
(silty clay)
PCBs
accumulate as
stream
sediment
Minimal vertical
migration of PCBs due to
low solubility and high
sorption properties
groundwater
flow *
Groundwater
not affected by
PCB soil
contamination
Principal Aquifer
(sandy silt)
groundwater
flow
w
Groundwater in surficial aquifer discharges to streams during periods of high water table.
Conceptual Site Model of OU2 PCB Contamination at the EMCA/ECC Superfund Site
Integrating QA into Project Development
J-5
8/2000
-------
B Storage Shed
Manufacturing
Building
Warehouse
Office
Building
Site Legend
A/Unimproved Roads Landuse Scale
/y Paved Roads ^Industrial One Inch = 700 Feet
Streams - | Residential q peet 700
'* Topographic Contours| | site Property
(ft. above MSL) _ We||fje|d
$0 Buildings
North
Figure 2. Site Map
EMCA/ECC Superfund Site
Integrating QA into Project Development J-6 8/2000
-------
B Storage Shed ^
2 6ฎ ND&
05 09 .
297f .Operable Unit 2
L ฉ418 i
ฉ0.019
Operable Unit 1
Manufacturing
Building
i Warehou
Office
Building
Site Legend
A/Unimproved Roads Landuse
/S/Paved Roads gง|g Industrial
/V Streams ~ Residential
/Topographic Contours I I Site Property
/ " (ft. above MSL) jgm Wellfield
Buildings
7 42ฉ Soil Sampling Location with Total PCB Concentration (mg/Kg)
nd ฉ PCBs Not Detected above the Reporting Limit (0.01 mg/Kg)
Figure 4. Total PCB Concentrations Reported for Preliminary Soil Samples
EMCA/ECC Superfund Site
Scale
One Inch = 700 Feet
0 Feet 700 North
-------
QAPP Part C Activity
This exercise involves individual reading and reflection, followed by discussion
at your table.
Reading Assignment A: Section CI, Assessments and Response Actions (an excerpt
from EPA QA/G-5, EPA Guidance for Quality Assurance Project Plans, Final -
EPA/600/R-98/018, February 1998), which identifies the requirements for documenting
assessments and response actions in a QAPP.
Reading Assignment B: Overview of Technical Assessments, which addresses readiness
reviews, technical systems audits, surveillance, performance evaluations, and audits of
data quality.
2. For each of these five types of assessments, think about the following questions and write
down some brief notes to capture your thoughts:
a. What kind of information does this tvoe of assessment generate?
1. Turn to the next page in this journal for reading assignments A and B:
b. What kinds of issues or problems is the assessment designed to detect?
p rd-dxdiMiU) OAJL
c. What kinds of response actions could result from this assessment?
d. How would you document the assessment procedures in the QAPP to ensure that
the assessment Heter.ts nrnhlemc anH triooprc rpcnnncp artinnc in 9 fim<=ป1\/ mmnpr?
C
3. Discuss your
Integrating QA into Project Development J-8
8/2000
-------
QAPP Part C Activity
Reading Assignment A: Section CI, Assessments and Response Actions (from
EPA QA/G-5, EPA Guidance for Quality Assurance Project Plans, Final 2/98)
C ASSESSMENT/OVERSIGHT
CI ASSESSMENTS AND RESPONSE ACTIONS
Identify the number, frequency, and type of assessment activities needed for this project.
List and describe the assessments to be used in the project. Discuss the information expected and the
success criteria for each assessment proposed. List the approximate schedule of activities, identify
potential organizations and participants. Describe how and to whom the results of the assessments
shall be reported.
Define the scope of authority of the assessors, including stop work orders. Define explicitly the
unsatisfactory conditions under which the assessors are authorized to act and provide an
approximate schedule for the assessments to be performed.
Discuss how response actions to non-conforming conditions shall be addressed and by whom.
Identify who is responsible for implementing the response action and describe how response actions
shall be verified and documented.
3/gno iLyJs | -
YioiyiG e ch^ 5 <2lo
- y iV'vi'UtjCsyw
\^CY~0 sฃ4^c$-!AJ> b tJrvHMiJ
ฆฃ ^ dtawn
^ to. ^ ^ colW- ^
Integrating QA into Project Development J-9 8/2000
-------
QAPP Part C Activity
Reading Assignment B: Overview of Technical Assessments
Readiness Review
A readiness review is performed prior to the initiation of data collection to verify that the project
personnel have brought the facility to a state of readiness. Readiness means achieving a configuration in
which the right people are in the right places at the right times working with the right hardware, software,
and materials according to the right procedures and management controls.
Technical Systems Audits (TSA)
A TSA is a qualitative on-site evaluation of all components of the measurement system,
including technical and QA management personnel. Assessors travel to the site, gather evidence in
person, and produce a report. The main function of a TSA is to determine that project personnel and
equipment are physically m place and functioning as stated in the QAPP. It includes an evaluation of
both field and laboratory staff, equipment and procedures. The optimal time for performance of a TSA is
during the first few days of the project, after all measurement systems are operational, but before
significant amounts of data have been collected. TSAs should be performed on a regular schedule
throughout the project. Checklists are the basis of a TSA and are prepared based on the QAPP.
Surveillance
Surveillance is the real-time observation of a specific activity of an ongoing project. It may be
done on multiple occasions during a project. Its objective is to provide confidence that the activity is
being performed in accordance with approved methods and procedures. It allows for immediate
identification of any problems and initiation of corrective action. Surveillance offers the opportunity for
the assessor to develop a close working relationship with the project team and to encourage the work to
be performed correctly, rather than just pointing out errors or deficiencies after they occur.
Performance Evaluation (PE)
A PE is a quantitative assessment in which analytical results are generated by a measurement
system for a sample that originates outside of the project. A PE sample mimics actual samples in all
possible aspects, except that its composition is unknown to the analyst and is known to the assessor. In
the context of the Quality System, a PE is used to determine if measurement system's results are within
data quality goals specified in the QAPP. PE results are often used to estimate the degree of bias in the
measurement system.
Audit of Data Quality (ADQ)
An ADQ is an examination of data after they have been collected. It is done to determine how
well the measurement system performed with respect to the data quality goals specified in the QAPP.
ADQs entail tracing data though processing steps and duplicating intermediate calculations. The focus is
on identifying a clear, logical connection between the steps. The product is a report which details
custody tracing, data transfers, recalculations, incidents which resulted in lost data, and a review of QA
data and summary statistics.
Integrating QA into Project Development
J-10
8/2000
-------
Application Planning
Think about the following questions and write your answers and ideas in the space provided.
1) Turn back to question 1 on page J-l in this journal and review the QA issues you
identified yesterday morning as the most pressing to you. What concepts, skills, or
tools have you learned over the past two days that will help you begin to address
these issues?
2) What are some specific actions you could take in the upcoming days or weeks to
begin to apply the concepts, skills, or tools you've learned that will help you address
the important QA issues and problems that you face?
Integrating QA into Project Development
J-l 1
8/2000
-------
3) Identify a current or upcoming project where you might be able to apply the tools,
skills, and information you learned in this workshop.
Project:
Write down some specific actions you would like to take in the upcoming days or weeks to
apply what you've learned here, and identify the kind of support or resources you need to
accomplish these tasks.
4) Are there any topics in project QA that came up during this workshop that you
think you want to learn more about? If so, how could you access appropriate
training or other learning opportunities?
Integrating QA into Project Development
J-12
8/2000
-------
4
-------
U.S. EPA Headquarters Library
Mail code 3201
1200 Pennsylvania Avenue NW
Washington DC 20460
Revision 1
QUALITY
ASSURANCE
PROJECT PLAN
For a Preliminary Remedial
Investigation of Operable
Unit 2 PCB Contamination
at the EMCA/ECC
Superfund Site
March 31,1998
Prepared for:
U. S. Environmental Protection Agency
Region IV Superfund Section
This is an example Quality Assurance
Project Plan. The referenced Superfund
site and contractors are fictitious.
Prepared by:
Sandra Lowem & Associates
Environmental Consultants, Inc.
-------
QUALITY ASSURANCE PROJECT PLAN
FOR A PRELIMINARY REMEDIAL INVESTIGATION OF
OPERABLE UNIT 2 PCB CONTAMINATION
AT THE EMCA/ECC SUPERFUND SITE
REVISION 1
March 31, 1998
Prepared by:
Sandra Lowem & Associates Environmental Consultants, Inc.
Prepared for:
U.S. Environmental Protection Agency, Region 4 Superfund Section
Document Approval Signatures:
Andrew Miller, EPA Remedial Project Manager Date
Joseph Braswell, EPA Remedial Site Manager Date
Elizabeth Wall, EPA Quality Assurance Officer Date
James Boyd, Sandy Lowem & Associates Project Manager Date
Susan Davis, Sandy Lowem & Associates QA Manager Date
Mark Roberts, Bunse & Burner Laboratory Project Manager Date
Richard Allison, Bunse & Burner Laboratory QA Officer Date
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page i of x
QUALITY ASSURANCE PROJECT PLAN
FOR A PRELIMINARY REMEDIAL INVESTIGATION OF
OPERABLE UNIT 2 PCB CONTAMINATION
AT THE EMCA/ECC SUPERFUND SITE
A1 - A3 DOCUMENT INTRODUCTION
A1 EXECUTIVE SUMMARY
The U.S. Environmental Protection Agency (EPA) is conducting a remedial investigation
(RI)/feasibility study (FS) under the Comprehensive Environmental Response, Compensation, and
Liability Act (CERCLA or Superfund) program at the Electronic Manufacturing Corporation of
America (EMCA)/Energy Components Company, Inc. (ECC), Superfund site. EPA is conducting
this work in a phased approach. The RI activities described in this Quality Assurance Project Plan
(QAPP) are designed to assess only polychlorinated biphenyl (PCB) contamination in shallow soil
within operable unit 2 (OU2). OU2 has been defined as approximately 70 acres of land that
includes roughly 1,200 feet of dirt road on which PCB-contaminated waste oil was sprayed as a
dust suppressant by EMCA from the 1970s through 1985.
The sampling program involves collection of composite surface soil samples using handheld tools
within 54 delineated decision areas (DAs). Each decision area is either: (1) a linear segment of
the dirt road; (2) a plot of land 0.5 to 4.5 acres in size that may have received PCB contamination
from overspray, air-blown particles, or stormwater runoff; or (3) a reach of ephemeral stream that
may contain deposits of PCB-contaminated soil. The samples will be submitted to Bunse &
Burner Laboratory for analysis of individual PCBs (congeners) by SW-846 Method 8082. Total
PCB concentrations will be calculated by summing the congener concentrations. The laboratory
analytical reports will contain most of the elements required for an EPA Contract Laboratory
Program data package. After the data are validated, data of acceptable quality will be statistically
evaluated using a robust generalized version of the Student's t-test called the Chen test. A soil
screening level of 1.0 ppm total PCBs will apply. The results of these analyses will indicate which
DAs will: (1) be characterized as not posing an unacceptable risk to human health or the
environment and dismissed from further RI/FS activities, (2) be included in the FS to evaluate
remedial alternatives for surface soil PCB contamination cleanup and targeted for characterization
of subsurface soil contamination in a subsequent RI phase, or (3) require additional surface soil
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page n of x
PCB data before a determination can be made within the established decision error limits as to
which of the first two categories applies.
This QAPP presents the Data Quality Objectives (DQOs) established for the project; the
prescribed data collection methods and procedures; project management structure and protocol;
quality assurance and quality control procedures to be implemented during the project, including
use of a data management system; and the prescribed data assessment methodology. This QAPP
contains each of the elements presented in EPA Requirements for Quality Assurance Project
Plans for Environmental Data Operations, EPA QA/R-5 (EPA, 1994b) and described in EPA
Guidance for Quality' Assurance Project Plans, EPA QA/G-5 (EPA, 1997). The DQOs
presented in Section A3 have been established in accordance with Guidance for the Data Quality
Objectives Process, EPA QA/G-4 (EPA, 1994c) and Data Quality Objectives Process for
Superfund, EPA 540-R-93-071 (EPA, 1993). Following the requirements of EPA QA/R-5 and
the guidance of EPA QA/G-5, this QAPP contains the essential elements of a CERCLA RI
sampling and analysis plan; field sampling methods and procedures are presented in Section B in
sufficient detail that a companion field sampling plan would be redundant.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page lii of x
A2 TABLE OF CONTENTS
A1 - A3 DOCUMENT INTRODUCTION i
A1 EXECUTIVE SUMMARY i
A2 TABLE OF CONTENTS iii
List of Figures vii
List of Tables vii
List of Acronyms/Abbreviations viii
A3 DISTRIBUTION LIST x
A4 - A9 PROJECT INTRODUCTION AND MANAGEMENT 1
A4 PROJECT/TASK ORGANIZATION 1
A4.1 Management Responsibilities 1
A4.1.1 EPA Region 4 Remedial Project Manager 1
A4.1.2 EPA Region 4 Remedial Site Manager 2
A4.1.3 Sandy Lowem & Associates Project Manager 2
A4.2 Quality Assurance Responsibilities 3
A4.2.1 EPA Quality Assurance Officer 3
A4.2.2 Sandy Lowem & Associates Quality Assurance Manager 3
A4.3 Field Responsibilities 4
A4.3.1 Field Team Leader 4
A4.3.2 Sandy Lowem & Associates Field Technical Staff 4
A4.4 Laboratory Responsibilities 5
A4.4.1 Laboratory Project Manager 5
A4.4.2 Laboratory Operations Manager 5
A4.4.3 Laboratory Quality Assurance Officer 6
A4.4.4 Laboratory Sample Custodian 6
A4.4.5 Laboratory Technical Staff 7
A5 HISTORICAL AND BACKGROUND INFORMATION 7
A5.1 Site Description 8
A5.2 Site Topography and Drainage 8
A5.3 Site Geology and Hydrogeology 9
A5.4 Past Data Collection Activities 9
A5.5 Applicable PCB Standards and Criteria 10
A6 PROJECT DESCRIPTION 11
A6.1 Project Tasks 11
A6.2 Work Schedule 13
A7 DATA QUALITY OBJECTIVES 13
A7.1 DQOStepl : Statement of the Problem 14
A7.1.1 Historical and Background Information 14
A7.1.2 Conceptual Site Model 14
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page iv of x
A7.1.3 Involved Parties, Resources, and Deadlines 15
A7.2 DQO Step 2: Decision Statement 16
A7.3 DQO Step 3: Inputs into the Decision 16
A7.4 DQO Step 4: Study Boundaries 17
A7.4.1 Sampling Depth 17
A7.4.2 Decision Areas 17
A7.4.3 Temporal Study Boundaries 18
A7.5 DQO Step 5: The Decision Rule 19
A7.6 DQO Step 6: Limits on Decision Error 19
A7.7 DQO Step 7: Design Optimization 21
A7.7.1 Composite Sampling 22
A7.7.2 Sampling Pattern 22
A7.7.3 Numbers of Samples 22
A8 SPECIAL PERSONNEL TRAINING REQUIREMENTS 23
A9 DOCUMENTATION, RECORDS, AND REPORTS 24
A9.1 Field Documentation 24
A9.2 Laboratory Documentation 25
A9.3 Management and QA Reports 25
A9.3.1 Monthly Progress Reports 25
A9.3.2 Audit Reports 25
A9.3.3 Data Validation Reports 26
A9.4 Final Report 26
B DATA ACQUISITION 27
B1 EXPERIMENTAL DESIGN 27
Bl.l Sample Matrix and Target Analytes 27
B1.2 Types, Numbers, and Locations of Samples 27
B1.3 Criticality of Measurements 28
B2 FIELD SAMPLING METHODS AND PROCEDURES 28
B2.1 Preparation for Field Work 28
B2.2 Support Organizations 29
B2.3 Presampling Survey of Decision Areas 30
B2.4 Selection and Surveying of Sampling Locations 30
B2.5 Sample Containers, Preservation, and Maximum Holding Times 32
B2.6 Field Quality Control Samples 33
B2.6.1 Field Duplicate Samples 33
B2.6.2 Performance Evaluation Samples 33
B2.7 Soil Sampling Procedures 34
B2.8 Field Documentation 37
B2.8.1 Sample Numbering System 37
B2.8.2 Field Forms 38
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page v of x
B2.9 Handling Investigation-Derived Waste 39
B2.10 Field Corrective Action 40
B2.11 Health and Safety 40
B3 SAMPLE HANDLING AND CUSTODY 40
B4 ANALYTICAL METHOD REQUIREMENTS 43
B4.1 List of Target Analytes 43
B4.2 Method Sensitivity Requirements 44
B4.3 Required Equipment and Reagents 44
B4.4 Corrective Action Process for Analytical System Failure 45
B4.5 Laboratory Turnaround Time Requirements 46
B4.6 Safety and Hazardous Material Disposal Requirements 46
B4.7 Laboratory Data Report 47
B5 LABORATORY QUALITY CONTROL ELEMENTS 48
B5.1 Quality Control Checks and Procedures 48
B5.2 Quality Control Acceptance Criteria for Measurement Data 49
B5.2.1 Precision 49
B5.2.2 Accuracy 50
B5.2.3 Representativeness 51
B5.2.4 Comparability 51
B5.2.5 Completeness 52
B5.2.5.1 Completeness of Field and Laboratory Activities .... 52
B5.2.5.2 Data Quality Assessment Using the Chen Test 52
B6 INSTRUMENT EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE
REQUIREMENTS 52
B7 INSTRUMENT CALIBRATION AND FREQUENCY 53
B8 INSPECTION/ACCEPTANCE REQUIREMENTS FOR SUPPLIES AND
CONSUMABLES 53
B9 DATA ACQUISITION REQUIREMENTS FOR NON-DIRECT
MEASUREMENTS 54
BIO DATA MANAGEMENT 54
B10.1 Data Recording 54
B10.2 Data Quality Assurance Checks 55
B 10.3 Data Tranformations 55
B10.4 Data Transmittal 56
B10.5 Data Analysis 57
B10.6 Data Tracking 57
B10.7 Data Storage and Retrieval 58
C ASSESSMENT/OVERSIGHT 59
CI ASSESSMENT ACTIVITIES 59
Cl.l Technical Systems Audits 59
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page vi of x
CI.2 Data Validation 60
CI.3 Data Quality Assessment 60
C1.4 Corrective Action Process and Responsibility 60
CI.4.1 Field Corrective Action 61
CI.4.2 Laboratory Corrective Action 61
C2 ASSESSMENT DOCUMENTATION AND REPORTS 62
C2.1 Corrective Action Request and Tracking Form 62
C2.2 Audit Reports 62
C2.3 Data Validation Reports 63
C2.4 Monthly QA Summaries 63
D DATA VALIDATION AND USABILITY 64
D1 DATA REVIEW, VALIDATION, AND VERIFICATION 64
Dl.l Sampling Design 64
D1.2 Sample Collection Procedures 65
D1.3 Sample Handling 65
D1.4 Analytical Procedures 65
D1.5 Quality Control 66
D1.6 Calibration 66
D2 VALIDATION AND VERIFICATION METHODS 67
D3 RECONCILIATION WITH DATA QUALITY OBJECTIVES 68
E REFERENCES 70
APPENDIXES
A Example Project Documentation Forms
Chain of Custody Form
Sample Jar Label and Custody Seal
Soil Sampling Data Sheet
Corrective Action Request and Tracking Form
B Example Worksheet for Establishing Soil Specimen Sampling Locations
C Chen Test Procedures for Data Quality Assessment
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
LIST OF FIGURES
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page vii of x
1 Project Organization Chart
2 Site Map
3 Conceptual Site Model of OU2 PCB Contamination
4 Total PCB Concentrations Reported for Preliminary Soil Samples
5 Project Schedule
6 Data Quality Objective Process
7 Operable Unit 2 Decision Areas
8 Data Management Process
LIST OF TABLES
1 Health and Environmental Risks from PCBs
2 Sampling Plan Summary
3 Specific PCB Congeners in Aroclors
4 Project-Specific List of Target Analytes and Reporting Limits
5 Quality Control Sample Analyses and Acceptance Criteria
6 Internal Quality Assurance Assessment Activities
7 Explanation of Data Validation Qualifiers
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page vi ii of x
LIST OF ACRONYMS/ABBREVIATIONS
bgs
Below Ground Surface
CAS
Chemical Abstract Service
CERCLA
Comprehensive Environmental Response, Compensation, and Liability Act
(Superfund)
CERCLIS
CERCLA Information System
CFR
Code of Federal Regulations
CLP
Contract Laboratory Program
COC
Chain of Custody
cv
Coefficient of Variation
DA
Decision Area
DQA
Data Quality Assessment
DQO
Data Quality Objective
ECC
Energy Components Company, Inc.
ECD
Electron Capture Detector
EDD
Electronic Data Deliverable
EMCA
Electronic Manufacturing Corporation of America
EPA
U. S. Environmental Protection Agency
FS
Feasibility Study
GALP
Good Automated Laboratory Practices
GC
Gas Chromatograph
GPS
Global Positioning System
H&SP
Health and Safety Plan
HRS
Hazard Ranking System
IDW
Investigation-Derived Waste
IUPAC
International Union of Pure and Applied Chemistry
LIMS
Laboratory Information Management System
LOQ
Limit of Quantitation
MDL
Method Detection Limit
MS
Matrix Spike
MSD
Matrix Spike Duplicate
msl
Mean Sea Level
NIST
National Institute of Standards and Technology
NPL
National Priorities List
OSHA
Occupational Safety and Health Administration
OU1
Operable Unit 1
OU2
Operable Unit 2
PA
Preliminary Assessment
PCB
Polychlorinated Biphenyl
PE
Performance Evaluation
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
%R
Percent Recovery
PM
Project Manager
ppm
Parts Per Million
QA
Quality Assurance
QAM
Quality Assurance Manager
QAPP
Quality Assurance Project Plan
QC
Quality Control
RI
Remedial Investigation
RPD
Relative Percent Difference
RPM
Remedial Project Manager
RSM
Remedial Site Manager
SI
Site Inspection
SOP
Standard Operating Procedure
SSL
Soil Screening Level
TSA
Technical Systems Audit
TSCA
Toxic Substances Control Act
USGS
U.S. Geological Survey
VOC
Volatile Organic Compound
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page ix of x
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page x of x
A3. DISTRIBUTION LIST
EPA Region 4 Superfund Section
Joseph Braswell
Andrew Miller
Elizabeth Wall
Sandy Lowem & Associates
James Boyd
Susan Davis
Scott Michael
Sandy Lowem & Associates RI Project File
mailing address:
4 Hadley Way
Malcolm, VA 20151
Bunse & Burner Laboratory
Richard Allison
Bob O'Neill
Mark Roberts
Peter Rogers
mailing address:
12 Alison Lane
Helen, GA 30071
Other
City Environmental Department
Concerned Citizen Coalition
City Library (public depository)
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 1 of 70
QUALITY ASSURANCE PROJECT PLAN
FOR A PRELIMINARY REMEDIAL INVESTIGATION OF
OPERABLE UNIT 2 PCB CONTAMINATION
AT THE EMCA/ECC SUPERFUND SITE
A4 - A9 PROJECT INTRODUCTION AND MANAGEMENT
A4 PROJECT/TASK ORGANIZATION
The U.S. Environmental Protection Agency (EPA) Region 4 Superfund Section has overall
responsibility for the remedial investigation (RI) of polychlorinated biphenyl (PCB) contamination
at the Electronic Manufacturing Corporation of America (EMCA)/Energy Components Company,
Inc. (ECC) site. EPA's contractor, Sandra Lowem & Associates Environmental Consultants, Inc.
(Sandy Lowem & Associates), will perform the field investigation, evaluate the data, and prepare
project deliverables, including the RI report. The various quality assurance (QA) and
management responsibilities are divided between EPA and Sandy Lowem & Associates key
project personnel as defined below. The lines of authority between key personnel for this project
are shown on the project organization chart, Figure 1.
A4.1 Management Responsibilities
Project management responsibilities are divided among the EPA Region 4 Superfund Section
personnel and Sandy Lowem & Associates personnel described below.
A4.1.1 EPA Region 4 Remedial Project Manager
The EPA Region 4 Remedial Project Manager (RPM), Andrew Miller, has overall responsibility
for the investigation. He is responsible for granting final approval of project plans and reports and
seeing that plans are implemented according to schedule, and he has the authority to commit the
resources necessary to meet project objectives and requirements.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 2 of 70
A4.1.2 EPA Region 4 Remedial Site Manager
The EPA Region 4 Remedial Site Manager (RSM), Joseph Braswell, has the responsibility to
ensure that technical, financial, and scheduling objectives are achieved successfully. The RSM
reports directly to the RPM and is the major point of contact and control for matters concerning
the project. The RSM performs the following tasks:
Define project objectives and develop a detailed work plan schedule
Establish project policy and procedures to address the specific needs of the project as
a whole and the needs of each task
Evaluate project and/or task staffing requirements and acquire EPA or contractor
resources as needed to ensure performance within budget and schedule constraints
Orient contractor personnel concerning the project's special considerations
Review work progress for each task to ensure that budgets and schedules are met
Review and analyze overall task performance with respect to task goals and objectives
Approve all plans and reports before their submission to the RPM for final approval
Represent the project team at meetings and public hearings
A4.1.3 Sandy Lowem & Associates Project Manager
The Sandy Lowem & Associates Project Manager (PM), James Boyd, is responsible for task
implementation and technical quality control (QC). As requested, the PM will assist the RSM in
carrying out appropriate RSM responsibilities listed above. In addition, the PM is responsible for
monitoring and directing the field teams and the Field Team Leader, preparing monthly progress
reports, updating and distributing revisions of this Quality Assurance Project Plan (QAPP) as
necessary, and performing or overseeing data evaluation activities and RI report preparation.
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R]
QAPP Revision No. 1
March 31, 1998
Page 3 of 70
A4.2 Quality Assurance Responsibilities
QA responsibilities are divided among the EPA Region 4 Superfund Section personnel and Sandy
Lowem & Associates personnel described below.
A4.2.1 EPA Quality Assurance Officer
The EPA QA Officer, Elizabeth Wall, will remain independent of direct job involvement and
day-to-day operations and will be available to resolve any QA issues that may arise. Specific
functions and duties of the EPA QA Officer include approving the contents of this QAPP and
subsequent revisions; reviewing QA reports prepared by Sandy Lowem & Associates, including
QA evaluations and discussions presented in the final RI report; and providing QA technical
assistance to the RPM and RSM.
A4.2.2 Sandy Lowem & Associates Quality Assurance Manager
The Sandy Lowem & Associates QA Manager (QAM), Susan Davis, reports directly to the PM
and will be responsible for ensuring that the QA/QC procedures described in this QAPP are
followed. In addition, the Sandy Lowem & Associates QAM will:
Maintain regular communication with the EPA QA Officer regarding QA issues
Report on the adequacy, status, and effectiveness of the QA program on a regular
basis to the PM (see Section C2.4)
Conduct two audits of field activities and two audits of laboratory activities (see
Section Cl.l) and preparing audit reports
Validate each laboratory data report (see Section D2) and prepare data validation
reports
Ensure that corrective action, if necessary, is properly implemented and documented
(see Section CI.4).
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 4 of 70
A4.3 Field Responsibilities
The responsibilities of the Sandy Lowem & Associates field technical staff and Field Team Leader
are described below.
A4.3.1 Field Team Leader
The PM will be supported by the Field Team Leader, Scott Michael, who is responsible for
leading and coordinating day-to-day field activities. The Field Team Leader also will:
Coordinate and oversee the efforts of the subcontracted land surveyor (see Section
B2.3)
Ensure that the each field team is properly equipped to execute the field sampling
methods and procedures described in Section B2 and the sample handling and custody
procedures described in Section B3
Prepare the tables and figures described in Section B2.4 for selecting and surveying
soil specimen sampling locations
Package coolers for shipment to the analytical laboratory as described in Section B3
Identify problems at the field team level, resolve difficulties in consultation with the
PM and QAM, implement and document corrective action procedures, and provide
communication between the field teams and upper management
Prepare sections of the final RI report that document field activities.
A4.3.2 Sandy Lowem & Associates Field Technical Staff
The field technical staff for this project will be drawn from Sandy Lowem & Associates' pool of
corporate resources. All of the designated technical team members are experienced professionals
who possess the degree of specialization and technical competence required to perform the
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 5 of 70
required work effectively and efficiently and to meet the training requirements described in
Section A8.
A4.4 Laboratory Responsibilities
For this project, Sandy Lowem & Associates has subcontracted Bunse & Burner Laboratory. The
responsibilities of Bunse & Burner Laboratory personnel are described below.
A4.4.1 Laboratory Project Manager
The Bunse & Burner Laboratory Project Manager, Mark Roberts, will:
Ensure resources of the laboratory are available on an as-needed basis
Carry out liaison activities and scheduling with the Sandy Lowem & Associates PM
Review and approve final analytical reports prior to submission to Sandy Lowem &
Associates.
A4.4.2 Laboratory Operations Manager
The Bunse & Burner Laboratory Operations Manager, Bob O'Neill, will report to the Laboratory
Project Manager and will:
Coordinate laboratory analyses
Supervise in-house chain-of-custody
Schedule sample analyses
Oversee data review
Oversee preparation of analytical reports.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No 1
March 31, 1998
Page 6 of 70
A4.4.3 Laboratory Quality Assurance Officer
The Bunse & Burner Laboratory QA Officer, Richard Allison, has the overall responsibility for
data quality. The Laboratory QA Officer will:
Oversee laboratory quality assurance activities
Prepare laboratory standard operating procedures (SOPs) and see that they are
implemented
Conduct detailed review of analytical data and QA/QC documentation
Identify when laboratory corrective action is warranted and oversee its implementation
and documentation
Review analytical reports prior to submission to the Bunse & Burner Laboratory
Project Manager.
A4.4.4 Laboratory Sample Custodian
The Bunse & Burner Laboratory Sample Custodian, Peter Rogers, will report to the Laboratory
Operations Manager. The Laboratory Sample Custodian will:
Receive and inspect the incoming coolers, sample containers, and custody seals
Record the condition of the incoming coolers, sample containers, and custody seals
Sign the chain-of-custody (COC) form and other appropriate documents
Verily chain-of-custody and its correctness
Notify Laboratory Project Manager and Operations Manager of sample receipt and
inspection results
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 7 of 70
Assign a unique identification number and customer number to each sample and enter
each into the sample receiving log
With the help of the Laboratory Operations Manager, transfer samples to appropriate
laboratory sections
Control and monitor access/storage of samples and extracts.
A4.4.5 Laboratory Technical Staff
The Bunse & Burner Laboratory technical staff will be responsible for analyzing samples and
notifying the Laboratory QA Officer when the need for corrective actions is identified. The
laboratory technical staff will report directly to the Laboratory Operations Manager.
A5 HISTORICAL AND BACKGROUND INFORMATION
The subject property is a 275-acre site located in the southeastern United States. The property
was occupied from 1965 until 1985 by EMC A for production of electronic parts and
manufacturing equipment. During these 20 years of operation, EMCA used large quantities of
chlorinated solvents in the manufacturing process and in cleaning the products. Additionally,
EMCA began recycling substation transformers in the 1970s to recover copper. The transformers
each contained approximately 200 to 300 gallons of contaminated waste oil, which was composed
of a mixture of mineral oil and PCBs. To dispose of this waste oil, EMCA used it as a dust
suppressant and sprayed it over approximately 1,200 feet of a north-south oriented dirt road in the
northwest portion of the site. In 1985, EMCA relocated its operation to the Midwest and sold
the subject property to ECC. ECC operated a battery recycling/lead recovery business on the
property for 5 years before declaring bankruptcy in 1990. EMCA also went out of business in
1990. In 1991, the city detected volatile organic compounds (VOCs) in water supply wells to the
east of the site that primarily provide water to nearby light industry, and the city contacted EPA
with this finding.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 8 of 70
After reviewing the site history and in response to the city's detection of VOC groundwater
contamination, EPA added the site to the Comprehensive Environmental Response,
Compensation, and Liability Act (CERCLA or Superfund) Information System (CERCLIS). EPA
then performed a Preliminary Assessment (PA) followed by a Site Inspection (SI) that included
limited soil sampling and analysis. Using the results of these activities, EPA evaluated the site
using the Hazard Ranking System (HRS), and the site was added to the National Priorities List
(NPL) as a Superfund site.
Prior to EPA's adding the site to the Superfund NPL, the local media widely publicized the
probable connection between EMCA's prior use of chlorinated solvents at the subject property
and the VOC groundwater contamination detected in the city's water supply wells. In addition, a
land developer had expressed interest in the subject property for residential development, and the
city encouraged this redevelopment effort as part of their brownfields initiative. Subsequently,
several newspaper articles focused on the potential exposure of future residents to PCB and VOC
contamination in surface soil and to VOC vapors. In response to the media attention and the
shutdown of several of the city's water supply wells, community interest and concern remain
elevated.
A5.1 Site Description
Figure 2 is a map of the EMCA/ECC Superfund site and surrounding area. The EMCA/ECC site
occupies approximately 275 acres in the center of a light industrial corridor. Active industrial
areas are located just south and north of the site, and the industrial well field that is owned and
operated by the city borders the EMCA/ECC site to the east. The remaining property east and
west of the EMCA/ECC site includes residential subdivisions with private lots that typically
occupy several acres. The EMCA/ECC property had been logged several times before EMCA
occupied the site, and the property currently is covered by a canopy of young pine trees with little
undergrowth.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 9 of 70
A5.2 Site Topography and Drainage
Topographic contours and streams also are shown on Figure 2. There is approximately 20 feet of
relief in the study area with the ground surface generally sloping toward the east. Ground
elevations range from a high of just over 40 feet above mean sea level (msl) in the western portion
of the study to below 20 feet msl in the northeast corner of the site. An unnamed stream flows
across the eastern portion of the study area from the south to the north. Three small tributaries to
this stream extend to the west into the central portion of the EMCA/ECC site. These tributaries
typically are dry and flow only during storm events and seasonal periods (typically spring) of high
water table. Other than the stream incisions, the northeastern third of the EMCA/ECC site is
relatively flat with elevations typically between 22 and 32 feet msl.
A5.3 Site Geology and Hydrogeology
Figure 3 is a conceptual site model of the EMCA/ECC site presenting the site geology and
hydrogeology on a schematic cross section. The unconsolidated sediments underlying the subject
property consist of silty sand deposits. These sediments typically are loose and have a relatively
low organic component. Underlying these sediments at a depth of approximately 10 to 30 feet
below ground surface (bgs) is a silty clay deposit that ranges in thickness from 0 to 15 feet. The
sediment beneath the clay consists of a sandy silt that is at least 80 feet thick.
Where the clay exists, it serves as a semiconfining layer that separates a surficial water table
aquifer and a semiconfined principal aquifer. The water table in the surficial aquifer exists within
the unconsolidated sandy sediments described above. The water table is situated at a depth of
about 10 feet bgs at the western part of the EMCA/ECC site. Surficial-aquifer groundwater flows
due east and, during periods of high water table, discharges to the streams described in Section
A5.2. Groundwater in the principal aquifer flows to the east-northeast.
A5.4 Past Data Collection Activities
A county groundwater appraisal report prepared by the U.S. Geological Survey (USGS) in 1954
provides the basic hydrogeologic framework of the study area. Several observation wells were
installed in the study area by the city in the early 1960s to evaluate water supply capacity of the
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 10 of 70
principal aquifer for the planned industrial corridor. In addition, several foundation geotechnical
investigations were performed in 1964 and 1965 for the EMCA manufacturing building and the
former EMCA/ECC administrative building. The information from these older geotechnical soil
test borings, plus those from the surrounding industrial sites, was used by the EPA contractor,
Sandy Lowem & Associates, to help define the basic stratigraphy and hydrogeologic
characteristics of the site described in Section A5.3.
Limited soil contamination information was developed by EPA in performing the SI. The findings
suggest that VOC soil contamination is limited to an area surrounding the manufacturing building
located in the southern part of the property. This southern area of VOC contamination has been
defined by EPA as operable unit 1 (OU1) for the Superfund Rl/feasibility study (FS). The SI
analytical results also confirmed elevated levels of PCB contamination in surface soil along the
dirt road located in the north portion of the site. Additionally, PCBs were detected east
(downhill) from the dirt road in surface soil and in stream sediment. This northern area of PCB
soil contamination has been defined by EPA as operable unit 2 (OU2) for the CERCLA Rl/FS.
The total PCB concentrations reported for the SI soil samples are shown in Figure 4.
A5.5 Applicable PCB Standards and Criteria
PCBs in surface soil present a potential risk to the site biota and to humans who may be exposed
to the contamination. Known and suspected risks of PCB exposure are summarized in Table 1.
Given the potential future residential use of the EMCA/ECC site, EPA's soil screening
methodology is applicable as documented in Soil Screening Guidance: Technical Background
Document (EPA, 1996a) and Soil Screening Guidance: User's Guide (EPA, 1996b). The soil
screening level (SSL) for total PCBs presented in the Technical Background Document is 1.0
ppm. Therefore, 1.0 ppm has been selected as an appropriate SSL for data assessment.
The disposal of PCBs is governed by the Toxic Substances Control Act (TSCA). Title 40, Part
761, Subpart G of the Code of Federal Regulations (CFR) contains EPA's PCB Spill Cleanup
Policy of 1987. However, the 1987 TSCA Spill Cleanup Policy does not apply to the
EMCA/ECC site PCB contamination because the policy applies only to releases of PCBs
occurring after May 4, 1987. Nevertheless, since 1990, the Superfund program has adopted an
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 11 of 70
approach to cleanup of PCBs that relies heavily on the TSCA policy. The TSCA PCB Spill
Policy at Section 761.120 recommends PCB spills be cleaned up to 1 ppm total PCBs on the
surface to a depth of 10 inches in the case of remediation for residential land use. Therefore, use
of EPA's Soil Screening Guidance and selection of 1.0 ppm as the SSL for data assessment are
consistent with the 1987 TSCA Spill Cleanup Policy.
A6 PROJECT DESCRIPTION
This section briefly describes the project tasks and the work schedule.
A6.1 Project Tasks
The following RI tasks have been established to address the Data Quality Objectives (DQOs)
presented in Section A7:
Task 1. Project planning and QAPP preparation. This QAPP represents the results
of initial project planning as summarized by the DQOs (Section A7). However,
the project planning/DQO process is iterative, and the DQOs and this QAPP
will be revised if warranted by information developed through execution of this
project.
Task 2. Health & Safety Plan and Community Relations Plan preparation. A
written Health & Safety Plan (H&SP) is required for hazardous site
investigations according to the Occupational Safety and Health Administration
(OSHA), CFR 1910.120(b). A project-specific H&SP currently is being
prepared. Section B2.11 discusses the minimum requirements of an H&SP. A
Community Relations Plan also is being prepared for this project. The
requirements of the Community Relations Plan will include the public meetings
listed below as Task 10.1.
Task 3. Survey Decision Areas. Before soil sampling begins, the boundaries of each of
the 54 decision areas (DAs) will be surveyed and marked as described in
Section B2.3. The DAs to be surveyed are described in Section A7.4.2.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 12 of 70
Task 4. Collect soil samples. As described in Section A7.7, six composite soil samples
will be collected from each DA, and each composite sample will be formulated
from five soil specimens. Detailed soil sampling procedures are presented in
Section B2.7.
Task 5. Laboratory analysis of soil samples. The composite soil samples will be
submitted to Bunse & Burner Laboratory for analysis of individual PCBs
(congeners) by SW-846 Method 8082. The laboratory method and laboratory
requirements are described in Sections B4 though B6.
Task 6. Data validation. The laboratory analytical results will be subject to validation
to assess for bias and to review for completeness, representativeness, and
acceptable levels of precision and accuracy. The acceptance criteria for
measurement data are described in Section B5.2. Data validation procedures
are presented in Section D.
Task 7. Data quality assessment. The validated analytical results will be assessed
using the Chen test to evaluate whether decision error limits have been met
(Section A7.6) or whether additional valid sample results are required from
certain DAs in order to meet the decision error limits. Use of the Chen test for
data quality assessment (DQA) is further described in Section B5.2.5.2. If
necessary to meet decision error limits, a second round of sample collection,
analysis, and validation may be implemented.
Task 8. Data analysis and Rl report preparation. After the DQA process has
verified that enough valid data have been generated to meet the decision error
limits, maps and tables will be prepared to illustrate those DAs that are to be
included in a subsequent Rl phase to characterize subsurface soil contamination
and included in the FS to evaluate remedial alternatives for surface soil PCB
contamination cleanup. These tables and maps will be included in an RI report
that also documents field activities and the results of laboratory analyses, data
validation, and DQA.
Task 9. Auditing. Field and laboratory activities will be audited twice throughout the
project. These technical systems audits (TSAs) are further described in Section
Cl.l.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 13 of 70
Task 10. Project support
Task 10.1 Public meetings. As discussed in Section A7.1.3, public
participation is an important aspect of this project, and several
public meetings are planned throughout the project. The projected
timing of public meetings is presented in Section A6.2.
Task 10.2 Data management. Data management is a critical activity that
begins upon conception of the DQOs and continues through and
after the duration of the project. Data management procedures are
discussed in detail in Section BIO.
Task 10.3 Progress reports. Monthly progress reports will be prepared
throughout the duration of the project as discussed in Section
A9.3.1.
A6.2 Work Schedule
As discussed in Sections A7.1.3 and A7.4.3, EPA has made a commitment to the city to report
the results of this phase of the RI within 6 months of the issuance of Revision 1 of this QAPP.
The tasks described in Section A6.1 are shown in the project schedule, Figure 5, along with the
duration of each task. Because the laboratory turnaround time is 3 weeks and there is the
potential for a second round of surface soil sampling, there is little flexibility in the project
schedule if the 6-month deadline is to be met. Figure 5 also presents the anticipated timing of
TSAs and public meetings.
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 14 of70
A7 DATA QUALITY OBJECTIVES
DQOs are qualitative and quantitative statements derived from the output of the first six steps of
the DQO process shown in Figure 6. The DQO process is an iterative, strategic planning
approach designed to ensure that the type, quality, and quantity of environmental data used in
decision making are appropriate for the intended application. Once established, the DQOs are
used to develop a scientific and resource-effective data collection design.
A7.1 DQO Step 1: Statement of the Problem
This section presents historical and background information about the project, describes the
conceptual site model, and lists the involved parties, project resources, and deadlines.
A7.I.1 Historical and Background Information
Historical and background information relevant to the problem addressed by this QAPP is
presented in Section A5. In summary:
Waste oil contaminated with PCBs was sprayed for dust suppression along a
dirt road in the northern part of the property (OU2).
Preliminary sampling performed by EPA confirmed PCB contamination in soil
along the road, in soil downhill from the road, and in onsite stream sediment.
Preliminary sampling also suggests that most of the property may be free of
PCB contamination.
The site is under consideration for redevelopment as a residential neighborhood.
Community interest and concern are high.
A7.1.2 Conceptual Site Model
Figure 3 is a schematic cross section through the EMCA/ECC Superfund site showing the
conceptual model of OU2 PCB contamination. This figure displays the following concepts and
assumptions:
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 15 of 70
The highest concentrations of PCBs in soil (i.e., PCB source area) are along the
dirt road where PCB-contaminated waste oil was sprayed.
PCB-contaminated soil has migrated from the source to adjacent areas via
stormwater runoff and as windblown dust.
Adjacent areas contaminated with PCBs are predominantly downhill of the dirt
road.
PCB-contaminated soil has accumulated as sediment in streams.
Due to their low solubility and high sorption properties, PCBs have not
migrated into subsurface soil.
Groundwater flowing to the east in the surficial aquifer and, seasonally,
discharging to surface streams has not been affected by PCB soil contamination.
Groundwater flowing to the east in the principal aquifer has not been affected
by PCB soil contamination.
A7.1.3 Involved Parties, Resources, and Deadlines
The principal organizations involved in performing this PCB investigation include EPA; EPA's
contractor, Sandy Lowem & Associates; and Sandy Lowem & Associates' subcontracted
analytical laboratory, Bunse & Burner Laboratory. Specific roles and responsibilities of each team
member are described in Section A4. In addition to these organizations, this QAPP reflects
comments received from the city's environmental department and from representatives of the
concerned citizens coalition. Furthermore, several public meetings have been scheduled to be
held before, during, and after the activities described in this QAPP are executed. A project-
specific Community Relations Plan is currently in preparation.
A Superfund budget has been authorized for this phase of the RI that will support the scope of
services described in this QAPP plus a 20 percent contingency. EPA has made a commitment to
the city and to the concerned citizens coalition to report the results of this investigation within 6
months of the date of issuance of Revision 1 of this QAPP. The project schedule is discussed
further in Section A6.2.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 16 of 70
A7.2 DQO Step 2: Decision Statement
The decision to be made from this investigation is to:
Determine whether PCB contamination in surface soils exceeds an acceptable
risk-based soil concentration.
EPA plans to perform a focused followup phase of the RI to characterize subsurface soil PCB
contamination and to evaluate whether groundwater quality is impacted or threatened by
subsurface soil PCB contamination. As part of the followup subsurface investigation, EPA will
confirm that areas screened out by the above decision statement do not contain PCBs in
subsurface soils. A separate QAPP and set of DQOs will be generated to address the subsurface
soil investigation.
A7.3 DQO Step 3: Inputs into the Decision
The following informational inputs are required to resolve the decision statement presented in
Section A7.2:
PCB concentrations in surface soil. This information will be gathered
through the sampling and analysis activities described in this QAPP.
Future land use scenario. As described in Section A5, a land developer has
expressed interest in the subject property for residential development.
Soil screening level. As discussed in Section A5.5, EPA's soil screening
guidance (EPA, 1996a and 1996b) presents a soil screening level of 1.0 ppm for
total PCBs. This soil screening level has been adopted because the residential
exposure scenario described in the soil screening guidance potentially applies to
this site.
Additional information will be required for the followup phase of the RI, which will be designed
to characterize subsurface soil PCB contamination. For example, soil characteristics such as soil
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No 1
March 31, 1998
Page 17 of 70
texture, dry bulk density, soil organic carbon, and pH will be assessed to evaluate whether
groundwater quality is threatened by subsurface soil PCB contamination.
A7.4 DQO Step 4: Study Boundaries
This section describes the planned soil sampling depths, the derivation and configuration of 54
DAs, and the temporal study boundaries of the project.
A7.4.1 Sampling Depth
As included in the conceptual site model (Section A7.1.2), PCBs are not expected to have
migrated into subsurface soil due to their low solubility and high sorption properties. EPA's soil
screening guidance (EPA, 1996a and 1996b) considers surface soil as the top 2 centimeters.
Considering the relatively loose, sandy soils at the EMCA/ECC Superfund site, however, it is
reasonable to assume that PCBs may be present somewhat deeper. Therefore, the sampling depth
selected for this investigation is 2 inches. As included in the DQO decision statement (Section
A7.2), subsurface soil (deeper than 2 inches) PCB contamination will be characterized in a
followup phase of the RI.
A7.4.2 Decision Areas
Figure 7 shows OU2 subdivided into 54 DAs. The DAs have been established based on the
likelihood of contamination as inferred from the conceptual site model (Figure 3) and from
previous analytical results (Figure 4). DAs are defined as one of the following:
A linear segment of the dirt road approximately 200 feet long
A plot of land 0.5 to 4.5 acres in size that may have received PCB
contamination from overspray, air-blown particles, and/or stormwater runoff
A reach of ephemeral stream approximately 300 to 400 feet long that may
contain deposits of PCB-contaminated soil.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 18 of70
Several DAs are larger than 0.5 acres, which is the suggested maximum size of "exposure areas"
presented in EPA's soil screening guidance (EPA, 1996a and 1996b). This size of DAs was
selected due to the funding and time constraints imposed on the project (Section A7.1.3).
However, the sizes of these DAs are considered appropriate because, as suggested by the
conceptual site model, the variability of PCB concentrations within each area is expected to be
low. If the variability within a DA is found to be greater than the acceptance criteria (see Section
A7.6, Limits on the Decision Error), then additional samples will be collected from the area or
from subdivisions of the area. Furthermore, if PCB contamination is found within any DA at the
perimeter of OU2, then the boundary of OU2 will be expanded by creating additional DAs for
subsequent sampling and analysis (see Section A6.2, Work Schedule).
A7.4.3 Temporal Study Boundaries
The latest date that PCBs are known to have been disposed of at the site is 1985. The low
volatility and solubility in soil of PCBs and the fact that this contamination has been present for at
least 13 years provide temporal flexibility for executing this investigation, subsequent RI/FS
activities, and remediation. Nevertheless, EPA has made a commitment to the city to report the
results of this phase of the RI within 6 months of the date of issuance of Revision 1 of this QAPP.
Because the laboratory turnaround time is 3 weeks and there is the potential for a second round of
surface soil sampling (see Section A6.2 and Figure 5, Project Schedule), there is little flexibility in
the project schedule if the 6-month deadline is to be met.
An additional time constraint concerning collection of sediment samples from the ephemeral
streams is that the sampling must take place when the streams are not flowing. Other than during
large storm events, the streams flow during seasonal periods of high water table, which is typically
February through April. Therefore, sampling within the ephemeral stream DAs may have to be
conducted toward the end of field activities since field activities are scheduled to commence in
April.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No 1
March 31,1998
Page 19 of 70
A7.5 DQO Step 5: The Decision Rule
The following statements describe the decision rule to apply to this investigation:
If the mean concentration of total PCBs in surface soil (top 2 inches) averaged
over each DA exceeds the action level, then the area will be targeted for
characterization of subsurface soil contamination in a subsequent RI phase and
included in the FS to evaluate remedial alternatives for surface soil PCB
contamination cleanup.
Otherwise, the area will be characterized as not posing an unacceptable risk to
human health or the environment and will be dismissed from further RI/FS
activities.
The action level to be used in implementing this decision rule differs from the screening level
value of 1.0 ppm because of the way the limits on decision errors have been specified in the EPA
soil screening guidance (EPA, 1996a and 1996b) and adopted for this project. This is explained
in the next section.
A7.6 DQO Step 6: Limits on Decision Error
The default decision errors presented in EPA's soil screening guidance (EPA, 1996a and 1996b)
have been selected for this investigation. Before describing the probability limits on decision
errors, the issue of the how the action level described in the decision rule for this project differs
from the soil screening level must be addressed. EPA's soil screening guidance (EPA, 1996a and
1996b) identifies a default decision-making gray region of one-half to two times the SSL. Within
this gray region, relatively large decision error rates are considered tolerable with minor
consequences. Considering that the SSL identified for this project is 1.0 ppm, the lower bound
(one-half the SSL) of the gray region is 0.5 ppm, and the upper bound (two times the SSL) of the
gray region is 2.0 ppm. The upper bound of the gray region is where EPA specifies a tolerable
limit on the probability of making a decision error that would result in mischaracterizing a DA that
truly poses an unacceptable risk to human health and the environment. EPA believes that setting
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 20 of 70
the upper bound of the gray region at two times the SSL is appropriate because the SSLs are
sufficiently conservative.
The baseline condition (null hypothesis) adopted for this site is that the true mean contaminant
concentration for each DA is less than or equal to one-half the SSL (lower bound of the gray
region). The conceptual site model and existing soil analytical results suggest that this condition
should be valid for most of OU2. Moreover, this will allow the use of a robust statistical
procedure called the Chen test, which is described in the soil screening guidance. From a
statistical perspective, the "action level" therefore is 0.5 ppm, or one-half the SSL. That is, if the
data demonstrate convincingly that the true mean is significantly greater than-0.5 ppm, then the
baseline condition (null hypothesis) will be rejected and the DA will be subject to further
investigation and remediation. There is a chance that this will be an erroneous decision (a "false
positive" or Type 1 error), but the consequences are merely further investigation of a DA that in
truth does not pose an unacceptable risk.
From a site management perspective, EPA wants to ensure that whenever the results show that
the baseline condition cannot be rejected (i.e., the data do not provide conclusive evidence that
the mean is significantly above 0.5 ppm), the data provide sufficient evidence to allow EPA to
characterize that DA as not posing an unacceptable risk to human health or the environment.
Most of the time this will be clear (such as when most or all of the data values are "non-detects").
However, sometimes the baseline condition cannot be rejected because the data are inconclusive
for example, the average of the data values is greater than 0.5 ppm, but not "significantly" greater.
Under these conditions, EPA wants to ensure that sufficient data have been collected so that there
is only a small chance of mischaracterizing a DA that could, in truth, pose an unacceptable risk.
Therefore, from a site management perspective, it may be useful to think of the upper bound of
the gray region as a "threshold" for controlling the chance of misclassifying DAs that truly pose an
unacceptable risk. By specifying a maximum tolerable probability of making a "false negative"
(Type II) error at 2 times the SSL and by ensuring that a proper DQA is performed whenever the
baseline condition is not rejected, EPA's concern for mischaracterizing DAs is addressed. The soil
screening guidance provides direction on how to perform calculations to ensure that the chance of
mischaracterizing a DA is sufficiently small, which avoids mistakes due to inconclusive data.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 21 of 70
These two decision errors and the probability goals limits adopted for each can be summarized as
follows:
1. The Type I decision error occurs when the null hypothesis is incorrectly rejected
(false positive). With respect to the Chen test, the Type I error occurs when a DA
is incorrectly included in followup phases of the RI and evaluated in the FS for
remedial alternatives when, in actuality, the PCB concentrations do not pose an
unacceptable risk to human health or the environment. The limit set on the
probability that the Type I decision error will occur is 0.2 (20 percent) at 0.5 ppm,
the lower end of the gray region. Following the Chen test procedures ensures that
the Type 1 decision error limit is met.
2. The Type II decision error occurs when the null hypothesis is incorrectly accepted
(false negative). With respect to the Chen test, the Type II error occurs when a
DA is incorrectly dismissed from further RI/FS activities when, in actuality, the
PCB concentrations warrant further study and remediation. The limit set on the
probability that the Type II decision error will occur is 0.05 (5 percent) at 2.0 ppm,
the upper end of the gray region. As opposed to the Type I decision error,
hypothesis test procedures do not ensure that the Type II decision error limit is
met. The DQA statistical protocol must be used for each DA for which the
baseline condition was not rejected (see DQA activities under Section B5.2.5.2) to
demonstrate whether enough valid data have been generated to meet the Type II
decision error limit.
A7.7 DQO Step 7: Design Optimization
This section presents design-optimization details including the rationale for collecting composite
soil samples and the rationale for the sampling pattern and the number of samples. In general, the
soil screening guidance was used to guide professional judgments about the type of designs that
would be appropriate. Time constraints and design cost considerations precluded a more
exhaustive search for optimal designs.
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 22 of 70
A7.7.1 Composite Sampling
Because the objective of surface soil sampling is to estimate the mean contaminant concentration
for each DA, the physical "averaging" that occurs during compositing is consistent with the
intended use of the data. The PCB concentration in each composite sample should represent an
estimate of the mean PCB concentration for the DA because individual soil specimens that make
up a composite sample are collected from across the DA.
A7.7.2 Sampling Pattern
Each composite sample will be comprised of five soil specimens. Within each DA, soil specimen
locations will be selected using a stratified random sampling procedure, and each composite
sample will be formulated using a random compositing scheme. Specific protocols for identifying
soil specimen locations and creating composite samples are presented in Section B2.4.
A7.7.3 Numbers of Samples
Six composite samples will be collected from each of the 54 DAs. This number of samples was
developed by assuming a conservatively high coefficient of variation (CV) of 2.5 for the soil
sample analytical results to be generated for each DA. This CV is thought to be conservative
because each of the composite samples should represent an estimate of the mean PCB
concentration for the DA, and each DA was constructed to avoid straddling areas that the
conceptual site model suggests would have different degrees of contamination. Specifying five
soil specimens per composite sample and using the Chen test with a CV of 2.5, Table 26 in EPA's
Soil Screening Guidance Technical Background Document (EPA, 1996a) indicates that six
samples per DA is the minimum sample size needed to achieve the decision errors prescribed in
Section A7.6.
This is an example Quality Assurance Project Plan. The referenced Superfund sue and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 23 of 70
A8 SPECIAL PERSONNEL TRAINING REQUIREMENTS
In addition to studying the methods and procedures described in Section B of this QAPP, each
field team member must be experienced or have received proper training on this project's
requirements for soil sampling, sample handling and custody, and field documentation. Each field
team member will have received the OSHA-required 40-hour hazardous waste site worker
training and will be current on the required 8-hour refresher training, medical monitoring, and
first-aid/CPR training. The Field Team Leader is required to have attended the OSHA 8-hour
hazardous waste site worker Supervisor Training Course. At least one field team member must
have received the training mandated by the U.S. Department of Transportation in association with
the International Air Transportation Association for shipping hazardous materials. Sandy Lowem
& Associates has a rigorous training program and, as of the date of this QAPP, each of these
requirements has been satisfied.
The method chosen for analysis of PCBs in the laboratory (SW-846 Method 8082) is restricted to
use by, or under the supervision of, analysts experienced in the use of a gas chromatograph (GC)
and skilled in the interpretation of gas chromatograms. Each analyst must demonstrate his or her
ability to generate acceptable results with the method.
Bunse & Burner Laboratory has been subcontracted by Sandy Lowem & Associates to conduct
the analyses. This laboratory has performed Method 8082 or its predecessor method for over 10
years, has a comprehensive laboratory QA plan, and possesses the required sample preparation
and analytical equipment. Furthermore, Bunse & Burner Laboratory has experienced staff
members who have demonstrated continuous proficiency in analysis of PCBs in water and soil
matrices. Each analyst is required to satisfactorily analyze a PCB standard reference material in
the appropriate matrix before beginning a project. Bunse & Burner Laboratory has participated in
EPA-sponsored performance evaluation studies for 5 years and has consistently achieved high
ratings with respect to PCB identification and quantification.
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 24 of 70
A9 DOCUMENTATION, RECORDS, AND REPORTS
This section identifies the documents and reports to be generated throughout the investigation and
the information to be included in these documents and reports. A description of the data
management system established for this project, including a description of the types of data that
will be collected in this effort and their relationship to the final report, is presented in Section BIO.
A9.1 Field Documentation
Field documentation requirements are fully described in Section B2.8. Examples of selected
forms are included in Appendix A. In summary, the field team will be responsible for maintaining
the following field documents:
Soil sampling data sheet
Sample container labels
COC forms
Health and safety documentation
Photograph log
Daily diary of activities in a bound field notebook.
Section B2.4 describes the methods to be followed by the field team to determine soil sampling
locations. As shown in Appendix A on the example soil sampling data sheet, the location where
each soil specimen is collected will be recorded as feet north and feet east of the southwest corner
of each DA. As backup documentation to these measurements, the field team will operate a
Global Positioning System (GPS) at each soil-specimen sampling location. At each location, the
field team will enter the soil specimen identification number into the GPS data logger, thereby
associating a sampling time and location with the specific soil specimen.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
OU2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 25 of 70
A9.2 Laboratory Documentation
The laboratory data reports will be consistent with current EPA Contract Laboratory Program
(CLP) documentation requirements (CLP forms not required). Each laboratory data report will
include a case narrative, an analytical results package, a copy of the completed COC form, and an
Electronic Data Deliverable (EDD). Section B4.7 provides a full description of the required
components of each of these four elements of the laboratory data reports.
A9.3 Management and QA Reports
Management and QA reports include monthly progress reports, audit reports, data validation
reports, and the final RI report.
A9.3.1 Monthly Progress Reports
Sandy Lowem & Associates will prepare a monthly progress report and submit it to EPA no later
than the 15th of the month following the period being reported. This report will state technical
and financial progress for the duration of the reporting period (1 month) and cumulatively for the
entire project. Narrative descriptions of work accomplished, problems encountered, and
projected work in the next reporting period will be included for each task in progress. The
monthly progress report also will contain a QA summary. The QA summary provides an
overview of the QA observations and findings presented in the QA reports and forms described in
the next two subsections. The QA summary also will indicate the status of corrective action
documentation and implementation (see Section C2.1), if any.
A9.3.2 Audit Reports
As described in Section CI, two TSAs will be conducted of field activities and two will be
conducted of laboratory activities. The auditor will prepare an audit report summarizing the
observations and findings of each of these audits. As needed, the audit reports will be
supplemented by a Corrective Action Request and Tracking Form(s) to correct each observation
and finding. These forms are further discussed in Section C2.1, and audit reports are further
discussed in Section C2.2.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 26 of 70
A9.3.3 Data Validation Reports
A data validation report will be prepared for each laboratory data report generated. The data
validation report will identify biases inherent in the data including assessment of laboratory
performance and overall precision, accuracy, representativeness, and completeness. Data
validation flags will be applied to those sample results that fall outside of specified tolerance limits
and, therefore, do not meet the program's QA objectives. The data validation report will address
whether the quality of the flagged data affects the ability to use the data as intended. As needed,
data validation reports will be supplemented by a Corrective Action Request and Tracking
Form(s). These forms are further discussed in Section C2.1, and data validation reports are
further discussed in Section C2.3.
A9.4 Final Report
Following data validation and DQA activities, a final report on OU2 PCB contamination will be
prepared for this phase of the EMCA/ECC site RI. The final report will document field activities
and summarize the results of QA and DQA activities. The final RI report will specifically indicate
which DAs: (1) are characterized as not posing an unacceptable risk to human health or the
environment and dismissed from further RI/FS activities, and (2) are to be targeted for
characterization of subsurface soil contamination in a subsequent RI phase and included in the FS
to evaluate remedial alternatives for surface soil PCB contamination cleanup. The final report will
include maps and tables that illustrate those DAs that fall into each of these two categories.
Appendixes to the final report will include the laboratory analytical reports, data validation
reports, audit reports, and corrective action documentation.
The final RI report also will describe whether a second round of field activities was conducted.
The results of the DQA process may indicate that some DAs require additional surface soil PCB
data in order for decision error limits to be met. As discussed in Section A7.4.2, a second round
of field activities also would be warranted if PCB contamination is found within any DA at the
perimeter of OU2. In that situation, the boundary of OU2 would be expanded by creating
additional DAs for subsequent sampling and analysis.
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 27 of 70
B DATA ACQUISITION
B1 EXPERIMENTAL DESIGN
The experimental design and rationale were established through the DQO process as documented
in Section A7. The following sections present implementation details regarding that design.
Bl.l Sample Matrix and Target Analytes
Composite soil samples will be analyzed for concentrations of PCBs by SW-846 Method 8082.
The laboratory will report analytical results for at least 22 individual PCBs (congeners) as further
discussed in Section B4.1. Five soil specimens will make up each composite sample. Each soil
specimen will be collected at a unique location from the ground surface to 2 inches bgs. The site
soils typically are relatively loose silty sand.
B1.2 Types, Numbers, and Locations of Samples
Six composite soil samples will be collected from each of the 54 DAs shown in Figure 7. Several
categories of DAs have been established as follows:
Seven DAs are linear segments of dirt road.
Nine DAs are reaches of ephemeral stream.
Nine DAs are 0.5-acre plots that have the dirt road running through them.
Thirteen DAs are 1.0-acre plots that are adjacent to 0.5-acre plots.
Eight DAs are 2.0-acre plots that are adjacent to 1.0-acre plots.
Eight DAs are 4.5-acre plots that cumulatively make up the eastern half of OU2.
Within each DA, a random compositing scheme will be implemented to create six composite
samples, each made up of five soil specimens. The procedures for identifying soil specimen
sampling locations are detailed in Section B2.4.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 28 of 70
B1.3 Criticality of Measurements
Three types of measurements or observations will be made in the field and laboratory activities
described in this QAPP: (1) laboratory measurements of PCB concentrations in soil, (2) field
observations of soil appearance, and (3) field sampling locations measured using a GPS. Of these
three types of information, only the first, soil PCB concentrations, is considered critical to achieve
project objectives and limits on decision errors. Field observations of soil appearance are to be
recorded for information purposes only. Measurement of field sampling locations using a GPS is
to be performed essentially to validate that, within each DA, the field team successfully occupied
the general vicinity of each randomly selected soil specimen sampling location.
B2 FIELD SAMPLING METHODS AND PROCEDURES
This section describes the field procedures for collecting composite soil samples.
B2.1 Preparation for Field Work
Before field work begins, Sandy Lowem & Associates will establish field headquarters in the
onsite office building (see Figure 2). The headquarters will serve as the central point of
communication for project personnel as well as the temporary storage area for field equipment,
completed field documentation, soil samples not yet delivered to Bunse & Burner Laboratory, and
investigation-derived waste (IDW).
One room in the office building will be designated for clean field supplies to include an ample
stock of the following consumable equipment:
Laboratory-supplied, wide-mouth, 4-ounce glass sample jars with labels
Heavy-duty plastic spoons (large serving spoons)
Paper buckets
Personal protective equipment as required by the project H&SP
Custody seals and blank forms described in Section B2.8.2
Paper towels
Zip-sealing plastic bags, several sizes
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 29 of 70
Plastic garbage bags
Bubble wrap packing material
Clear packing tape and duct tape.
A separate room will contain a locking temperature-monitored refrigerator/freezer and reusable
equipment that may come in contact with site soils. As samples are collected and until they are
shipped to Bunse & Burner Laboratory, they will be stored in the refrigerator according to the
custody procedures described in Section B3. Reusable equipment includes the following:
Portable ice chests (coolers)
Cold packs (blue ice) and bags of water ice (regular ice)
Wooden forms (see Section B2.7)
Yard sticks or other flat rigid object longer than the wooden form.
An additional area within the office building will be used to calibrate, maintain, and store field
instruments required by the project H&SP (e.g., photoionization detector and calibration gas) and
the GPS instrumentation.
A staging area will be established outside the office building for storage of IDW. The staging area
will be set off with caution tape and will consist of covered 55-gallon drums that are labeled and
stored on plastic sheeting. The IDW will remain at this staging area until implementation of soil
remediation activities.
B2.2 Support Organizations
In addition to the organizations discussed in Section A4, other organizations that will provide
support on this project include:
Sample courier: Frederick's Express Service (Fred Ex)
Performance evaluation (PE) sample vendor: Amount Known, Inc.
Land Surveyor: Shootit Wright Surveyors.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 30 of 70
B2.3 Presampling Survey of Decision Areas
The four comers of each DA will be identified and marked by a licensed land surveyor. The
surveyor will drive a labeled wooden stake into the ground that indicates the direction of each
adjacent DA. For example, one stake in the northwest portion of OU2 will read "SE of DAI; NE
of DA5; SW of DA8; NW of DA9." In addition, the surveyor will record the locations of these
stakes using GPS equipment and will present the survey data both in digital format and on survey
drawings. The staked corners of the DAs will be surveyed again using GPS equipment by the
field sampling team as each DA is occupied for sampling. The two sets of GPS measurements
will be compared so that the GPS measurements of soil sampling locations measured by the field
team can be calibrated to the land surveyor's records.
B2.4 Selection and Surveying of Sampling Locations
Before sampling begins in a DA, the Field Team Leader will prepare a schematic map and a table
of 30 pairs of sampling coordinates (six composite samples, five soil specimens per sample) for
the DA. A pin flag then will be set at each soil sampling location. Appendix B presents an
example worksheet for establishing soil specimen sampling locations. The procedures to be
followed for each DA are as follows:
1. Using a random-number-generating computer program or preprinted table of
random numbers, obtain 60 random numbers ranging from 0 to 99. For 0.5- to
4.5-acre DAs that have a segment of dirt road or stream segment within, obtain an
additional 10 contingency random numbers.
2. On a piece of graph paper, draw a schematic of the DA and divide the DA along
its long dimension into five sectors of equal length and width. Label the sectors 1
through 5 beginning with the south sector for DAs elongated north-south or the
west sector for DAs elongated east-west.
3. Convert the first 30 random numbers into sampling coordinates along the long axis
of the DA. For DAs elongated north-south, these coordinates will be expressed as
feet north of the southwest corner of the DA. For DAs elongated east-west, these
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 31 of 70
coordinates will be expressed as feet east of the southwest corner of the DA. Six
random numbers will be converted into coordinates for each of the five DA sectors
using the following equation:
X = [(R/l 00)*(D/5)]+[(D/5)*(N-1)]
where
X = long-axis coordinate (feet from southwest corner of DA)
R = random number
D = long dimension of DA (feet)
N = sector number (1 through 5).
4. Following the format shown in Appendix B, create a table that assigns each of the
long-axis coordinates to a composite sample.
5. Convert the second 30 random numbers into sampling coordinates along the short
axis of the DA. For DAs elongated north-south, these coordinates will be
expressed as feet east of the southwest corner of the DA. For DAs elongated east-
west, these coordinates will be expressed as feet north of the southwest corner of
the DA. Six random numbers will be converted into coordinates for each of the
five DA sectors using the following equation:
Y = (R/100)*d
where
Y = short-axis coordinate (feet from southwest corner of DA)
R = random number
d = short dimension of DA (feet).
6. Assign each of the short-axis coordinates to a composite sample by filling in the
table created in Step 4 (see example, Appendix B).
7. Plot each of the soil-specimen sampling locations on the schematic figure of the
DA and label the locations within each sector 1 through 6 corresponding to the
assigned composite sample as listed on the generated coordinate table.
this is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 32 of 70
8. For those 0.5- to 4.5-acre DAs that have a segment of dirt road or stream segment
within, soil specimens are not to be collected from the dirt road or stream segment
(these features are identified as separate DAs). Coordinates that fall on these
features are to be discarded and replaced with a set of coordinates generated from
the contingency random numbers mentioned in Step 1.
9. Starting at the wooden stake identifying the southwest corner of the DA, place a
tape measure on the ground along the long axis of the DA. At each long-axis
coordinate generated in Step 3, identify each soil specimen sampling location using
the short-axis coordinates generated in Step 5 and a second tape measure placed
perpendicular to the first tape measure. Set a pin flag in the ground at each
location and, using an indelible marker, label the pin flag with the assigned
composite sample number identified in Step 6.
10. Record each of the 30 pin flag locations using a GPS instrument. The GPS
instrument will be operated according to the standard operating procedures
established by the manufacturer.
B2.5 Sample Containers, Preservation, and Maximum Holding Times
Table 2 summarizes the sampling plan, showing the types and numbers of samples to be
collected. This table also shows required sample containers, preservation, and maximum holding
times.
Each composite soil sample collected for laboratory analysis will be submitted in two 4-ounce
glass jars. The field sampling protocol requires that the soil be homogenized before the jars are
filled so that the contents of each of the two jars equally represents the composite soil sample.
Although only 2 to 30 grams of soil are required for a single PCB analysis by SW-846 Method
8082, two soil-filled jars are required for this project to: (1) protect against complete loss of a
sample in the event that one jar breaks during shipment, and (2) to enable the laboratory to have
ample soil for samples selected for matrix spike (MS) and MS duplicate (MSD) analyses in case
the sample also has to be re-extracted.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superftind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 33 of 70
As shown in Table 2, the maximum holding time for soil samples to be analyzed for PCBs is 14
days from the time the sample is collected to the time that it is extracted, and 40 additional days
from the time it is extracted to the time the extract is analyzed. Within the initial 14-day holding
period, the soil samples will not be affected by being stored in glass; plastic containers are not
appropriate for storage of the samples because long-term contact could result in container-
induced phthalate ester contamination of the soil, and the presence of phthalate ester compounds
could interfere with the PCB analysis.
The only requirement for preserving soil samples to be analyzed for PCBs is to maintain the
samples in a chilled state. As further described in Section B3, collected soil samples will be
placed in clean coolers that contain sufficient coolant to chill the samples to 4 ฑ 2 ฐC. Once
transferred to the onsite refrigerator and while stored at the analytical laboratory, the samples will
be stored and maintained at 4 ฑ 2 ฐC.
B2.6 Field Quality Control Samples
The following field QC samples will be collected to assess laboratory and field precision and
laboratory accuracy.
B2.6.1 Field Duplicate Samples
Field duplicate samples will be collected and analyzed to evaluate sampling and analytical
precision. Field duplicates will be prepared by filling two sets of sample bottles with the
homogenized soil. One of the bottle sets will be labeled as the primary sample and one will be
submitted to the laboratory blind with a fictitious sample identification number. The blind field
duplicate will be analyzed in the same manner as the primary samples. One field duplicate will be
collected for every 12 primary samples; that is, half of the DAs will have a blind field duplicate
included with its set of six primary soil samples.
B2.6.2 Performance Evaluation Samples
A total of four double-blind solid-matrix PE samples will be submitted to the analytical laboratory
to evaluate analytical accuracy. The first PE sample will be submitted along with the first sample
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA7ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 34 of 70
shipment, and the remaining PE samples will be interspersed throughout the project at regular
intervals; i.e., the second, third, and fourth PE samples will be submitted after 18, 36, and 54 DAs
have been sampled, respectively. Double-blind PE samples will be prepared using National
Institute of Standards and Technology (NIST) traceable standards. The PE samples will contain
known concentrations of PCBs. Blind laboratory results will be evaluated against the Certificates
of Analyses by the Sandy Lowem & Associates QAM to ensure that the laboratory maintains
good performance. Double-blind PE samples will be obtained from a commercial vendor,
Amount Known, Inc. The PE samples will be shipped from the field to the analytical laboratory in
4-ounce glass jars identical to those used for the field samples. Although the amount of PE
sample provided by the vendor (50 grams) will not fill two glass jars, the PE sample will be split
into two jars. The resultant headspace will not affect the integrity of the sample. The PE samples
will be kept chilled and under custody until they are submitted to the analytical laboratory for
analysis of PCBs by EPA Method 8082. In addition to making an entry on the COC form under a
fictitious name, an entry will be made on the soil sampling data sheet described in Section B2.8.2
for each PE sample.
B2.7 Soil Sampling Procedures
Procedures for collection of six composite soil samples within each DA are as follows:
1. Begin the sampling procedure at each DA as follows:
1.1 Label six paper buckets (disposable paint buckets) with the identification
names to be applied to each of the six composite soil samples (see Section
B2.8.1). Place each bucket in a separate clean plastic bag (wastepaper
basket liner) and fold the bag over the bucket to protect the bucket from
cross-contamination. Place the six protected buckets into coolers equipped
with blue ice so that the buckets will not tip over. Onto the lid of each
cooler, tape a sketch that indicates the position of each bucket in the cooler
(bucket identification synonymous with soil sample identification).
1.2 Place a mark on the handle of an unused, rigid, large, plastic spoon (that is
dedicated to the composite sample) at a measured distance from its end
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 35 of 70
equal to 2 inches plus the thickness of the wooden form described in Step
2.1.
2. Conduct the following procedures at each of the six soil specimen sampling
locations within the first DA sector (one soil specimen is collected from each of
the five sectors for each composite soil sample):
2.1 Put on a clean pair of latex gloves (and/or other personal protective
equipment prescribed by the H&SP). Clear any pine needles or other loose
nonsoil material from the sampling location and place an untreated wooden
form on the cleared area. The form is to have an opening approximately 12
inches by 12 inches.
2.2 Using the plastic spoon that is dedicated to the composite sample (see Step
1.2), scrape soil from an approximately 6-inch diameter area in the center
of the form and place the soil into a clean quart-size zip-sealing plastic bag.
2.3 Continue scraping soil from the area and placing the soil into the quart-size
bag until a concave excavation has been created with a maximum depth of
2 inches. Measure the depth of the excavation by placing a yard stick (or
other flat rigid object longer than the wooden form) across the top of the
wooden form. Then place the handle of the plastic spoon into the deepest
point of the excavation and slide the yard stick against the handle. The
excavation is 2 inches deep when the mark placed on the spoon handle in
Step 1.2 is level with the bottom of the yard stick. To avoid cross-
contamination, do not measure the depth of the excavation with any object
other than the spoon that is dedicated to the corresponding composite soil
sample.
2.4 Using an indelible pen, mark the soil-filled level on the outside of the quart-
size bag.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 36 of 70
2.5 Empty the contents of the quart-size bag into the paper bucket dedicated to
the composite sample. As much as possible, break apart any soil clods with
the plastic spoon.
2.6 Place the spoon into the bucket and place the bucket back into the larger
storage bag. Place the quart-size bag next to the bucket in the larger
storage bag, fold the storage bag over its contents, and return the bag to its
previous position in the chilled cooler.
2.7 Remove the pin flag from the ground.
3. Conduct the following procedures at each of the remaining 24 soil specimen
sampling locations within the DA:
3.1 At each soil specimen sampling location, follow procedures 2.1 through 2.3
being careful to use the correct set of disposable, dedicated equipment for
each of the six composite samples. As closely as possible, achieve the 2-
inch excavation depth just as the quart-size bag is filled to the mark placed
on the bag while sampling in the first sector.
3.2 Add the contents of the quart-size bag to the soil already in the paper
bucket. As much as possible, break apart any soil clods with the plastic
spoon and thoroughly homogenize the soil with the spoon.
3.3 Repeat Steps 2.6 and 2.7.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 37 of 70
4. Conduct the following procedures for each of the six composite samples after a
soil specimen has been collected from each of the five sectors:
4.1 Fill two (four if a duplicate sample is being collected) unused laboratory-
provided 4-ounce glass jars with the homogenized soil using the plastic
spoon dedicated to that sample. Fill no more than one-quarter of each jar
at a time, alternating between the jars.
4.2 Once the jars are filled, follow the sample labeling and handling procedures
described in Section B3 and the waste handling procedures described in
Section B2.9.
B2.8 Field Documentation
Field documents will be kept by each field sampling team. Entries will be made in blue or black
indelible ink. Multiple-page documents will be consecutively numbered. Corrections will consist
of a single line-out deletion that is initialed and dated. If only part of a page or form is used, the
remainder of the page or form will have an "X" drawn across it and it will be initialed and dated.
B2.8.1 Sample Numbering System
Sample identification numbers will be assigned to each soil specimen and composite sample
collected. Each identification number will be unique and will consist of an alphanumeric string as
follows:
[ZZAA]-[BB]-[C]
where
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 38 of 70
ZZ = one- or two-character abbreviation that identifies the type of DA, as follows:
DA = 0.5- to 4.5 acre DA
R = dirt road segment DA
ST = stream segment DA
AA = two-character decision-area number as shown in Figure 7
BB = two-character sequential number differentiating the composite samples collected
from the same DA
C = one-character sequential number differentiating the soil specimens that make up each
composite sample.
For example, identification number ST09-04-5 represents the fifth soil specimen of the fourth
composite soil sample collected from stream segment decision area 09. The composite soil
sample associated with this soil specimen would have the identification number ST09-04.
As discussed in Section B2.6.1, field duplicate samples will be submitted to the laboratory as blind
QC samples. Therefore, fictitious sample identification numbers will be recorded on the sample
label and COC form (COC forms discussed in Sections B2.8.2 and B3). These fictitious numbers
will follow the same general format described above, but will include a sequential DA number that
is not presented in Figure 7. As discussed in Section B2.8.2, the actual depth and sequential
sampling location number associated with the field duplicate will be recorded on the soil sampling
data sheet, which will not be submitted to the analytical laboratory.
B2.8.2 Field Forms
Each field sampling team will be responsible for maintaining the following field forms:
An entry will be made on a soil sampling data sheet for each sample collected. The
intent of the soil sampling data sheet is to document the time that each soil
specimen is collected, any known deviation from the planned sampling location,
This is an example Quality Assurance Project Plan. The referenced Superfund site and
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 39 of 70
and other pertinent field observations associated with the soil specimen. For
samples submitted to the laboratory as blind QC samples with fictitious
identification numbers, the sampling data sheet provides documentation of the
primary sample associated with the blind QC sample. Appendix A of this QAPP
includes an example soil sampling data sheet.
Sample container labels, custody seals, and COC forms will be maintained as
described in Section B3. Appendix A of this QAPP includes an example container
label, custody seal, and COC form.
Health and safety documentation will be submitted as required by the project-
specific H&SP.
A photograph log will be kept that describes each subject image and the time and
date that the photograph was taken.
A field notebook will serve as a diary of field activities and record of pertinent data
not included on the other forms described above. Recorded information will
include general site conditions, daily weather, equipment used onsite, equipment
problems, description of field QC samples, handling and disposal of IDW, and
other relevant information.
B2.9 Handling Investigation-Derived Waste
Two types of IDW will be generated during this investigation: (1) excess soil remaining in paper
buckets after the sample jars are filled, and (2) disposable materials that have come in contact with
site soils. The latter category includes the dedicated sampling equipment described in Section
B2.7 as well as disposable personal protective equipment. These two categories of IDW will be
kept separated and stored in the 55-gallon drums described in Section B2.1. The IDW will be
treated and/or disposed of as a site remediation activity. Because all sampling equipment is
disposable, decontamination is not prescribed in this QAPP. If decontamination is required by the
H&SP, then the decontamination fluids will be contained in separate 55-gallon drums and also
treated and/or disposed of as a site remediation activity.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31,1998
Page 40 of 70
B2.10 Field Corrective Action
Corrective actions will be initiated if the field team is not adhering to the prescribed sampling or
documentation procedures or if laboratory analyses are experiencing interference or systemic
contamination due to field sampling procedures or sample handling protocol. Field corrective
action responsibilities and documentation requirements are discussed in further detail in Sections
CI.4.1 and C2.1.
B2.ll Health and Safety
Health and safety training requirements are discussed in Section A8. A written H&SP is required
for hazardous site investigations according to OSHA, CFR 1910.120(b). A project-specific
H&SP is being prepared and will be completed before field activities begin. The H&SP will
include the following elements:
Overview of the site history, project objectives and scope of work, and health and
safety responsibilities
Hazard assessment
Safety procedures
Field decontamination
Emergency response plan
Maps showing the work areas and route to hospital
Tables summarizing potential chemical hazards, action levels, and emergency
telephone numbers.
B3 SAMPLE HANDLING AND CUSTODY
Immediately after each sample jar is filled, the threads and the outside of the jar will be wiped
clean with a paper towel, the lid will be tightly screwed onto the jar, and the jar will be labeled
The label to be affixed to each sample jar will indicate the sample identification number, the
sampling date and time, the sampler's initials, and the requested analysis. Additionally, a set of
three bar-code stickers will be applied to each sample: one bar-code sticker will be applied to
each of the two sample bottles and one will be applied to the COC form next to the entry for the
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 41 of 70
sample. A custody seal will be placed on each sample jar extending from the lid onto the glass
(not covering the label or bar-code sticker). Custody seals provide assurance that the samples are
not tampered with until opened at the laboratory.
The glass jars will be securely packed in plastic bubble wrap and then sealed in zip-sealing plastic
bags. The samples then will be placed in a clean cooler and kept chilled until they are transferred
to the onsite refrigerator or packed for shipment to the laboratory. The coolers will contain
sufficient coolant to chill the samples to 4 ฑ 2 ฐC. Once transferred to the onsite refrigerator and
while stored at the analytical laboratory, the samples will be stored and maintained at 4 ฑ 2 ฐC.
The onsite refrigerator will be equipped with a high-low alarm system and temperature-recording
device.
For each sample to be submitted to the laboratory for analysis, an entry will be made on a COC
form. Information to be recorded includes sampling date and time, sample identification number,
requested analytes and methods, and sampler's name. The COC also will contain a bar-code
sticker for each sample that matches the bar-code stickers applied to the sample jars. Appendix A
includes a sample COC form.
Sampling team members will maintain custody of the samples until they are transferred to the
onsite refrigerator or the sample courier service. The COC form will accompany the samples
from the time of collection until they are received by the laboratory. Each party in possession of
the samples (except the professional courier service) will sign the COC form signifying receipt. A
copy of the original completed form will be provided by the laboratory along with the report of
results. The onsite refrigerator will be kept locked at all times when samples are in storage. The
COC form will indicate the dates and times that the samples were placed into the refrigerator and
retrieved from the refrigerator. If the refrigerator is opened when samples are in storage, the
responsible individual will sign a refrigerator custody log, enter onto the log the date and time the
refrigerator was unlocked and relocked, and list the group of samples that were stored when the
refrigerator was opened.
Samples will be shipped via overnight delivery service to Bunse & Burner Laboratory
approximately every other day. It is important that coolers are packed properly to prevent
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 42 of 70
breakage of sample containers and to maintain proper sample temperature. Standardized
procedures for packing sample coolers for shipment are as follows:
1. Place blue ice in the bottom of the sample cooler. Place layers of bubble wrap
over the blue ice. Line the cooler with an open plastic garbage bag, place the
samples upright inside the garbage bag and seal the bag.
2. Double-bag and seal loose ice in sealing plastic bags. Place the sealed bags of ice
outside the garbage bags containing the samples.
3. Pack any extra space in the cooler with packing material so that contents cannot
shift during handling, even after the ice used in the cooler loses its shape after
melting.
4. Enclose COC forms in a zip-sealing plastic bag and tape the bag to the inside of
the cooler lid. If more than one cooler is being shipped, note on the COC form
whether the contained information applies only to the samples within the individual
cooler or to those shipped in several coolers.
5. Seal the cooler with signed and dated custody seals so that the cooler cannot be
opened without breaking the custody seal. Place clear packing tape over the
custody seal to prevent incidental damage to the seal.
6. Tape the cooler shut with packing tape. Place duct tape over the cooler drain
plug, if there is one.
7. To ensure that the cooler does not run out of coolant while in the custody of the
overnight delivery service, the samples must be shipped for delivery on the next
calendar day. If a weekend or holiday will prevent delivery of the samples on the
next calendar day, retain custody of the samples in the onsite refrigerator until after
the weekend or holiday.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 43 of 70
Upon receipt of the samples, the laboratory shall immediately notify the Sandy Lowem &
Associates PM if conditions or problems are identified that require immediate resolution. Such
conditions include container breakage, missing or improper COC forms, holding time
exceedances, custody seals that indicate potential tampering, or missing or improper sample
labeling.
B4 ANALYTICAL METHOD REQUIREMENTS
The analytical methods selected for this investigation are described in the December 1996 Update
III of Test Methods for Evaluating Solid Waste; Physical/Chemical Methods, SW-846 (EPA,
1996c). SW-846 Method 8082, "Polychlorinated Biphenyls by Gas Chromatography," will be
used to identify and quantify individual PCB congeners in the soil samples. The internal standard
calibration method will be used. A dual-column GC configuration will be used to allow
confirmation of target analyte identifications. Method 8082 will be used in conjunction with
Method 3541, "Automated Soxhlet Extraction," and Method 8000B, "Determinative
Chromatographic Separations." Method 8000B gives procedures for multiconcentration
calibrations, evaluating linearity of the calibration, establishing retention time windows, and
various QC aspects of analysis. Cleanup Methods 3660B and 3665A will be used as needed to
remove interfering elemental sulfur and phthalate ester contaminants, respectively, should they be
present.
B4.1 List of Target Analytes
Oil/PCB mixtures were sprayed on the soil at the EMCA/ECC site beginning about 25 years ago
and continuing until 13 years ago. Because the PCB compounds have weathered over time, the
relative distributions of individual PCB congeners initially present in the Aroclor(s) (recognizable
groupings of congeners) has changed. Thus, identification and quantification schemes based on
recognition of Aroclor patterns will not be reliable and cannot be used. Method 8082 highly
recommends that individual congeners be identified instead. Their concentrations will be summed
to give a "total PCB" value in terms of ppm by weight.
Table 3 lists the PCB congeners present in the seven Aroclors that make up the Method 8082
Aroclor target analyte list. All but three of these congeners (IUPAC Numbers 12, 28, and 118)
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 44 of 70
are listed in SW-846 as having been tested by Method 8082, but SW-846 further states that
Method 8082 may be appropriate for additional congeners. Therefore, Bunse & Burner
Laboratory has performed method validation studies for these three congeners. Bunse & Burner
Laboratory will begin the project by providing analytical results for the three recently validated
congeners plus the 19 congeners found on the Method 8082 target analyte list. This project-
specific list of 22 target congeners is presented in Table 4. To improve the robustness of the "total
PCB" estimate, Bunse & Burner Laboratory will carefully review chromatograms for unidentified
PCB congeners and attempt to identify and quantify peaks with heights greater than 10 percent of
the nearest internal standard peak height. As necessary, Bunse & Burner Laboratory will perform
additional method validation studies for congeners not listed in Table 4 that are manually
identified in several samples, and these congeners will then be added to the project-specific target
analyte list.
B4.2 Method Sensitivity Requirements
Method 8082 is very sensitive and is appropriate for meeting the project DQOs. As described in
Section A7.5, the intent of the current project is to identify DAs for remediation by comparing
concentrations of "total PCBs" in surface soil to a soil screening level of 1 ppm and an associated
statistical gray region of 0.5 to 2.0 ppm. Soils with total PCB concentrations above the "action
level" of 0.5 ppm (see Section A7.6) would require remediation if the land were to be used for
residences.
The limit of quantitation (LOQ) is the lowest concentration that can be reliably determined within
specified limits of precision and accuracy. Analytical laboratories identify the LOQ for each
analyte using the method detection limit (MDL) and the procedures given in Section 5.0 of
Chapter 1, Quality Control, of SW-846. The Bunse & Burner Laboratory Method 8082 MDLs
and LOQs vary by congener and are shown in Table 4.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 45 of 70
B4.3 Required Equipment and Reagents
The following equipment and reagents will be required to conduct soil sample preparation and
analyses for PCB congeners:
Gas chromatograph. A dedicated analytical system complete with a GC suitable
for on-column and split-splitless injection and all required accessories including
syringes, autoinjectors, analytical columns, gases, electron capture detectors
(ECDs), and recorder/integrator or data system.
Narrow-bore GC columns for dual-column analysis. Columns are specified in
SW-846 Method 8082, Section 4.2.1.
Automated Soxhlet extraction system. This system is described in SW-846 Method
3541.
Analytical balance, readable to the nearest 0.1 mg.
Explosion-proof refrigerator to store extracts of soil samples awaiting analysis.
Reagent-grade solvents (hexane and acetone) for extraction.
Commercial calibration standards for each PCB congener that has been well-
characterized by Method 8082. Standards for other PCB congeners as necessary,
based on the findings of the initial analyses.
Internal and surrogate standards (decachlorobiphenyl and tetrachloro-meta-xylene,
respectively).
Miscellaneous laboratory glassware and supplies.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 46 of 70
B4.4 Corrective Action Process for Analytical System Failure
Analytical system upsets caused by sample contaminants will be handled by the analyst in
consultation with the Bunse & Burner Laboratory Operations Manager and QA Officer. For
failures of the GC's mechanical, electronic, or thermal subsystems, Bunse & Burner Laboratory
technical staff will inform the Operations Manager who will in turn call on the manufacturer's
service representative for assistance in repairing and/or replacing failed components. Laboratory
corrective action responsibilities and documentation requirements are discussed in further detail in
Sections CI.4.2 and C2.1.
B4.5 Laboratory Turnaround Time Requirements
Although the maximum recommended holding time for soil samples to be extracted is 14 days
(see Table 2), Bunse & Burner Laboratory will strive to extract each sample within 2 days of
receipt. Similarly, the sample extracts may be refrigerated and stored out of light for up to 40
days before analysis (Method 8082, Section 6.2), but Bunse & Burner Laboratory is contractually
required to issue a final laboratory data report containing the components described in Section
B4.7 within 3 weeks of sample receipt.
B4.6 Safety and Hazardous Material Disposal Requirements
All employees of Bunse & Burner Laboratory will follow the safety and industrial hygiene rules
specified in the company's safety manual and standard operating procedures. Additionally, Bunse
& Burner Laboratory has strict protocol for handling and disposal of potentially contaminated
samples and has contracts with permitted waste transportation and waste disposal facilities.
These protocols are documented in Bunse & Burner Laboratory SOPs and are in conformance
with all federal, state, and local requirements. The company's safety manual and written SOPs are
available for review at the laboratory facility.
This is an example Quality Assurance Project Plan. The referenced Superfund sue and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 47 of 70
B4.7 Laboratory Data Report
The laboratory data reports will be consistent with current EPA CLP documentation requirements
(CLP forms not required). Portions of these reports are produced under software control thus
enabling reproducibility of outputs. Full traceability is provided through sample codes whereby
specific raw data are fully traceable to sample identity and location through instrument-
identification and time stamps. The laboratory data reports will include the following four
elements:
1. Case Narrative. It is the policy of Bunse & Burner Laboratory to fully document
any difficulties encountered during sample preparation and analysis. Case
narratives will be prepared from the laboratory notebook entries and from
information entered into the laboratory information management system (LIMS).
Case narratives will include the following information plus a complete description
of any difficulties encountered during sample handling and analysis:
~ Date the laboratory data report is issued
Laboratory analyses performed
~ Deviations from intended analytical strategy
*ฆ Laboratory batch number
~ Numbers of samples and respective matrices
~ QC procedures used and references to the acceptance criteria
Laboratory report contents
~ Project name and number
*ฆ Condition of samples "as received"
~ Discussion of whether or not sample holding times were met
~ Discussion of technical problems or other observations that may
have created analytical difficulties
~ Discussion of laboratory QC checks that failed to meet project
criteria, corrections made, and effectiveness of corrective actions
~ Signature of the Bunse & Burner Laboratory QA Officer.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 48 of 70
2. Analytical Results Package. The analytical results package will include the
following data and summary forms:
~ Summary page indicating dates of analyses for samples and
laboratory QC checks
~ Cross-reference of laboratory sample to project sample
identification numbers
~ Description of data qualifiers
~ Sample preparation and analysis methods
~ Sample results
~ Raw data for sample results and laboratory QC samples
~ Results of (dated) initial and continuing calibration checks and GC
tuning results
Results of laboratory QC analyses listed in Table 5
~ Labeled (and dated) chromatograms/spectra of sample results and
laboratory QC checks.
3. Completed Chain-of-Custody Form.
4. Electronic Data Deliverable. The EDD will be formatted according to the
requirements of the data management system described in Section BIO.
B5 LABORATORY QUALITY CONTROL ELEMENTS
B5.1 Quality Control Checks and Procedures
A number of QC checks will be required to ensure the quality of data generated by SW-846
Method 8082. In all cases, a second-column technique will be used to confirm the identities and
concentrations of PCB congeners in the soil samples. In the dual-column analysis technique, a
single injection of sample extract is split between two columns that are mounted in the same GC.
The chromatograph is dedicated to the project. Once the operating conditions are established for
the two columns, the same conditions will be used for analysis of samples and standards.
Agreement of retention-time-based identification of any PCB congener by both columns will be
required in order to report a value.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 49 of 70
The laboratory also will evaluate blanks, calibration check standards, QC reference standards,
internal standards, surrogate standards, laboratory control standards, MSs, and MSDs. These QC
samples will be analyzed at various points in the analytical process, often as part of a sample
analysis batch of 20 samples or less. Table 5 describes each of these QC samples and lists their
frequency of use, control limits, and required corrective actions if control limits are exceeded.
Section B5.2 further discusses QC acceptance limits. Refer to Section B4.4 and to the methods
themselves for additional detail on laboratory corrective action.
B5.2 Quality Control Acceptance Criteria for Measurement Data
The QC limits set as project acceptance limits for measurement data are presented in the following
sections and listed in Table 5. These criteria will be used in data validation to assess whether the
program's QA objectives have been met and whether the quality of the flagged data affects the
ability to use the data as intended. Data validation procedures are presented in Section D2.
B5.2.1 Precision
Precision measures the reproducibility of repetitive measurements. It is strictly defined as the
degree of mutual agreement among independent measurements as the result of repeated
application of the sample process under similar conditions. Precision acceptance limits for the QC
analyses discussed below are shown in Table 5.
Analytical precision is a measure of the variability associated with duplicate or replicate analyses
of the same sample in the laboratory and is evaluated by analysis of laboratory QC samples, such
as duplicate control samples, MSDs, and sample duplicates. If the recoveries of analytes in the
specified control samples are comparable within established control limits, then precision is within
limits.
Total precision is a measure of the variability associated with the entire sampling and analytical
process. It is evaluated by analysis of duplicate or replicate field samples and measures variability
introduced by both the laboratory and field operations. Field duplicate samples are analyzed to
assess field and analytical precision. One field duplicate will be collected for every 12 primary
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 50 of 70
samples; that is, half of the DAs will have a blind field duplicate included with its set of six
primary soil samples.
Duplicate results will be assessed using the relative percent difference (RPD) between duplicate
measurements. RPD limits for laboratory MSD analyses will be 30 percent. If the RPD for
laboratory QC samples exceeds the established limit, data will be qualified as described in the
applicable validation procedure. If the RPD between primary and duplicate field samples exceeds
50 percent, data will be qualified as described in the applicable validation procedure (see Section
D2). The RPD will be calculated as follows:
RPD = (200) {X, - X2) / (X, + X2)
where X, is the larger of the two observed values, and X2 is the smaller of the two observed
values.
B5.2.2 Accuracy
Accuracy is a statistical measure of correctness and includes components of random error
(variability due to imprecision) and systematic error. It reflects the total error associated with a
measurement. A measurement is accurate when the value reported does not differ from the true
value or known concentration of the spike or standard. Accuracy acceptance limits for the QC
analyses discussed below are given in Table 5.
Accuracy of laboratory analyses will be assessed by initial and continuing calibrations of
instruments and analysis of blanks, laboratory control samples, surrogate and internal standards,
MSs, and blind PE samples. Laboratory accuracy is expressed as the percent recovery (%R). If
the percent recovery is calculated to be outside of acceptance criteria, data will be qualified as
described in the applicable validation procedure (see Section D2). Percent recovery will be
calculated as follows:
%R = (100) (Xs - X) / T
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 51 of 70
where Xs is the measured value of the spiked sample, X is the measured value of the unspiked
sample, and T is the true value of the spike solution added.
Field accuracy often is assessed through the analysis of trip blanks, field blanks, and field
equipment blanks. Analysis of blanks monitors errors associated with the sampling process when
volatile compounds are under investigation or when sampling equipment is decontaminated
between samples and reused. Field accuracy will not be evaluated for this project because PCBs
are not volatile and, as described in Section B2, sampling tools will be previously unused and will
be disposed of after each use.
B5.2.3 Representativeness
Representativeness is the degree to which data accurately and precisely represent a characteristic
of a population, parameter variations at a sampling point, a process condition, or an
environmental condition.
Representativeness of data collection will be addressed by careful preparation of sampling and
analysis programs. This QAPP addresses representativeness by specifying sufficient and proper
numbers and locations of samples and incorporating appropriate sampling methodologies; the
sampling network has been designed to provide data representative of site conditions by
considering past waste disposal practices, existing analytical data, and physical setting and
processes. This QAPP further addresses representativeness by specifying appropriate laboratory
methods for the preparation and analysis of samples and establishing and following proper QA/QC
procedures.
B5.2.4 Comparability
Comparability is an expression of the confidence with which one data set can be compared to
another. The objective of comparability is to ensure that soil PCB data developed during the
investigation may be readily compared to other soil PCB data collected at this or other sites and
to applicable criteria or standards. This QAPP addresses comparability by specifying appropriate
field and laboratory methods that are consistent with the current standards of practice as approved
by EPA.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 52 of 70
B5.2.5 Completeness
Completeness is the amount of valid data obtained compared to the amount that was planned.
The number of valid results divided by the number of planned results, expressed as a percentage,
determines the completeness of the data set. Completeness is calculated as follows:
%completeness - (100) (number of valid results / number of planned results).
B5.2.5.1 Completeness of Field and Laboratory Activities
The completeness acceptance criteria for collection of field samples is 100 percent; there are no
foreseeable obstacles in collecting six composite soil samples from each DA with each composite
sample comprising five soil specimens. The percent completeness of laboratory performance will
be calculated upon completion of data validation and compared to a contractual acceptance
criteria of 95 percent or greater.
B5.2.5.2 Data Quality Assessment Using the Chen Test
The analytical results that are found to be of acceptable quality through data validation will be
used in the DQA process using the Chen test. The Chen test directions to be followed for the
valid data generated for each DA are presented in Appendix C. DQA using the Chen test is
considered an evaluation of whether the completeness acceptance criteria have been met in that
the Chen test identifies for a given data set whether or not the minimum sample size has been
obtained to achieve the decision error limits established in Section A7.6.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 53 of 70
B6 INSTRUMENT EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE
REQUIREMENTS
The Bunse & Burner Laboratory technical staff analyst will be responsible for maintaining the GC
and related extraction equipment. Routine maintenance items may involve capillary GC column
rinsing, cleaning the metal GC injector body, and servicing the splitter connections. Other
maintenance will be performed as needed. Major maintenance will be conducted by the
manufacturer's service representative through the laboratory's maintenance and servicing
contract.
B7 INSTRUMENT CALIBRATION AND FREQUENCY
The GC system will be initially calibrated using calibration standards for individual PCB
congeners. A minimum of five different concentrations covering the expected working range will
be employed. Their concentrations will be related to the internal standard as described in SW-846
Methods 8000B and 8082. Each sample analysis session will be bracketed by an acceptable initial
calibration, calibration check standard(s) (each 12-hour shift), or calibration check standards
interspersed within the samples. Sample injection may continue for as long as the calibration
verification standards and standards interspersed with the samples meet QC requirements.
Standards will be analyzed after no more than 20 samples have been analyzed. The sequence will
end when the set of samples has been injected or when qualitative or quantitative QC criteria are
exceeded, at which time corrective action must be taken.
B8 INSPECTION/ACCEPTANCE REQUIREMENTS FOR SUPPLIES
AND CONSUMABLES
The Bunse & Burner Laboratory technical staff analyst will be responsible for inspecting incoming
equipment and supplies before placing them in service. The manufacturer's specifications for
product performance and purity will be used as criteria for acceptance or rejection of supplies and
consumables.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 54 of 70
B9 DATA ACQUISITION REQUIREMENTS FOR NON-DIRECT
MEASUREMENTS
No types of data are needed for project implementation or decision making that would be
obtained from non-measurement sources such as computer databases, programs, literature files, or
historical databases.
BIO DATA MANAGEMENT
This section identifies the activities and processes planned for documenting the traceability of the
conclusions and information in the final report and data package to the data collected in this
project. This process is shown schematically in Figure 8.
B10.1 Data Recording
Data for this project will be collected by computer and by handwritten entries. Field observations
and records such as sample collection information and shipping data will be primarily recorded
manually using the forms described in Section B2.8.2 and shown in Appendix A. Additional field-
generated data include the locations of soil specimen sampling locations and the locations of the
corners of DAs. These locations will be recorded by the GPS data logger. After they are
recorded by hand or by the GPS data logger, the field observations and records will be entered
into the Sandy Lowem & Associates computer data management system for subsequent
integration with other project data. This computer data management system w^certifie^in 1995
by Snoopit Consulting, Inc., to satisfy EPA's Good Automated Laboratory PracticeslGALP)
guidelines and has undergone no major revision since then.
Computer-generated data are primarily associated with laboratory activities and will be managed
under the control of the automated LIMS used by Bunse & Burner Laboratory. This data
management system is maintained to the manufacturer's current revision under contract to the
manufacturer and is certified to be compliant with EPA's GALP guidelines. This data system is
used to produce a majority of the components of the laboratory data reports described in Section
B4.7.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
U.S. EPA Headquarters Library
Mail code 3201
1200 PennsylvaniaAvenue^NW
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 55 of 70
The data generated by the licensed land surveyor, Shootit Wright Surveyors, will be recorded
both manually (traditional survey records) and by the GPS data logger. Of these records, the only
pieces of information to be entered into the Sandy Lowem & Associates data management system
are the manually calculated coordinates and the GPS-measured coordinates (state plane
coordinate system) of the corners of each DA.
Integration of manually recorded and computer-recorded data will be done by the Sandy Lowem
& Associates data management system to produce data summary output for evaluation and for the
final RI report.
B10.2 Data Quality Assurance Checks
QA checks of data as early as possible in a project are essential to provide early warning of
potential problems. Several levels of QA checks are routinely performed by Sandy Lowem &
Associates staff, according to the type of data collected.
Firs jpriate for computerized data operations: this permits
ale1 ^ o _o_ :hecks are automated at Bunse & Burner Laboratory
in that the LIMS immediately notifies the technical staff analyst of out-of-range data for each
monitored variable. Range checks ajso are programmed into the GPS equipment to be used on
this project. Manual field observations to be made during this project do not have value-range
limitations and, therefore, the project-specific field forms do not specify acceptable variable
ranges.
After the range checks, appropriate relational checks by the LIMS are performed (e.g., comparing
sampling records and analysis data against the required sample turn-around-time), and the
operator is warned so that appropriate corrective action can be taken as soon as possible.
Additional QA assessment activities and data validation procedures established for this project are
discussed in detail i ns C and E^)
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 56 of 70
B10.3 Data Transformations
Conversions, also termed transformations, established in the data management system are all
reversible. Manually recorded observations and data are entered into the Sandy Lowem &
Associates data system as recorded; data entry screens match the respective field form. For this
project, transformations of manually recorded field records performed under control of the data
system will be restricted to reformatting and tabulation routines that do not involve mathematical
transformation calculations.
For the computerized data acquisition performed by the L1MS, sensor voltages are transformed to
chemical concentrations using a series of relationships. The relationships are not subject to
change: applicable parameters depend on specific experimental conditions. These parameters are
specified by the operator or are established on the basis of routine calibration and then are
combined to produce the reported PCB concentrations. The relationships and parameter values
will be delivered as part of the final data package.
The GPS equipment uses transformations to convert satellite-based measurements to state plane
coordinates. These transformation routines are validated procedures built into the GPS data
loggers and associated software by the manufacturers.
B10.4 Data Transmittal
Observations and data manually recorded on the field forms described in Section B2.8.2 will be
faxed daily to Sandy Lowem & Associates' office by the Field Team Leader, and the originals will
be stored temporarily in a locked file cabinet in the field office. Data will be entered into the
Sandy Lowem & Associates data system daily for ongoing management.
The manually calculated coordinates and the GPS-measured coordinates of the corners of each
DA will be transmitted from Shootit Wright Surveyors to Sandy Lowem & Associates in the
surveyor's final report. This transmission will take place within 1 week of completion of field
survey activities, at which time the data will be entered into the Sandy Lowem & Associates data
management system.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No 1
March 31, 1998
Page 57 of 70
Data entered into Bunse & Burner Laboratory's system are managed by the LIMS, beginning with
sample check-in on the sample-receiving data terminal. Certain data in the LIMS, such as sample
tracking information and final QA-checked analytical results in the proper EDD format, are
transmitted nightly to the Sandy Lowem & Associates data system. This transmission not only
ensures that project records are maintained current but also provides for a de facto data backup
capability. Checksums used in the commercially procured telecommunications software verify the
correctness of each packet of a transmission. The full laboratory data reports described in Section
B4.7 will be delivered to Sandy Lowem & Associates within 3 weeks of the laboratory's receipt
of the associated samples.
B10.5 Data Analysis
Data analysis in this project will be performed by Sandy Lowem & Associates project staff using
add-on routines to the production version of the data management system. These add-on routines
follow the Chen test data analysis procedures discussed in Section D3. Additionally, the data
management system produces data summary tables showing user-selected parameters that can be
formatted according to numerous data comparison and sorting procedures. Data analysis results
such as mean PCB concentrations for each DA will be superimposed on the site map to assist with
interpretation of project data.
B10.6 Data Tracking
The Sandy Lowem & Associates data management system will include the milestones of planned
project activity and the numbers of samples to be collected. Routines in the data system will use
this information to assist the PM and the field teams by monitoring the progress of sample
collection and processing to ensure that samples are indeed collected as planned. This monitoring
will continue throughout laboratory sample analysis by tracking the specified turn-around times
for each sample.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 58 of 70
BIO.7 Data Storage and Retrieval
Once delivered to the Sandy Lowem & Associates office, the hard-copy originals of field forms
containing manually recorded information will be bound. The completed forms and notebooks
will be stored in the custody of the PM for the duration of the project, and the full laboratory data
reports submitted to Sandy Lowem & Associates will be stored in the custody of the Sandy
Lowem & Associates QAM. Bunse & Burner Laboratory will maintain possession of original
laboratory hard-copy documents and magnetic tape backups of GC data.
The project records entered into the Sandy Lowem & Associates data management system will be
downloaded weekly to a CD-ROM disk. This disk is stored in a locked fireproof cabinet under
custody of the Sandy Lowem & Associates PM.
Following the management policy of Sandy Lowem & Associates, project files will be archived
offsite at a secure facility for a minimum of 10 years following delivery of the final report. The
records will not be destroyed without written approval from EPA.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 59 of 70
C ASSESSMENT/OVERSIGHT
CI ASSESSMENT ACTIVITIES
EPA has conducted a thorough management systems review of Sandy Lowem & Associates as
part of the contractor selection process. Similarly, Sandy Lowem & Associates implements an
ongoing management evaluation program of each regularly used subcontractor, including Bunse
& Burner Laboratory. These evaluations have established that the QA management structure,
policies, practices, and procedures of these organizations are adequate for ensuring that the type
and quality of data needed for this investigation can be obtained.
In addition to the management system reviews, the QA assessment activities presented below are
planned for this project. Table 6 lists these assessment activities and, for each activity, indicates
the frequency, number of assessments, timing, and responsible personnel.
Cl.l Technical Systems Audits
Two TSAs will be conducted of field activities and two will be conducted of laboratory
operations. The audits will be conducted by the Sandy Lowem & Associates QAM. The first
laboratory and field TSAs will be conducted within the first week of activity. The second audit
will be conducted approximately halfway through the program. Additional TSAs will be
scheduled if warranted by audit observations and findings.
A TSA is a thorough, systematic, onsite, qualitative audit of project systems. For both the field
TSA and the laboratory TSA, the QAM will develop a checklist to guide the audit that is based on
the requirements included in this QAPP. Field TSAs focus on the appropriateness of personnel
assignments and expertise; availability and proper use of field equipment; adherence to project-
controlling documents for sample collection, identification, handling, and transport; proper
collection and handling of QC samples; and adherence to established COC, equipment
decontamination, and documentation procedures. Laboratory TSAs include reviews of sample
handling procedures, internal sample tracking, SOPs, analytical data documentation, QA/QC
protocols, and data reporting.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
OU2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 60 of 70
C1.2 Data Validation
The laboratory analytical results will be subject to validation to assess for bias and to review for
completeness, representativeness, and acceptable levels of precision and accuracy. The
acceptance criteria for measurement data are described in Section B5.2. Data validation
procedures are presented in Section D2. The validation of data quality is, in part, based on the
analytical results of field duplicate soil samples submitted to the laboratory blind (laboratory is
unaware that the sample is a QC duplicate) and the analytical results of PE soil samples submitted
to the laboratory as double-blind samples (laboratory is unaware that the sample is a QC sample
and also does not know the spiked concentration in the sample). In addition to the blind field QC
samples, the validation of data quality is based on the results of laboratory QC procedures
discussed in Section B5 and shown in Table 5.
CI.3 Data Quality Assessment
As discussed in Section B5.2.5.2, the analytical results that are found to be of acceptable quality
through data validation will be used in the DQA process using the Chen test. The Chen test is a
statistical tool that evaluates whether the minimum sample size has been obtained to achieve the
decision error limits established in Section A7.6. The Chen test directions to be followed to
assess the valid data generated for each DA are further discussed in Section D3. The results of
the DQA process will be presented in the final RI report and, therefore, cannot warrant mid-
project corrective action or QA documentation as described in the next two sections.
C1.4 Corrective Action Process and Responsibility
The first level of responsibility for identifying the need for corrective action lies with the field and
laboratory technical staff during routine sampling and analysis activities. The second level of
responsibility lies with any person observing deviations during field audits, while reviewing field
documentation, or while reviewing laboratory results (e.g., field observations made by the Field
Team Leader, deficiencies identified by the Bunse & Burner Laboratory QA Officer during
preparation and review of laboratory data reports, or observations and findings made by the
Sandy Lowem & Associates QAM during TSAs or data validation).
This is an example Quality Assurance Project Plan The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 61 of 70
Each time the need for corrective action is identified, the problem will be documented on the
Corrective Action Request and Tracking Form as described in Section C2.1. The form indicates
the person(s) responsible for identifying, implementing, and assessing the effectiveness of the
corrective action. It is the responsibility of the Sandy Lowem & Associates QAM to track the
progress of the corrective action and update management on the progress (see Section C2.4).
Cl.4.1 Field Corrective Action
Corrective actions will be initiated if the field team is not adhering to the prescribed sampling or
documentation procedures or if laboratory analyses are experiencing interference or systemic
contamination due to field sampling procedures or sample handling protocol. Corrective actions
begin with identifying the source of the problem. Corrective action responses may include more
intensive staff training, modification of field procedures, or removal of the source of systemic
contamination. Once resolved, the corrective action procedure will be fully documented as
described in Section C2.1. In an extreme situation, a revision of this QAPP will be prepared and
distributed for implementation.
Cl.4.2 Laboratory Corrective Action
Analytical system upsets caused by sample contaminants will be handled by the analyst in
consultation with the Bunse & Burner Laboratory Operations Manager and QA Officer. Sections
3.0 and 7.11 of SW-846 Method 8000B discuss methods to reduce interferences, improve
performance, and maintain the GC system. Potential problems include carryover contamination
due to samples with unexpectedly high concentrations, elevated baseline problems, contamination
with high-boiling materials, and carrier gas contaminants. Additional cleanup of the sample
extract may be required. Changeout of columns and detectors may be required. All corrective
actions will be recorded in the laboratory notebook and reviewed periodically by the Bunse &
Burner Laboratory Project Manager and QA Officer. Once resolved, the corrective action
procedure will be fully documented as described in Section C2.1. In an extreme situation, this
QAPP will be revised and distributed for implementation.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 62 of 70
For failures of the GC's mechanical, electronic, or thermal subsystems, Bunse & Burner
Laboratory technical staff will inform the Laboratory Operations Manager who will, in turn, call
on the manufacturer's service representative for assistance in repairing and/or replacing failed
components. This procedure also applies to failures (crashes) of the computerized data
acquisition system. Bunse & Burner Laboratory has multiple analytical systems available as
backup. Repair and replacement activities will be documented in the instrument logbook
maintained with each analytical or data acquisition system.
C2 ASSESSMENT DOCUMENTATION AND REPORTS
This section describes documentation and reporting requirements for the assessment activities
described in Section CI.
C2.1 Corrective Action Request and Tracking Form
Any one who identifies the need for corrective action will document the nature of the problem on
a Corrective Action Request and Tracking Form. An example form is included in Appendix A.
The initiator will submit the form to the Sandy Lowem & Associates QAM. The QAM will
identify appropriate parties responsible for recommending, implementing, and evaluating a
corrective action strategy. Each of these corrective action steps will be documented on the form.
It is the overall responsibility of the Sandy Lowem & Associates QAM to track the progress of
the corrective action and update management on the progress (see Section C2.4). Once the
corrective action is deemed successful, the issue is closed out as indicated by the signatures of the
RSM, EPA QA Officer, Sandy Lowem & Associates PM, and Sandy Lowem & Associates QAM.
C2.2 Audit Reports
The Sandy Lowem & Associates QAM will prepare an audit report summarizing the observations
and findings, if any, of each of the TSAs. The audited group will be allowed to comment on the
audit reports before they are finalized. Each audit report will present recommendations of
observations or findings that warrant commencement of a Corrective Action Request and
Tracking Form (see Section C2.1). The distribution list for audit reports will be the same as the
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 63 of 70
distribution list for this QAPP, except that Bunse & Burner Laboratory personnel will not receive
reports of field TSAs. Audit reports will be included as appendixes to the final RI report.
C2.3 Data Validation Reports
A data validation report will be prepared for each laboratory data report generated. Data
validation procedures are discussed in Section D2. The data validation report will address
whether the quality of the data is appropriate for the intended use of the data.
Each data validation report will include a tabulation of QA/QC issues, findings, and deficiencies.
The Sandy Lowem & Associates QAM will issue a draft report to the Bunse & Burner Project
Manager and QA Officer. These individuals will address QA7QC issues, findings, and deficiencies
that are found to be correctable (e.g., omitted information). After these corrections are made, the
final data validation report will be prepared. The final report will present recommendations of
QA/QC issues, findings, and deficiencies that warrant commencement of a Corrective Action
Request and Tracking Form (see Section C2.1). The distribution list for data validation reports
will be the same as the distribution list for this QAPP. Data validation reports will be included as
appendixes to the final RI report.
C2.4 Monthly QA Summaries
As described in Section A9.3.1, Sandy Lowem & Associates will prepare a monthly progress
report and submit it to EPA no later than the 15th of the month following the period being
reported. The monthly progress report will contain a QA summary prepared by the Sandy
Lowem & Associates QAM. The QA summary also will indicate the status of each deficiency
that is being tracked on a Corrective Action Request and Tracking, if any. The QA summary also
will provide an overview of other QA/QC observations and findings identified within the reporting
period that have not warranted corrective action.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 64 of 70
D DATA VALIDATION AND USABILITY
D1 DATA REVIEW, VALIDATION, AND VERIFICATION
The purpose of this section is to describe the process for documenting the degree to which the
collected data meet the project objectives, individually and collectively, and to estimate the effect
of any deviations on the ability to use the data for addressing the decision rule described in
Section A7.5.
Dl.l Sampling Design
For this project, the critical sampling design variable is that each composite sample is
representative of the intended DA; that is, that the state plane coordinates reported for each soil
specimen fall within the intended DA sector (see Section B2.4). Considering that the intent of the
sampling plan is to collect each soil specimen from a randomly selected location within a DA
sector, specimen sampling location errors are acceptable as long as the location still falls within
the sector boundaries and the errors are random.
As described in Section B2.3, the GPS data collected by the field sampling teams at each soil
specimen sampling location will be calibrated to the boundaries of each DA reported by the
licensed land surveyor. After this calibration is performed, a routine built into the Sandy Lowem
& Associates data management system will check whether the location of each specimen sampling
location falls within the intended DA sector. The Sandy Lowem & Associates PM will review
this documentation, evaluate the suitability of each sample for use in the project, and accept or
reject each sample. The rationale of the PM's decision will be noted in the data management
system for any sample that contained at least one soil specimen falling outside the intended DA
sector.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 65 of 70
D1.2 Sample Collection Procedures
Deviations from the prescribed sampling procedures are noted on the sample collection forms
and/or in TSA reports. Such deviations may include inappropriate soil specimen sampling depths
or inconsistent soil specimen volumes that are homogenized to form a composite (see Section
B2.7). The documented deviations will be included in the computerized data management system.
The PM will review the rationale for the deviations, evaluate the suitability of each sample for use
in the project, and accept or reject each sample. The rationale of the PM's decision will be noted
in the data management system for any sample with noted deviations.
D1.3 Sample Handling
Deviations from the planned sample handling procedures (e.g., preservation, custody, and
transport as described in Section B3) will be noted on the COC forms and in the field notebooks.
These sample handling deviations will be included in the computerized data management system.
The PM will review the deviations, evaluate the suitability of each sample for use in the project,
and accept or reject each sample. The rationale of the PM's decision will be noted in the data
management system for any sample with noted deviations.
Sample handling information relevant to laboratory issues will also be supplied to the Sandy
Lowem & Associates data management system from Bunse & Burner Laboratory's LIMS.
Deviations noted by Bunse & Burner Laboratory are to be noted in the LIMS and in the
laboratory data report case narrative. Once integrated into the master data management system,
sample handling information is compared against specified variables (e.g., turnaround time) by the
data system software. Deviations are reviewed by the Sandy Lowem & Associates QAM as part
of the data validation process described in Section D2.
D1.4 Analytical Procedures
Deviations from SW-846 Method 8082 will be noted in the LIMS by the Bunse & Burner
Laboratory technical staff analyst or QA Officer and will be discussed in the laboratory data
report case narrative. Such deviations will include any change of conditions from the published
method (e.g., change in temperature of the capillary column). The LIMS entry and case narrative
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 66 of 70
will also contain the Bunse & Burner Laboratory QA Officer's recommended use of the
associated data. This information will be transferred into the Sandy Lowem & Associates data
management system and will be reviewed by the QAM as part of the data validation process
described in Section D2.
D1.5 Quality Control
Laboratory QC analyses are monitored by the LIMS; the specified QC samples must be analyzed
in response to the directives of the LIMS (work cannot continue until the specified QC sample is
analyzed). Since all samples and standards are uniquely identified by bar-code labels, the LIMS is
able to track each. In addition, the sample sequence in the autosampler is verified by the system
before the system "accepts" a batch of samples for processing.
Field QC samples submitted to the laboratory blind will be identified by the field teams on soil
sampling data sheets and in field notebooks. The true identity of these samples will be entered
into the Sandy Lowem & Associates data management system. Once the finalized analytical
results of primary soil samples, blind QC samples, and laboratory QC samples are transmitted
from the LIMS to the master data management system, the software will calculate applicable
parameters such as RPD and %R and compare the calculations to the QC acceptance limits
discussed in Section B5.2 and shown in Table 5. These data and QC calculations will be reviewed
by the Sandy Lowem & Associates QAM as part of the data validation process described in
Section D2.
D1.6 Calibration
Field GPS instruments have been calibrated using validated procedures built into the GPS data
loggers and associated software by the manufacturers.
Laboratory calibration will be monitored by the LIMS, which has a record of the sample analysis
plan including the number, sequence, and acceptable concentration(s) of calibration samples. The
LIMS will monitor the sample loading of the autosampler for conformance to the planned
sequence and will "accept" only batches of samples that conform to the plan. The results of initial
and continuing calibrations will be transmitted from the LIMS to the master data management
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination R1
QAPP Revision No. 1
March 31, 1998
Page 67 of 70
system and will be included in the laboratory data report. The calibration data will be reviewed by
the Sandy Lowem & Associates QAM as part of the data validation process described in Section
D2.
D2 VALIDATION AND VERIFICATION METHODS
This section describes the process for verifying (i.e., determining that project data were collected
in a way that meets at least the specified QC acceptance criteria) and validating (i.e., determining
that project results are suitable for use in making the specified decision) project data.
Data validation will be performed largely under the Sandy Lowem & Associates data management
system and will be reviewed and interpreted by the Sandy Lowem & Associates QAM. The
validation results will be presented in reports as described in Section C2.3. Data will be verified
by the PM through review of the validated data reports.
Each analytical laboratory report will be reviewed for compliance with the applicable method and
for the quality of the data reported. The EPA Contract Laboratory Program National Functional
Guidelines for Organic Data Review (EPA, 1994a) provides general data validation guidelines
that will be applied to the generated data. The data validation procedures described in the
Functional Guidelines are designed to review each data set and identify biases inherent in the data
including assessment of laboratory performance, overall precision and accuracy,
representativeness, and completeness. Data validation flags presented in these guidelines will be
applied to those sample results that fall outside of the QC acceptance criteria presented in Section
B5.2 and in Table 5. An explanation of the data flags is provided in Table 7. The following areas
of data validation will be applied to every laboratory data report using the automated routines in
the Sandy Lowem & Associates data management system:
Data completeness
Holding times
Blanks
Initial and continuing calibrations
QC reference and internal standards
Laboratory control samples
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
Page 68 of 70
MS/MSDs
Surrogates
Field and laboratory duplicates
PE samples.
In addition to the above validation areas, a manual comparison of the EDD to the laboratory
hardcopy report will be performed, and the raw data for 25 percent of the laboratory analytical
results packages will be scrutinized according to the procedures in the Functional Guidelines to
verify compound identification and quantification.
D3 RECONCILIATION WITH DATA QUALITY OBJECTIVES
After the data are validated, data of acceptable quality will be statistically evaluated using the
Chen test (see Section A7.6). The Chen test directions to be followed for the valid data that were
generated for each DA are presented in Appendix C. This data analysis procedure provides an
overall reconciliation with the project DQOs because: (1) the only data used in the Chen test
analysis are those that, through extensive QC procedures and the data validation process, have
been demonstrated to meet project QA/QC acceptance criteria, and (2) it ensures that the limits
on the decision error established in DQO Step 6 (Section A7.6) have been met. The limit set on
the probability that the Type I decision error will occur is 0.2 (20 percent) at 0.5 ppm, the lower
end of the gray region. Following the Chen test procedures ensures that the Type I decision error
limit is met. The limit set on the probability that the Type II decision error will occur is 0.05 (5
percent) at 2.0 ppm, the upper end of the gray region. As opposed to the Type 1 decision error,
hypothesis test procedures do not ensure that the Type II decision error limit is met. The Chen
test statistical protocol must be used for each DA for which the baseline condition was not
rejected to demonstrate whether enough valid data have been generated to meet the Type II
decision error limit.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 69 of 70
The results of the Chen test analyses will indicate which DAs will be: (1) characterized as not
posing an unacceptable risk to human health or the environment and dismissed from further RI/FS
activities, (2) targeted for characterization of subsurface soil contamination in a subsequent RI
phase and included in the FS to evaluate remedial alternatives for surface soil PCB contamination
cleanup, or (3) characterized as requiring additional surface soil PCB data before a determination
can be made within the established decision error limits as to which of the first two categories
applies. For the DAs falling into the third category, additional composite soil samples will be
collected until the total set of valid data indicate that the decision error limits are met. If
necessary, the DA will be subdivided into smaller DAs for separate evaluation in order to meet the
decision error limits.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
E REFERENCES
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Page 70 of 70
U.S. EPA, 1993. Data Quality Objectives Process for Superfund, Interim Final Guidance. Office
of Emergency and Remedial Response. Washington, DC. EPA 540-R-93-071.
, 1994a. Contract Laboratory Program National Functional Guidelines for Organic Data
Review. Office of Emergency and Remedial Response. Washington, DC. EPA-540/R-
94-012.
, 1994b. EPA Requirements for Quality Assurance Project Plans for Environmental Data
Operations, EPA QA/R-5. Office of Research and Development. Washington, DC.
, 1994c. Guidance for the Data Quality Objectives Process, EPA QA/G-4. Office of
Research and Development. Washington, DC. EPA/600/R-96/055.
,1996a. Soil Screening Guidance: Technical Background Document. Office of Solid
Waste and Emergency Response. Washington, DC. EPA/540/R95/128.
,1996b. Soil Screening Guidance: User's Guide. Office of Solid Waste and Emergency
Response. Washington, DC. Publication 9355.4-23.
, 1996c. Test Methods for Evaluating Solid Waste, Physical/Chemical Methods, SW-846,
Third Edition, Office of Solid Waste and Emergency Response, Washington, DC.
, 1997. EPA Guidance for Quality Assurance Project Plans, EPA QA/G-5. Office of
Research and Development. Washington, DC. EPA/600/R-96/055.
U.S. Geological Survey, 1954. Ground Water Resources of Whoostano County. U.S.
Government Printing Office. Alexandria, VA.
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
Figures
-------
Figure 1. Project Organization Chart
OU2 PCB Contamination RI, EMCA/ECC Superfund Site
-------
Storage Shed
Manufacturing
Building
/Warehouse
Office
Building
ฆ
8
'\
' 1
BJ
a
/ / n '
* I ,/ฐ
A
Site Legend
A/Unimproved Roads Landuse Scale
/%/ Paved Roads ] industrial 0116 Inch = 700 Feet
, Streams ^Residential o" 7ee7 700 Nnrth
. * Topographic Contours Site Property 1X01111
(ft above MSL) m Wellfield
Buildings
Figure 2. Site Map
EMCA/ECC Superfund Site
-------
Migration of PCBs via
windblown
PCB Source Area
PCB-contaminated waste oil
sprayed on dirt road
PCBs
accumulate as
stream
sediment
Surficia! Aquifer
(silty sand)
Minimal vertical
migration of PCBs due to
low solubility and high
sorption properties
Semiconfining Layer
(silty clay)
groundwater
flow *
Groundwater
not affected by
PCB soil
contamination
Principal Aquifer
(sandy silt)
groundwater
flow
w
* Groundwater in surficial aquifer discharges to streams during periods of high water table.
Figure 3. Conceptual Site Model of OU2 PCB Contamination at the EMCA/ECC Superfund Site
E
-------
* Storage Shed
Operable Unit 2
-03 18 ,J
Operable Unit 1
Manufacturing
Building
ffiND
Warehou:
Office
Building
Site Legend
/\/Unimproved Roads Landuse Sca]g l
/S/Paved Roads H] Industrial One Inch = 700 Feet A
/ \y'Streams ; | Residential rT__r-l__|
/Topographic Contours i I Site Property o Feet 700 \wrii
' v (ft. above MSL) Wellfield 1
Buildings
742ฉ Soil Sampling Location with Total PCB Concentration (mg/Kg)
nd ฉ PCBs Not Detected above the Reporting Limit (0.01 mg/Kg)
Figure 4. Total PCB Concentrations Reported for Preliminary Soil Samples
EMCA/ECC Superfund Site
-------
Task Name
Feb '98
1 8 115)22
Mar'98
Apr '98
May '98
Jun '98
Jul '98
Aug '98
Sep '98
8|15|22|29| 5|12|19|26| 3 110j 17| 24| 31| 7 114| 21128| 5112| 19|26| 2 | 9 116j 23j 30| 6 113| 20| 27
Task 1. Project planning and QAPP preparation
Task 2. Preparation of Health & Safety/Community Relations F
Task 3. Surveying decision areas
Task 4 Soil sample collection
Task 5 Laboratory analysis of soil samples
Task 6 Data validation
Task 7. Data quality assessment
If necessary to meet decision error limits
Task 4a. Additional soil sampling
Task 5a. Additional laboratory analysis
Task 6a Additional data validation
Task 7a Additional data quality assessment
Task 8. Data analysis and Rl report preparation
Task 9 Technical systems auditing
Task 9.1 Field audit 1
Task 9 2 Field audit 2
Task 9.3. Laboratory audit 1
Task 9 4. Laboratory audit 2
Task 10 Project support
Task 10.1. Public meetings
Meeting 1
Meeting 2
Meeting 3
Task 10 2. Data management
Task 10.3 Progress reporting
ml
Figure 5. Project Schedule - OU2 PCB Contamination RI, EMCA/ECC Superfund Site
-------
Figure 6. Data Quality Objective Process
-------
Site Legend
/\/ Unimproved Roads
/\/ Paved Roads
/\/' Streams
/.? Topographic Contours
^ (ft. above MSL)
Buildings
DA07
0.5 acre to 4.5 acre
Decision Area
Dirt Road Segment
Decision Area
Stream Segment
Decision Area
Scale
One Inch = 400 Feet
Feet 400
k
North
Figure 7. Operable Unit 2 Decision Areas
OU2 PCB Contamination RJ, EMCA/ECC Superfund Site
-------
Samples
Actions
Records
Figure 8. Data Management Process
-------
Tables
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 1. Health and Environmental Risks from PCBs
Human Health
Some evidence (limited human data) for increased breast cancer rates
Accumulation in adipose; especially in fatty deposits of liver
Known carcinogen in rats and mice (liver); nonmutagenic
Group 2A classification: limited human data and sufficient animal data to classify as a
carcinogen
Human evidence that it is a female teratogen
Diffusion of PCB-containing compounds through skin can cause irritation and
sensitization through biotransformation (acute topical exposures)
Hepatic microsomal enzyme disrupters not directly linked to hepatic carcinoma but
directly associated with promotion of thyroidal carcinoma
Immunosuppressive effect (especially on HIV, HBV, and other CMI-related
infections)
Environmental
Bioaccumulation in terrestrial vertebrates, especially second-order avian populations
(documented teratogenic effects)
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 2. Sampling Plan Summary
Sample Type
Frequency of
Collection
Number
of
Samples
Parameters/
Analytical
Methods
Sample
Container
Preservation*
Maximum Holding
Time
Primary composite
soil sample consisting
of five homogenized
soil specimens
6 from each of
the 54 decision
areas
324
22 project-
specific PCB
congeners by
SW-846
Method 8082
(2) 4-
ounce glass
jars
4 ฑ 2 ฐC
14 days from sample
collection to extraction,
40 days from extraction
to analysis
Blind duplicate
composite soil sample
1 per 12 primary
samples (1 for
every other
decision area)
27
22 project-
specific PCB
congeners by
SW-846
Method 8082
(2)4-
ounce glass
jars
4 ฑ 2 ฐC
14 days from sample
collection to extraction,
40 days from extraction
to analysis
Double blind
performance
evaluation soil sample
1 upon project
startup, and 1 at
approximate
25%, 50%, and
75% completion
points
4
22 project-
specific PCB
congeners by
SW-846
Method 8082
(2) 4-
ounce glass
jars
4 ฑ 2 ฐC
14 days from receipt by
laboratory to extraction,
40 days from extraction
to analysis
* Upon sample collection, samples will be maintained in a cooler with sufficient coolant to chill the samples to 4 ฑ 2 ฐC until placed in
the onsite refrigerator or delivered to the analytical laboratory. While in the onsite refrigerator and in the laboratory, samples will be
stored and maintained at 4 ฑ 2 ฐC.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 3. Specific PCB Congeners in Aroclors
Congener
IUPAC
No.
Aroclor
1016
1221
1232
1242
1248
1254
1260
Biphenyl
-
X
2-CB
1
X
X
X
X
23-DCB
5
X
X
X
X
X
34-DCB
12
X
X
X
X
244'-TCB
28
X
X
X
X
X
22'35'-TCB
44
X
X
X
X
X
23'44'-TCB
66
X
X
X
233'4'6-PCB
110
X
23'44'5-PCB
118
X
X
22'44'55'-HCB
153
X
22'344'5'-HCB
138
X
22'344'55'-HpCB
180
X
22'33'44'5-HpCB
170
X
IUPAC = International Union of Pure and Applied Chemistry
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 4. Project-Specific List of Target Analytes and Reporting Limits
Congener
CAS Registry No.
IUPAC
Number
Laboratory
LOQ (ppb)
Laboratory
MDL (ppb)
2-Chlorobiphenyl
2051-60-7
1
3
1.7
2,3-Dichlorobiphenyl
16605-91-7
5
2
0.8
3,4-Dichlorobiphenyl
2974-92-7
12
0.7
0.3
2,2',5-Trichlorobiphenyl
37680-65-2
18
0.7
0.3
2,4,4'-T richlorobiphenyl
7012-37-5
28
0.3
0.2
2,4',5-Tnchlorobiphenyl
16606-02-3
31
0.3
0.2
2,2'3,5'-Tetrachlorobiphenyl
41464-39-5
44
0.3
0.2
2,2',5,5'-T etrachlorobiphenyl
35693-99-3
52
0.3
0.2
2,3',4,4-T etrachlorobiphenyl
32598-10-0
66
2
0.8
2,2',3,4,5'-Pentachlorobiphenyl
38380-02-8
87
0.3
0.2
2,2',4,5,5'-Pentachlorobiphenyl
37680-73-2
101
0.3
0.2
2,3,3',4',6-Pentachlorobiphenyl
38380-03-9
110
0.3
0.2
2,3',4,4',5-Pentachlorobiphenyl
31508-00-6
118
0.3
0.2
2,2',3,4,4',5'-Hexachlorobiphenyl
35065-28-2
138
0.3
0.2
2,2',3,4,5,5'-Hexachlorobiphenyl
52712-04-06
141
0.3
0.2
2,2',3,5,5',6-Hexachlorobiphenyl
52663-63-5
151
2
0.8
2,2',4,4',5,5'-Hexachlorobiphenyl
35065-27-1
153
0.3
0.2
2,2',3,3',4,4',5-Heptachlorobiphenyl
35065-30-6
170
0.3
0.2
2,2',3,4,4',5,5'-Heptachlorobiphenyl
35065-29-3
180
0.7
0 3
2,2',3,4,4',5',6-Heptachlorobiphenyl
52663-69-1
183
0.3
0.2
2,2',3,4',5,5',6-Heptachlorobiphenyl
52663-69-1
187
0.3
0.2
2,2',3,3',4,4',5,5',6-
Nonachlorobiphenyl
41086-72-9
206
0.3
0.2
CAS = Chemical Abstract Service
1UPAC = International Union of Pure and Applied Chemistry
LOQ = Limit of quantitation
MDL = Method detection limit
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31,1998
page 1 of 2
Table 5. Quality Control Sample Analyses and Acceptance Criteria
Description and Use
Frequency of
Application
Acceptance
Criteria
Laboratory
Corrective Action
Method blank. PCB-free soil extracted
and cleaned up with regular samples.
Checks for contaminants in total
analytical system.
At least once
per batch of 20
samples.
Each target analyte
not detected above
the limit of
quantitation.
Reanalyze method
blank once,
reprepare and
reanalyze samples
that showed similar
detections.
Solvent blank. Pure solvent or solvent
mixture. Checks for column carryover
of analytes.
Once per batch
of 20 samples,
immediately
following the
calibration
standard.
Each target analyte
not detected above
the limit of
quantitation.
Clean system and
reanalyze affected
samples.
Calibration check standard. Solution
of one or more target congeners at
midpoint of calibration range. Also
contains the internal standard
compound.
After each
batch of 20
samples.
Response factor
within 15% of
initial calibration.
Stop analysis, make
corrections,
recalibrate, and
reanalyze associated
samples.
Oualitv control reference sample.
Independently prepared mixture of
congeners, including the internal
standard. Checks accuracy of
calibration standards responses.
Minimum of 1
per 20 samples
or 1 per batch
if batch is less
than 20
samples.
Compound
recovery between
80% and 120%.
Evaluate system and
prepare/ analyze a
new set of
calibration
standards.
Check of internal standard response.
An evaluation of the response of the
internal standard (decachloro-
biphenyl).
Each sample.
Area of internal
standard peak
within 50% of
average calculated
during calibration.
Reanalyze all
samples outside this
limit on a different
instrument to verily
matrix effects.
Surrogate standard. Tetrachloro-meta-
xylene added to each soil sample prior
to extraction. Allows continual
evaluation of congener recoveries.
Each sample.
Bunse & Burner
Laboratory
acceptance range is
60% to 130%.
Reanalyze all
samples outside this
limit.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
page 2 of 2
Table 5 (continued)
Description and Use
Frequency of
Application
Acceptance
Criteria
Laboratory
Corrective Action
Laboratory control standard CLCS").
A clean matrix similar to the sample
matrix and of same weight and
volume. Spiked identically to matrix
spike.
Once per batch of
20 samples.
Bunse & Burner
Laboratory
acceptance range is
80% to 120%.
Reanlyze the LCS,
evaluate extraction/
cleanup procedure,
reextract and
reanalyze associated
samples.
Matrix spike. A proiect soil sample
spiked with one or more congeners to
evaluate effect of soil matrix on
recovery.
Once per batch of
20 samples.
Bunse & Burner
Laboratory
acceptance range is
75% to 125%.
Evaluate extraction/
cleanup to improve.
Matrix spike duplicate. Duplicate of
matrix spike process using the same
project soil sample.
Once per batch of
20 samples.
Compare to matrix
spike results,
relative percent
difference less than
30%.
Evaluate extraction/
cleanup to improve.
Performance evaluation samples
(blind from fieldl. Evaluates
analytical accuracy.
Four for entire
project.
Limits provided by
vendor, typically
75% to 125%.
Evaluated during
data validation. No
immediate corrective
action possible.
Field duplicate samples (blind from
field). Evaluates overall precision.
One per 12
primary samples
(one for every
other decision
area).
Relative percent
difference less than
50%.
Evaluated during
data validation. No
immediate corrective
action possible.
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 6. Internal Quality Assurance Assessment Activities
Type of
assessment
Frequency of
assessment
Number of
assessments
Approximate
date
Responsible
Personnel
Management
Systems
Review, EPA
Contractor
None specific to
this project
Review of Sandy
Lowem & Associates
performed as part of
contractor selection
March 1997
Various EPA
personnel
Management
Systems
Review,
Analytical
Laboratory
None specific to
this project
Review of Bunse &
Burner Laboratory
performed routinely by
Sandy Lowem &
Associates
January 1998
Various Sandy
Lowem &
Associates
personnel
Technical
Systems
Audits
Upon project
startup and
approximately
at project
midpoint
Two of field activities
and two of laboratory
activities
April and May
1998*
Sandy Lowem &
Associates QA
Manager
Laboratory
Analysis of
QC Samples
See Table 5
See Table 5
April through
June 1998*
Internal review
by Bunse &
Burner
Laboratory QA
Officer
Data
Validation
Once per
laboratory data
report
10
May through
June 1998*
Sandy Lowem &
Associates QA
Manager
Data Quality
Assessment
Chen Test
Analyses
Once per
decision area
54
May through
July 1998*
Sandy Lowem &
Associates PM
* If necessary to meet decision error limits, additional activities may be scheduled for July through August 1998.
-------
EMCA/ECC Superfiind Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Table 7. Explanation of Data Validation Qualifiers
Data
Flag
Data Qualifier Explanation
U
The analyte was analyzed for, but was not detected above the reported sample
quantitation limit.
J
The analyte was positively identified; the associated numerical value is the
approximate concentration of the analyte in the sample.
N
The analysis indicates the presence of an analyte for which there is presumptive
evidence to make a "tentative identification."
NJ
The analysis indicates the presence of an analyte that has been "tentatively
identified," and the associated numerical value represents its approximate
concentration.
UJ
The analyte was not detected above the reported sample quantitation limit.
However, the reported quantitation limit is approximate and may or may not
represent the actual limit of quantitation necessary to accurately and precisely
measure the analyte in the sample.
R
The sample results are rejected due to serious deficiencies in the ability to analyze
the sample and meet quality control criteria. The presence or absence of the
analyte cannot be verified.
-------
Appendix A
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination Rl
QAPP Revision No. 1
March 31, 1998
APPENDIX A
Example Project Documentation Forms
~ Chain of Custody Form
~ Sample Jar Label and Custody Seal
~ Soil Sampling Data Sheet
~ Corrective Action Request and Tracking Form
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
Chain of Custody
PROJECT MANAGER
ANALYSIS REQUEST
COMPANY:
ADDRESS:
BILL TO' ...
COMPANY:
ADDRESS:
m
i
i
IO
ง
a
8
8
o
0
S
S
1
1
o
o
oo
Ui
a
CO
0
1
0
3
f
P
1
5
2
c
a
o
rsj
s
c
2
ง
"O
>>
X
a
i
lii"
*
3
I
a
CO
o
CO
0
(0
1
V
"O
a
u*
ซ
a.
a
ซ
ฃ
(O
i
X
\
\
<
o
r-.
CD
Si
N
(O
)
3
o
7
ฃ
a
0
ฆg
<
1
/$
<2
1
s,
ฆ
>
VI
X)
i
3
)
fr
e
ฃ
<
a
e
3
c
0
(/)
1
J
<
0
?
s
ฆi
5
o
>
ซr
1
3
a>
2
i
3
S.
e-
I
m
I
0
ri
9
ฃ
O
H-
a.
LU
X
1
2
M
o
H-
&
CO
ฃ
c
&
oป
O
rt
0
t
&*
1
2
o
~
Q.
LU
to
ฃ
s
A
3
2
s
~-
a
eo
ฃ
NUMBER OF CONTAINERS
( 1
SAMPLERS (SgraUia) PHONE NUMBER
SAMPLE ID
DATE
TIME
MATRIX
LAB 10
N/"
<
'X
\
0>
\
-J
\
V,
>
(\
\;
/ W
PROJECT INFOnMATION
SAMPLE REdf
PT
RELINQUISHED BY: 1.
RELINQUISHED BY: Z
RELINQUISHED BY: 1
PROJECT NO:
TOTAL CONTAINER^
Sgnaturs Turn.
Sgnature Time
Signature Time-
PROJECT NAME:
CHAW Qf CUSTODY SEALS
Pi
inled Name Data
hinted Name: Date
Printed Name; Date
PO NO:
INTACT}^ //
VIA'
RECEIVED vG^6b/CONDyCOLD
Co
mpany.
Company
Company
TAT ~?4HR Q4d HRSQ 1 WK 02WKS
LAB NUMBER
RECEIVED BY: 1.
RECEIVED BY: 2.
RECEIVED BY: (LAB) 1
SAMPLE DISPOSAL INSTRUCTIONS
kg
nature* Time
Sgnaftjra: Time'
Signature: Tone
~ Dispoul @ (i 00 ซch ~ Rtlum ~ Pickup (wll al)
Comments.
Printed Name- Dale'
Printed Nam*' Dale
Printed Name* Date*
Company
Company
-------
r
I-CHEM
CLIENT/SOURCE
OGKAB
0 COMPOSITE
OTHER
SITE NAMt
DATE
SAMPLE t
TIME
ANALYSIS
PKESEKVAT1VE
COLL BY
J
CUSTODY SEAL
(signs in)
Date Collected Time Collected.
Person Collecting Sample Sample No..
(signs in)
Sample Jar Label and Custody Seal
-------
Sampling Team Members: Soil Sampling Data Sheet
Page of EMCA/ECC Superfund Site
OU2 PCB Contamination Rl
Soil Specimen
Composite Sample
Identification Number
Coordinates, ft.
from SW corner
of Decision Area
Sampling
Noted Deviations from
Planned Sampling Location
Soil Appearance and
Other Observations
COC
ID for
Primary
Sample
COC ID
for Blind
Duplicate
Sample
Decision
Area
Composite
Sample
No.
Soil
Specimen
No.
Type* No.
North
East
Date
Time
-
-
-
-
-
-
DA = 0.5-, 1 0-, 2 0, or 4.5-acre decision area
ST = stream segment decision area
R = dirt road segment decision area
COC ID for Blind PE Sample:
-------
Corrective Action Request and Tracking Form
EMCA/ECC Superfund Site, OU2 PCB Contamination Rl
Problem
Date(s) Problem Identified:
Nature of Problem:
Originator
Name (print or type) Signature Date
Recommendation
Recommended Action and Timing:
Recommending Party
Name (print or type) Signature Date
Implementation
Date Corrective Action Began:
Implementing Party's Observations/Comments:
Implementing Party
Name (print or type) Signature Date
Evaluation
Evaluation of Corrective Action Effectiveness/lmplementability:
Evaluator
Name (print or type) Signature Date
Approval
Sandy Lowem & Associates Project Manager
Name (print or type) Signature Date
Sandy Lowem & Associates QA Manager
Name (print or type) Signature Date
EPA Remedial Site Manager
Name (print or type) Signature Date
EPA QA Officer
Name (print or type) Signature Date
Corrective action tracking number.
-------
Appendix B
-------
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
APPENDIX B
Example Worksheet for Establishing Soil Specimen Sampling Locations
This is an example Quality Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
Decision Area: DA17
Long Dimension Orientation: North-South
Long Dimension Length in feet (D): 418
Short Dimension Orientation: East-West X = [(R/100)*(D/5)]+[(D/5)*(N-1)]
Short Dimension Length in feet (d): 209 Y = (R/100)*d
Random Numbers (R):
29
41
30
22
47
39
40
40
66
91
60
40
55
3
62
42
15
67
3
15
54
87
76
46
66
18
59
61
87
40
50
84
50
99
91
25
15
52
89
78
63
3
10
4
61
67
78
52
81
79
81
11
36
98
19
13
3
29
19
95
Contingency
Random Numbers
for DAs with
Intervening
Road or Stream
80
94
87
31
39
53
58
79
54
25
DA
Composite
Sector (N)
Sample
R
X
R
Y
1
1
29
24.2
22
46.0
2
40
33.4
91
190.2
3
55
46.0
42
87.8
4
3
2.5
87
181.8
5
66
55.2
61
127.5
6
50
41.8
99
206.9
2
1
15
96.1
78
163.0
2
10
92.0
67
140.0
3
81
151.3
11
23.0
4
19
99.5
29
60.6
5
41
117.9
47
98.2
6
40
117.0
60
125.4
3
1
3
169.7
15
31.4
2
15
179.7
76
158.8
3
18
182.2
87
181.8
4
84
237.4
91
190.2
5
52
210.7
63
131.7
6
4
170.5
78
163.0
4
1
79
316.8
36
75.2
2
13
261.7
19
39.7
3
30
275.9
39
81.5
4
66
306.0
40
83.6
5
62
302.6
67
140.0
6
54
295.9
46
96.1
5
1
59
383.7
40
83.6
2
50
376.2
25
52.3
3
89
408.8
3
6.3
4
61
385.4
52
108.7
5
81
402.1
98
204.8
6
3
336.9
95
198.6
-------
Decision Area DA17
In
ii
o
B3
SedorS B2
01 B<
n
02
Ei
_EE
_&L
04
B5
H2 B3
B3
E5
134
B2
B1
B5
HI
B2
E 6
KM
DA Rซfซrvnc*
PoW (SW Comer)
d = 209 feet
-------
Appendix C
-------
APPENDIX C
EMCA/ECC Superfund Site
0U2 PCB Contamination RI
QAPP Revision No. 1
March 31, 1998
Chen Test Procedures for Data Quality Assessment
This is an example Quality' Assurance Project Plan. The referenced Superfund site and contractors are fictitious.
-------
Directions for the Chen Test
from ฆ U S EPA. 1996 Soil Screening Guidance Technical Background Document
Office of Solid Waste and Emergency Response Washington. DC. EPA/S40/R95/128
Let Xi. X].... xn. represent concentration measurements lor N random tamping points or N pseudo-
random sampling points (ie.,froma design tat can be analyzed as 9 (I were a wnple random sample)
The tolowmg descrfoes the steps for a one-*ainp*a test lor Hฉ: p S 0 5 SSL at the I00a% significance
level that ฆ designed to achieve a 1006% chance of incorrectly accepting H, when p ฆ 2 SSL
fr I 1
STEP 1: Calculate the sample mean J = |2- *i| *jg"
STEP 6 Let C represent the number erf specimens composited to form each of the N sarrples.
where each of xf, x2,... x* a a composite sample consisting of C specimens selected'so
that each composite is representative of the EA as a whole (tf each of x2... x^ ts an
individual random or pseudo-random sampling point, then C ซ 1.)
STEP 2: Calculate the sample standard deviation
SSL
If Max (x1. Xy. ..V< ^i^ . thปn no further data quality assessment is needed and the EA
needs no further investigation.
ป = ij'RTT ฃ,(*'"*)
Otherwise proceed to Step 7.
STEP 3. Calculate ttw umpte ttwrnni
STEP 7 Calculate the sample estimate of the coefficient of variation. CV, for indmdual concentration
measurements from across the EA.
ฃ(ป-*)'
b ป N r
(N-l) (N-2) s
STEP 4: Calculate the Chen test statistic, t2, as foBows:
CV-
X
NOTE: This calculation ignores measurement error, which results in conservatively large
sample size requirements.
b
1 = 6 -/N
STEP 6. Use the value of the sanple CV calculated in Step 7 as the tme CV of concentrations in
Tables 25 through 90 to detemwe the minimum sample size, N*. necessary to achieve a
1000% or less chance of ncorrectly accepting Hฉ when p 2 SSL
x-0.5 SSL
t 3 " " III 1
I / Vn
H N i N". the EA needs no further Investigation.
If N 2ซ, the nuO hypothesis is rejected, and the EA needs further nvestigatfon
H t2 ฃ zซ, them tt insufficient evidence to reject the nufl hypothesis Proceed to Step 6 to
determine If the sample sae is suffioent to achieve a 1000% or less chance of incorrectly
accepting the H, when |i 2 SSL.
-------
Table 25. Minimum Sample Size for Chen Test at 10 Percent Level of
Significance to Achieve a 5 Percent Chance of "Walking Away" When EA
Mean is 2.0 SSL, Given Expected CV for Concentrations Across the EA
Number ot
Coefficient of variation (CV)*
specimens
per composite*
1.0
1.5
2.0
2.5
3.0
2
7
9
>9
>9
>9
3
5
7
9
>9
>9
4
4
6
8
>9
>9
5
4
5
6
8
>9
6
4
4
5
7
9
'The CV is the coefficient ol variation for individual, uncomposiled measurements across the entire EA and includes
measurement error.
"Each corrposne consists of points from a stratified random or systematic gnd sample across the entire EA.
NOTE- Sample sizes are based on 1,000 simulations thai assume that each composite is representative of the entire
EA, thai half the EA has concentrations below the Itmit of detection, and that halt the EA has concentrations following
a gamma distribution (a conservative distributional assumption).
Table 26. Minimum Sample Size for Chen Test at 20 Percent Level of
Significance to Achieve a 5 Percent Chance of "Walking Away" When EA
Mean is 2.0 SSL, Given Expected CV for Concentrations Across the EA
Number of
Coefficient of variation (CV)*
specimens
per composite*
1 .0
1.5
2.0
2.5
3.0
3.5
1
9
>9
>9
>9
>9
>9
2
5
7
>9
>9
>9
>9
3
4
5
7
9
>9
>9
4
4
4
6
7
>9
>9
5
4
4
4
6
8
>9
6
4
4
4
5
8
9
*The CV ts the coefficient of variation for individual, uncomposited measurements across the entire EA and includes
measurement error.
bEach composite consists of points from a stratified random or systematic grid sample across the entire EA.
NOTE. Sample sizes are based on 1,000 simulations that assume thai each composite is representative of the entire
EA, that half the EA has concentrations below the limn of detection, and that half the EA has concentrations following
a gamma distribution (a conservative distributional assumption).
-------
Table 27. Minimum Sample Size for Chen Test at 40 Percent Level of
Significance to Achieve a 5 Percent Chance of "Walking Away" When EA
Mean is 2.0 SSL, Given Expected CV for Concentrations Across the EA
Number of
Coefficient of variation (CV)*
specimens
per composite*
1.0
1.5 2.0
2.S
3.0
3.5
4.0
1
5
9 >9
>9
>9
>9
>9
2
- 4
4 8
9
>9
>9
>9
3
4
4 5
7
>9
>9
>9
4
4
4 4
5
8
>9
>9
5
4
4 4
5
6
9
>9
6
4
4 4
4
5
8
9
The CV b the coefficient of variation for individual, uncomposited measurements across the entire EA and includes
measurement error.
bEach composite consists of points from a stratified random or systematic grtd sample across the entire EA.
NOTE: Sample sizes are based on 1.000 simulations that assume that each composite is representative of the entire
EA, that half the EA has concentrations below the limit of detection, and that half the EA has concentrations following
a gamma distribution (a conservative distributional assumption).
Table 28. Minimum Sample Size for Chen Test at 10 Percent Level of
Significance to Achieve a 10 Percent Chance of "Walking Away" When
EA Mean is 2.0 SSL, Given the Expected CV for Concentrations Across
the EA
Number of
specimens
per composite*
Coefficient of variation (CV)*
1 .0
1.5 2.0
2.5 3.0
3.5
2
6
7
>9
>9 >9
>9
3
4
5
7
>9 >9
>9
4
4
4
6
7 >9
>9
5
4
4
5
6 8
>9
6
4
4
4
5 7
9
*The CV is the coefficient of variation for individual, uncomposited measurements across the entire EA and includes
measurement error.
"Each composite consists of points from a stratified random or systematic grid sample across the entire EA.
NOTE: Sample sizes are based on 1,000 simulations that assume that each composite is representative of the entire
EA, that half the EA has concentrations below the limit of detection, and that half the EA has concentrations f ollowtng
a gamma distribution (a conservative distributional assumption).
-------
Table 29. Minimum Sample Size lor Chen Test at 20 Percent Level of
Significance to Achieve a 10 Percent Chance of "Walking Away" When
EA Mean is 2.0 SSL, Given Expected CV for Concentrations Across the
EA
Number of
Coefficient of variation (CV)a
specimens
per composite*
1.0
1.5
2.0
2.5
3.0
3.5
4.0
1
7
9
>9
>9
>9
>9
>9
2
4
5
8
>9
>9
>9
>9
3
4
4
5
8
>9
>9
>9
4
4
4
4
5
8
>9
>9
5
4
4
4
5
6
e
>9
6
4
4
4
4
5
7
9
The CV is the coefficient o< variation tor indwdual, uncomposited measurements across the entire EA and includes
measurement error.
bEach composite consists ot points from s stratified random or systematic grid sample across the entire EA.
NOTE: Sample sizes are based on 1.000 simulations that assume that each corrposite is representative ot the entire
EA, that half the EA has concentrations below the limit of detection, and that half the EA has concentrations following
a gamma distribution (a conservative distributional assumption).
Table 30. Minimum Sample Size for Chen Test at 40 Percent Level of
Significance to Achieve a 10 Percent Chance of "Walking Away" When
EA Mean is 2.0 SSL, Given Expected CV for Concentrations Across the
EA
Number of Coefficient of variation (CV)*
specimens
per composite*
1.0
1.5
2.0
2.5
3.0
3.5
4.0
1
4
7
9
>9
>9
>9
>9
2
4
4
5
8
9
>9
>9
3
4
4
4
5
7
9
>9
4
4
4
4
4
5
7
>9
5
4
4
4
4
5
6
8
6
4
4
4
4
4
5
6
The CV is the coefficient of variation tor individual, uncomposited measurements across the entire EA and includes
measurement error.
ปEach composite consists ot points from a stratified random or systematic grid sample across the entire EA.
NOTE; Sample sizes are based on 1,000 simulations thai assume that each corrposite is representative of the entire
EA. that ha* the EA has concentrations below the limit of detection, and that half the EA has concentrations following
a gamma distribution (a conservative distributional assumption).
-------
EPA QA/G-4: Guidance for the Data
Quality Objectives Process
(Peer Review Draft, 1/00)
EPA QA/G-5: Guidance on Quality
Assurance Project Plans
( Final - EPA/600/R-98/018, 2/98)
Contact: EPA QAD (202) 564-6830
http://www.epa.gov/quality1/
-------
EPA QA/G-9: Guidance for Data
Quality Assessment: Practical
Methods for Data Analysis
(Final - EPA/600/R-96/084, 1/98)
Contact: EPA QAD (202) 564-6830
http://www.epa.gov/quality1/
-------
EPA QA/G-4D: Decision Error Feasibility
Trials (DEFT) Software for the
Data Quality Objectives Process
(Windows Beta 1.0 Version, 2/00)
EPA QA/G-9D: Data Quality Assessment
Statistical Toolbox (DataQUEST)
(Windows Beta 1.0 Version, 4/99)
Contact: EPA QAD (202) 564-6830
http://www.epa.gov/quality1/
To run programs, click on Deftbeta.exe or
Questbeta exe in each respective folder. User
guides are the corresponding *.pdf files.
-------