EPA 910/9-90-024
Puaet Sound Estuary Program
A Project Manager's Guide
to Requesting and Evaluating
Chemical Analyses
7
JU
August 1991
-------
-------
PTf
ENVIRONMENTAL SERVICES
15375 SE 30th Place
Suite 250
Bellevue, Washington 98007
A PROJECT MANAGER'S GUIDE TO REQUESTING
AND EVALUATING CHEMICAL ANALYSES
Prepared for
U.S. Environmental Protection Agency
Region 10, Office of Coastal Waters
1200 Sixth Avenue
Seattle, Washington 98101
EPA Contract No. 68D80085
PTI Contract C74426
August 1991
-------
CONTENTS
LIST OF FIGURES v
LIST OF TABLES vi
LIST OF ACRONYMS vii
ACKNOWLEDGMENTS viii
EXECUTIVE SUMMARY ix
INTRODUCTION 1
I. DEFINING ANALYTICAL OBJECTIVES 6
STEP 1: DEFINING CHEMICAL ANALYSES ACCORDING
TO DATA USE 6
Selecting the Appropriate Level of Analysis 7
Meeting Levels of Concern 9
STEP 2: ASSESSING HISTORICAL RESULTS 12
STEP 3; DECIDING WHICH CONTAMINANTS REQUIRE
ANALYSIS 15
Securing Analytical Cost Savings 16
Exploratory or Broad-Spectrum Analyses 17
Focused Analyses 18
Deciding Between Broad-Spectrum and Focused Analyses 19
II. PLANNING FOR QUALITY ASSURANCE 21
DRAFTING A QUALITY ASSURANCE PROJECT PLAN 21
Category I QA Project Plans 22
Category II QA Project Plans 25
Category III QA Project Plans 26
Category IV QA Project Plans 26
STAFFING FOR QUALITY ASSURANCE 27
/'/'
-------
III. ASSURING QUALITY DURING SAMPLE COLLECTION 28
DEVELOPMENT OF A SAMPLING PLAN 28
SAMPLE CUSTODY 29
IV. CHOOSING ANALYTICAL METHODS AND QUALITY
CONTROL SAMPLES 36
PREPARING AND ANALYZING SAMPLES 38
Digestion and Distillation of Inorganic Samples 38
Purging of Volatile Organic Samples 38
Extraction of Semivolatile Organic Samples 40
Cleanup of Organic Samples 40
SUMMARY OF CALIBRATION METHODS AND QUALITY
CONTROL SAMPLES 41
SPECIFYING QUALITY CONTROL LIMITS 45
V. WORKING WITH AN ANALYTICAL LABORATORY 48
SELECTING A QUALIFIED LABORATORY 48
DRAFTING A STATEMENT OF WORK 53
VI. EVALUATING DATA FROM THE LABORATORY 59
STEP 1: CHECKING DATA COMPLETENESS 60
STEP 2: SELECTING AN APPROPRIATE LEVEL OF
DATA VALIDATION 61
Level 1 Validation 63
Level 2 Validation 63
Level 3 Validation 63
Level 4 Validation 63
Review Options 64
STEP 3: ASSESSING DATA QUALITY 64
STEP 4: ASSIGNING DATA QUALIFIERS AND TAKING
FINAL ACTIONS 71
///'
-------
STEP 5: WRAPPING UP THE PROJECT
73
Summarizing the Results of the Quality Assurance Review
Storage and Disposal of Samples
73
74
REFERENCES
76
GLOSSARY OF TERMS
80
APPENDIX A - EPA Priority Pollutants and Additional Hazardous
Substance List Compounds
APPENDIX B - Description of Calibration, Quality Control Samples,
and Widely Used Analytical Methods
APPENDIX C - Example Statements of Work for the Laboratory
APPENDIX D - Example Quality Assurance Report
APPENDIX E - Example Forms
-------
LIST OF FIGURES
Figure 1, Overall quality assurance perspective
Figure 2. Guidance for data assessment and evaluation of data
quality
-------
LIST OF TABLES
Table 1. Summary of analytical levels 8
Table 2. Levels of data quality for historical data 13
Table 3. Categories of EPA projects requiring QA project plans 23
Table 4. Collection, preparation, and QA/QC requirements for
Puget Sound environmental samples analyzed for
metals 30
Table 5. Collection, preparation, and QA/QC requirements for
Puget Sound environmental samples analyzed for organic
compounds 31
Table 6. Example warning and action limits for calibration and
quality control samples 46
Table 7. Checklist of laboratory deliverables for the analysis of
organic compounds 54
Table 8. Checklist of laboratory deliverables for the analysis of
metals 56
Table 9. Levels of data validation 62
Table 10. Example data qualifier codes 70
vi
-------
LIST OF ACRONYMS
AAS
atomic absorption spectrometry
BFB
bromofluorobenzene
CRM
certified reference material
DFTPP
decafluorotriphenylphosphine
EPA
U.S. Environmental Protection Agency
GC/ECD
gas chromatography/electron capture detection
GC/MS
gas chromatography/mass spectrometry
GC
gas chromatography
GPC
gel permeation chromatography
HPLC
high-performance liquid chromatography
ICP
inductively coupled plasma-atomic emission spectrometry
LIMS
Laboratory Information Management System
PCB
polychlorinated biphenyl
QA
quality assurance
RF
response factors
RRM
regional reference material
SAP
sampling and analysis plan
SRM
standard reference material
VOA
volatile organic analysis
vii
-------
A CKNO WLEDGMENTS
This document was prepared by PTI Environmental Services under the direction
of Mr. Robert Barrick. The document was prepared for the U.S. Environmental
Protection Agency (EPA) in partial fulfillment of Contract No. 68-D8-0085. Dr.
Thomas Ginn of PTI Environmental Services was the Program Manager. Dr.
John Armstrong of EPA was the Project Monitor. Review comments provided
by numerous federal, state, and local agencies and other interested individuals
during preparation of this manual are gratefully acknowledged. Primary authors
were Mr. Barrick, Ms. Wendy Graham, and Mr. David Gordon.
viii
-------
EXECUTIVE SUMMARY
The purpose of this manual is to help less-experienced project managers from
governmental agencies, industry, and environmental groups in requesting
appropriate chemical analyses and in making an informed evaluation of the
results. Many project managers are not chemists, but most may need to plan for,
request, discuss, or evaluate chemical analyses. Even after the results have been
received and interpreted, many managers must still defend the project data or
critical decisions.made by themselves or staff. This manual is designed to guide
the nonchemist. Earlier chapters provide background on terms and concepts that
are used in later chapters. Strategies are presented throughout the manual for
choosing options ranging from simple to more complex plans, requirements,
analyses, or evaluations. When applicable, the relative cost consequence of these
options, ranging from inexpensive to more expensive, is also described.
Errors in requesting analyses or using chemical results of unknown quality or
poor design may waste thousands or millions of dollars. In addition, these errors
could result in inappropriate decisions that are either environmentally harmful or
overly restrictive. Similarly, excessive analyses or exhaustive reviews of
reasonable results can also be wasteful. The guidance in this manual should help
to eliminate the "no questions asked" approach of accepting and using chemical
results without the appropriate level of review. In turn, management effective-
ness should increase because project managers can gain a better technical basis
for using funds available for chemical analysis. Moreover, through application
of the approach provided in this manual, project managers should be able to
increase the level of confidence they have in the results of chemical analyses and
in the subsequent use of these data to make environmental decisions.
The manual is not intended to take the place of technical experts, whose advice
may be needed at times to assist with problems specific to each analytical effort.
Such specialized advice is not easily given within a manual of this format and
scope. However, by using the detailed information and checklists provided in this
manual, and by seeking the advice of a chemist or experienced quality assurance
specialist where needed, project managers should be better able to make analytical
requests and to evaluate the general quality of results received from chemical
laboratories. For example, the preliminary evaluation of results described in
Chapter VI provides guidance on determining when results are likely to be clearly
acceptable, clearly unacceptable, or will require a more detailed review by a
specialist. This preliminary evaluation is made using six major criteria for data
completeness and laboratory performance, including analytical accuracy and
sensitivity. Response measures are described for common deficiencies in analyses
to provide both a better sense of what can be done easily by the manager and
what questions should be asked of the laboratory or a specialist, if needed.
ix
-------
Examples of strategies for making analytical decisions are provided from existing
programs involving procurement of analyses and review of chemical results. One
valuable source of examples is the U.S. Environmental Protection Agency's
Superfund program, including guidance produced for the Contract Laboratory
Program. Although these particular examples reference hazardous waste
investigations, the underlying principles have broad application as guidance for
obtaining chemical services and evaluating chemical data in almost every field of
environmental studies, including monitoring or permit programs, reconnaissance
surveys, and research studies. The Puget Sound Estuary Program is another
primary source of examples that have been used in this manual. Again, the intent
of this manual is not to focus on any one program but to combine available
information from well-documented programs with professional experience to
provide generally applicable guidance.
The manual is also designed to enable project managers who are familiar with any
of the topics addressed in a particular section to skip ahead to areas where they
desire specific guidance. The key components of the manual are described in the
following sections.
/. DEFINING ANALYTICAL OBJECTIVES
Chapter I focuses on the following three steps that will help in formulating
analytical needs:
1. Defining and describing chemical analyses, including the required
accuracy and sensitivity of results, according to the expected use of
the data
2. Assessing the quality and usefulness of historical data
3. Determining which contaminants require analysis.
Specific examples of strategies for defining analytical objectives are addressed in
this chapter. In Step 1, a general summary of accuracy, sensitivity, and level of
documentation associated with analyses conducted in the field, laboratory
screening methods, and specialized analytical methods is presented. The intended
use of the data will affect the kind of chemical analysis that is requested by the
project manager. For example, a preliminary reconnaissance of a large area may
only require data from simple and quick checks performed in the field. In
contrast, a complete characterization of contamination in a sensitive area may
require specialized laboratory methods and considerable documentation of results.
Routine methods to screen samples for a standard list of contaminants typically
require an intermediate level of effort.
x
-------
The project manager makes important tradeoffs in balancing data quality and
project costs. That is, when the level of data quality needed is low, project costs
can be minimized but the data may be of limited use in other projects that require
high quality results. On the other hand, by increasing the accuracy or sensitivity
of analysis, the resulting data may be useful for more and different projects. As
a general rule, it may be appropriate to request the laboratory analyses that will
result in the highest quality of data possible within the constraints of the available
budget. In addition to maximizing the possible uses of the data, this approach can
also increase a manager's confidence in the reliability of the data and, therefore,
in decisions made for the project at hand.
Guidance is also presented on how to ensure that the sensitivity of analyses will
be sufficient to evaluate levels of concern that may be specified in governmental
regulations addressing environmental hazards. When the focus of a project is
simply to characterize chemical contamination without making comparisons with
regulatory levels of concern, the project manager may want to consider the
following options for deciding the appropriate sensitivity of analysis:
1. Stringent—The analytical results should be adequately sensitive to
accurately compare conditions at the site with natural background
conditions or, in some cases, to evaluate even lower concentrations
that may be required for risk assessment.
2. Moderate—The analytical results should be adequately sensitive to
accurately characterize local conditions near the site or conditions
at a more distant reference site that is relatively uncontaminated.
3. Screening—The analytical results should be adequately sensitive to
identify high chemical concentrations usually found at or near
sources of contamination. However, these screening analyses may
not be sufficient to make conclusions concerning the absence of
contamination because environmentally significant concentrations of
contaminants may be present that are just below the detection limits
of the screening analysis.
Specific guidance is provided for defining limits of detection for chemical
analyses and for defining a higher, more confidently established concentration
called the quantification limit. The relationship between these two limits and
chemical levels of concern that may be established for a project are described,
and recommendations are made on how concentration data should be reported that
fall below, between, and above these limits.
In Step 2, a strategy is presented for classifying historical results according to
different levels of data quality. The quality of historical results can vary
considerably and will affect the usefulness of these data for comparisons or
xi
-------
integration with other studies. Factors that may influence the classification of
historical results for a particular project are discussed, including the following;
1. The analytical methods used and their associated detection limits
2. Quality assurance and quality control procedures and documentation
3. The amount of time that has passed since the historical data were
gathered.
In Step 3, guidance is provided on how to decide which contaminants require
analysis. The use of indicator chemicals and tradeoffs in specifying a broad-
spectrum analysis for many contaminants vs. focused analyses for selected
contaminants are discussed in terms of cost, quality of information, and overall
usefulness of the data. Several alternative strategies for ensuring that adequate
chemical analyses are conducted within project budget constraints are also
presented.
II. PLANNING FOR QUALITY ASSURANCE
Experienced managers who are already familiar with defining analytical objectives
may wish to move ahead to Chapter II, which focuses on the mechanics of
documenting project goals, responsibilities, and procedures in a written plan.
Four categories of quality assurance project plans are addressed in this chapter,
ranging from plans that are appropriate for large projects that may be subjected
to legal challenge to, for example, small projects that are designed to only
provide intermediate data to test preliminary assumptions concerning the general
presence or absence of contamination.
III. ASSURING QUALITY DURING SAMPLE COLLECTION
A brief summary of sampling considerations is provided in Chapter III; this
summary is most appropriate for managers seeking references for sampling
guidance and recommendations for how sampling procedures are documented in
a written plan. The sampling plan should include a description of the site,
sampling objectives, and sampling design; a description of how each indepen-
dently collected sample will be identified; and sampling and handling procedures.
In addition, guidance on sample custody is provided in Chapter III. Although not
appropriate for all projects, procedures for maintaining stringent custody of
project samples are described.
xii
-------
IV. CHOOSING ANALYTICAL METHODS AND QUALITY CONTROL CHECKS
In this chapter, major quality control procedures that are typically performed by
the laboratory are described and guidance on their relative priority for use is
summarized. Important information that a laboratory will need to help in making
decisions on analytical methods is also summarized, including:
1. A list of chemicals for analysis
2. Required detection limits
3. Expected use of the data and data quality objectives
4. Requirements for comparability with other projects.
Following a description of basic procedures for preparing and analyzing samples,
a description of recommended calibration methods and quality control checks is
provided. The relative importance, rationale, and relative frequency of calibra-
tion and each of the quality control checks are discussed in Appendix B. A
summary of priorities, rationales, and suggested frequency of use for each
procedure is provided at the end of Chapter IV. These procedures fall into the
following categories:
1. Calibration of analytical instruments is a top priority and is always
required for any project requiring quantitative data (even if only
estimated quantities are necessary). Calibration is essential because
it is the means by which instrument responses are properly trans-
lated into chemical concentrations.
2. High priority checks on quality control include essential checks on
laboratory contamination and the bias and precision of chemical
measurements.
3. Medium priority checks on quality control include supplemental
checks on laboratory bias and precision that are strongly recom-
mended but are sometimes eliminated if the analytical budget is
tight.
4. Low priority checks on quality control include supplemental checks
on sampling variability (in addition to laboratory variability) that,
although critical in some studies, are generally discretionary.
Finally, two kinds of quality control limits are described in this chapter. Warning
limits are values that serve to warn the project staff that the analytical system,
instrument, or method may not be performing normally and that data should be
xiii
-------
qualified as "estimated" before using the results for technical analysis. Action
limits are limits placed on the acceptability of data from the analysis of quality
control samples. Exceedance of action limits informs the analyst and the project
manager that the analytical system or instrument is performing abnormally and
needs to be corrected. Action limits should be contractually binding on laborato-
ries, and statements of work should provide the project manager with sole
discretion in enforcing the limits. Data that exceed action limits are often
rejected and excluded from a project database, although, as discussed later in
Chapter VI, there may be special circumstances that warrant acceptance of the
data as estimated values.
V. WORKING WITH AN ANALYTICAL LABORA TORY
In Chapter V, procedures for selecting and contracting with laboratories are
described. The contents of a laboratory statement of work, which defines the
terms of analyses, are also described. Information that should be included in
these documents includes the following items:
1. A summary of analyses to be performed by the laboratory
2. Acceptable procedures for sample delivery and storage
3. A list of products to be delivered by the laboratory, specifying the
maximum time that may elapse between the submittal of samples to
the laboratory and the delivery of data reports to the agency,
organization, or industry requesting the analyses
4. Methods to be followed for processing and analyzing samples,
including any modifications of standard procedures
5. Quality assurance and quality control requirements, including the
data quality objectives specified in the quality assurance project
plan and warning and action limits
6. Requirements that each laboratory submit a quality assurance
manual for review and approval by the agency, organization, or
industry requesting or funding the analyses
7. Progress report requirements (usually necessary only for large
projects)
8. Circumstances under which the laboratory must notify project
personnel of problems with performance
9. Notice that scheduled or unannounced laboratory visits by the
project manager or designated staff may be conducted
10. Storage time for records and samples
x/V
-------
11. Terms for payments to the laboratory, including a requirement that
the quality of data must be acceptable (pending the outcome of the
quality assurance review) before payment is made.
VI. EVALUATING DATA FROM THE LABORATORY
Project managers who are experienced in the planning, sampling, and analysis
phases of projects may wish to focus on Chapter VI, which presents guidance on
how to evaluate data returned by laboratories. Examples and specific recom-
mendations for resolving concerns are provided for evaluating data packages
according to the following steps:
1. Checking data completeness
2. Selecting an appropriate level of data validation
3. Evaluating data quality
4. Assigning data qualifiers and taking final actions
5. Wrapping up the project.
These five steps enable a project manager to determine the amount of review that
the laboratory results may require before they are accepted and the level of
expertise that may be needed to accomplish this review. Guidance is also
provided for a stepwise check that the project manager can complete as part of
an overall assessment of data quality. For some laboratory results, this level of
review may be all that is needed to make a final decision as to whether the results
are acceptable for use in a project.
APPENDICES
Additional material useful for establishing data quality is contained in the
appendices of the manual. This material includes information on the chemicals
on the U.S. Environmental Protection Agency's priority pollutant and hazardous
substance lists (Appendix A); simplified descriptions of calibration methods,
quality control checks, and widely used analytical methods (Appendix B); an
example statement of work for contracting with or directing an analytical
laboratory (Appendix C); a summary report of a detailed quality assurance review
of data (Appendix D); and examples of miscellaneous forms used for sampling
and analysis (Appendix E).
XV
-------
INTRODUCTION
With each passing year, environmental planning and assessment activities require
government agencies, industry, and environmental groups to spend larger and
larger amounts of money to analyze the chemical content of water, air, soil,
sediment, and tissue samples. The projects for which these samples are analyzed
range from small to very large, and the experience of project managers and staff
in requesting chemical analyses and in evaluating the resulting data range from
essentially none to substantial. For project managers and staff with minimal to
average experience in this area, conducting these tasks may be formidable. In
fact, a project manager may simply accept laboratory results as definitive with
"no questions asked" or must rely on the expertise of in-house or contracted
quality assurance specialists, if available.
The purpose of this manual is to help less-experienced project managers from
governmental agencies, industry, and environmental groups in requesting
appropriate laboratory services and in making an informed evaluation of the
results of chemical analyses. Most managers of environmental projects are not
chemists, but many will need to plan for, request, discuss, or evaluate chemical
analyses. Even after the results have been returned and interpreted, many
managers must still defend the project data or critical decisions made by
themselves or staff that led to the final results. This manual is designed to guide
the nonchemist. Earlier chapters provide background on terms and concepts that
are used in later chapters. Strategies are presented throughout the manual for
choosing options ranging from simple to more complex plans, requirements,
analyses, or evaluations. When applicable, the relative cost consequence of these
options, ranging from inexpensive to more expensive, is also described.
Errors in requesting analyses or using chemical results of unknown quality or
poor design may waste thousands or millions of dollars and may result in
inappropriate environmental decisions. Similarly, excessive analyses or exhaus-
tive reviews of reasonable results can also be wasteful. The guidance provided
here should help to eliminate the "no questions asked" approach of accepting and
using chemical data without review. As a result, project managers should be
more effective in using funds available for chemical analysis. By following the
approach outlined in this manual, project managers nationwide should be able to
increase the level of confidence they have in the results of chemical analyses and
in the subsequent use of these data to make environmental decisions.
This report guides the project manager from the point of determining which
chemicals to measure through defining analysis objectives and desired detection
limits, developing sampling and quality assurance plans, contracting and working
with an analytical laboratory, and finally to determining the completeness and
quality of the data received. Project managers familiar with any of these steps
1
-------
may skip certain sections of this manual, going directly to the areas where they
desire specific guidance.
The manual is organized as follows:
¦ Basic information on project planning is addressed in Chapters I
and II. Examples of guidance provided in these chapters include
identifying tradeoffs between using field screening tests or labora-
tory analyses, evaluating the usefulness of a 15-year-old data set,
deciding whether to add two more metals to the analytical request,
and determining what information should be in a quality assurance
plan for a simple reconnaissance project.
¦ Basic information on data collection activities in the field and
laboratory is addressed in Chapters III and IV. Examples include
finding guidance on sampling organisms, choosing between mini-
mum and enforcement-level requirements for sample custody, and
selecting action limits for laboratory performance.
¦ Procedures for contracting with an analytical laboratory, including
laboratory selection criteria and checklists of laboratory deliver-
ables to include in a statement of work, are addressed in Chap-
ter V.
¦ Step-by-step procedures for conducting a preliminary review of
data and determining implications for data use are addressed in
Chapter VI.
This manual covers all of the major elements of project planning and implemen-
tation of quality assurance activities shown in Figure 1. The focus of Chapter I
is on defining the appropriate analytical effort (e.g., field screening techniques
vs. sophisticated laboratory analysis), the usefulness of existing data, which
contaminants should be analyzed, and how sensitive analyses should be to meet
project goals.
Experienced managers who are already familiar with this aspect of project
planning may wish to move ahead to Chapter II, which focuses on the mechanics
of documenting project goals, responsibilities, and procedures in a written plan.
The brief summary of sampling considerations in Chapter III is most appropriate
for managers seeking references for sampling guidance and recommendations for
how sampling procedures are documented in a written plan.
The purpose of Chapter IV is to acquaint project managers with basic terms,
procedures, and quality controls used in laboratory analyses. The experienced
manager may wish to focus on the section Summary of Calibration Methods and
Quality Control Samples, which provides recommendations on the relative priority
2
-------
CO
PROJECT PLANNING DATA COLLECTION DATA REVIEW DATA USE
FIELD AND LABORATORY
Determine if Data
are Acceptable for
All Uses
Determine if Data
are Acceptable for
Primary Use
Expert
Review
{if necessary)
Implement
Corrective
Action
Review
Data Quality
Prepare Laboratory
Quality Control
Samples
Analyze
Collect Samples
and Field Quality
Control Samples
Choose
Laboratory
Define Objectives
and Chemicals
of Concern
Develop
Quality Assurance
Project Plan
Audit Field Operations
and Laboratory Facilities
Service and Calibrate Field
and Laboratory Equipment
Review, Approve,
and Finalize
Contract/
Statement of
Work
See Chapters I, II, V See Chapters III, IV See Chapter VI
Figure 1. Overall quality assurance perspective.
C744-26 0891
-------
of different kinds of quality controls that can be requested from laboratories, and
the final section, on specifying performance limits for chemical analyses.
Experienced managers may also focus their review on recommendations for
laboratory statements of work in Chapter V. These documents are critical in
contractually specifying procedures and expectations for laboratory performance
before laboratory analyses begin. Recommendations for selecting laboratories are
also provided in Chapter V.
Project managers who are experienced in the planning, sampling, and analysis
phases of projects may concentrate on Chapter VI, which presents guidance on
how to evaluate data returned by laboratories. The steps addressed in this chapter
enable a project manager to conduct a preliminary assessment of the data quality.
Based on this assessment, the project manager can determine the amount of
detailed review that the laboratory results may require before they are accepted
and the level of expertise that may be needed to accomplish the review. For
some laboratory results, this preliminary check that the project manager can
complete may be all that is needed to make a final decision as to whether the
results are acceptable or unacceptable for use in a project.
Additional material useful for establishing data quality is contained in the
appendices of the manual. This material includes information on the chemicals
on EPA's priority pollutant and hazardous substance lists (Appendix A), simpli-
fied descriptions of calibration methods, quality control checks, and widely used
analytical methods (Appendix B), an example statement of work for contracting
with or directing an analytical laboratory (Appendix C), a summary report of a
detailed quality assurance review of data (Appendix D), and examples of
miscellaneous forms used for sampling and analysis (Appendix E).
The chemical analysis of environmental samples can be time-consuming and
costly. Regardless of the amounts of time and money spent on analyses, any data
produced will be useful to decision-makers only if the quality of data (specifically
the accuracy and sensitivity of the analyses producing the data) can be estab-
lished. Throughout this manual, emphasis is placed on strategies for selecting
among different levels of effort or quality associated with laboratory analyses and
quality assurance activities. In general, these different levels represent trade-offs
in time, cost, amount of documentation, and/or confidence in results. Examples
of strategies addressed in this manual include the following:
¦ The kinds of analyses needed to meet project goals may range from
relatively simple field screening techniques to complex and non-
standard laboratory procedures. Strategies for choosing an appro-
priate analysis are discussed in the section Defining Chemical
Analyses According to Data Use, Chapter I (see p. 6). A general
summary of accuracy, sensitivity, and amount of documentation
that can be expected with these different kinds of analyses is also
presented (see Table 1 on p. 8).
4
-------
¦ The quality of historical results can vary considerably and will
affect the usefulness of these data for new projects. A strategy for
classifying historical results according to different levels of data
quality is summarized in Assessing Historical Results, also a
section of Chapter I (see Table 2 on p. 13).
¦ The need to document requirements for quality assurance in a
planning document will depend on the expected use of the data.
Recommendations for choosing an appropriate level of detail for
different categories of projects are summarized in Chapter II (see
Table 3 on p. 23).
¦ The intended use of new data also affects how much evaluation of
data quality may be appropriate. Recommendations for selecting
among different levels of effort for this assessment of data quality
are presented in Chapter VI (see Table 9 on p. 62).
¦ A strategy for assigning levels of quality to analytical data is also
presented in Chapter VI (see pp. 64-71 and Figure 2 on p. 65).
These levels should help the project manager decide what actions
are needed after conducting a brief assessment of the data package
provided by the laboratory.
Providing instructions for conducting a thorough technical review of project data
is beyond the scope of this manual. Examples of detailed technical guidance of
this nature can be found in a pair of publications, Laboratory Data Validation:
Functional Guidelines for Evaluating Inorganics Analyses (U.S. EPA 1988c) and
Laboratory Data Validation: Functional Guidelines for Evaluating Organics
Analyses (U.S. EPA 1988d). These guidelines are usually applied by quality
assurance specialists rather than by managers.
The manual is not intended to take the place of technical experts, whose advice
may be needed at times to assist with problems specific to each analytical effort.
Such specialized advice is not easily given within a manual of this format and
scope. However, by using the information presented in this manual, and by
seeking the advice of a chemist or experienced quality assurance specialist where
needed, project managers should be able to make analytical requests and to
evaluate the general quality of data received from chemical laboratories.
5
-------
/. DEFINING ANALYTICAL OBJECTIVES
Decisions regarding the selection and requesting of chemical analyses should
focus on producing results that are reliable and that meet the needs of their users.
This goal can be achieved by establishing data quality objectives1, or perfor-
mance criteria, and defining project data needs early in project planning. The
request for laboratory services and the subsequent evaluation of data from
analyses are then based on these project-specific objectives. In this chapter, three
steps are described that will help in formulating analytical needs:
Step 1. Defining and describing chemical analyses, including the
required accuracy and sensitivity of results, according to the
expected use of the data
Step 2. Assessing the quality and usefulness of historical data
Step 3. Determining contaminants requiring analysis.
In addition to these steps, project managers will need to define how much effort
should be expended in assessing the quality of results reported by the laboratory.
The process of data assessment and making decisions on final data quality for
laboratory results is discussed in Chapter VI, Evaluating Data from the Labora-
tory.
STEP 1: DEFINING CHEMICAL ANAL YSES ACCORDING TO DATA USE
The intended use of data will affect the kinds of chemical analyses that are
requested by the project manager. For example, a preliminary reconnaissance of
a large area may only require data from simple and quick checks performed in
the field. In contrast, a complete characterization of contamination in a sensitive
area may require specialized laboratory methods and considerable documentation
of results. Routine methods to screen samples for a standard list of contaminants
typically require an intermediate level of effort.
Different analytical methods are capable of detecting different concentrations of
a chemical in a sample (referred to as a detection limit). For example, a highly
sensitive technique can detect a much lower chemical concentration than can a
screening technique for the same chemical. The accuracy of measurements also
1 Incorporating data quality objectives into project plans is discussed in Chapter II, Planning for
Quality Assurance.
6
-------
differs among analytical techniques. Accuracy is defined in terms of bias (how
close the measurement is to the true value) and precision (how variable the
measurements are when repeated). In general, as the sensitivity and accuracy of
a technique increases, so does the cost. By carefully formulating data quality
needs early in project planning, a manager can ensure that the quality of data
required for a particular project will be produced in a cost-effective manner.
Selecting the Appropriate Level of Analysis
The categories of field checks, routine screening methods, and specialized
methods of analysis include a range from general field reconnaissance to research-
type analyses. Table 1 describes several analytical levels for assessing contami-
nated sites, the types of chemical analyses associated with each level, and their
associated accuracy, sensitivity, and level of documentation. These levels are
used in some U.S. Environmental Protection Agency (EPA) programs (U.S. EPA
1987b).
Data of acceptable quality can be obtained at each level shown in Table 1.
However, different data uses, such as reconnaissance surveys, routine environ-
mental monitoring, remedial response activities, and environmental research, have
varying needs for analytical effort and documentation of results that correspond
to one or more levels shown in Table 1. In choosing an appropriate level of
analysis for each project, the project manager should consider the general
analytical levels presented in Table 1 and the following major tradeoffs:
¦ Analyses conducted at Levels I and II are usually limited in terms
of the contaminants that may be detected and the accuracy of
results. However, these analyses are typically performed quickly
and at low cost.
¦ The documentation provided for analyses at Levels I, II, and III is
less complete than that provided by Level IV and Level V. Less
documentation is available at Levels I and II because few checks
on performance may be made during analysis. Although additional
performance checks may be made at Level III, only final sample
results may be provided by the laboratory. This limited documen-
tation of performance will limit the assessment of data quality that
is possible (see Chapter VI, Evaluating Data from the Laboratory)
and, consequently, the final use of the data.
For example, it may be critical to have enough documentation to
conduct an independent review of laboratory performance and to
confirm that unintended errors in reporting results have been
corrected before using the data in costly modeling or design
projects, or in controversial risk assessments or legal actions.
7
-------
TABLE 1. SUMMARY OF ANALYTICAL LEVELS
Analytical Level
Type of Analysis®
Accuracy
Sensitivity
Level of
Documentation
Field Check
EPA Level i
Routine Screening
EPA Level II
EPA Level III
Program-Specific
EPA Level IV
EPA Level V
Total organic/inorganic vapor detector
using portable instruments
Field test kits
• Primarily analyses of volatile organic
compounds by gas chromatography;
metals by atomic absorption (some anal-
yses may be conducted in a mobile labor-
atory)
• Organic/inorganic chemicals using a
variety of standard EPA procedures
Low; provides general indication
of contamination
Standard analyses of organic/inorganic
chemicals by gas chromatography/mass
spectroscopy, atomic absorption, induc-
tively coupled plasma
Nonstandard analyses, including modified
EPA procedures and research-grade
techniques. (For substances not typically
analyzed for. Level V may be the only
alternative, but accuracy, sensitivity, and
documentation are not necessarily differ-
ent from that of standard analyses.)
Moderate; provides data typically
as concentration ranges
High; provides data of known
bias and precision for an overall
accuracy level that is useful for
most applications
High; similar accuracy as Level III
with a focus on confirmation of
results
High; accuracy equals and some-
times exceeds that of Levels III
and IV because interferences
may be more completely elimi-
nated
Low to moderate; at least
sufficient to screen for the
presence of moderately
high concentrations
Moderate to high; sufficient
to document presence or
absence of selected con-
taminants
Moderate to high; sufficient
to document presence or
absence of a wide range of
contaminants
Moderate to high; similar
sensitivity as Level III but
most standardized proto-
cols focus on characteriza-
tion of waste materials
High; sufficient to charac-
terize a wide range of
waste and uncontaminated
background materials
Low; often digital readout of
final result only or visual indi-
cation of concentration range
(e.g., by change in color)
Low; often only the final
quantitative result without
supporting quality assurance
data
Low to moderate; summary
of quality assurance results is
provided but is usually not
adequate for an independent
verification of results
Rigorous; standardized data
package of sample and quali-
ty assurance results is suffi-
cient for independent verifi-
cation of results
Rigorous; can request com-
parable documentation as for
Level IV (or less if not need-
ed)
a See Chapter IV {Choosing Analytical Methods and Quality Control Checks) and Appendix B for a description of these analyses.
Reference; Adapted and modified from EPA's Data Quality Objectives for Remedial Response Activities (U.S. EPA 1987b).
-------
¦ The time and level of expertise required for analysis and detailed
documentation of results generally increases from Level I through
Level V. These factors may affect the project schedule or increase
the required budget.
One strategy to maximize data usability and minimize project costs is to request
a more sensitive laboratory analysis but to also request that the laboratory provide
only that documentation recommended2 for making a preliminary assessment of
data quality. More extensive evaluations can then be performed if a preliminary
assessment of results suggests a problem with their quality. In addition, the
project manager may decide to limit the amount of data and supporting documen-
tation that will be completely reviewed3 for quality. The elements of a complete
quality control program are described in Chapter IV, Choosing Analytical
Methods and Quality Control Checks.
An alternative strategy involves requesting simple or routine tests for all samples
but also specifying that a percentage of the samples be analyzed a second time
using special methods as a check. For example, a field screening technique could
be used to provide order-of-magnitude estimates of oil contamination in all
samples. Ten percent of the samples could also be analyzed using a more
accurate and sensitive laboratory procedure for comparison. A detailed evaluation
of both sets of results could then be used to estimate the reliability of the field
screening technique or to determine if additional analyses are required.
The project manager makes important tradeoffs in balancing data quality and
project costs. That is, when the level of data quality needed is low, project costs
can be minimized but the data may be of limited use in other projects that require
high quality results. On the other hand, by increasing the accuracy or sensitivity
of analysis, the resulting data may be useful for a range of projects with varying
requirements. As a general rule, it may be appropriate to request the laboratory
analyses that will result in the highest quality of data possible within the con-
straints of the available budget. In addition to maximizing the possible uses of
the data, this approach can also increase a manager's confidence in the reliability
of the data and, therefore, in decisions made for the project at hand.
Meeting Levels of Concern
The project manager should always request analyses that are sensitive enough to
support subsequent decisions concerning chemical measurements. For example,
if the purpose of the study is simply to determine if a particular contaminant is
2 Recommended documentation is described in Chapter VI, Evaluating Data from the Laboratory.
3 Different levels of effort associated with detailed reviews of data quality are discussed in Chapter
VI, Evaluating Data from the Laboratory.
9
-------
present at its probable source, the least sensitive analysis capable of making this
determination would be appropriate. On the other hand, if the purpose is to
determine if the contaminant is present at a low concentration known to be
harmful to human health, a more sensitive analysis would be necessary.
Likewise, careful determinations of the extent of contamination at levels slightly
above background concentrations typically require sensitive analyses.
Contaminant concentrations that may trigger an environmental quality decision
have been formalized as levels of concern in state and federal guidelines, criteria,
and standards. For example, EPA has established water quality criteria that set
forth maximum allowable contaminant concentration levels in water that are
considered to be acceptable for various uses (e.g., drinking water, agricultural
use). When applicable to a particular project, the project manager must,
therefore, request analyses sufficiently sensitive to detect contaminants at levels
of concern established by these authorities. When specific guidance on levels of
concern is not available for the purposes of a particular study, the required
sensitivity of analysis must be made on a case-by-case basis.
The reliability of a chemical measurement generally increases as the contaminant
concentration increases above its detection limit. Near the detection limit, the
presence of the contaminant may be obscured by a complex mixture of chemicals
or not distinguished from random electronic signals in the analytical instrument.
For this reason, methods are often requested that can provide detection limits that
are 5-10 times lower than regulatory levels of concern. In this way, increased
confidence can be placed in chemical concentrations that are measured near the
level of concern. Precision of approximately +30-50 relative percent difference
between measurements (the random error of measurement) and bias of no more
than ±50 percent of the true value (the systematic error of measurement) are
adequate in many programs for making comparisons with regulatory limits.
Special analyses may be required to meet some levels of concern. For example,
the EPA water quality criterion for polychlorinated biphenyls (PCBs) is 0.03 ptg/L
for the protection of marine organisms from chronic effects, However, routine
methods for PCBs in water samples may only achieve a sensitivity of 0.1 /xg/L.
To meet the water quality criterion, a staff chemist or the laboratory should be
consulted to determine appropriate methods.
When the focus of a project is simply to characterize chemical contamination
without making comparisons with regulatory levels of concern, the project
manager may want to consider the following options for deciding the appropriate
sensitivity of analysis:
1. Stringent—The analytical results should be adequately sensitive to
accurately compare conditions at the site with natural background
conditions or, in some cases, to evaluate even lower concentrations
that may be required for risk assessment.
10
-------
2. Moderate—The analytical results should be adequately sensitive to
accurately characterize local conditions near the site or conditions
at a more distant reference site that is relatively uncontaminated.
3. Screening—The analytical results should be adequately sensitive to
identify high chemical concentrations usually found at or near
sources of contamination. However, these screening analyses may
not be sufficient to make conclusions concerning the absence of
contamination. Environmentally significant concentrations of
contaminants may be present just below the detection limit of the
screening analysis.
Two different limits on analytical sensitivity are usually defined to establish the
degree of certainty associated with chemical concentrations: limits of detection4
and quantification limits5. Limits of detection indicate the level at which a small
amount of a contaminant can be "seen" by an analysis. Quantification limits
indicate the level at which a contaminant can be acceptably measured. In
regulatory compliance projects, methods are often selected so that the quantifica-
tion limit is no higher than the regulatory limit. However, useful information can
still be provided when concentrations fall between the limit of detection and the
quantification limit. The acceptability of these data for a project depends on the
degree of uncertainty that can be reasonably factored into project conclusions.
The Puget Sound Estuary Program provides additional guidance for the use of
limits of detection and quantification limits. Reporting requirements for both
kinds of limits are summarized in the data reporting requirements sections of
analysis guidelines published by this program (PSEP 1989a,b). Based on these
guidelines and the discussion in this section, the following specifications for
analytical sensitivity are recommended when requesting analyses that will be used
to make comparisons with levels of concern;
¦ The quantification limit should not exceed the levels of concern
defined for the project and, to the extent feasible, the limit of
detection should be 5 to 10 times lower than these levels of con-
cern
4 The limit of detection is the lowest amount of a contaminant that can be reliably detected, based
on the variability of either the blank response of a method or that of a low-level standard. Limits of
detection are contaminant-specific and instrument-specific. They can be determined by statistical
treatment of multiple analyses, in which the lowest observable amount of a contaminant is determined
relative to the level of random response.
5 The quantification limit is considered the lowest level at which a contaminant may be accurately
measured and reported without qualification as an estimated quantity. Because of the irregular nature of
instrument or method noise, reproducible quantification of a contaminant is not always possible at the
limit of detection. Often, a factor of 5 (sometimes 10) is applied to this value to estimate the quantifi-
cation limit.
11
-------
¦ The limit of detection reported by laboratories for all samples
without significant chemical interferences should be no higher than
the quantification limit
¦ No results should be reported below the limit of detection deter-
mined for a project because of the high potential for random error
¦ Concentrations reported between the limit of detection and the
quantification limit should be used as estimates6.
¦ Concentrations above the quantification limit may be used as firm
values unless a specific quality assurance concern has been identi-
fied in the assessment of data quality (see Chapter VI).
STEP 2: ASSESSING HISTORICAL RESULTS
Reviewing the results of chemical analyses performed in past studies can help
focus plans for new analyses. In particular, analytical costs can be reduced if
historical results can substitute for new analyses. Collecting these data is only the
initial step, however. Assessing their usefulness to the current project should
always be performed before substantial effort is spent on incorporating historical
results into a project database.
The quality of historical results covers a continuum from excellent to poor. By
classifying historical results into different levels of quality along this continuum,
the project manager can then decide to use only those data that meet project
objectives. Examples of four levels of data quality that can be assigned to
historical results are summarized in Table 2. Labelling each set of results with
a data quality level is also a simple way to summarize the evaluation of the
relative quality of the data set for future use. This classification provides an easy
summary of data quality when making conclusions and writing up the results of
the project. The example classification in Table 2 considers the following factors
when determining the suitability of historical results for a particular project:
¦ The analytical methods used and their associated limits of detec-
tion—Available technologies have changed over time. For exam-
ple, as late as the 1970s, concentrations of many organic com-
pounds in sediment samples were difficult to measure routinely,
6 These estimated low concentrations should be acceptable for many purposes (including use in risk
assessment) but have a higher potential for being affected by random error in the analytical procedure
than concentrations that measure above the quantification limit. These estimated concentrations are
typically less than the lowest concentration of standards used by the laboratory to calibrate the analytical
instrument (see Chapter IV, Choosing Analytical Methods and Quality Control Checks).
12
-------
TABLE 2. LEVELS OF DATA QUALITY FOR HISTORICAL DATA
Level 1 Data are acceptable for all project uses.
The data are supported by appropriate documentation that confirms
their comparability to data that will be generated in the current
project.
Level 2 Data are acceptable for most project uses.
Appropriate documentation may not be available to confirm conclu-
sions on data quality or to support legal defensibility. These data
are supported by a summary of quality control information, and the
environmental distribution of contamination suggested by these
data is comparable to the distribution suggested by an independent
analytical technique. The data are thus considered reliable and
potentially comparable to data that will be produced in the project.
Level 3 Data are acceptable for reconnaissance-level analyses.
The data can be used to estimate the nature and extent of contami-
nation. No supporting quality control information is available, but
standard methods were used, and there is no reason to suspect a
problem with the data based on 1) an inspection of the data, 2)
their environmental distribution relative to data produced by an
independent analytical technique, or 3) supporting technical reports.
These data should be considered estimates and used only to
provide an indication of the nature and possible extent of contami-
nation.
Level 4 Data are not acceptable for use in the current project.
The data may have been acceptable for their original use. How-
ever, little or no supporting information is available to confirm the
methods used, no quality control information is available, or there
are documented reasons in technical reports that suggest the data
may not be comparable to corresponding data to be collected in the
current project.
13
-------
accurately, or sensitively. However, as better preparation methods
and more sensitive analytical techniques have been made available,
the ability to distinguish these compounds from other substances
and the overall sensitivity of analysis have been substantially
improved.
¦ Quality assurance and quality control procedures and documenta-
tion—The usefulness of data will depend on whether appropriate
quality assurance procedures have been used during analysis and if
the data have been properly validated7 and documented. Because
more rigorous methods to analyze samples and document data
quality have been required by environmental scientists over the past
decade, only data that have been produced and well-documented by
laboratories using acceptable data quality controls should be consid-
ered to have no limitations. Historical data produced by even the
best laboratories often may lack complete documentation, or the
documentation may be difficult to obtain.
WILL NEW DATA BE COMPARABLE TO HISTORICAL DATA?
Making sure that newly collected data
are comparable to acceptable data
from previous analyses is another
important issue that should be ad-
dressed early in the planning process.
For project managers, this may mean
choosing only those methods that will
produce data of a quality that meets
or exceeds those levels established
during previous analytical efforts.
Even when the same kind of analytical
instrument is used, the same proce-
dures should be followed if compara-
bility to past data is essential. For
example, the way that a soil or water
sample is processed by Ihe laboratory
into a form that can Be analyzed may
affect how much of a contaminant is
quantified. Alternatively, the project
manager may determine that less
Stringent accuracy or sensitivity is
appropriate for the new data. For
example, initial analyses may have
Been cehducted at low limits of detec-
tion that are no longer needed to
assess high levels of contamination at
the source of the discharge. In this
case, any comparisons that are made
between the two data sets should
consider the potential limitations of
the less sensitive rtiw data.
The amount of time that has passed since the historical data were gathered can
also be an important consideration for some projects. Depending on project
goals, historical data of acceptable quality may or may not be useful. If it can
be clearly established that environmental conditions have not changed over time,
then historical data collected years ago may be directly comparable with newly
7 Validation of data refers to a detailed review of quality control information to ensure that the
results are of known quality, Data validation is discussed in Chapter VI, Evaluating Data from the
Laboratory.
14
-------
collected data. However, contaminant concentrations in an environmental
medium can change with time either because of chemical degradation or changes
in the contribution of contaminants from their sources. For example, concentra-
tions of the pesticide DDT in samples of tissue or sediment collected in the early
1970s may be substantially greater than those reported after use of this pesticide
was banned. Analysis of these changes over the last 15-20 years would be
valuable in assessing how rapidly the environment has recovered following the
cessation of active DDT use. For other projects focusing only on existing
environmental conditions, however, use of such historical data could be mislead-
ing.
STEP 3: DECIDING WHICH CONTAMINANTS REQUIRE ANALYSIS
The decision of which contaminants to analyze for will depend first on the results
of the historical data review conducted under Step 2 above. At a minimum,
project managers should request analyses for identified contaminants of concern
for which historical data are lacking or are insufficient. For example, limited
data may be available to address health concerns for fish and wildlife feeding at
the site or for other animals or humans that may eat them, or for other uses of
affected resources (such as irrigation water that has been contaminated by soils
at a site).
Development of a conceptual model of environmental risks based on the fate and
transport of potential chemical contamination also can be useful in identifying
other chemicals for analysis. For example, surface soil contamination at a site
may leach into groundwater through infiltration of rain. The groundwater, in
turn, may be collected in wells and drunk. In addition, an organic chemical of
low concern at the site may be partially degraded by sunlight into a more harmful
form that is easily carried into the groundwater. By examining the possible
degradation products and probable routes of exposure, the project manager can
decide to analyze for all of the chemicals that pose unacceptable environmental
or human health risks, even if the original chemical of interest does not pose a
direct risk through soil contamination.
By limiting the scope of investigation to documented contaminants of concern,
project managers may lose the opportunity to increase the usefulness of the results
of any sampling efforts. For example, there may be no historical information on
additional chemicals that are only suspected of being present. In poorly charac-
terized areas there may also be value in conducting a broader investigation to
document both the presence and absence of a wide range of chemicals. In
addition, omitting apparently irrelevant chemicals one at a time from a long,
standard list of contaminants will do little to minimize analytical expenses. A
strategy for addressing this concern is provided in the next section.
15
-------
DECIDING WHICH CONTAMINANTS TO ANALYZE:
ONE APPROACH
A list of 100 potential contaminants
of concern in Puget Sound has been
compiled during ongoing development
of the Puget Sound Estuary Program's
Pollutants of Concern. Potential con-
taminants on this list were chosen
from EPA's priority pollutant list, other
lists compiled specifically for Pug6t
Sound, and recommendations by field
investigators and experts in a number
of related disciplines, To date, 64
high-priority coritaminahts have been
selected from this list and data on
their characteristics and environmental
distribution in Puget Sound have been
summarized (PSEP 1991).
The following criteria were used to select these potential contaminants;
¦ High toxicity—The contaminant is poisonous, carcinogenic (cancer-
causing), or otherwise directly harmful to life
¦ High environmental persistence—The contaminant is not easily
broken down into harmless components by natural physical,
chemical, or biological processes
¦ High bioaccumulation potential—The contaminant is easily accu-
mulated in the tissuei of animals and plants
¦ High measured water column concentration—The contaminant has
been detected at high levels in fresh or salf water
¦ High concentrations relative to samples from reference areas—The
contaminant has been detected at levels higher than those of sam-
ples from reference areas that are removed from direct sources of
contamination
¦ Existence of known sources—The contaminant stems from known
land use activities and natural occurrences
¦ Widespread distribution—The contaminant is found at many loca-
tions and in many different media around the study area
¦ Public or agency concern.
Securing Analytical Cost Savings
Substantial savings are usually realized only when entire groups of chemically
related contaminants are removed. For example, deletion of the analysis for
volatile organic compounds at a site may decrease analytical costs by $150~$300
for each sample analyzed. For large sites, these savings could amount to
thousands of dollars. However, if only some volatile organic chemicals are
omitted from the analysis, the total cost would still be incurred because of the
need to analyze for the other chemically related compounds and the associated
quality control samples. Because of various expenses associated with nonstandard
organic analyses, the laboratory will often simply omit the reporting of data for
16
-------
compounds that have not been requested even though data have been generated
for all of the standard compounds.
The cost of organic analyses should always be treated as a fixed price for an
entire group of contaminants rather than as a cost that can be broken down into
incremental prices for each compound requested. The cost of metals analyses is
more dependent on the number of metals requested, but unless data are only
needed for a few metals, there will usually still be cost savings by requesting a
standard group for analysis.
INDICATOR CONTAMINANTS
Often, when land use or health risks
are not well known, the project man-
ager may be left with a long list of
contaminants, Some of which may or
may not be present in the media to be
analyzed. In these cases, it may be
most effective to identify a small
number of contaminants that can be
; used as indicators of the presence of
; several other chemicals, For example,
analysis of polynuclear aromatic
hydrocarbons may be used as indi-
cator chemicals for oil contamination,
although oil may contain hundreds of
other kinds of chemicals. Processes
for selecting contaminants as human
health indicators are described in de-
tail in EPA's Risk Assessment Guid-
ance for Super fund, Volume I: Human
Health Evaluation Manual (U.S. EPA
1989d) and Assessing Human Health
Risks from Chemically Contaminated
Fish and Shellfish: a Guidance Manual
{U.S. EPA 1989a). Whether analyzing
for all contaminants Ofconcern or for
indicator contaminants, the process of
contaminant selection^should be fo-
cused, so that project resources are
not spread too broadly and so the
contaminant ranges identified will
enable relevant environmental ques-
tions to be addressed for the investi-
gation.
In general, the project manager should consider requesting analyses for all
chemicals related to the chemicals of concern when the cost of analysis remains
constant. Assessment of the quality of these additional data (see Chapter VI) and
subsequent entry of the data into the project database are options that can be
decided on later, so that these additional costs will only be incurred if warranted
by the preliminary results.
Additional factors that a project manager should consider in selecting chemicals
or groups of chemicals for analysis are discussed in the following sections.
or Broad-Spectrum Analyses
For a reconnaissance survey or detailed investigation at a site with an unknown
history of contamination, project managers may choose to analyze for as many
contaminants as the project budget will permit and that meet the objectives of the
project. In this case, an exploratory or broad-spectrum analysis covering several
17
-------
kinds of chemical groups (for example, toxic metals and volatile and semivolatile
organic acid, base, and neutral compounds) is often warranted. A reasonably
thorough investigation of contamination in environmental samples might involve
analyzing samples for the 24 inorganic contaminants on the Target Analyte List
and the 125 organic contaminants on the Target Compound List compiled and
presented in the statements of work for EPA's Contract Laboratory Program
(U.S. EPA 1990a,b).
In addition, because environmental samples may contain hundreds of organic
contaminants that are not on these lists, laboratory analysis of other major organic
compounds identified in the samples may be appropriate. Because analytical
standards8 are usually not available for these compounds, their identification is
only tentative and their concentrations are estimated. A project manager may
want to request analysis of up to 20 tentatively identified compounds with the
highest concentrations. Alternatively, a specific list of tentatively identified
compounds could be requested for each sample analyzed so that the presence or
absence of a consistent set of chemicals is evaluated for all samples. The cost for
data on these additional chemicals may range from no cost to $100 or more,
depending on the laboratory, the analyses requested, and the nature of the specific
project.
Even if only a few chemicals of concern are identified at a site, information from
a more comprehensive analysis could be of value in making environmental
interpretations of the results. For example, perhaps only polynuclear aromatic
hydrocarbons are of concern for a remediation investigation of a wood treatment
facility. Supplemental data on other compounds (such as nitrogen-containing
aromatic hydrocarbons or nonaromatic hydrocarbons) could be useful to distin-
guish different sources of contamination at the site or to identify contributions
from sources away from the site. In addition to providing relevant information
for the current project, these supplemental data may be useful for other monitor-
ing or characterization studies.
Focused Analyses
Focused investigations for a few contaminants are sometimes appropriate. For
example, analyses of contaminants in fish muscle tissue might be limited to those
toxic metals and chlorinated organic compounds that have high potential for
uptake and that are not metabolized in the fish's liver (for example, see Malins
et al. 1985). More comprehensive historical studies may also be used to reduce
the list of contaminants requiring focused investigation. When in doubt, it may
be appropriate to analyze only known contaminants of concern in all samples and
8 Analytical standards are chemically pure solutions of known concentration used by the laboratory
to relate the analytical signal generated by an environmental sample to a known quantity of the chemical.
18
-------
a broader spectrum of contaminants in a subset of samples. This subset may be
composed of, for example, 10 percent of all samples taken at random or only
those samples collected in certain areas, such as next to a specific source or
critical resource. Finally, by archiving sufficient material9, a decision to analyze
samples for additional contaminants can also be made after initial analyses are
performed.
Deciding Between Broad-Spectrum and Focused Analyses
A broad spectrum analysis may require funds that are not available for the
project. For example, quantitative analyses for individual chlorinated dioxins and
furans at low parts-per-trillion concentrations10 are highly specialized and can
cost hundreds to thousands of dollars more than analyses for other inorganic and
organic contaminants. Managers should consider several alternative strategies to
ensure that adequate chemical analyses are conducted within project budget
constraints. In this effort, the following factors should be considered:
¦ Each metal specified for analysis usually results in increased
analytical costs. However, the added cost per sample may only be
$10-$20 per metal, so a broader analysis may be possible at only
a minimal increase in cost. Techniques in this cost range provide
low limits of detection for individual metals but take more time and
require more sample volume for analysis than alternative analytical
techniques that have higher limits of detection. The advantage of
the alternative techniques is that they enable simultaneous analysis
of nearly all toxic metals at an overall reduced cost. Often, a
variety of analytical techniques may be specified to balance limit
of detection requirements and costs for specific metals. This
approach is reasonable because the difference in costs among these
different techniques for metals analyses is small, especially com-
pared with the cost of organic analyses. Both kinds Of techniques
are discussed in Appendix B, including how each technique works
and what special variations are most useful for individual metals.
¦ Concentrations of organic compounds that have related chemical
structures (e.g., a group of hydrocarbons or a group of phenolic
9 Archived samples for chemical analysis are usually stored frozen (-18°C or lower) to minimize the
potential for microbial degradation, volatilization, or other processes that may occur at faster rates at
higher temperatures. Holding times for frozen samples are discussed in Chapter III, Assuring Quality
During Sample Collection.
10 Low parts-per-trillion concentrations of chlorinated dioxins and furans may be needed to meet
levels of concern which, although low, may still be associated with substantial human health risks.
19
-------
compounds) can usually be measured in a single analysis, even
when a high degree of sensitivity and accuracy is required.
¦ Only three separate analyses are needed to provide useful data on
nearly all major organic contaminants of concern routinely analyzed
by EPA11. These analyses include broad spectrum analyses for
volatile organic compounds, semivolatile organic compounds
(including organic acid, base, and neutral compounds), and chlori-
nated pesticides and PCBs. At moderate or high detection limits,
these analyses are often referred to as screening level analyses.
Laboratories should be consulted to determine if concentrations of
organic compounds of concern that have dissimilar chemical
structures can be measured in the same analysis. It is also possible
to obtain low detection limits for these broad spectrum analyses,
primarily by increasing the amount of sample analyzed and by
adding steps to remove interferences12. Each such "cleanup"
step may add $25-$50 to the cost of analysis; however, project
managers should consult with the laboratory because some cleanup
steps may not significantly add to costs while others may be
prohibitively expensive.
¦ In some cases, focused analyses for different groups of organic
contaminants may be necessary to increase the sensitivity or
accuracy of measurements (that is, to reduce limits of detection,
increase precision, or reduce the bias of results). This approach
may cost more than a single analysis for a broad spectrum of
contaminants because several separate analyses are usually required
to address all contaminants. However, if data for only a few
organic contaminants are needed, then focused analyses for a
limited number of related groups of chemicals (e.g., aromatic
hydrocarbons or chlorinated phenols) can help keep project costs
down. Laboratories should be consulted to determine if they can
supply standard lists of contaminants for which data reports are
easily prepared. In some cases, additional charges or delays may
be incurred if the laboratory must customize a data report to
provide certain requested data.
11 A list of EPA priority pollutants cited in 40 CFR 136 (Code of Federal Replations) and additional
hazardous substances is contained in Appendix A of this manual.
12 Interfering substances are chemicals that may mask the presence of a chemical of concern in the
analytical procedure. A further discussion of interferences is contained in Chapter IV, Choosing
Analytical Methods and Quality Control Checks.
20
-------
//. PLANNING FOR QUALITY ASSURANCE
Quality assurance, or QA, activities are intended to provide an independent
system of checks on the sample gathering and laboratory analysis that the project
manager is supervising13. These QA activities begin before and continue after
samples are collected and laboratory analyses are completed, requiring ongoing
coordination and oversight. The initial planning task of identifying specific
project data needs and establishing data quality objectives is addressed in
Chapter I, The second planning task of staffing and writing project plans for QA
activities is addressed in this chapter. Related planning activities of writing
sampling plans and specifying details of analytical methods are addressed in
Chapters III and IV.
DRAFTING A QUALITY ASSURANCE PROJECT PLAN
After overall project goals and data quality objectives have been formulated, a
formal strategy can be developed to assure data quality. When the sample
gathering and laboratory analysis effort is small, this strategy may be quite
straightforward, requiring only a minor QA planning effort. However, when the
sample gathering and laboratory analysis efforts are significant, the assurance of
data quality may require the formulation of a formal and often quite detailed
document called a quality assurance project plan (known as a QA project plan or
QAPjP). The QA project plan is both a planning and an operational document.
Although a formal document may not be necessary for all projects, a plan should
be developed for each project that involves monitoring and measurement
activities.
The QA project plan contains specific guidelines and procedures for sample
collection, data analysis, and reporting. In particular, the plan addresses required
quality control checks, performance and system audits, QA reports to manage-
ment, corrective actions, and assessment of data precision, bias, and complete-
ness. These procedures are explained in later chapters of this manual and specify
performance expectations that must be closely tied to the objectives for data use.
QA project plans developed for major programs where little precedent exists may
require significant detail and the formation of a QA team. Other plans may be
brief, with references to other documents (such as a detailed sampling plan or an
13 Occasionally, final decisions must be made by individuals other than the project manager. For
example, joint studies and cooperative agreements may require the oversight of QA personnel from EPA
or other federal and state agencies.
21
-------
existing statement of work for laboratory analyses) taking the place of lengthy
detail. EPA has recently provided guidance on preparing QA project plans for
different categories of projects (U.S. EPA 1989c), These categories of projects
are summarized in Table 3 and discussed in the following sections.
Category / QA Project Plans
Category I projects may be subjected to legal challenge and thus require rigorous
and detailed quality assurance documentation. Sixteen quality assurance elements
have been identified by EPA's Office of Emergency and Remedial Response for
use in such projects (U.S. EPA 1988a). These elements include:
1. Title and signature page—The signature page should be signed by
those persons responsible for approving and implementing the plan
(the project manager is usually included in these signatures even if
other persons are primarily responsible for QA activities).
2. Table of contents
3. Project description—The goals and objectives of the study project
are outlined in this section.
4. QA organization and responsibilities—The relationship of the
personnel on the QA project team and their responsibilities for
implementing the QA program are identified in this section. It is
also useful to identify the major items to be included in negotiated
statements of work for laboratories (see Chapter V, Working With
an Analytical Laboratory).
5. Quality assurance objectives and sampling strategy-Data quality
objectives are addressed in this section. The specific criteria for
precision, bias, completeness, and comparability of data should be
established, as well as criteria for the maximum allowable time that
samples can be held prior to analysis by a laboratory. It is helpful
to describe the strategy for selecting sampling sites and sample
variables in this section to clarify the selection of data quality
objectives.
6. Sampling procedures—The specific procedures for collecting each
kind of sample are described in detail in this section (or are cross-
referenced to a separate detailed sampling plan). Sample contain-
ers and special conditions for the preparation of sampling equip-
ment are also identified.
22
-------
TABLE 3. CATEGORIES OF EPA PROJECTS
REQUIRING OA PROJECT PLANS
Category I Those projects producing results that can stand alone. These
projects are of sufficient scope and substance that their
results could be used directly, without additional support, for
compliance or other litigation. Such projects are of critical
importance to EPA goals and must be able to withstand legal
challenge. Accordingly, the quality assurance requirements
will be the most rigorous and detailed to ensure that such
goals are met.
Category 11 Those projects producing results that complement information
from other projects. These projects are of sufficient scope
and substance that their results could be combined with the
results of other projects of similar scope to produce narratives
that would be used for making rules, regulations, or policies.
In addition, projects that do not fit this pattern, but have high
public visibility, would also be included in this category.
Category III Those projects producing results for the purpose of evaluating
and selecting basic options, or performing feasibility studies
or reconnaissance of unexplored areas that might lead to
further work.
Category IV Those projects producing intermediate results used in testing
assumptions.
Reference: Preparing Perfect Project Plans (U.S. EPA 1989c).
23
-------
7. Sample custody procedures—Recordkeeping procedures are de-
scribed in detail in this section, including specific procedures to
document the physical possession and condition of samples during
their transport and storage (see Chapter III, Assuring Quality
During Sample Collection). This section also describes how
samples will be preserved, shipped, and stored prior to analysis and
how excess samples will be disposed of at the end of the project.
8. Calibration procedures—Procedures for properly maintaining the
accuracy and precision of each piece of equipment to be used in the
field or laboratory are detailed in this section. Procedures are also
described for obtaining, using, and storing chemical standards of
known purity used to quantify analytical results.
9. Analytical procedures—Sample analysis procedures are identified
in this section by reference to established methods; any modifica-
tions to these procedures and any specialized, nonstandard proce-
dures are described in detail (see Chapter IV, Choosing Analytical
Methods and Quality Control Checks, and Appendix B).
10. Data reduction, validation, and reporting—Procedures are described
in this section for how data are compiled and verified as suitable
for making technical conclusions. This description may include
special equations used to make calculations, models used in data
analysis, criteria used to validate the integrity of data that support
final conclusions, and methods used to identify and treat data that
may not be representative of environmental conditions. A descrip-
tion of the data management scheme is also necessary because it
documents how data are handled at each stage of the project.
11. Internal quality control checks—The various control samples that
will be used internally by the laboratory or sample collection team
to assess quality are described in this section (see Chapter IV,
Choosing Analytical Methods and Quality Control Checks).
12. Quality assurance audits—Procedures to determine the effectiveness
of the QA program and its implementation are summarized in this
section. For example, on site audits may be conducted of the field
or laboratory operations, and results from analyses of performance
evaluation samples may be compared with the results from indepen-
dent laboratories.
13. Quality assurance reports to management—The content of reports
summarizing the data quality review and the schedule for submit-
24
-------
ting these reports to management are included in this section (see
Chapter VI, Evaluating Data from the Laboratory).
14. Preventive maintenance procedures—Procedures for maintaining
field and laboratory equipment in a ready state are described in this
section, including identification of critical spare parts that must be
available to ensure that data completeness will not be jeopardized
by equipment failure.
15. Routine procedures for data assessment—Specific equations or
procedures used to assess precision, accuracy, representativeness,
comparability, and completeness during data quality assessment are
identified in this section.
16. Corrective actions—Major problems that could arise during field or
laboratory operations, predetermined corrective actions for these
problems, and the individual responsible for each corrective action
are identified in this section.
A list of references cited in the QA project plan is also included at the end of the
plan.
Category U QA Project Plans
The level of detail for Category I projects is reduced in the QA plans for other
categories of projects for which data will not be used for enforcement and so are
less likely to be subject to legal challenge. EPA suggests that for its purposes,
data from less stringent Category II projects may be combined with data from
other projects for use in making rules, regulations, or policies. For these
projects, U.S. EPA (1989c) recommends omitting or combining information in
Category I project plans as follows:
¦ The section on sample custody can be combined with the sampling
procedures section and reduced in scope to focus on the field
portion of sample custody rather than both laboratory and field
procedures. However, the flow of data should still be addressed
in the data reduction, validation, and reporting section.
¦ The sections on analytical procedures and calibration can be
combined.
¦ The section documenting preventive maintenance is not required.
¦ A list of references is not mandatory.
25
-------
Category III QA Project Plans
Data from Category III projects (e.g., reconnaissance surveys) are quantitative but
more preliminary in nature than data in the previous categories. Information
from these projects will likely be used to support subsequent, more comprehen-
sive projects. Category III projects may require less rigorous documentation of
quality assurance, including:
¦ The section on QA organization and responsibilities can be omitted
and information on project responsibilities and procedures for
communicating between personnel involved in QA activities can be
briefly summarized in the project description section.
¦ The level of detail in sections describing the purpose of the project
and QA objectives will usually be less than for Category I or
Category II projects (for example, reconnaissance surveys usually
do not have a quantitative purpose, but instead are designed to
answer a general question on environmental conditions).
¦ Discussion of sample custody procedures in the sampling proce-
dures section is still needed, but examples of custody forms (see
Sample Custody in Chapter III) are not required.
¦ The section on data reduction, validation, and reporting does not
need to identify critical control points for validation and reporting
because data for Category III projects will not be used to make
critical decisions.
Category IV QA Project Plans
Data from Category IV projects are only used by EPA to test assumptions that
may be only semiquantitative in nature, not to make final decisions or to
characterize sites. An example of such a project may be to make simple tests for
the presence or absence of contamination before undertaking more costly quantita-
tive analyses in a reconnaissance project. The QA plans for these projects briefly
summarize basic information concerning the project objectives, sampling and
analysis procedures, and QA procedures. Major differences between these plans
and plans for Category III projects include:
h Discussion on why specific sampling sites are selected is not
necessarily included, and the sections on sampling and analysis are
combined
26
-------
¦ All additional sections on QA procedures are replaced with a single
section on the approach to quality assurance.
Additional information on the content of QA project plans for each of these
categories of projects is summarized in U.S. EPA (1989c). Copies of the pocket
guide for preparing these QA project plans are available from EPA's Risk
Reduction Engineering Laboratory in Cincinnati, Ohio.
STAFFING FOR QUALITY ASSURANCE
A project manager has overall responsibility for assuring the quality of data
generated for a project. In large projects, the project manager is typically not
directly involved in actual QA activities. However, the project manager should
ensure the implementation of any corrective actions that are called for during
sampling, analysis, or data assessment. The writing of QA plans can usually be
done by one person with assistance as needed from technical specialists for details
of methods or quality control criteria. One person should also have primary
responsibility for coordinating the oversight of all sampling activities (including
completion of all documentation for samples sent to the laboratory). Coordinating
laboratory interactions before and during sample analysis is also best performed
by one person to avoid confusion. Subsequent interactions that may be necessary
with the laboratory during a QA review of the data (see Chapter VI) usually
involve the person actually doing the review.
Additional QA tasks and responsibilities during sampling and analysis are often
assigned to technicians who collect samples, record field data, and operate and
maintain sampling and analytical equipment. These technicians perform a number
of essential day-to-day activities, which include calibrating and servicing
equipment, checking field measurements and laboratory results, and approving
modifications to field or laboratory procedures. These individuals should have
training to perform these functions and follow established protocols and guidelines
for each of these tasks.
27
-------
III. ASSURING QUALITY DURING
SAMPLE COLLECTION
Before any analysis of samples can produce quality data, procedures, methods,
and safeguards for field activities should be developed and adopted so that
samples are properly collected, handled, and stored. This chapter provides basic
guidance for assuring sample quality from collection to delivery to the laboratory,
References are also provided for more detailed guidance in other documents.
DEVELOPMENT OF A SAMPLING PLAN
Quality assurance for sampling and field measurement of air, water, soil,
sediment, and tissue begins during creation of a field sampling plan (or a detailed
sampling section of a QA project plan). Sometimes the description of both field
and laboratory activities is combined into one sampling and analysis plan (SAP).
The QA project plan may either be an appendix of this plan, or specific sampling
details may be provided as an appendix of the QA project plan. For smaller
projects, a single planning document may be created that combines a work plan
(project rationale and schedule for each task), sampling plan (how project tasks
are implemented), and the QA project plan.
The purpose of the sampling plan is to provide guidance for all fieldwork by
defining in detail the appropriate sampling and data gathering methods. The plan
should be written so that a field sampling team unfamiliar with the site would be
able to gather the necessary samples and field information. However, to assure
sampling quality, at least one individual familiar with the study area should
always be present during sampling activities.
Addressing quality assurance in the sampling plan includes designating field
samples to be collected and used for assessing the quality of sampling and
analysis and ensuring that quality assurance is included in standard operating '
procedures for field measurement. As with the drafting of the QA project plan,
participation by several individuals may be necessary when developing these
elements of the sampling plan. The sampling plan should include sections that
address:
¦ Site background, including all sampling media
¦ Sampling objectives
¦ Sample location and collection frequency
28
-------
¦ Sample designation (i.e., a description of how each independently
collected sample will be identified)
¦ Sampling equipment and procedures
¦ Methods for sample handling and analysis.
To ensure an adequate level of confidence in the data produced and in the
comparability of the data to information collected by other sampling teams, the
sampling plan must adhere to published sampling protocols and guidance.
Descriptions of widely used sampling methods can be found in several EPA
publications, many of which are cited in this chapter and the bibliography of this
manual. Methods specific to Puget Sound studies have also been identified in
Recommended Guidelines for Measuring Organic Compounds in Puget Sound
Sediment and Tissue Samples (PSEP 1989a) and Recommended Protocols for
Measuring Metals in Puget Sound Water, Sediment, and Tissue Samples (PSEP
1989b). These documents are recent revised chapters of a larger volume of
protocols and guidelines for the determination of environmental variables
(chemical, biological, and physical) in Puget Sound (PSEP 1986). It is recom-
mended that these guidelines be used in all Puget Sound sampling surveys.
Sample handling methods must be specified in the sampling plan. These methods
include any special sample preparation techniques, container types and cleaning
procedures, and methods for preserving and transporting samples. The Puget
Sound Estuary Program has developed guidelines for appropriate container
materials, sample sizes, preservation techniques, and holding times for metals in
water, sediment, and tissue samples (Table 4) and for organic compounds in
sediment and tissue samples (Table 5).
SAMPLE CUSTODY
The possession of samples should be documented from sample collection through
laboratory analysis, unless there will be no need to verify handling procedures at
any time in the future. Recording basic information during sample handling is
good scientific practice even if formal custody procedures are not required.
Sample custody procedures, including examples of forms to be used, are
described in the QA project plan. Minimum requirements for documentation of
sample handling and custody on simple projects should include the following
information:
¦ Sample location, project name, and unique sample number
¦ Sample collection date (and time if more than one sample may be
collected at a location in a day)
29
-------
TABLE 4. COLLECTION, PREPARATION, AND QA/QCa REQUIREMENTS
FOR PUGET SOUND ENVIRONMENTAL SAMPLES
ANALYZED FOR METALS
Sample Storage
Analyte
Matrix
Container1*
Size
Preservative
Holding Time
Total or dissolved (except
Hg)°
H20
P,Fd
1 liter
HN03 pH <2
6 months
Total or dissolved Hgc
h2o
G,F
500 mL
HN03 pH <2
28 days
Particulate metals
h2o
P,G
4 liters
_ e
-
All metals (except Hg)
Sediment
P,G
50 grams'
Freeze
2 years0
Hg
Sediment
P,G
1 gram'
Freeze
28 days0
Elutriate studyh
h2o
Sediment
P,G
P.G
12 liters
3 liters
4°C
4°C
--
Fractionation studyh
Sediment
P,G
1 liter
4°C
-
All metals (except Hg)
Tissue
P,G
6 grams''1
Freeze'
2 years0
Hg
Tissue
P,G
0.2 grams'*1
Freeze'
28 days0
a QA/QC - quality assurance and quality control.
b P - linear polyethylene
G - borosilicate glass
F - fluoropolymers [e.g., PTFE (Teflon®)],
c U.S. EPA (1990a). Samples to be analyzed for dissolved metals must be filtered before preservation.
d If aliquot for mercury is taken from this 1-liter sample, linear polyethylene containers cannot be used.
6 Samples should be filtered as soon as possible after collection and always within 24 hours.
' Wet weight.
0 Suggested holding time. No U.S. Environmental Protection Agency criteria existfor sediment and tissue. Maximum holding
time for mercury in water is 28 days.
h Storage time "as short as possible," analyses to be completed "within 1 week of sample collection" (Plumb 1981).
1 Weight is a minimum for one sample. Studies using specific organs may require more tissue.
' Postresection.
Source; Recommended Protocols for Measuring Metals in Puget Sound Water, Sediment, and Tissue Samples {PSEP 1989b).
30
-------
TABLE 5. COLLECTION, PREPARATION, AND OA/QC8 REQUIREMENTS
FOR PUGET SOUND ENVIRONMENTAL SAMPLES
ANALYZED FOR ORGANIC COMPOUNDS
Variable
Sample Maximum Sample Maximum Extract
Sample Sizeb Container0 Preservation Holding Time Holding Time
Sediments
Semivolatile compounds
Volatile compounds
Tissues (whole)
Tissues (after resection)
Semivolatile compounds
Volatile compounds
50-100 grams
50 grams
(40 mL)
h
25 grams
5 grams
Ge'f
A
G,T
G,T
Freeze (-18°C)
Cool (4°C)
Freeze (-18°C)
Freeze (-18°C)
Freeze (-18°C)
1 year
14 days
1 year'
1 year1
14 days'
40 days
40 days
40 days
a QA/QC - quality assurance and quality control.
b Recommended field sample sizes for one laboratory analysis. If additional laboratory analyses are required (e.g., replicates),
the field sample size should be adjusted accordingly. Smaller sample sizes may be used if comparable sensitivity can be
obtained by adjusting instrumentation, extract volume, or other factors of the analysis.
c G - glass
A - wrapped in aluminum foil, placed in watertight plastic bags
T - PTFE (Teflon®).
d This is a suggested maximum holding time, although some compounds (e.g., polycyclic aromatic hydrocarbons) have been
shown to be stable for longer periods. A holding time of 1 year is beyond the U.S. Environmental Protection Agency criterion
of 14 days for water samples. Every effort should be made to analyze the sample as soon as possible.
6 No headspace or air pockets should remain; this criterion will be difficult to achieve if the sample has a low moisture content.
f Freezing these samples will probably cause breakage of the sample container because no airspace for expansion is provided.
0 Extracts are not provided in the analysis of volatile compounds.
h Whole tissues are not generally recommended for analysis.
1 No U.S. Environmental Protection Agency criterion exists for holding times of tissue samples; this is a maximum suggested
holding time.
Source: Recommended Guidelines for Measuring Organic Compounds in Puget Sound Sediment and Tissue Samples (PSEP
1989a).
31
-------
¦ Any special notations on sample characteristics or problems
¦ Initials of the person collecting the sample
¦ Date sample sent to laboratory.
For large projects or sensitive projects that may result in enforcement actions or
other litigation, a strict system for tracking sample custody should be used to
assure that one individual has responsibility for a set of samples at all times. For
these projects, only data that have clear documentation of custody can be accepted
without qualification.
A strict system of sample custody implies the following conditions:
¦ The sample is possessed by an individual and secured so that no
one can tamper with it
¦ The location of the sample is known and documented at all times
¦ Access to the sample is restricted to authorized personnel only.
Chain-of-custody forms are often used to document the transfer of a sample from
collection to receipt by the laboratory (or between different facilities of one
laboratory). Although not always required, these forms provide an easy means
of recording information that may be useful weeks or months after sample
collection. When these forms are used, they are provided to field technicians at
the beginning of a project. The completed forms accompany the samples to the
laboratory and are signed by the relinquisher and receiver every time the samples
change hands. After sample analysis, the original chain-of-custody form is
returned by the laboratory. The form is filed and becomes part of the permanent
project documentation. An example of a chain-of-custody form is reproduced in
Appendix E, Additional custody requirements for field and laboratory operations
should be described in the QA project plan, when appropriate.
When in doubt about the level of documentation required for sampling and
analysis, a strict system of documentation using standard forms should be used.
Excess documentation can be discarded; lack of adequate documentation in even
simple projects sometimes creates the unfortunate impression that otherwise
reasonable data are unusable or limited. Formal chain-of-custody procedures are
outlined briefly in the statements of work for laboratories conducting analyses of
organic and inorganic contaminants under EPA's Contract Laboratory Program
(U.S. EPA 1990a,b).
32
-------
FIELD CONTAMINATION: A MAJOR THREAT
TO ANALYTICAL ACCURACY
Because limits of detection for an
analytical method can be dramatically
affected by sample contamination,
every precaution should be taken to
avoid contamination during sample
collection, handling, storage, prepa-
ration, arid analysis.
The most common contamination
threats include exposure to airborne
dust, inadequate cleaning of sample
containers and equipment, contact
with potentially contaminating materi-
als {such as rubber O-rings, aluminum-
lined caps, or steel containers), and
the use of contaminated solvents,
preservatives, or reagents. Field
personnel who touch samples or
equipment with dirty hands or gloves
also frequently contribute to sample
contamination.
The following steps should be taken to minimize these sources of sample
contamination:
¦ To reduce exposure of samples to airborne dust, capped containers
should be used and physical sample handling should be kept to a
minimum. Containers should be protected from contamination
during storage and transportation by enclosing them in polyethy-
:: lene bags. - ¦
¦ Appropriate container materials arid cleaning methods should be
used to minimize sample contamination from storage containers.
¦ Established national or regional protocols for cleaning sampling
containers and equipment should be followed to greatly reduce
contamination from these sources. Surgical removal (resection) of
tissue samples should be performed in a contaminant-controlled
environment. Organisms should not be frozen prior to resection if
analysis will include internal organs, because freezing may cause
some internal organs to rupture and spill onto other tissues.
¦ Solvents, preservatives, and reagents should be of an ultrapure or
equivalent grade arid should never be returned to their stock
containers once removed.
¦ Efforts should be made to eliminate any unnecessary handling of
samples by field technicians.
In addition to field operations overseen by the project manager, a strict system
of sample custody for laboratory operations should include the following items:
¦ Appointment of a sample custodian, authorized to check the condi-
tion of and sign for incoming field samples, obtain documents of
shipment, and verify sample custody records
33
-------
¦ Separate custody procedures for sample handling, storage, and
disbursement for analysis in the laboratory
¦ A sample custody log consisting of serially numbered, standard
laboratory tracking report sheets.
34
-------
MEDIA-SPECIFIC SAMPLING GUIDANCE
Soil Sampling Guidance
U.S. EPA. 1986a. Test Methods for Evaluating Solid Waste (SW-846); Physical/
Chemical Methods
U.S. EPA. 1986b. Field Manual for Grid Sampling of PCB Spill Sites to Verify
Cleanups
U.S. EPA. 1987a. A Compendium of Superfund Field Operations Methods
U.S. EPA, 1989e. Review Draft. Soil Sampling Quality Assurance Guide
Groundwater Sampling Guidance
U.S. EPA, 1985c. Practical Guide to Groundwater Sampling
U.S. EPA. 1987c. Handbook: Groundwater
U.S. EPA. 1988b. Guidance on Remedial Actions for Contaminated Groundwater at
Superfund Sites
U.S. EPA. 1988f. Statistical Methods for Evaluating Groundwater from Hazardous
Waste Facilities
U.S. EPA. 1989b. Groundwater Sampling for Metals Analyses
Surface Water and Sediment Sampling Guidance
Plumb, R.H., Jr. 1981. Procedures for Handling and Chemical Analysis of Sediment
and Water Samples
PSEP. 1986 (and 1989 revisions), Recommended Protocols for Measuring Selected
Environmental Variables in Puget Sound
U.S. EPA. 1984. Sediment Sampling Quality Assurance User's Guide
U.S. EPA. 1985b. Methods Manual for Bottom Sediment Sample Collection
Air Sampling Guidance
U.S. EPA. 1983. Technical Assistance Document for Sampling and Analysis of Toxic
Organic Compounds in Ambient Air
U.S. EPA. 1988e. Procedures for Dispersion Modeling and Air Monitoring for
Superfund Air Pathway Analysis
Biota Sampling Guidance
PSEP. 1989a,b. Recommended Guidelines and Protocols for Measuring Organic
Compounds and Metals in Puget Sound Water, Sediment, and Tissue Samples
U.S. FDA. 1977. Pesticide Analytical Manual: Volume 1
U.S. FDA, 1986. Pesticides and Industrial Chemicals in Domestic Foods
U.S. EPA. 1985a. Cooperative Agreement on the Monitoring of Contaminants in
Great Lakes Sport Fish for Human Health Purposes
U.S. EPA. 1989a. Assessing Human Health Risks from Chemically Contaminated
Fish and Shellfish: A Guidance Manual
Full citations for each of these documents are contained in the References section of
this manual.
35
-------
IV. CHOOSING ANALYTICAL METHODS
AND QUALITY CONTROL SAMPLES
In this chapter, guidance is provided on specifying the quality control procedures
that will be performed by the laboratory. In most instances the project manager
does not determine the specific details of chemical analyses. Instead, the
laboratory may propose appropriate methods when supplied with the following
information (see related discussions on analytical requirements in Chapter I):
1. A list of chemicals for analysis
2. Required detection limits
3. Expected use of the data and data quality objectives
4. Requirements for comparability with other projects.
Technical considerations that should be taken into account before any analytical
method is included in a laboratory's statement of work are described in Appen-
dix B, Description of Calibration, Quality Control Samples, and Widely Used
Analytical Methods. However, even when relying on advice from the laboratory
or other chemists regarding appropriate methods, the project manager should
consider the following questions:
¦ How will the data be used? Guidance for selecting the appropriate
chemicals for analysis and for determining appropriate limits of
detection is discussed in Chapter I, Defining Analytical Objectives.
Both of these issues will be influenced by the anticipated use of the
resulting data. In addition, the choice of analytical method(s) will
depend on whether the data results will be compared to previously
collected data from the same site, or to similar data collected at
another location. If so, the sample preparation and analytical
methods used in each project may need to be the same. If different
analytical methods are used for data that may be compared, then it
is essential to include adequate measures, or quality control
samples (QC samples), to provide a basis for comparison.
Selection of certain analytical methods may also be important to
produce legally defensible data. For example, absolute confirma-
tion of a contaminant's identity may be needed to determine a
responsible party's liability in a court of law. How sensitive do the
analyses need to be? Greater instrument sensitivity (and lower
detection limits) may be needed when the project objective is to
establish the level of risk associated with a specific contaminant at
low concentrations.
36
-------
What are the relative costs of analyses? Costs can vary substan-
tially among different analytical methods. Generally, methods
capable of achieving lower detection limits require more sensitive
instruments and are therefore more costly than those that yield data
on a screening level. In addition, the cost of analyzing a limited
set of contaminants at low detection limits can be less than the cost
of analyzing a broad range of contaminants at high detection limits.
Differences in costs for the same level of data quality may also be
significant among different laboratories. When comparing costs,
it is important to secure bids from several laboratories. The
process of choosing a laboratory is discussed in greater detail in
Chapter V, Working With an Analytical Laboratory. If all of the
bids are too high for the project budget, the number of analyses
may be reduced. If the cost is still too high, the objectives of the
project may have to be changed to stay within the budget.
What are the costs of sample preparation? The costs of isolating
contaminants for analysis from soil, water, tissue, or other sample
media are included as part of the total cost of each analysis per-
formed by the laboratory. Extra costs may be incurred in cases
where extensive sample preparation is required. For example, with
regard to sensitive analyses, it may be required that parts-per-
trillion levels of chlorinated dioxins in tissue samples be docu-
mented for risk assessment. Tissues contain fat and other sub-
stances that can mask the presence of chlorinated dioxins if all
chemicals present in a tissue sample are analyzed as a mixture.
However, this mixture of chemicals can be subjected to a series of
analytical steps that concentrate chlorinated dioxins with similar
chemicals prior to analysis. These additional laboratory techniques
may increase the time and cost of cleanup. Because these tech-
niques may improve the ability to detect the presence of a contami-
nant or to quantify its concentration, it may be necessary to incur
these extra expenses to meet the goals of a project. On the other
hand, the use of these techniques may result in a cost savings
during data review and evaluation. The most commonly used
sample preparation techniques are described in Appendix B.
What other options are available? In general, it is preferable to
select "standard" analyses such as EPA methods (listed in the
Federal Register), specialized analyses in regional documents such
as PSEP (1986, 1989), or other recognized guidelines. Depending
on the funding or sponsorship of a project, the use of a specific
method may, in fact, be required. However, nonstandard methods
may prove appropriate when conducting certain types of analyses.
The appropriateness of any alternative method should be demon-
strated through laboratory tests described in the QA project plan to
37
-------
ensure that it meets the performance criteria established for a
project. Selection of any nonstandard method is best done only
after consulting with the laboratory supervisor and a qualified
consulting chemist.
PREPARING AND ANALYZING SAMPLES
Before samples can be analyzed by the laboratory, they must be transformed into
more usable forms. The sample preparation techniques used for this purpose are
briefly described in this section. The most common sample preparation tech-
niques are digestion, distillation, purging, and extraction of the sample matrix'4.
Various cleanup techniques are available for removing unwanted substances from
the resulting material.
Digestion and Distillation of Inorganic Samples
Prior to analysis of inorganic substances, samples are broken down, or digested,
in solutions of acids to ensure that the analytes are in a chemical form that can
be measured. For some instrument analyses (i.e., graphite furnace atomic
absorption spectrometry; see Appendix B), this process may include the addition
of other chemicals, called matrix modifiers. These modifiers are added to the
digested sample to reduce or eliminate unwanted substances that can interfere with
the analysis. Analysis of other inorganic substances does not require digestion
of samples. For example, cyanide or ammonia can be released from samples by
distillation and absorbed in an appropriate solution for analysis.
Purging of Volatile Organic Samples
Volatile organic compounds are stripped from sample matrices by passing a
nonreactive gas (e.g., nitrogen) through the sample. This process is called
purging. Purged volatile compounds can be introduced to the analytical instru-
ment without further processing.
14 Matrix is a common laboratory term referring to the sample material (e.g., water, sediment,
tissue) in which the chemicals of interest are found (see Glossary).
38
-------
NORMALIZING SAMPLES
Physical and chemical differences among sediment samples or soli samples may
confound interpretation of data from these kinds of samples. To compensate for these
differences, data from chemical analyses can be made more comparable through a
process known as normalization. Normalization is based on the observed and
theoretical factors known to affect the quantity of contaminants in a given volume of
a sample. There are three main reasons for normalizing sediment or soli data for
interpreting environmental distributions of chemicals:
• ¦ Most contaminants In sediments'are associated with the solid material,
nbt the interstitial water. Because water content can vary considerably
/ among samples, wet-weight concehtrationsare typically poor indicators'
of the relative quantity of a chemical.
¦ Fine-grained soils and sediments naturally tend to accumulate more
: chemicals than do coarse-grained sediments because the relative
surface area available for absorption of chemicals increases with de-
creasing grain size.
¦ Many trace chemicals are associated and transported with carbon-rich
' particles In the envlrorimentrorcanbe bound by organic carbon in sedi-'
mems. vCarbbn-richsediments tend to contain a larger quantity of these
chemicals than do carbon-poor sediments.
Another reason for normalizing concentrations of some organic compounds to organic
• 7 carbon content is to better account for the relative amounts of a compound that may
be available in different samples for uptake by organisms (bioavailability). For certain
metals, the varying content of acid-volatile sulfides in different samples may be
similarly important (U.S. EPA 1991).
To take normalizing factors into account,, the concentration of a contaminant is
/expressed as a ratio of its mass to the total vveight of, for example/the organic carbon
content in a sample of soils or sediments. Using these concentrations, data from the
Analysis of dissimilar soils'dr Pediments-can:be"cornpared and .interpreted. The
following example Illustrates the calculation of an organic-carbon normalized value from
a wet-weight concentration reported by the laboratory for the organic compound
naphthalene in sediment;
Naphthalene concentration; 50//g/kg wet secliment (or 50 ppb)
Moisture content of sediment: : = 60% water lor 40% dry sediment)
Tptai organic carbon (TOC) content: :.:2% (dry weightInsediments'
50 ua naphthalene 1 ko wet sediment 1"ka drv'Sediment
kg wet sediment x 0,4 kg dry sediment x 0,02 kg TOC
- 6,250//g haphthalene/kg TOG == 6.25 fiprriTOG
Obtaining these measures may require samples to be oveh-dried, separated into
different-sized particle fractions, or subjected to additional grtgiiytlcal procedures (for
example, to determine organic carbon content). Different costs are associated with
each of these procedures. For example, while drying of samples is routinely included
in the cost of an analysis, fractioning of particles into multiple size groupings of sands,
^ilts, and clays may cost $40-1100 per sample, depending on the specific procedure
and laboratory. The cost of determining the organic carbon content may vary from
$30-$80 for similar reasons.
39
-------
Extraction of Semivolatile Organic Samples
Most semivolatile organic chemicals are separated from the sample matrix by
repeatedly passing organic solvents15 over the sample. The resulting product
is called an extract. Sometimes samples are simultaneously extracted with
organic solvents and digested with potassium hydroxide in methanol or other mild
basic solutions. Such treatments are usually reserved for exhaustive analyses of
tissue matrices, in which the membrane structure of the tissue must be completely
broken down. However, organic solvents are sufficient to extract the fatty
material (in which most organic contaminants are usually found) within these
membranes.
Selection of a particular extraction procedure is based on the expected contami-
nants and the matrices, as well as considerations of project cost and how soon the
data are required from the laboratory. Certain extraction procedures are required
by EPA's Contract Laboratory Program. Others have been identified in the
National Status and Trends Program of the National Oceanic and Atmospheric
Administration, Test Methods for Evaluating Solid Waste (U.S. EPA 1986a),
other published guidelines for sampling and analysis (such as those developed for
the Puget Sound Estuary Program), and in professional journals of analytical
chemistry.
Cleanup of Organic Samples
During cleanup, interfering substances can mask the presence of chemicals of
interest are removed from the sample extracts. Removing the interferences will
usually result in improved sensitivity of analysis, which enables lower detection
limits to be achieved. In some cases, the level of interference may be estimated
using data from previous studies. This information may help determine appropri-
ate cleanup measures. For example, since lipid (fat) content is exceptionally high
in tissue samples taken from oysters that are ready to spawn, gel permeation
chromatography, a process particularly efficient at removing lipid interferents, is
often specified as a cleanup measure.
In some cases, a particular cleanup procedure will not separate all interferents
from the chemicals of interest. However, the likelihood of also removing some
portion of the chemicals of interest during cleanup increases as additional cleanup
procedures are used. This loss, in turn, may require that more sensitive analyses
be conducted to detect the remaining amount of the chemical in the extract. In
some cases all of the chemical of interest will be lost and the analysis will have
15 A wide array of chemical and mechanical procedures, including sonication, Soxhlet® recycling
chambers, separatory funnels, continuous liquid-liquid extraction chambers, and shaker or roller devices,
can be used to expose samples to solvents and extract semivolatile organic compounds.
40
-------
been wasted. Therefore, in specifying cleanup procedures, care must be taken
to understand the potential for losing any chemicals of interest. An environmental
chemist should be consulted to determine which cleanup procedures are needed
and whether they will yield net increases in sensitivity and accuracy for the
chemicals of interest.
Following sample processing, instrument analysis can begin. Dozens of different
analytical methods are currently used by laboratory technicians. A summary of
some of the most widely used methods is presented in Appendix B of this manual.
For the analysis of organic compounds, the principal methods described in
Appendix B are gas chromatography (GC) and high-performance liquid chroma-
tography (HPLC). Atomic absorption spectrometry (AAS) and inductively
coupled plasma-atomic emission spectrometry (ICP) are the principal methods
described for metals.
SUMMARY OF CALIBRATION METHODS AND QUALITY CONTROL SAMPLES
In order to quantify any measurement accurately, analytical instruments require
calibration. In addition, for most environmental investigations, as many as 10 to
30 percent of all samples may be analyzed specifically for purposes of quality
control. In some special cases (for example, when the number of samples is
small and the need to establish the validity of analytical data is large), as many
as 50 percent of all samples are used for this purpose. These samples, called
quality control samples may be used to check the accuracy of the overall
analytical system and to evaluate the performances of individual analytical
instruments or the technicians that operate them.
This section of the manual (and Appendix B) summarizes the most widely used
calibration methods and the following quality control samples:
¦ Blanks
¦ Matrix spike samples
¦ Surrogate spike compounds
¦ Check standards, including:
Spiked method blanks
Laboratory control samples
Reference materials.
¦ Replicates.
41
-------
Which quality control samples will be used in analyses should be determined
during project planning. The statements of work for EPA's Contract Laboratory
Program (U.S. EPA 1990a,b) specify the types of checks to be used during
sample analysis. Determining the actual numbers of samples and how often they
must be used is also a part of this process. These specifications, called quality
control sample frequencies, are also identified in laboratory statements of work,
such as for EPA's Contract Laboratory Program, as well as the published
guidelines of the Puget Sound Estuary Program. Quality control sample frequen-
cies represent the minimum levels of effort for a project. Increasing the
frequency of quality control samples may be an appropriate measure when the
expected concentrations of chemicals are close to the detection limit, when data
on low chemical concentrations are needed, or when existing data indicate
chemical concentrations in a range where cleanup or other actions may be
required. In such cases, the need for increased precision may justify the cost of
extra quality control samples.
The relative importance, rationale, and relative frequency of calibration and each
kind of quality control sample are discussed in Appendix B. The following
priority, rationale, and frequency of use is recommended for each procedure:
1. Calibration of analytical instruments is a top priority and is always
required for any project requiring quantitative data (even if only
estimated quantities are necessary). Calibration is essential because
it is the means by which instrument responses are properly translat-
ed into chemical concentrations. Instrument calibration is per-
formed before sample analyses are begun and is continued during
sample analyses at intervals specified in each analytical method.
2. Method blank samples are one of the highest priority checks on
quality control, because they provide an assessment of possible
laboratory contamination (and the means to correct results for such
contamination), and are used to determine the limit of detection.
As a result, method blank analyses are always required; at least
one analysis is usually performed for each group of samples that
are processed by a laboratory. In contrast, the need for other kinds
of blank samples (bottle, transport, or field equipment) is usually
project-specific and depends on the likelihood of contamination
from solvents, reagents, and instruments used in the project; the
matrix being analyzed; or the contaminants of concern.
3. Matrix spike samples are high priority checks on quality control
and should always be analyzed to indicate the bias of analytical
42
-------
measurements due to interfering substances or matrix effects16.
Duplicate matrix spike samples analyzed at a frequency of one
duplicate for every 20 samples can serve as an acceptable means of
indicating both the bias and precision of measurement for a particu-
lar sample. Duplicate matrix spike samples may provide the only
information on precision for contaminants that are rarely detected
in samples.
4. Surrogate spike compounds are high priority checks on quality
control that are used to evaluate analytical recovery of compounds
of interest from individual samples. This kind of check is only
used when the identity of the surrogate compound can be reason-
ably confirmed (e.g., by mass spectroscopy). Because a surrogate
compound is chemically similar to the associated compound of
interest and is added to the sample in a known amount, its known
recovery is indicative of that of the compound of interest.
Variations in recovery that can be seen using surrogate spike
compounds with each sample will not necessarily be reflected in
duplicate matrix spike analyses conducted on only a few of the
samples. The reasons for possible differences between surrogate
spike analyses and matrix spike analyses relate to sample heteroge-
neity and how these quality control samples are prepared; the
specific reasons are discussed further in Chapter IV.
5. Check standards should be used whenever available as a medium
priority check on laboratory performance. Check standards include
laboratory control samples, reference materials prepared by an
independent testing facility, and spiked method blanks prepared by
the laboratory. By comparing the results of check standards with
those of sample-specific measurements (e.g., matrix spike samples
and surrogate compound recovery), an overall assessment of
accuracy can be obtained. The laboratory should be contacted
prior to analysis to determine what laboratory control samples can
be used. Catalogues from organizations such as the U.S. National
Institute for Standards and Technology and the National Research
Council of Canada are available that list reference materials for
different water, soil, tissue, and air samples.
16 Both interferences and matrix effects can bias chemical measurements. Interferences are unwanted
chemicals in the sample that have properties similar to those of the chemical of interest. Unless removed
by an appropriate cleanup procedure, the interferences are carried along with the chemical of interest
through the analytical procedure. Matrix effects are physical or chemical interactions between the sample
material and the chemical of interest that can bias chemical measurements in either a negative or positive
direction. Because matrix effects can vary from sample to sample and are often not well understood, they
are a major source of variability in chemical analyses.
43
-------
Reference materials provide a standardized basis for comparison
among laboratories or between different rounds of analysis at one
laboratory. Therefore, reference materials should always be used
when comparison of results with other projects is an intended data
use. At least one analysis of a reference material for every 50
samples is recommended for this purpose. Similarly, spiked
method blanks should be used as acceptable checks on laboratory
performance whenever a new procedure is used or when laborato-
ries with no established track record for a standard or nonstandard
procedure will be performing the analysis.
6. Analytical replicate samples should be included as a medium
priority check on laboratory precision whenever project budgets
permit. Analytical replicate samples better indicate the precision
of measurements on actual samples than do matrix spike duplicates
because the contaminants have been incorporated into the sample
by environmental processes rather than having been spiked in a
laboratory setting. The suggested frequency is one replicate sample
for every 20 samples analyzed. For organic analyses, analysis of
matrix spike duplicate samples are sometimes a higher priority than
analytical replicate samples if budgets are limited. The reason for
this preference is because many organic compounds of interest may
not be present in samples unless they are added as spiked com-
pounds. However, for metals analysis, analytical replicate samples
are usually preferred over duplicate matrix spike samples.
As a further check on quality control, analysis of blind replicate
samples (unknown to the laboratory) can provide an assessment of
data independent from the analysis of replicate samples prepared by
the laboratory or of duplicate matrix spikes. Use of this kind of
quality control sample is dependent on the availability of funds
after other, higher priority checks have been specified.
7. Field replicate samples should be included if measuring sampling
variability is a critical component of the study design. Otherwise,
collection of field replicate samples is discretionary and a lower
priority than the other quality control samples. When included, the
suggested frequency is at least one field replicate for every 20
samples analyzed. One of the field replicate samples should also
be split by the laboratory into analytical duplicates so that both
laboratory and laboratory-plus-sampling variability can be deter-
mined on the same sample. By obtaining both measures on the
same sample, the influence of sampling variability can be better
discerned. It is possible that analytical variability can mask
sampling variability at a location.
44
-------
SPECIFYING QUALITY CONTROL LIMITS
Prior to performing a chemical analysis, recognized limits on analytical perform-
ance must be established (see the Defining Chemical Analyses According to Data
Use section of Chapter I). These limits are established largely through the
analysis of quality control samples. Many laboratories have established limits that
are applicable to their own measurement systems. These limits should be
evaluated to ensure that they are at least as stringent as general guidelines or that
the reasons for a less stringent limit are acceptable. Also, if a laboratory has
consistently demonstrated better performance than indicated by general guidelines,
limits tied to this better performance should be used to indicate when there may
be a problem at that laboratory. For example, if surrogate recoveries for benzene
in sediment samples have consistently been between 85 and 105 percent, a
recovery of 70 percent indicates an analytical problem that should be investigated
even if the general guideline for acceptable recovery is 50 percent.
Project managers may find it advantageous to establish different kinds of limits
when working with laboratories. For example, the following two kinds of limits
are used by PSEP (1989a) and are similar to limits used in EPA's Contract
Laboratory Program.
Warning limits are values indicating that data from the analysis of quality control
samples should be qualified (for example, that they represent estimated or
questionable values) before they can be relied upon in a project. These limits
serve to warn the project staff that the analytical system, instrument, or method
may not be performing normally and that data should be qualified as "estimated"
before using the results for technical analysis. Examples of warning limits used
by the Puget Sound Estuary Program are provided in Table 6. Such limits
provide a means of ensuring that reported data are consistently qualified, an
important consideration when combining data in a regional database.
If necessary to meet project goals, project managers may specify warning limits
as more stringent contractual requirements in laboratory statements of work. For
example, Puget Sound Estuary Program guidelines for organic compound analyses
state that the warning limits for the minimum recovery of surrogate spike and
matrix spike compounds are 50 percent of the amount added prior to sample
extraction. Data that do not meet this minimum requirement would normally be
qualified as estimates. However, the project manager could apply more stringent
criteria and decide to reject data that do not meet warning limits, which would
require reanalysis of the samples associated with those quality control samples
that do not meet these limits. These more stringent criteria are termed action
limits.
Action limits are limits placed on the acceptability of data from the analysis of
quality control samples. Exceedance of action limits informs the analyst and the
45
-------
TABLE 6. EXAMPLE WARNING AND ACTION LIMITS FOR
CALIBRATION AND QUALITY CONTROL SAMPLES8
Analysis Type
Recommended Warning Limit
Recommended Action Limit
Ongoing calibration
Project manager decisionb
> ±25 percent of the average
response measured in the initial
calibration
Surrogate spikes
< 50 percent recovery0
Follow EPA Contract Laboratory
Program guidelines
Method blanks
Exceeds the limit of detection
Exceeds the practical
quantification limit
Reference materials
95 percent confidence interval, if
certified
Project manager decision
Matrix spikes
50-150 percent recovery
Project manager decisiond
Spiked method blanks
(check standards)
50-150 relative percent difference
Project manager decision
Analytical replicates
35 percent coefficient of variation
> ± 50 percent coefficient of
variation (or a factor of 2 for
duplicates)
Field replicates
Project manager decision
Project manager decision
8 Warning and action limits used in the Puget Sound Estuary Program for the analysis of organic
compounds (PSEP 1989a).
b See text for specific examples of project manager decisions for warning or action limits.
0 Except when using the isotope dilution technique.
d Zero percent spike recovery requires rejection of data.
46
\
!
-------
project manager that the analytical system or instrument is performing abnormally
and needs to be corrected. Action limits should be contractually binding on
laboratories, and statements of work should provide the project manager or
designatee with sole discretion in enforcing the limits. Data obtained under these
circumstances should be corrected before they are resubmitted by the laboratory.
Data that exceed action limits are often rejected and excluded from a project
database, although there may be special circumstances that warrant acceptance of
the data as estimated values. The reasons for making such an exception should
always be documented in a quality assurance report for the data (see Chapter VI
and Appendix D).
Unlike warning limits, action limits and appropriate corrective actions (such as
instrument recalibration, elimination of sources of laboratory contamination, or
sample reanalysis) should be clearly identified in the statement of work.
Examples of regional action limits used by the Puget Sound Estuary Program are
also provided in Table 6, In those cases that require a project manager's decision
to determine the appropriate action limit, it is recommended that the associated
warning limit be used as an action limit to produce data that will have broad
applicability (including use in enforcement proceedings). Action limits should be
enforced with discretion because some environmental samples are inherently
difficult to analyze. Recommended actions under different circumstances are
provided in the Assessing Data Quality section of Chapter VI.
47
-------
V. WORKING WITH AN ANALYTICAL
LABORATORY
Data quality requirements and analytical methods need to be clearly and concisely
communicated to the laboratory selected by the project manager. These
specifications are best contained in a written laboratory contract. The main body
of the contract should consist of general terms and conditions common to any
legal contract. A contract addendum, called a statement of work, should also be
drafted and negotiated with the laboratory prior to the start of any analyses. This
statement of work should define all requirements for acceptable analyses, an
important consideration even when working with a familiar laboratory, and all
pertinent information on the price, timing, and necessary documentation of the
analyses. Examples of a fill-in-the-blank statement of work for any analysis and
a detailed statement of work written specifically for organic sample analyses are
provided in Appendix C.
A statement of work can also form the basis from which laboratories can submit
competitive bids for upcoming analyses. However, laboratories can usually
formulate and submit bids based on a verbal explanation of the work that will be
required. Furthermore, the evaluation of competitive bids may only be part of
the laboratory selection process. Other factors such as the laboratory setup
(including quality and capacity of the available analytical equipment), past
experience, or an upfront demonstration of performance may also influence the
project manager's selection. The need to conduct a comprehensive evaluation of
candidate laboratories will vary with the project and the familiarity with available
laboratories. In any case, final details of the analyses that may need to be
negotiated are written into a final statement of work and the contract is signed
after one laboratory has been selected for a particular analysis.
Common factors used in selecting qualified laboratories are described in the next
section, followed by a discussion of the "who, what, when, where, how, and
why" details of a laboratory statement of work.
SELECTING A QUALIFIED LABORATORY
Selecting an analytical laboratory is no different from selecting the services of any
other group of professionals. First a list is drawn up of laboratories that are
capable of performing the required analytical tasks. The list is then narrowed to
a few likely candidates, and the qualifications of these candidates are examined
closely. The candidate laboratories are asked to provide cost estimates for each
48
-------
individual analytical method (and associated quality control samples) identified in
the draft statement of work, and these costs are compared.
Lists of qualified laboratories are often available from state or federal agencies.
The competence of these laboratories to perform certain tasks may be established
by a formal accreditation process. In the Washington State Laboratory Accredita-
tion Program, for example, the Department of Ecology Environmental Investiga-
tions and Laboratory Services Program is responsible for accrediting laboratories
for the purpose of conducting analyses to make water quality determinations other
than for drinking water. These laboratories are required to submit applications
and quality assurance manuals to the Department of Ecology. Laboratories must
then successfully analyze required performance evaluation samples, pass an onsite
audit conducted by departmental staff, and pay an accreditation fee. The
Washington Department of Health is responsible for certifying drinking water
laboratories in Washington under EPA's national drinking water certification
program. In the Pacific Northwest, this program is supervised by EPA Re-
gion 10, Office of Quality Assurance.
On a national level, laboratories are evaluated (but not formally accredited) under
EPA's Contract Laboratory Program. Contract Laboratory Program status is
usually a reliable indicator of competence for analyses that do not require highly
specialized procedures or methods. The current status of a laboratory in this
program can be determined by contacting EPA's Sample Management Office in
Alexandria, Virginia.
Accreditation in Washington is currently limited to those laboratories conducting
tests that pertain to drinking water and other water-related determinations (for
example, effluent testing for permit compliance). Eventually all state and federal
monitoring and enforcement programs in the United States that submit analytical
samples for any medium may be required to use laboratories accredited through
similar programs. Expansion of the accreditation process will depend primarily
on the availability of staff and funding.
A list of laboratories can also be compiled from referrals by other agencies or
individuals conducting similar projects. When gathering information for such a
list, the previous experiences of others may help determine how reputable the past
work of a laboratory has been. Directories maintained by the American Council
of Independent Laboratories and the American Society for Testing and Materials
may also provide valuable leads (e.g., ASTM 1987).
Comparing the relative strengths and weaknesses of individual laboratories
requires the development of criteria to define minimum acceptable performance
standards. The generic guidelines offered below can be helpful when evaluating
potential contract laboratories, particularly when several analytical procedures will
be performed by the same laboratory. For smaller projects (for example, those
involving a small number of samples by a single analysis), project managers may
49
-------
choose to evaluate laboratories simply on the recommendation and past experience
of other agencies or organizations. Information of each of the following factors
is compiled through interviews and review of brochures and quality assurance
plans produced by the laboratory. Application of common sense is one of the
best means of reviewing this information; however, the project manager should
request additional explanations from the laboratory, request a demonstration of
ability, or consult a chemist if there are doubts concerning a laboratory's
qualifications.
Personnel and organization provides insight about a laboratory's ability to assure
data quality. Duties and responsibilities of management and staff need to be well-
defined, adequate supervision must be provided, and a system to constantly
inspect and evaluate work products should be in place. A systematic plan for
critical review and self-appraisal of laboratory functions should also be in effect.
Technical competence and experience of all laboratory staff should be demonstrat-
ed. Staff qualifications should be documented, and training should be provided
by the laboratory to encourage staff to attain the highest levels of technical
competence. Staff turnover can affect the ability of a laboratory to perform a
particular analysis. The experience of current staff with projects of similar scope
should be assessed. Upfront analysis of appropriate quality control check samples
is one way of assessing technical competence in the absence of pertinent experi-
ence.
Analytical equipment should be adequate in kind, quantity, and quality to satisfy
analytical requirements. Equipment should be properly housed and maintained
in a ready or standby condition for use in the services offered. Backup instru-
mentation should also be available in case of failure.
Calibration and analytical standards that are adequate for the accuracy require-
ments of the services offered should be available. The laboratory should be able
to explain how these standards are properly stored and replenished so that their
integrity is maintained. Standards should be kept in a separate refrigerator from
samples to prevent cross-contamination and replenished at frequencies that are
often specified in the analytical method. A complete record of the preparation
and storage of standards should be available for inspection.
Commonly accepted test methods and procedures should be used for routine
analyses. These methods should be on hand as standard operating procedures.
On questioning, personnel should be knowledgeable about and familiar with these
procedures. Methods should be supported by control charts or other evidence
of proficiency. For nonstandard or new procedures, the laboratory should
provide a complete description of the analytical steps and testing requirements.
50
-------
Facilities should be adequate with respect to space, environmental control, worker
safety, and maintenance. Good housekeeping practices should be readily
apparent.
Management systems for sample handling and storage that are consistent with the
requirements of the most critical samples analyzed should be in place.
Records should be well maintained, and a records management system should be
in place (including chain-of-custody records, if required). The ease with which
information can be retrieved provides important insight into the adequacy of
records management. All records, including instrument printouts, laboratory
work sheets, analyses of standards, and sample preparation loops, should be
retained by the laboratory.
Test reports should be adequate in content and clarity. If EPA Contract Labora-
tory Program data packages are required, the laboratory should be able to
generate the specific forms required by this program. A system for review and
release of reports should be operational.
A laboratory QA project plan should be in place and should address all of the
requirements mentioned in this section. This plan should be requested and used
as a selection criterion.
A FINAL TEST OF THE LABORATORY
Once the list of laboratories is nar-
rowed to a few likely candidates,
candidate laboratories may be re-
quested (or in some cases may offer)
to perform an analysis or submit re-
cent results of test analyses of one or
more samples. For example, an ad-
vance analysis of a reference material
is recommended for comparison With
independent data from previous tests.
This comparison provides the best
way to objectively assess the bias of
a laboratory's testing procedure.
Advance testing should always be
seriously considered if a laboratory is
not well experienced with an analysis
or if there are no recent results of
similar analyses that can be offered
for inspection.
Cost is the final criterion to be considered in evaluating potential contract
laboratories. Competitive bids should be requested that are based on detailed
verbal or written statements of work for the laboratory. Only in this way can the
most cost effective laboratory be identified. Cost-effectiveness is determined not
only by price but also by demonstrated ability and service.
When comparing costs, it is important to consider the amount of quality assurance
included in the quoted price for an analysis. For example, a laboratory may or
may not include the cost of duplicates, matrix spikes, and other laboratory-
generated quality control samples in the routine price per sample. The project
51
-------
manager should always ask the laboratory to document any separate charges for
quality control samples.
Cost can also be influenced by the degree of computerization used by a labora-
tory. Having data calculated and transferred electronically may increase the
initial cost of analyses, but may more than make up for this initial expense by
saving time and effort during assessment of the data. Cost savings are especially
likely when a laboratory's computer system for filing and retrieving data is
compatible with the computer system of the organization or individual requesting
the analyses. Many laboratories are moving toward making electronic data
transfers more routine, and the ability to provide data in a usable, computerized
format is likely to become a routine consideration when selecting a laboratory to
perform chemical analyses.
THE ADVANTAGE OF COMPUTER-READABLE DATA
Many laboratories are equipped to
submit data in computer-readable
form. By providing data on diskette,
errors that may have been created by
laboratory technicians when compiling
and transcribing data by hand can be
virtually eliminated. The extent to
which errors can be reduced depends
on the method used by the laboratory
to transcribe the data into computer-
readable form.
In some cases, computer-readable
data are collected using the Laborato-
ry Information Management System
(LIMS) or similar software systems.
With this system, data are taken
. ¦ directly from the instruments used to
conduct the analysis. The data are
then recorded and later copied onto
computer diskettes. In this Way,
potential errors resulting from tran-
scription of data are effectively elimi-
nated.
In other cases, data are entered into a
computer by hand from laboratory
printouts. The data are then copied
onto computer diskettes and Submit-
ted to the requesting organization
along with the data package. Be-
cause errors can still occur during
transcription of the printout into a
computer-readable form, project man-
agers may find it just as effective to
request printouts of data, rather than
spend additional time cross-checking
data on diskettes for accuracy,
Computer-readable data should be
included as diskettes in the same
shipment as the hard copy data, but
packaged and shipped in a diskette
mailer to prevent damage. Data
should be recorded in ASCII text file
format. Data should also adhere to
file, record, and field specifications
listed in the laboratory's statement of
work.
Occasionally a contract laboratory
may wish to use spreadsheets or
other forms of computer-readable
reports not specified in the statement
of work. Data qualifier codes must be
adequately ; identified and the com-
pleteness of the format for reporting
data must be assured before any
spreadsheet program is accepted.
52
-------
DRAFTING A STATEMENT OF WORK
The statement of work is part of a legally binding agreement between the
organization requesting an analysis and the laboratory performing the actual
analytical tasks. Like any legal document, the statement of work should be
written in clear and concise terms, providing sufficient detail about each required
procedure or method to eliminate any confusion over steps in the analysis. All
available information on the range of concentrations expected and any special
characteristics of the samples to be analyzed should be contained in the statement
of work. This information will allow the laboratory technicians to apply their
expertise obtained through past analyses of samples with similar characteristics.
The statement of work for analyses should include the following specific
information (see Appendix C for an example of a completed statement of work):
¦ A summary of analyses to be performed, including:
A list of all variables to be analyzed for in each sample
or group of samples, as appropriate
The total number of samples provided for analysis and
the associated laboratory quality control samples, the per
analysis price, and the total cost of the analytical service
requested for each sample matrix.
¦ Acceptable procedures for sample delivery and storage, including:
The method of delivery, weekly schedule of deliveries,
and person responsible for notifying the laboratory of any
changes in the schedule
Requirements for physical storage of samples, holding
times (consistent with those specified in the QA project
plan), chain-of-custody, and sample log book procedures.
¦ A list of products to be delivered by the laboratory, specifying the
maximum time that may elapse between the submittal of samples
to the laboratory and the delivery of data reports to the agency,
organization, or industry requesting the analyses. Penalties for late
delivery (and any incentives for early delivery) should be specified,
as should any special requirements for supporting documentation
and electronic data files. A checklist of the laboratory deliverables
for analyses of organic compounds, pesticides, and PCBs is pre-
sented in Table 7. A checklist of laboratory deliverables for
analyses of metals is presented in Table 8.
53
-------
TABLE 7. CHECKLIST OF LABORATORY DELIVERABLES FOR
THE ANALYSIS OF ORGANIC COMPOUNDS
~ A cover letter discussing analytical problems (if any) and referencing
or describing the procedures and instrumentation used,
f~l Tabulated results, including final dilution volume of sample extracts,
sample size, wet-to-dry ratios for solid samples (if requested), and
concentrations of compounds of interest (reported in units identified
to two significant figures unless otherwise justified). Concentration
units should be //g/kg (dry weight) for solids, //g/kg (wet weight) for
tissue, and yc/g/L for water. These results should be checked for
accuracy and the report signed by the laboratory manager or desig-
natee.
~ Tabulated instrument detection limitsand limits of detection achieved
for the samples.
f~| Original data quantification reports for each sample.
Method blanks associated with each sample, quantifying all com-
pounds of interest identified in these blanks.
Q A calibration data summary reporting the calibration range used. For
the analysis of semivolatile organic compounds, this summary should
include spectra and quantification reports for decafluorotriphenylphos-
phine (DFTPP) or an appropriate substitute standard. For volatile
organic compounds, the summary should include spectra and quanti-
fication reports for bromofluorobenzene (BFB) or an appropriate
substitute standard.
Q Recoveryassessmentsand replicatesamplesummaries. Laboratories
should report all surrogate spike recovery data for each sample, and
a statement of the range of recoveries should be included in reports
using these data.
~ All data qualification codes assigned by the laboratory, their descrip-
tion, and explanations for all departures from the analytical protocols.
54
-------
TABLE 7. (Continued)
Additional Deliverables for Volatile or Semivolatile Organic Compound Analyses0
I | Tentatively identified compounds (if requested) and methods of
quantification, along with the three library spectra that best match
the spectra of the compound of interest (see Appendix B, Figure B-1
for an example of a library spectrum).
I | Reconstructed ion chromatograms for gas chromatography/mass
spectrometry (GC/MS) analyses for each sample.
~ Mass spectra of detected compounds for each sample.
Q Internal standard area summary to show whether internal standard
areas were stable.
I | Gel permeation chromatography (GPC) chromatograms (for analyses
of semivolatile compounds, if performed), recovery assessments, and
replicate sample summaries. Laboratories should report all surrogate
spike recovery data for each sample, and a statement of the range
of recoveries should be included in reports using these data.
Additional Deliverables for Pesticide and PCB Analyses®
I | Gaschromatography/electroncapturedetection(GC/ECD)chromato-
grams for quantification column and confirmation columns for each
sample and for all standards analyzed.
Q GPC chromatograms (if GPC was performed).
Q An evaluation summary for 4,4'-DDT/endrin breakdown.
I | A pesticide standard evaluation to summarize retention time shifts
of internal standards or surrogate spike compounds.
a Many of the terms in this table are descussed more completely in Appendix B.
55
-------
TABLE 8. CHECKLIST OF LABORATORY DELIVERABLES FOR
THE ANALYSIS OF METALS
I""! A cover letter discussing analytical problems (if any) and referencing or
describing the digestion procedures and instrumentation used.
~ Tabulated results for final dilution volumes of sample digestates, sample size,
wet-to-dry ratios for solid samples (if requested), and concentrations of metals
(reported in units identified to two significant figures unless otherwise
justified). Concentration units should be mg/kg (dry weight) for sediment,
mg/kg (wet weight) for tissue, and jjg/L for water. These results should be
checked for accuracy and the report signed by the laboratory manager or
designatee.
[""I Tabulated instrument detection limits and limits of detection achieved for the
samples.
[~1 Method blanks for each batch of samples.
~ Results for all the quality control checks and calibration control checks
conducted by the laboratory.
Q All data quantification codes assigned by the laboratory, their description, and
explanations for all departures from the accepted analytical protocols.
56
-------
¦ Methods to be followed for processing and analyzing samples,
including any modifications to standard procedures.
¦ Quality assurance and quality control requirements, including the
data quality objectives specified in the QA project plan and appro-
priate warning and action limits.
¦ Requirements that each laboratory submit a quality assurance
manual for review and approval by the agency, organization, or
industry requesting or funding the analysis. Each manual should
contain a description of the laboratory organization and personnel,
facilities and equipment, analytical methods, and procedures for
sample custody, quality control, and data handling. This require-
ment may not be necessary when the laboratory is large and well
known or, perhaps, when EPA has recently conducted an inspec-
tion.
¦ Progress notices (usually necessary only for large projects).
¦ Circumstances under which the laboratory must notify project
personnel of problems, including, for example, when action limits
or other performance criteria cannot be met; instrument malfunc-
tions are suspected; holding time limits have or will shortly expire.
¦ Notice that scheduled and unannounced laboratory visits by the
project manager or representative may be conducted.
¦ Required storage time for records and samples prior to disposal.
¦ Terms for payments to the laboratory, including a requirement that
the quality of data must be acceptable (pending the outcome of the
quality assurance review) before payment is made.
Including these elements in the statement of work helps to assure that responsibili-
ties, data requirements, and expectations for performance are clear. Be sure to
provide a copy of the statement of work to the individual performing the data
assessment to assist in the evaluation of data returned by the laboratory.
57
-------
A NOTE OF CAUTION ON ASSIGNING LABORATORIES
It is strongly recommended that one
laboratory be assigned all samples for
each kind of chemical analyses that
are to be conducted. For example, a
project may recjuire analysis of 80
samples for volatile organic com-
pounds and PCBs. One laboratory
should conduct all analyses for volatile
organic compounds and either the
same laboratory or a second labora-
tory should conduct all of the PCB
analyses. This strategy may some-
times create a schedule problem be-
cause the capacity of a single labora-
tory may be insufficient for complet-
ing large numbers of a particular anal-
ysis quickly. However, the rule of
"one analysis - one laboratory" should
be followed even if it seems feasible
to "split" samples between different
laboratories to shorten the total analy-
sis time.
The problems of trying to combine,
for example, three sets of PCB results
from three laboratories in a project
can greatly confound the interpre-
tation of environmental trends. Re-
sults generated by different labora-
tories for different projects can and
often are compared. However, the
variability and potential for bias
among laboratories is usually much
greater than that found within a single
laboratory over the time period
needed to complete a series of analy-
ses. If a manager absolutely cannot
avoid splitting samples, then the care-
ful use of standard reference materials
to assess differences in bias becomes
essential, as does the use of all other
quality control samples discussed in
Chapter IV. The potential expense of
a rigorous quality assurance program
to ensure within-project data com-
parability usually outweighs the time
savings.
In contrast, a project manager should
assign each kind of analysis to which-
ever laboratory is best capable of
performing the analyses. The major
disadvantage of working, for example,
with one laboratory for PCBs and
another for arsenic .'is an increase in
coordination costs. However, it is
difficult for one laboratory to be the
best performer of every kind of analy-
sis that may be required, although
many laboratories are strong per-
formers in several areas. The project
manager should learn the strengths
and weaknesses" -of each laboratory
before committing analytical dollars.
58
-------
VI. EVALUATING DATA FROM THE LABORATORY
Once the laboratory has completed the requested sample analyses, the analysis
results are compiled, printed out, and submitted as a data package. This package
may include computer disks, magnetic tape, or other forms of electronically
stored information. Data packages may range in size from a few pages to several
cartons of documents, depending on the nature and extentvof the analyses
performed. The cost of this documentation can vary from no charge (in cases
where only the final results of an analysis are reported) to approximately $200
over the cost of reporting only the final results of an analysis. The laboratory
statement of work should include reporting requirements to ensure that the
necessary information is provided. In this chapter, guidance is provided for
evaluating data packages according to the following steps:
Step 1. Checking data completeness
Step 2. Selecting an appropriate level of data validation
Step 3. Evaluating data quality
Step 4. Assigning data qualifiers and taking final actions
Step 5. Wrapping up the project.
As soon as the data package is received from the laboratory, it should be checked
for completeness and data usability and, ideally, dated and duplicated. Dating is
important for establishing the laboratory's adherence to schedules identified in the
statement of work. Duplication assures that a clean reference copy is always kept
on file. Checking each element of the data package for completeness of informa-
tion, precision of analytical methods, and bias of all measurements helps to
determine whether acceptable data from each type of analysis have been supplied
by the laboratory.
Assessing data quality requires knowledge of the sample holding times and
conditions, the types of analyses requested, and the form in which data were to
be delivered by the laboratory. Review of the statement of work is essential to
determine any special conditions or requests that may have been stated at the
onset of the analyses. The initial evaluation of data described in the following
sections can be performed by the project manager or other appropriate staff.
Guidance on when additional assistance from experienced QA reviewers may be
needed is also provided.
59
-------
STEP 1: CHECKING DATA COMPLETENESS
The first step in data assessment is to verify that all information requested in the
statement of work has been provided in the data package. This check can be
performed by comparing the data summaries and other information contained in
the data package against a master checklist of the expected deliverables for the
project. Recommended lists of laboratory deliverables for different types of
analyses are provided in Tables 7 and 8 in Chapter V, Working with an Analytical
Laboratory. This check is a quick and easy way for the project manager to
determine the initial status of all deliverables for a project. The following
example addresses common actions that can be taken to resolve concerns over
data completeness:
¦ Missing or Incomplete Information—Has the laboratory omitted
part of the information requested in the laboratory statement of
work? For example, a statement of work may have specified a list
of 25 contaminants for analysis, but the laboratory has only
reported data for 20 contaminants. Or the laboratory statement of
work may have specified that matrix spikes and matrix spike
duplicates were to be analyzed for each batch of 20 samples, but
data from the laboratory indicated that in a batch of 22 samples,
only one matrix spike/matrix spike duplicate pair was analyzed.
The following series of actions are recommended:
Contact the laboratory and request all missing informa-
tion as soon as possible after receipt of the data package.
Waiting increases the chance that information may be
misplaced or forgotten and (if holding times have been
exceeded) can sometimes limit options for reanalysis.
If the information does not exist because analyses were
not completed, request that the laboratory provide and
implement a plan to correct the deficiency. This plan
may include submittal of a revised data package and
possible reanalysis of the sample batch. Provisions for
this kind of corrective action can also be identified in the
statement of work.
If the laboratory does not have an acceptable plan of
action, consult a chemist to determine if the existing data
can be acceptably interpreted without the missing infor-
mation or, if the missing data are critical, how the data
set should be qualified (including a determination of
whether the data can be used at all). Decide whether to
withhold payment to the laboratory in accordance with
provisions of the statement of work.
60
-------
After the data package has been checked against the master list of deliverables,
the completeness of the package should be formally documented. Documentation
can be in the form of a detailed memo to the project file outlining the concerns
with data omissions, analysis problems, or descriptions of questionable data
identified by the laboratory.
STEP 2: SELECTING AN APPRO PR!A TE LEVEL OF DA TA VALID A TION
Data validation, or the process of assessing data quality, can begin after determin-
ing that the data package is complete. Analytical laboratories strive to produce
data that conform to the requested statement of work, and they typically perform
internal checks to assure that the data meet a standard level of quality. However,
data validation is an independent check on laboratory performance and is intended
to assure that quality of reported data meets the needs identified in the QA project
plan.
The first major part of validation involves the checking of data for any possible
errors resulting from transcription of tabulated results, misidentification, or
miscalculation of data. This part is largely a mechanical process, a form of
proofreading. Like proofreading, the data must be carefully checked, piece by
piece, before it can be stated with confidence that the entire data package is
100-percent free of transcription and calculation errors. However, because a
100-percent check is not always convenient or cost-effective, project managers
may have to determine whether a reduced level of effort in checking is appro-
priate.
The second major part of validation involves comparing the data against estab-
lished criteria for acceptable performance. This comparison can be performed for
all aspects of the analysis, including, for example, how well the analytical
instrument was set up and calibrated for quantitative measurements. In some
cases, an assessment of instrument performance or other detailed checks may not
be required. For example, the comparison may be limited to an assessment of
method blanks and the bias and precision of sample measurements.
The project manager should select an appropriate level of data validation for the
intended data use. Examples of four alternative levels of data validation effort
are summarized in Table 9. These four data validation levels are described
further in the following sections and range from complete, 100-percent review of
the data package to acceptance of the data package without any evaluation.
61
-------
TABLE 9. LEVELS OF DATA VALIDATION
Level 1 100 percent of the data (including data for laboratory quality
control samples) are independently validated using the data quality
objectives established for the project8. Calculations and the
possibility of transcription errors are checked. Instrument perform-
ance and original data for the analytical standards'3 used to calibrate
the method are evaluated to ensure that the values reported for
detection limits and data values are appropriate. The bias and
precision of the data are calculated and a summary of corrections
and data quality is prepared0.
Level 2 20 percent of the sample data and 100 percent of the laboratory
quality control samples are validated. Except for the lower level
of effort in checking data for samples, the same checks conducted
in Level 1 are performed. If transcription errors or other concerns
(e.g., correct identification of chemicals in the samples) are found
in the initial check on field samples, then data for an additional 10-
20 percent of the samples should be reviewed. If numerous errors
are found, then the entire data package should be reviewed.
Level 3 Only the summary results of the laboratory analyses are evaluated.
The data values are assumed to be correctly reported by the
laboratory. Data quality is assessed by comparing summary data
reported by the laboratory for blanks, bias, precision, and detection
limits with data quality objectives in the QA project plan. No
checks on the calibration of the method are performed, other than
comparing the laboratory's summary of calibrations with limits
specified in the QA project plan.
Level 4 No additional validation of the data is performed. The internal
reviews performed by the laboratory are judged adequate for the
project.
8 See Chapter I (Defining Analytical Objectives) and Chapter II (Planning for Quality
Assurance) for more information on formulating and implementing data quality objec-
tives.
b See Chapter IV (Choosing Analytical Methods and Quality Control Checks) for more
information on these quality control checks.
0 Checks that can be easily performed by the project manager are provided in this
manual. Step-by-step procedures used by quality assurance specialists to validate data
for analyses of organic compounds and metals can be found in EPA's functional
guidelines for data review (U.S. EPA 1988c,d). These guidelines were developed for
analyses conducted according to the statements of work for EPA's Contract Laboratory
Program and are updated periodically. Regional interpretation of these detailed proce-
dures is also contained in Data Validation Guidance Manual for Selected Sediment
Variables (PTI 1989b), a draft report released by the Washington Department of
Ecology's Sediment Management Unit in June 1989. A simplified version of this
guidance is provided in Data Quality Evaluation for Proposed Dredged Material Disposal
Projects (PT11989a), another report released by the Sediment Management Unit in June
1989.
62
-------
Level 1 Validation
Level 1 is validation of 100 percent of the data, including verifying that all
calibrations, checks on quality control, and intermediate calculations have been
properly performed for all samples. This level of validation is typically required
for projects involving enforcement actions. Level 1 validation may also be
required, for example, when assessing the risks posed by contaminants to public
health at a controversial site, when using new analytical techniques or laborator-
ies, or when previous results have been questioned.
Level 2 Validation
Level 2 is a check of only those data that pertain directly to certain critical
elements of a study or that constitute a representative subsample of the total data
set. For example, in routine monitoring of a well-characterized site, a project
manager may decide to evaluate only the data for quality control check samples
and high and low data values. In performing a reconnaissance of a large area of
potential concern, the project manager may decide to evaluate the data for quality
control check samples produced by the laboratory and a random 20 percent of the
field data. An additional 10-20 percent of the data should be checked if any
errors are discovered in this first batch of figures. In either example, if numer-
ous errors are found, the entire data package should be reviewed in detail.
Level 3 Validation
Level 3 is a cursory review of only the summary results. In Level 3, quality
control checks such as precision and accuracy of the data are evaluated, but no
check of the supporting laboratory information is performed to validate the final
data values. This level of effort may be appropriate when the data are not
expected to be used outside of the current project and do not form the basis for
critical decisions on expenditure of funds. In any case, the results of quality-
control samples should be reported with the field data so that others can make
their own estimation of the data quality.17
Level 4 Validation
Level 4 is acceptance of the data package without conducting an independent
review of the data quality. This level may be appropriate for noncritical projects
17 A discussion of limitations that may be placed on historical data that do not have complete
documentation of data quality is contained in the Assessing Historical Results section of Chapter I.
63
-------
when the project manager is already confident that the laboratory results are of
known quality. Confidence may be based on the laboratory's internal quality
assurance program or recent past experience with the same laboratory (and
personnel) analyzing the same kinds of samples without problems. As with
Level 3, all results of the laboratory quality control samples should be reported
with the field data. Both the results and conclusions sections of any technical
report using the data should note that the results were accepted without further
validation and should provide a brief explanation of the reasons why.
Review Options
The project manager should also decide who will perform the evaluations called
for in Levels 1, 2, or 3. The following options should be considered:
¦ Perform a brief assessment as described in the following Step 3 and
rely on specialists to resolve outstanding concerns. This assess-
ment is equivalent to Level 3 (Table 9).
¦ Perform a complete review for Level 1 or 2 using qualified staff
and technical guidelines for quality assurance specialists (see
Footnote c in Table 9).
¦ Send the data package to an outside technical specialist for review,
specifying either Level 1, 2, or 3.
STEP 3: ASSESSING DATA QUALITY
What constitutes unacceptable data after data completeness has been checked
(Step 1) and an appropriate level of data validation has been selected (Step 2)?
By conducting a preliminary evaluation of the data package, the project manager
can conduct a brief assessment of the data quality. This assessment does not
include a detailed validation of the supporting technical data, which is time-
consuming and requires specialized training (see Footnote c in Table 9),
However, after this preliminary assessment of data quality, the project manager
can determine whether to accept, reject, or conditionally accept the data package
that has been received from the laboratory. Outright acceptance or rejection of
results is possible, based on adherence to or clear exceedance of established
limits. The conditional acceptance of a data package typically includes qualifying
certain results, which may require the assistance of an expert to resolve major
concerns. These three management actions are presented schematically in
Figure 2.
64
-------
MANAGEMENT
ACTION
INFORMATION
SOURCE
EVALUATION
CRITERIA
TECHNICAL
CONCLUSION
Information
Complete
Accept
Data for Use
Calibrations
Acceptable
Within Limits
Blanks
Acceptable
Accept Data with
Appropriate
Qualifications
Analytical
Data and Supporting
Documentation
Marginally
Outside Limits
Bias
Acceptable
Consult Expert
Precision
Acceptable
Severely
Outside Limits
Detection
Limits
Acceptable
Reject Data
(and consider
reanalysls)
Figure 2. Guidance for data assessment and evaluation of data quality.
*
65
C7U-2S mm
-------
The evaluation criteria in Figure 2 provide several signs that should alert the
manager to potential problems with data acceptability, which in turn will
influence the selection of the appropriate management action. These signs
include:
¦ Exceedance of Quality Control Limits—Do data for the calibration
of analytical instruments or for blanks, bias, or precision exceed
warning or action limits specified in the laboratory statement of
work? Four examples of common problems follow:
(Example A) The response measured for the benzoic acid standard
in the second-to-last continuing calibration differs by more than
+50 percent from the average response in the initial calibration.
This difference greatly exceeds the criterion of ±25 percent for
acceptable changes in response, which was specified as an action
limit.
(Example B) The concentration of benzene reported in a method
blank is 25 ppb. This concentration exceeds the 20 ppb limit of
detection that was specified as the warning limit for significant
laboratory contamination by benzene.
(Example C) For another set of samples, one of the associated
duplicate matrix spike analyses for dichlorobenzene shows a very
low recovery of 9 percent. This low recovery violates the action
limit of 25 percent minimum recovery specified in the statement of
work. The other duplicate matrix spike analysis has a marginal
recovery of 28 percent for dichlorobenzene, which exceeds the
warning limit of 50 percent recovery.
(Example D) In duplicate analyses, the precision of arsenic
measurements is ±21 percent. This variability slightly exceeds the
warning limit of ±20 percent specified in the statement of work.
To address each of these different problems of limit violations, the
following steps are recommended:
Check the laboratory's cover letter to determine if the
problems were satisfactorily addressed.
If only warning limits were exceeded (see Example B and
Example D), then it is appropriate for the laboratory to
report the results. Nonetheless, all data values associated
with the quality control samples should be qualified as
part of the QA review of the results. Specific actions
associated with warning limits should be specified in the
66
-------
QA project plan; if in doubt about their applicability,
check with a qualified chemist.
Therefore, in Example B, all benzene values that are less
than 25 ppb and are associated with the contaminated
method blank may be considered as "not detected"
because it is uncertain that the reported values are real or
are the result of laboratory contamination. In Exam-
ple D, all arsenic values may be considered estimates if
the criterion for acceptable precision were to be strictly
applied. However, no action may be recommended
because the ±21 percent precision only slightly exceeds
the ±20 percent criterion and no other problems were
noted with the arsenic analyses.
If action limits are exceeded without satisfactory expla-
nation, the data set should be considered deficient and a
request for further information should be made. The
laboratory should provide documentation explaining why
the problem occurred and presenting a plan for corrective
action.
In Example A, the +50 percent difference in the calibra-
tion response indicates that the sensitivity of the instru-
ment was substantially less in the continuing calibration
analysis than when the initial calibration analysis was
performed. Therefore, all benzoic acid results may be
rejected as unreliable because the available quality assur-
ance data for instrument calibration are well outside of
acceptable limits and the laboratory failed to redo the
initial calibration.
In Example C, reanalysis of the one duplicate matrix
spike sample for dichlorobenzene should have been per-
formed because of a similar exceedance of an action
limit. However, only one of the duplicate analyses
exceeded the action limit. The duplicate analysis could
be considered a kind of reanalysis. Because neither of
the matrix spike analyses met the warning limit, the
associated dichlorobenzene results for samples may be
considered minimum estimates. However, in some EPA
programs, no actions are taken solely on the basis of a
problem with matrix spike analyses. This course of no
action may be recommended if there are other indepen-
dent quality control data that show no problems (for
example, surrogate spike recoveries for the samples may
be well within limits).
67
-------
Further actions to address data that exceed specified
limits should be explicitly defined in the laboratory
statement of work. Often data that exceed action limits
are rejected. The project manager could take this action
without further checking. However, an expert should be
consulted to determine if there are special circumstances
that would enable these data to be accepted with appro-
priate qualification.
¦ Exceedance of Multiple Limits—Has more than one kind of quality
control limit been exceeded for a sample? For example, is there
method blank contamination plus low recovery of a matrix spike
plus poor precision? Concern over data quality obviously increases
when several kinds of limits have been exceeded for a particular
chemical, especially if the exceedances are large. In the example
above, even with small exceedances the uncertainty associated with
possible laboratory contamination, evidence of analytical losses of
the chemical, and variable analyses would at least result in all
associated data being qualified as estimated values.
Large exceedances of several kinds of limits often result in rejec-
tion of the data because there is ample evidence that the analyses
were out of control and unreliable. Furthermore, violation of
several limits for the same chemical may easily justify withholding
payment for all or part of the analyses. Under these circum-
stances, the laboratory should correct the deficiencies and reanalyze
the samples (using spare sample material that was collected by the
project manager in anticipation of such a need).
¦ High Detection Limits—Are appropriate detection limits reported
for undetected chemicals in a sample? For example, the QA
project plan and statement of work may have specified a required
detection limit of 20 ppb for phenol, an organic compound. What
if phenol was undetected in all samples analyzed? If the laboratory
reported a detection limit of 200 ppb for phenol in one sample, but
still reported phenol as undetected in most other samples at a
detection limit of 20 ppb, is this a problem? What if the laboratory
instead reported phenol as undetected in all samples at a detection
limit of 200 ppb? To address detection limit problems, the follow-
ing steps are recommended:
If the reported detection limits meet project requirements
for sensitivity in the majority of samples but there is
evidence of unavoidable interferences in the rest of the
samples, then the data set should be considered accept-
able as reported. Detection limits in QA project plans
are typically specified for samples that do not have
68
-------
significant interferences; a qualified chemist can help
answer questions about interferences and whether they
appear to be unavoidable.
If detection limits reported for a chemical never or
almost never meet project requirements for sensitivity,
check for explanations in the cover letter accompanying
the data package, or contact the laboratory to determine
if there was a systematic problem in conducting the
analyses. A qualified chemist can assist in determining
appropriate corrective actions. Reanalysis may be recom-
mended under the following circumstances; the sample
extract in the original analysis was excessively diluted;
inappropriate analytical procedures resulted in excessive
loss of the chemical; or, it should have been feasible to
substantially reduce interferences. Alternatively, revision
of the original objective for sensitivity may be recom-
mended if it is determined to be unrealistic for the parti-
cular chemical. No action may be recommended if the
particular sample matrix submitted for analysis is judged
to warrant the detection limits reported. Finally, will the
data for the sample be used to make critical decisions?
The final decision to pursue corrective action often
depends on the project manager's assessment of the
importance in attaining the original objective for detection
limits.
¦ Data Qualified by the Laboratory— Has the laboratory assigned
qualifier codes to indicate data limitations (see Table 10)? Quali-
fier codes are used to provide additional information concerning the
quality of a reported value and are discussed further in Step 4.
However, data qualification does not automatically imply trouble.
For example, laboratories should qualify all data for any chemical
that is not included in the calibration of the instrument (i.e.,
tentatively identified chemicals) as estimates. Such data are still
usable, but have greater uncertainty associated with them than do
unqualified data. In other cases, the laboratory may have rejected
data, or may have qualified as estimates project data that are to be
used to determine permit compliance, respond to enforcement
proceedings, or support major conclusions in an investigation.
These are more serious concerns. To address potential concerns
with data qualified by the laboratory, the project manager should
assess the importance of the data and consult with the laboratory or
a qualified chemist to determine the significance of the data
qualifier and to decide whether reanalysis of the samples is war-
ranted.
69
-------
TABLE 10. EXAMPLE DATA QUALIFIER CODES*
EPA Contract Laboratory Program (qualifiers applied during quality assurance review)
U The analyte was not present above the level of the associated value.
The associated numerical value indicates the approximate concen-
tration necessary to detect the analyte in this sample.
J The analyte was positively identified, but the associated numerical
value may not be consistent with the amount actually present in the
field sample. The data should be seriously considered for decision-
making and are usable for many purposes,
R The data are unusable (rejected) for all purposes. The presence or
absence of the analyte has not been verified. Resampling and reanaly-
sis are necessary to confirm or deny the presence of the analyte.
(No data value is reported if the value has been rejected; only the
qualifier R is reported).
UJ The analyte was not present above the level of the associated numeri-
cal value. The associated numerical value may not accurately or
precisely represent the concentration necessary to detect the analyte
in this sample.
D The sample was diluted prior to analysis.
Puget Sound Estuary Program
C Combined with unresolved substances
E Estimate
G Value is greater than minimum shown
K Detected at less than detection limit shown
L Value is less than the maximum shown
M Value is a mean
Q Questionable value
R Rejected value
T Detected below quantification limit shown
U Undetected at the detection limit shown
X Recovery less than 10 percent (for isotope dilution technique)
Z Blank-corrected
8 The codes in this table are not all-inclusive, Different programs may use different
codes or variations of the same qualifier codes, even within the same region (e.g., Puget
Sound Ambient Monitoring Program), The choice of qualifier codes depends on data
use; however, codes should always be defined when reporting data.
70
-------
TO REJECT OR REANALYZE? A REMEDIAL
STRATEGY THAT WORKED
The statement of work required that
the contract laboratory perform an
initial five-point calibration at concen-
trations of 5, 20, 50, 100, and 150
total nanograms. However, when
reviewing the data package, it was
determined that calibration had been
conducted at concentrations of 20,
50, 100, 150, and 200 total nano-
grams. Limits of detection for the
target analytes had already been es-
tablished near 5 total nanograms.
Because the laboratory did not provide
any evidence to support either the
linearity of response between 5 and
20 nanograms or the sensitivity of
analysis at 5 nanograms, any data
reported in this range would have to
be qualified as estimated.
The laboratory was notified as soon
as the discrepancy was detected in re-
quested and actual calibrations. The
QA coordinator and a consulting
chemist then developed a strategy to
remedy the problem. Because the
initial calibration curve used to analyze
the samples was still valid (based on
results from ongoing calibrations), the
laboratory was requested to perform
a supplemental analysis of a 5-nano-
gram standard. The 5-nanogram
standard was detected and the re-
sponse was also determined to be
linear with the initial calibration curve.
Therefore, sensitivity and linearity
between 5 and 20 nanograms was
demonstrated. No additional analysis
of samples was required, because all
of the data could now be accepted
without qualification.
If additional information is required to resolve concerns identified in the data
assessment, an agreement between the laboratory and the project manager must
be reached. Requesting additional information from the laboratory may mean
amending the original statement of work. Factors such as the added cost of
analysis, adjustments to schedules, and the importance of expected results to the
overall project should be considered in making this decision. The final step in
data evaluation described in the following section is taken only after these
concerns have been resolved.
STEP 4: ASSIGNING DATA QUALIFIERS AND TAKING FINAL ACTIONS
The final step in the data evaluation process is to assign any necessary qualifier
codes to data values using the results of the previous steps. At this stage, data
are examined and compared, and any unusual or unexpected results and depar-
tures from procedures identified in the original statement of work are noted. An
overall assessment of the analytical system based on information provided by the
laboratory, as well as a review of instrument performance, are also conducted as
part of this step.
71
-------
Data qualifier codes are notations used by laboratories and data reviewers to
briefly describe, or qualify, data and the systems producing data. Qualifiers used
by different organizations may vary both in the definition of a specific code and
in the number of codes that are used. More than one code may be assigned to
a data value to indicate qualification for different reasons. In some database
systems, several qualifying comments may be consolidated (e.g., Storet and the
Puget Sound Ambient Monitoring Program use a consolidated set of data
qualifiers). Separate qualifier codes have been developed to meet the needs of
different laboratories and federal and state agencies and programs. In general,
the more elaborate systems of data qualification can be used to describe the
acceptability of data with greater detail. However, using these complex systems
may prove difficult for inexperienced reviewers or data users. Conversely, while
simplified coding systems are easy to use, they may provide inadequate qualifying
information. Examples of the kinds of qualifier codes used by EPA's Contract
Laboratory Program and the Puget Sound Estuary Program are presented in
Table 10.
Assignment of data qualifiers should be performed carefully and is usually
performed by a QA specialist. Some data qualifiers are assigned by the laborato-
ry. Qualifiers denoting undetected values and values that may be affected by
laboratory contamination are common examples of qualifiers assigned by the
laboratory. These qualifiers may be accepted or modified, or additional qualifiers
may be assigned by independent reviewers of the data after the package has been
released by the laboratory. The following guidelines can be used by the project
manager to make general decisions concerning these qualifiers:
¦ Following the management action guidelines in Figure 2, if the data
are within the limits of quality established at the onset of the
project, they should be used in technical reports without qualifica-
tion.
¦ Data for a quality control check may only slightly exceed the
established action limit. The project manager may then choose to
qualify and accept the data for samples associated with this quality
control check or reject the data. This course is best taken after
consulting with the laboratory or a chemist. For example, assume
that the quality control check for precision exceeds a project action
limit of 50 percent by an additional 5 percent. The project mana-
ger may decide to accept the data as estimated values rather than
reject them, because this increase in variability may be acceptable
for noncritical data. However, in making this decision, the project
manager should always consider whether one or several of the
different kinds of quality control samples exceeded limits. For
example, assume that quality control data for precision are slightly
outside of action limits but also that corresponding data for calibra-
tions and accuracy are outside of their action limits. In this case,
72
-------
it may be necessary to reject the data and consider reanalysis of the
samples (even if the limits for all three kinds of quality control
samples were only slightly exceeded).
¦ When qualifying data, it may also be appropriate to define the
qualifier as either a minimum or maximum estimate. For example,
the recovery of several surrogate compounds in the samples may
be slightly below a project action limit of 50 percent, and similar
low recovery may also be observed in the matrix spike sample. In
this case, it may be appropriate to qualify the data values as
minimum estimates. In other words, there is redundant evidence
that the actual values are likely higher than the reported values.
Alternatively, the project may absolutely require data that are
accurate to within the established action limits. In this case, the
data would be rejected and reanalyses may be required.
¦ Finally, data for one or more quality control samples may fall
substantially outside of the action limits. The project manager may
be able to conclude without expert assistance that the data should
be rejected. A decision should then be made on whether the data
are sufficiently critical to warrant reanalysis of the samples. If so,
the financial burden associated with this reanalysis will depend on
the terms of the laboratory statement of work.
STEP 5: WRAPPING UP THE PROJECT
Assuring data quality doesn't end with the data review. A report summarizing
the QA review of the data package should be prepared, samples should be
properly stored or disposed of, and laboratory data should be archived in a
storage file or database. Technical interpretation of the data begins after the QA
review has been completed. Once data interpretation is complete, the results ol
the project should be carefully examined to determine how closely the original
project goals and objectives were met.
Summarizing the Results of the Quality Assurance Review
Reports that document the results of a QA review of a data package should
summarize all conclusions concerning data acceptability and should note signifi-
cant quality assurance problems that were found. These reports may be prepared
by the project manager if a brief evaluation was conducted, or by QA specialists
if a detailed review was requested by the project manager. QA reviews are useful
in providing data users with a written record on data concerns and a documented
rationale for why certain data were accepted as estimates or were rejected. An
73
-------
example of a detailed QA review for a metals data package is provided in
Appendix D. The following specific items should be addressed in a QA report:
¦ Summary of overall data quality, including description of data that
were qualified
¦ Description of sample collection and shipping
¦ Description of analytical methods, including determination of
detection limits
¦ Description of data reporting, including any corrections made for
transcription or other reporting errors
¦ Description of data completeness relative to objectives for com-
pleteness stated in the QA project plan
¦ Description of initial and ongoing calibration results
¦ Description of precision relative to QA project plan objectives
¦ Description of bias relative to QA project plan objectives, including
results of standard reference material analyses, matrix spikes,
surrogate recoveries, and any check standards
¦ Description of blank contamination for chemical analyses.
QA reviews are usually included as appendices to technical project reports. In
any case, the QA review becomes part of the documented project file, which also
includes the original data package and any computer files used in data compilation
and analysis.
Storage and Disposal of Samples
In the laboratory statement of work, the laboratory should be instructed to retain
all remaining sample material (under temperature and light conditions appropriate
for the samples) at least until after the quality assurance review has been com-
pleted. In addition, sample extracts or digestates should be appropriately stored
until disposal is approved by the project manager. With proper notice, most
laboratories are willing to provide storage for a reasonable time period (usually
on the order of weeks) following analysis. However, because of limited space
at the laboratory, the project manager may need to make arrangements for long-
term storage at another facility.
Unless there is reason to believe that the samples are hazardous to human health
or the environment, no special handling is required during disposal of many
environmental samples. However, because of potential liabilities associated with
waste generation and disposal, the project manager is responsible for determining
appropriate procedures for disposal of all samples delivered to the laboratory and
74
-------
for ensuring that proper disposal occurs. Contaminated samples collected as part
of an investigation of an uncontrolled hazardous waste site may require special
handling or disposal under state or federal laws and regulations. In such cases,
the client requesting the investigation, not the laboratory, is considered the waste
generator (see 40 CFR 260-268, including specific discussion of waste generators
in 40 CFR 262).
75
-------
REFERENCES
ASTM. Undated. A proposed guide for sediment collection, storage, character-
ization, and manipulation. Draft Report. Available from G. Allen Burton, Dept.
of Biological Sciences, Wright State University, Dayton, OH.
ASTM 1987. ASTM directory for testing laboratories. American Society for
Testing and Materials, Philadelphia, PA.
Malins, D.C., M.M. Krahn, D.W. Brown, L.D. Rhodes, M.S. Myers, B.B.
McCain, and S.L. Chan. 1985. Toxic chemicals in marine sediment and biota
from Mukilteo, Washington: relationships with hepatic neoplasms and other
hepatic lesions in English sole (Parophrys vetulus). J. Natl. Cancer Inst. 74:487-
494.
Plumb, R.H., Jr. 1981. Procedure for handling and chemical analysis of
sediment and water samples. Technical Report EPA/CE-81-1. U.S. Environ-
mental Protection Agency and U.S. Army Corps of Engineers, Waterways
Experiment Station, Vicksburg, MS.
PSEP. 1986. Recommended protocols for measuring selected environmental
variables in Puget Sound. Prepared for the U.S. Environmental Protection
Agency, Region 10, Office of Puget Sound. Tetra Tech, Inc., Bellevue, WA.
PSEP. 1989a. Recommended guidelines for measuring organic compounds in
Puget Sound sediment and tissue samples. Prepared for the U.S. Environmental
Protection Agency, Region 10, Office of Puget Sound. PTI Environmental
Services, Bellevue, WA.
PSEP. 1989b. Recommended protocols for measuring metals in Puget Sound
water, sediment, and tissue samples. Prepared for the U.S. Environmental
Protection Agency, Region 10, Office of Puget Sound. PTI Environmental
Services, Bellevue, WA.
PSEP. 1991. Pollutants of concern in Puget Sound. Prepared for U.S.
Environmental Protection Agency Region 10, Office of Coastal Waters. PTI
Environmental Services, Bellevue, WA.
PTI. 1989a. Data quality evaluation for proposed dredged material disposal
projects. Prepared for Washington Department of Ecology, Sediment Manage-
ment Unit. PTI Environmental Services, Bellevue, WA.
76
-------
PTI. 1989b. Data validation guidance manual for selected sediment variables.
Draft Report. Prepared for Washington Department of Ecology, Sediment
Management Unit. PTI Environmental Services, Bellevue, WA.
U.S. EPA. 1979. Methods for chemical analysis of water and wastes. U.S.
Environmental Protection Agency, Environmental Monitoring and Support
Laboratory, Cincinnati, OH.
U.S. EPA. 1981. Procedures for handling and chemical analysis of sediment
and water samples. U.S. Environmental Protection Agency, Great Lakes
Laboratory.
U.S. EPA. 1983. Technical assistance document for sampling and analysis of
toxic organic compounds in ambient air. U.S. Environmental Protection Agency,
Office of Research and Development, Washington, DC.
U.S. EPA. 1984. Sediment sampling quality assurance user's guide. NTIS: PB-
85-233-542. U.S. Environmental Protection Agency, Environmental Monitoring
Support Laboratory, Las Vegas, NV.
U.S. EPA. 1985a. Cooperative agreement on the monitoring of contaminants
in Great Lakes sport fish for human health purposes. U.S. Environmental
Protection Agency, Region 5, Chicago, IL.
U.S. EPA. 1985b. Methods manual for bottom sediment sample collection.
EPA-905/4-85-004. U.S. Environmental Protection Agency, Great Lakes
National Program Office, Chicago, IL.
U.S. EPA. 1985c. Practical guide to groundwater sampling. EPA-600/2-85-
104. U.S. Environmental Protection Agency, Environmental Research Labora-
tory, Ada, OK.
U.S. EPA. 1986a. Test methods for evaluating solid waste (SW-846): physical/
chemical methods. U.S. Environmental Protection Agency, Office of Solid
Waste, Washington, DC.
U.S. EPA. 1986b. Field manual for grid sampling of PCB spill sites to verify
cleanups. EPA-560/5-86-017. U.S. Environmental Protection Agency, Office
of Toxic Substances, Washington, DC.
U.S. EPA. 1987a. A compendium of Superfund field operations methods.
EPA/540/P-87/001. OSWER Directive 9355.0-14. U.S. Environmental
Protection Agency, Office of Emergency and Remedial Response, Washington,
DC.
77
-------
U.S. EPA. 1987b. Data quality objectives for remedial response activities:
development process. EPA-540/G-87-003. OSWER Directive 9335.0-7B. U.S.
Environmental Protection Agency, Office of Emergency and Remedial Response
and Office of Waste Programs Enforcement, Washington, DC.
U.S. EPA. 1987c. Handbook: ground water. U.S. Environmental Protection
Agency, Center for Environmental Research Information, Cincinnati, OH.
U.S. EPA. 1987d. An overview of sediment quality in the United States. U.S.
Environmental Protection Agency, Office of Water Regulations and Standards,
Washington, DC.
U.S. EPA. 1988a. Guidance for conducting remedial investigations and
feasibility studies under CERCLA. Interim Final Report. EPA-540/G-89-00.
U.S. Environmental Protection Agency, Office of Emergency and Remedial
Response, Washington, DC.
U.S. EPA. 1988b. Guidance on remedial actions for contaminated groundwater
at Superfund sites. Interim Final Report. OSWER Directive 9283.1-2. U.S.
Environmental Protection Agency, Office of Emergency and Remedial Response,
Washington, DC.
U.S. EPA. 1988c. Laboratory data validation: functional guidelines for evaluat-
ing inorganics analyses. U.S. Environmental Protection Agency, Office of
Emergency and Remedial Response, Washington, DC.
U.S. EPA. 1988d. Laboratory data validation: functional guidelines for evaluat-
ing organics analyses. U.S. Environmental Protection Agency, Office of
Emergency and Remedial Response, Washington, DC.
U.S. EPA. 1988e. Procedures for dispersion modeling and air monitoring for
Superfund air pathway analysis. U.S. Environmental Protection Agency,
Research Triangle Park, NC.
U.S. EPA. 1988f. Statistical methods for evaluating ground water from
hazardous waste facilities. U.S. Environmental Protection Agency, Washington,
DC.
U.S. EPA. 1988g. Statistical methods for evaluating the attainment of Superfund
cleanup standards. Volume I: soils and solid media. Draft Report. U.S.
Environmental Protection Agency, Office of Policy, Planning, and Evaluation,
Washington, DC.
U.S. EPA. 1988h. User's guide to the contract laboratory program. U.S.
Environmental Protection Agency, Office of Emergency and Remedial Response,
Washington, DC.
78
-------
U.S. EPA. 1989a. Assessing human health risks from chemically contaminated
fish and shellfish: a guidance manual. EPA-503/8-89-002. U.S. Environmental
Protection Agency, Office of Marine and Estuarine Protection, Washington, DC.
U.S. EPA. 1989b. Groundwater sampling for metals analyses. EPA-540/4-89-
001. U.S. Environmental Protection Agency, Office of Solid Waste and
Emergency Response, Washington, DC.
U.S. EPA. 1989c. Preparing perfect project plans: a pocket guide for the
preparation of quality assurance project plans. U.S. Environmental Protection
Agency, Risk Reduction Engineering Laboratory, Cincinnati, OH.
U.S. EPA. 1989d. Risk assessment guidance for Superfund. Volume I: human
health evaluation manual. Interim Final Report. OSWER Directive 9285.7-01a.
U.S. Environmental Protection Agency, Office of Emergency and Remedial
Response, Washington, DC.
U.S. EPA. 1989e. Soil sampling quality assurance guide. Review Draft Report.
U.S. Environmental Protection Agency, Environmental Monitoring Support
Laboratory, Las Vegas, NV.
U.S. EPA. 1990a. U.S. EPA Contract Laboratory Program statement of work
for inorganics analyses, multi-media, multi-concentration. Document #ILM01.0.
U.S. Environmental Protection Agency, Washington, DC.
U.S. EPA. 1990b. U.S. EPA Contract Laboratory Program statement of work
for organic analyses, multi-media, multi-concentration. Document #OLM01.0.
U.S. Environmental Protection Agency, Washington, DC.
U.S. EPA. 1991. Draft analytical method for determination of acid volatile
sulfide in sediment. U.S. Environmental Protection Agency, Office of Science
and Technology, Washington, DC.
U.S. FDA. 1977. Pesticide analytical manual. Volume I. U.S. Food and Drug
Administration, Washington, DC.
U.S. FDA. 1986. Pesticides and industrial chemicals in domestic foods. U.S.
Food and Drug Administration, Washington, DC.
79
-------
GLOSSARY OF TERMS
See
Page
14 Action Limit A value for data from the analysis of quality control
checks indicating that a system or a method is not per-
forming normally and that an appropriate corrective
action must be taken. When action limits are exceeded,
analyses should be halted; samples analyzed since the
last quality control sample may need reanalysis.
B-l Analyte The specific component measured in a chemical analy-
sis.
Bioaccumulation The accumulation of certain chemicals in the tissues of
an organism. For example, certain chemicals in food
eaten by a fish tend to accumulate in its liver or other
tissues,
41 Blanks Quality control samples that are processed with the
samples but contain only reagents. They are used to
obtain the response of an analysis in the absence of a
sample, including assessment of contamination from
sources external to the sample.
13 Calibration The systematic determination of the relationship of the
response of the measurement system to the concentra-
tion of the analyte of interest. Instrument calibration
performed before any samples are analyzed is called the
initial calibration. Subsequent checks on the instrument
calibration performed throughout the analyses of sam-
ples are called continuing calibration. See the isotope
dilution technique for an example of calibration for an
entire analytical procedure from sample preparation to
instrument analysis.
40 Chromatography The process of selectively separating a mixture into its
component compounds. The compounds are measured
and presented graphically in the form of a chromato-
gram and digitally as a quantification report.
80
-------
See
Page
20 Cleanup
15 Data Package
72 Data Qualifier Codes
13 Data Quality
Objectives
B-8 Detector
38 Digestion
38 Extraction
40 Interferent
B-8 Ion
Isotope Dilution
Technique
The process of removing certain components from sam-
ple extracts, performed to improve instrument sensi-
tivity
The results of chemical analyses completed by the
laboratory, compiled, printed out, and presented to the
agency or individual requesting the analyses. Included
in the data package may be computer disks, magnetic
tape, or other forms of electronically stored data.
Notations used by laboratories and data reviewers to
briefly describe, or qualify, data reported by the labora-
tory.
Qualitative and quantitative statements formulated at the
start of any study to establish the quality of data re-
quired from the sampling and analysis procedures.
A device used in conjunction with an analytical instru-
ment to determine the components of a sample.
A process used prior to analysis that breaks down sam-
ples using acids (or bases). The end product is called
a digestate. Other chemicals, called matrix modifiers,
may be added to improve the final digestate.
A chemical or mechanical procedure to remove semi-
volatile organic compounds from a sample matrix. The
end product of extraction is called an extract.
Unwanted elements or compounds in a sample that
collectively cause unacceptable levels of bias in the
results of a measurement or in sensitive measurements.
An atom or group of atoms that carries a positive or
negative electric charge as a result of having lost or
gained one or more electrons.
As applied to the analysis of organic compounds,
technique that uses a large number of stable, isotopi-
cally labeled compounds spiked in each sample before
extraction to calibrate the method for biases in extract
preparation and analysis.
81
-------
Limit of Detection
Linearity
Matrix
Matrix Spike
Samples
Metals
Organic Compounds
Quality Assurance
and Quality Control
Quality Assurance
Project Plan
The lowest amount of a contaminant that can be reli-
ably detected based on the variability of either the blank
response of a method or that of a low-level standard.
During calibration of an analytical instrument, the
degree with which incremental concentrations of an
analyte produce constant increments of response.
The sample material (e.g., water, sediment, tissue) in
which the chemicals of interest are found. Matrix
refers to the physical structure of a sample and how
chemicals are bound within this structure. At a gross
level, tissue is one kind of sample matrix and soil is
another. At a finer level, a sediment sample of silty
sand containing large amounts of calcium carbonate
from the shells of aquatic organisms represents a dif-
ferent sample matrix than a sediment sample of clayey
silt containing a large amount of organic carbon from
decaying vegetation.
Quality control check samples created by adding known
amounts of chemicals of interest to actual samples,
usually prior to extraction or digestion. Analysis of
matrix spikes and matrix spike duplicates will provide
an indication of bias due to matrix effects and an esti-
mation of the precision of the results.
A group of naturally occurring elements. Certain
metals (such as mercury, lead, nickel, zinc, and cad-
mium) can be of environmental concern when they are
released to the environment in unnaturally high
amounts.
Carbon-based substances commonly produced by ani-
mals or plants.
A system of procedures, checks, audits, and corrective
actions to ensure that all research design and perfor-
mance, environmental monitoring and sampling, and
other technical and reporting activities are of the high-
est achievable quality.
A detailed document specifying guidelines and proce-
dures to assure data quality during data gathering,
analysis, and reporting.
82
-------
See
Page
13
11
41
Quality Control
Checks
Quantification Limit
Reference Materials
41
Replicates
B-l
Response Factor
Blanks, replicates, and other samples used to assess the
overall analytical system and to evaluate the perform-
ances of individual analytical instruments or the techni-
cians that operate them.
The lowest level at which a contaminant may be accu-
rately measured and reported without qualification as an
estimated quantity.
Materials or substances with well-characterized proper-
ties that are useful for assessing the accuracy of an
analysis and comparing analytical performances among
laboratories. Certified reference materials (CRMs) are
samples containing precise concentrations of chemicals,
accurately determined by a variety of technically valid
procedures. Standard reference materials (SRMs) are
CRMs issued by the National Institute of Standards and
Technology.
One of several identical samples. When two separate
samples are taken from the same field station, or when
one sample is split into two separate samples and ana-
lyzed, these samples are called duplicates. When three
identical samples are analyzed, these samples are called
triplicates.
The ratio of the response measured by the detector of
an analytical instrument to the amount (mass) of a
chemical.
20 Semivolatile Organic
Compound
38 Spectrometry
An organic compound with moderate vapor pressure
that can be extracted from samples using organic sol-
vents and analyzed by gas chromatography.
An analytical technique used to identify and measure
concentrations of metals in a sample. Two basic forms
of spectrometry commonly used are atomic absorption
spectrometry (AA) and inductively coupled plasma-
atomic emission spectrometry (ICP).
83
-------
See
Page
41 Spiked Method
Blanks
14 Statement of Work
41 Surrogate Spike
Compounds
18 Target Analyte List
16 Volatile Organic
Compound
13 Warning Limit
Method blanks to which known amounts of surrogate
compounds and analytes have been spiked. Such sam-
ples are useful to verify acceptable method performance
prior to and during routine analysis of samples con-
taining organic compounds. Also known as check
standards in some methods; independently prepared
standards used to check for bias and to estimate the
precision of measurements.
A contract addendum used as a legally binding agree-
ment between the individual or organization requesting
an analysis and the laboratory preforming the actual
tasks.
Compounds with characteristics similar to those of
compounds of interest that are added to a sample prior
to extraction. They are used to estimate the recovery
of organic compounds in a sample.
24 inorganic contaminants compiled and presented in
the statement of work for inorganics analysis under
EPA's Contract Laboratory Program. EPA's Target
Compound List contains an additional 125 contami-
nants. These are presented in the statement of work for
organics analysis under the Contract Laboratory Pro-
gram.
An organic compound with a high vapor pressure that
tends to evaporate readily from a sample.
A value indicating that data from the analysis of quality
control checks are subject to qualification before they
can be used in a project. When two or more sequential
quality control results fall outside of the warning limits,
a systematic problem is indicated.
84
-------
APPENDIX A
EPA Priority Pollutants and
Additional Hazardous
Substance List Compounds
-------
CHEMICAL STRUCTURES AND MOLECULAR WEIGHTS OF U.S. EPA
PRIORITY POLLUTANT AND ADDITIONAL HAZARDOUS SUBSTANCE LIST COMPOUNDS
EPA # Compound Structure
OH
b HSL 2-methyl phenol D foTCH:
OH
c HSL 4-methylphenol c fp) 108
OH
CM,
h 0,1
b 31 2,4-dichlorophenol (of01
OH
OH
c 22 4-chloro-3-methyl phenol c fpl " 143
CH, A OH
J u CI.JL-l..
198
198
d 21 2,4,6-trichlorophenol ^ ""(of"
e HSL 2,4,5-trichlorophenol 6 ciJ§T
,ci «
f OH
ci ciJSLcj
e 81 pheninthrene 178
f 78 anthracene (pToTol 178
mw
PHENOLS
a 65 phenol fo) 94
^ OH
108
d 34 2,4-dimethylphenol CHj d fpYCH3 122
SUBSTITUTED PHENOLS
a w
a 24 2-chlorophenol (oJC1 126
163
f 64 pentachlorophenol ^ ci^s^ci 266
g 57 2-nitrophenol 9 C1 139
OH
h 59 2,4-dinitrophenol h foT*0* 184
no2
LOW HOLECULAR WEIGHT AROHATICS
a 55 naphthalene (oŁo) 128
b 77 acenaphthylene (c§A) 152
c
c 1 acenaphthene mjo) 154
d
d 80 fluorene 116
e
EPA # - EPA priority pollutant number defined for toxic pollutants in 40 CFR 401.15 that are a
subset of the hazardous substances listed in Appendix VIII of 40 CFR 261.
mw - molecular weight of an organic compound.
A-1
-------
EPA # Compound
HIGH MOLECULAR WEIGHT PAH
Structure
a
39
fluoranthene
b
84
pyrene
C (o)
c
72
benzo(a)anthracene
c°x°ier
d
76
chrysene
e
e
74
benzo(b)fl uoranthene
tscf-
f
75
benzo(k)f1uoranthene
g
g
73
benzo(a)pyrene
(oJoXoj
h
83
indeno(l,2,3-c,d)pyrene
i
i
82
dibenzo(a,h)anthracene
r/oW /-v
(oWoWo)
^2/
j
79
benzo(g ,h,i)pery1ene
CHLORINATED AROMATIC HYDROCARBONS
a 26 1,3-dichlorobenzene
b 27 1,4-dichlorobenzene
c 25 1,2-dichlorobenzene
d 8 1,2,4-trichlorobenzene
e 20 2-chloronaphthalene
f 9 hexachlorobenzene
(Si
—*>ci
CI
(°rcl
CI
b "
ci
CI
ci
f CI
cl3s(cl
mw
202
202
228
228
252
252
252
276
278
276
147
147
147
181
163
285
ci
A-2
-------
Stucture
ci ci
f I
C1 —c—C—CI
t I
a ci
b»c,d; •' \.
ci
ci C1
X
Cl c-c
/V
Cl Cl
EPA # Compound
CHLORINATED ALIPHATIC HYDROCARBONS
a 12 hexachloroethane
b xx trichlorobutadiene isomers
c xx tetrachlorobutadiene isomers
d xx pentachlorobutadiene isomers
e 52 hexachlorobutadiene e
f 53 hexachlorocyclopentadiene
HAL06ENATE0 ETHERS a
a 18 bis (2-chl oroethyl) ether cl/Nov/Ni
b 42 bis(2-chloroisopropyl)ether
c 43 b1s(2-ch!oroethoxy)methane
d 40 4-chloropheriyl phenyl ether
mm
e 41 4-bromophenyl phenyl ether \2)~Q\2/~,r
H OR Cl
Cl
c
•^0
'•^-Cl
Cl
Cl
Cl
'C-cC^CI
i«<
Cl
ch3
"v/o^Cl
CH3
mw
168
158
192
226
261
273
143
171
d 173
/mm\
©-o-®-" 204
249
PHTOALATES
71
a
b
c
d
e
f
70
68
67
66
69
dimethyl phthalate
diethyl phthalate
di-n-butyl phthalate
butyl benzylphthalate
bis(2-ethylhexy1) phthalate
di-n-octylphthalate
c-o—c
-------
EPA # Compound Structure mw
MISCELLANEOUS OXYGENATED COMPOUNDS
o
a 5 4 isophorone c*,4jLu °T 138
b HSL
3ch3 W3 b CJL
benzyl alcohol fol
108
c HSL benzoic acid C fo) d 122
d 129 2,3,7,8-tetrachlorodibenzo-p-dioxin
e HSL dibenzofuran e ©g©
168
ORGANONITROGEN COMPOUNDS
a HSL aniline rm 98
b 56 nitrobenzene o ° (sD *23
&
C N
c 63 N-nitroso-di-n-propylamine ¦ Ax" 130
d x2
d HSL 4-chloroaniline rg) 128
e "5*2 t
e HSL 2-nitroanil ine (oJmz C1 138
Z
(Si
138
f HSL 3-nitroaniline
g HSL 4-nitroaniline 9 h *138
h 36 2,6-dinitrotoluene "°2 ^yŁv"°2 ig2
i <*3 ^
i 35 2,4-dinitrotoluene (aT"02 182
j 0
J 62 N-nitrosodiphenyl amine "°2 198
k 5 benzidine ""2 (o)" 1 184
1 28 3,3'-dichlorobenzidine 253
A-4
-------
93
94
92
89
90
91
95
96
97
98
99
100
101
102
103
104
105
113
106
110
107
111
Compound
Structure
mw
cci2
CO.
cJSngi
p,p1-DDE
p,p'-ODD
p.p'-DOT
aldrin
dieldrin
chlordane
alpha-endosulfan
beta-endosulfan
endosulfan sulfate
k
CHCU
Cl
Jtsnac,
g.h
pi C1?
CI / Z
endrin
ci
ci.
CI / 2
CI
d ci2
Cl,
CI'
f CI,
T Cl / 2
CI
'isomers have DIFFERENT
RING ORIENTATIONS
i ^2
J CI / Z
CI
Cl CI
endrin aldehyde STswE
heptachlor m a
2/
heptachlorepoxide
alpha-HCH
beta-HCH
delta-HCH
gamma-HCH
toxaphene
n,o,p
ci
pjgk
Cl^^A-CJ^
Cl
Cl
Cl,
Cl/Z
Cl Cl
A ¦ AXIAL
E ¦ EQUATORIAL
• - HCH: « EQUATORIAL Cl
« - HCH: 6 EQUATORIAL Cl
« - HCH: 5 EQUATORIAL Ct
Cl
3 EQUATORIAL Cl
C1 I AXIAL Cl
(HAXY COMPONENTS, APPROXIMATE FORMULA)
318
320
356
365
381
410
407
407
423
381
381.
373
389
290
290
Aroclor 1242
Aroclor 1248
Aroclor 1254
Aroclor 1260
ci ci
ci ci
REPRESENTATIVE FOWUU
A-5
-------
i
EPA
# Compound
VOLATILE
HALOGENATED ALKANES
a
45
chloromethane
b
46
bromoethane
c
16
chloroethane
d
44
methylene chloride
e
13
1,1'-dichloroethane
f
23
chloroform
g
10
1,2-dichloroethane
h
11
1,1,1-trichloroethane
i
6
carbon tetrachloride
j
48
bromodi ch1oromethane
k
32
1,2-dichloropropane
1
51
chlorodibromocnethane
m
14
1,1,2-trichloroethane
n
47
bromoform
0
15
1,1,2,2-tetrachloroethane
Structure
mw
-c—ci
I
i i
-c-c—a
a
t
-c-c
¦CI
I I
n-c-c-a
i i
ci
CI —c—CI
CI
CI
1 1 I
—•C—C—C— CI
m
ci ci
i i
—C— C —CI
I I
CI CI
ci—i—CI
50.6
i i
-C-C-lr
1 I
109
64.5
CI
1
—C—CI
1
85
99
CI
-C-Cl
119
CI
99
CI
I .
I 1
—C™C*»C1
133
' CI
154
CI
—C-lr
i
164
CI
133
Ir
4-ci
208
Ir
133
»r
Br
j
253
Ir
168
VOLATILE
HALOGENATED ALKENES
a
a 88
vinyl chloride
b 29
l.l'-dichloroethene
c
c 30
trans-1,2-dichloroethene
*SC™C '
CKC cv
d 33
cis- and trans-1,3-dichloropropene
e 87
trichloroethene e
CK ^C1
c-c
" CI
f 85
tetrachloroethene
CI
CI
Ve-C "" C1
-C"Cs'
eli-
C1'
"C-C
' \
tr«n»-
,C-C1
~CI
CK .
rmt
CK
62.5
97
97
111
131
166
A-6
-------
EPA # Compound
VOLATILE AROMATIC HYDROCARBONS
a
4
benzene
b
86
toluene
c
38
ethylbenzene
d
HSL
styrene
e
NSL
total xylenes
e
Structure
4*5
ex,
¦ ©-
CH,
b CH,
§
w ^
d CH
6
MO OTHER tSOCKS
mw
78
92
106
104
106
VOLATILE CHLORINATED AROMATIC HYDROCARBONS
a 7 chlorobenzene
ci
112
VOLATILE UNSATURATED CARBONYL COMPOUNDS
a 2 acrolein
b 3 acrylonitrile
C-C
.C-C'
56
53
VOLATILE ETHERS
a 19 2-chloroethylvinyl ether
VOLATILE KETONES
i i
106
a
HSL
acetone
—C—C—c—
' 'o 1 b
58
b
HSL
2-butanone
1 1 1
- -C-C—c-c-
C 1 ' 0 1
72
c
HSL
2-hexanone
i 1 1 1 1 ,
—c—c—c—c—c—c— d
1 1 1 1 5 1 CH,
1 i3 t 1
-c-c-c-c-c-
1 1 1 ¦ 1
100
d
HSL
4-methyl-2-pentanone
100
HISCELLAMEOUS VOLATILE COMPOUNDS
a HSl carbon disulfide
b HSL vinyl acetate
s-c-s
c-e
~ I
C-C-
76
86
A-7
-------
APPENDIX B
Description of Calibration,
Quality Control Samples, and
Widely Used Analytical Methods
-------
DESCRIPTION OF CALIBRATION, QUALITY CONTROL SAMPLES,
AND WIDELY USED ANALYTICAL METHODS
INTRODUCTION
The relative importance, rationale, and recommended frequency of calibration and each of
the quality control samples are discussed in the following sections. A summary of the major
considerations in applying these procedures is provided in the main text (see Chapter IV).
The concepts of calibration and quality control samples apply to dozens of analytical
methods that are currently used by laboratory technicians. Appropriate methods for particular
types of analyses are chosen based on the list of chemicals for analysis and the required detection
limits. Some of the widely used analytical methods are described below, along with technical
considerations that need to be taken into account when choosing individual methods.
CALIBRATION
Calibration of analytical instruments is a critical element of quality control, since the
procedures used for calibration will determine both the accuracy and precision of analytical
results. Gas chromatography/mass spectrometry, or any other analytical technique, measures
the magnitude of an unknown concentration of an analyte relative to a known concentration of
the analyte or a similar analyte in a standard. Such relative measurements are meaningless
unless the responsiveness of the analytical instrument can be determined over a range of analyte
concentrations. Through calibration, the response of the analytical instrument across a range
of analyte concentrations can be determined. The relationship between response and con-
centration is generally expressed as an analytical curve. For the analysis of organic compounds
in samples, response factors (RF) for analytes relative to standards at various concentrations may
be established from this analytical curve. The degree with which incremental concentrations of
an analyte produce constant increments of response is called linearity.
Guidelines for instrument calibration must be included in the statement of work for the
laboratory performing the analysis. Examples of these guidelines are given in Methods for
Chemical Analysis of Water and Wastes (U.S. EPA 1979). Project managers should assure that
the statement of work addresses the following points:
¦ Instruments should be calibrated at the beginning of the project before any
samples are analyzed, after each major disruption in analytical procedures, and
whenever action limits are exceeded for certain samples. This type of calibration
is called the initial calibration of the instrument. Through initial calibration, an
analytical curve based on the absorbance, emission intensity, or other measured
characteristics of known standards can be established. Data from subsequent
analyses is considered valid as long as it falls within the linear range of this
curve.
B-1
-------
¦ In some analytical programs the accuracy of the initial calibration is verified and
documented for every analyte by the analysis of EPA quality control solutions
immediately following the initial calibration. If immediate verification is not
required, then the verification may be conducted after several samples have been
analyzed. When a certified solution of an analyte is not available from EPA or
any other source, analyses should be conducted on an independent standard at a
concentration other than that used for calibration, but within the calibration range.
When measurements for the certified components exceed the action limits, the
analysis must be terminated, the problem corrected, the instrument recalibrated,
and the recalibration verified.
¦ The validity of the original calibration curve must be confirmed throughout the
analyses of samples. This process is called continuing calibration. However,
unless required by a specific method, the continuing calibration results should not
be used to quantify sample results (use the average response from the initial
calibration instead). For gas chromatography/mass spectrometry analyses of
samples containing organic compounds, calibration should be checked at the
beginning of each work shift, at least once every 12 hours (or every 10 to 12
analyses, whichever is more frequent), and after the last sample analysis of each
work shift. For gas chromatography/electron capture detection analyses,
calibration should be checked at the beginning of each shift, every 6 hours (or
every 6 samples, whichever is less frequent), and after the last sample analysis
of each shift.
¦ For analyses with inductively coupled argon plasma emission spectrometry and
atomic absorption spectrometry, all work should be performed using continuing
calibration. A procedure for conducting these calibrations is outlined in EPA's
Contract Laboratory Program statement of work for inorganic chemicals
(U.S. EPA 1990a). Frequency of continuing calibration of these instruments is
10 percent of the samples or every 2 hours during an analysis run, whichever is
more frequent.
QUALITY CONTROL SAMPLES
Blanks
Blanks are quality control samples that are processed with the samples but contain only
reagents. They are used to obtain the response of an analysis in the absence of a sample,
including assessment of contamination from sources external to the sample. Contamination can
arise from sources such as the reagents themselves, containers, and equipment used for
sampling, sample storage, and analysis. The types of analytical blanks used to identify each of
these potential sources of contamination are described below:
¦ Method blanks (also called preparation blanks or reagent blanks) are used to
identify any contamination that may have been contributed by laboratories during
B-2
-------
sample preparation. A method blank should be required for each batch of
samples prepared for analysis, except in the case of volatile organic analyses
(VOAs), which should be analyzed at least once every 12 hours. Because
method blanks are usually included in the cost of sample analysis, they should not
place an additional cost burden on a project.
¦ Bottle blanks are used to determine whether sample containers are sources of
contamination. One bottle blank should be prepared for each lot of sample
containers. Large increases in the contaminant level for the bottle blank com-
pared to the method blank indicates a potential container problem. Laboratories
usually provide clean containers for performing bottle blank analyses at no
additional cost. For most sampling efforts, precleaned containers from a chemical
supply company can be obtained at reasonable cost. This may eliminate the need
to have bottle blanks analyzed.
¦ Transport blanks (also called trip blanks) are used to detect contamination arising
during sample shipping, handling, and storage. These blanks are taken from
clean containers filled with deionized water, transported to the field, and stored
and shipped with the samples. One transport blank should be included with each
shipping container. A contaminant level for the transport blank that greatly
exceeds the contaminant level of the method blank indicates a potential field
handling, container, or storage problem. Transport blanks are important only for
projects involving analysis of volatile organic compounds, which may migrate
from one container to another.
¦ Field equipment blanks (also called decontamination checks) are used to detect
contamination arising from field sampling equipment. These blanks should be
used at least once for sampling of each medium during each sampling effort.
Matrix Spikes
Matrix spike samples are used to provide an indication of the bias due to matrix effects and
an estimation of the precision of results. They can also provide indications of how tightly an
analyte is bound to its matrix, such as soil or tissue. Matrix spike samples are created by adding
known amounts of chemicals of interest to actual samples, prior to extraction and usually prior
to digestion. The addition of these chemicals is commonly called spiking. The matrix spike is
analyzed using the same analytical procedure used for samples. The results are then compared
with the results from the analysis of a replicate, unspiked sample. In this way the effect of the
particular sample matrix on the recovery of chemicals of concern can be evaluated. By spiking
and analyzing the sample after digestion, an analyst can determine whether spike analysis results
have been affected by matrix binding or by sample preparation procedures. This post-digestion
spiking is only used for metals analyses.
Matrix spike samples should include a wide range of chemical types. For example,' a
matrix spike sample for analysis of semivolatile organic compounds may include spiking with
three neutral compounds, two organic acid compounds, and two organic base compounds.
B-3
-------
Ideally, samples should be spiked either at approximately 5 times the expected chemical
concentration in a sample or at 5 times the quantification limit, whichever is higher. Spiking
at this concentration reduces the possibility for any increase in random error during the matrix
spike analysis and eliminates any masking of interferences at representative chemical concentra-
tions.
One matrix spike sample and one matrix spike duplicate sample should be analyzed for
every set of 20 or fewer samples or with each sample preparation lot. If 20 or more samples
are submitted, one matrix spike duplicate pair should be run for each set of 20 samples.
Analysis of matrix spikes and matrix spike duplicates is often performed to assess the precision
and bias of one set of results.
Surrogate Spikes
Surrogate spike compounds can be used to estimate the recovery of organic compounds in
a sample. Surrogates are compounds with characteristics similar to those of compounds of
interest that are added to a sample before it undergoes the process of extraction. Surrogates
should be compounds that are not expected to be present in the samples, but they should have
characteristics similar to the compounds of concern. Compounds labeled with stable isotopes
(that is, where normal carbon or hydrogen atoms in the molecule have been replaced with
isotopes of carbon or hydrogen) are commonly used as surrogates. However, all surrogates need
not be isotopically labeled. They need only be compounds that are physically and chemically
similar to the chemicals of interest. For example, dibromooctafluorobiphenyl is used by some
laboratories as a surrogate for PCBs, although this compound is not identical in structure to a
PCB.
Because surrogate compounds are the only means of checking method performance on a
sample by sample basis, they should be used whenever possible. A minimum of five surrogate
spikes (three neutral and two acid compounds) should be added to each sample when analyzing
for semivolatile organic compounds. These surrogate spikes should cover a wide range of
compound classes. At least three surrogate compounds should be used for the analysis of
volatile organic compounds. At least one surrogate compound should be used in each extracted
sample as a check on recovery of pesticides. A separate surrogate compound should be used
in each extracted sample to check the recovery of PCB mixtures.
Check Standards
Check standards contain known amounts of analyte and are analyzed along with the
samples. Check standard results are used to indicate bias due to sample preparation and/or
calibration and to control precision.
B-4
-------
Laboratory Control Samples
Laboratory control samples are check standards used to assess precision in the analytical
procedures for metals. Like reference materials, these samples can be acquired from EPA.
Often they are routinely analyzed by the laboratory at no extra cost.
Spiked Method Blanks
In certain organic methods, surrogate spikes are added to the check standards; these quality
control samples are called spiked method blanks. The different compounds and their amounts
are specified in EPA's guidelines for the Contract Laboratory Program (U.S. EPA 1990a,b) and
other regional guidelines. Such analyses are useful to verify acceptable method performance
prior to and during routine analysis of samples containing organic compounds. Spike method
blanks do not take into account sample matrix effects, but can be used to identify basic problems
in procedural steps. Spiked method blanks can also be used to provide minimum recovery data
when no suitable reference material is available or when sample size is insufficient for matrix
spikes. A spiked method blank should be analyzed whenever a method is used for the first time
in a project and each time that a method is modified. In these instances, analysis of the spiked
method blank should take place before analysis of any samples.
Reference Materials
Reference materials are check standards with well-characterized chemical compositions,
which are useful for assessing the accuracy of an analysis and comparing analytical performances
among laboratories. These materials should be of the same matrix (for example, water,
sediment, or fish tissue) as the samples collected. They should contain, if possible, the major
target compounds at concentrations near those in the collected samples.
B-5
-------
OBTAINING CRMs AND SRMs
Reference materials can be obtained through the following sources:
¦ U.S. Environmental Protection Agency
Environmental Monitoring Systems Laboratory
QA Research Division
26 W. Martin Luther King Way
Cincinnati, Ohio 45268
(513) 569-7325
¦ 1990 Standard Reference Material Catalog
National Institute of Standards and Technology
Office of Standard Reference Materials
Building 202, Room 204
Gaithersburg, Maryland 20899
(301) 975-6776
¦ Certified reference materials (CRMs) are samples containing precise concentra-
tions of chemicals, accurately determined by a variety of technically valid
procedures. These samples are accompanied by, or traceable to, a certificate or
other documentation issued by a certifying body. CRMs can be obtained from
agencies such as the U.S. Geological Survey, National Research Council of
Canada, Canada Centre for Mineral and Energy Technology, and the EPA. A
standard reference material (SRM) is a CRM issued by the National Institute for
Standards and Technology.
¦ Other reference materials, while not certified, are also in wide use. For example,
a regional reference material (RRM) called Sequim-1 has been developed for use
by the EPA, National Oceanic and Atmospheric Administration, and other
agencies and laboratories working with marine sediments in Puget Sound. This
RRM is a fresh-frozen sediment homogenate collected from Sequim Bay, a study
site in Washington's Strait of Juan de Fuca. The homogenate is spiked with
selected organic acid and neutral compounds at low concentrations. Although not
certified, this RRM has been useful as a consistent base of comparison when
evaluating the effectiveness of extraction and analytical procedures used in several
Puget Sound studies.
In Puget Sound, analysis of one reference material sample is recommended for each 50
samples. When suitable reference materials are unavailable, a spiked blank can be used as a
substitute performance evaluation sample. Data from such a substitution should be accepted with
B-6
-------
qualification, because the laboratory may not be able to accurately analyze certain sample
matrices.
Replicates
Replicates are two or more samples that are analyzed to provide an estimate of the overall
precision of sampling or analytical procedures. When two separate samples are taken from the
same field station, or when one sample is split into two separate samples and analyzed, these
replicate samples are specifically called duplicates. Duplicates are usually sufficient when using
an analytical procedure that is well proven in the laboratory. Analyzing three replicate samples
(called triplicates) yields more meaningful statistical measures of variability than analyzing
duplicate samples. However, statistically combining the variance of duplicate sample results
across several sets of duplicates is also an effective way of evaluating variability.
Replicate samples are commonly used for the following purposes:
¦ Analytical (or laboratory) replicates measure the precision of sample analyses.
To prepare analytical replicates, the sample is homogenized by the laboratory and
divided into two subsamples. The subsamples are then independently analyzed.
If five or fewer samples are submitted for analysis, a minimum of one analytical
replicate is recommended, the exact number to be determined by the project
manager. If more than five but less than 20 samples are submitted, at least one
analytical replicate should be analyzed. A general rule is one analytical replicate
for every batch of up to 20 samples analyzed together (e.g., U.S. EPA 1990b).
¦ Field replicates measure sampling variability. These samples are collected at the
same time and location as other samples and are submitted for analysis along with
the other samples. Field replicates should be coordinated with analysis of
laboratory replicates so that both sampling variability and analytical variability can
be measured for the same station. The project manager or coordinator usually
determines the frequency with which field replicates are collected and sent to the
laboratory. If funds are limited, a single laboratory replicate to measure
analytical variability is preferred over a field replicate.
¦ Blind replicates are samples submitted to the laboratory without the laboratory's
prior knowledge. Data from these blind replicates can be used to detect potential
laboratory bias when compared to data from the analysis of analytical replicates.
In this manner, blind replicates can serve as laboratory quality control samples.
However, the results for these samples are subject to errors introduced by the
process of splitting the sample and by preservation, transportation, and storage
procedures as well as analytical errors. Analysis of one set of blind replicates
should be performed whenever 20 or more samples are submitted. At least one
triplicate set is recommended for analysis of more than 20 samples.
B-7
-------
COMMON ANALYTICAL METHODS
Gas Chromatography
Gas chromatography is a technique used to separate a complex mixture of organic materials
(for example, an extract of oil or smoke, which may contain hundreds, even thousands, of
compounds) into its components. To do this, the sample extract is injected into a heated
chamber, in which the mixture of compounds are concentrated at the head of a separating
column. The mixture is then carried through the column by an inert gas (called the mobile
phase). As the column is heated, the analytes pass through absorbent materials (called the
stationary phase). Different analytes move at different rates and appear one after another, along
with any interfering substances for a particular analyte, at the effluent end of the column. Here
they are measured by a detector. The detector sends information as an electronic signal to an
integrator, chart recorder, or computer. The signals are then interpreted and presented
graphically in the form of a chromatogram and digitally as a quantification report.
Using the chromatogram and the digital information contained in the quantification report,
many analytes contained in the sample can be accurately identified and quantified. Several
different gas ehromatograph/deteetor combinations are commonly used for the analysis of
volatile and semivolatile organic compounds, which include pesticides and polychlorinated
biphenyls (PCBs). Three of these combinations are described in the following sections.
Gas Chromatography/Mass Spectrometry
Gas chromatography/mass spectrometry (GC/MS) enables positive identification of a
compound that has eluted from a gas chromatographic column. In the GC/MS chamber,
separated compounds are bombarded by electrons and broken into characteristic fragments called
ions. The mass of the charged ions (i.e., their molecular weight) can be sensed by a detector
that accumulates data on ionization current over a wide range of masses. The more ions of a
particular mass, the greater the ionization current that is recorded for that mass. At any one
time, the relative intensity of this current over all the different masses recorded for a particular
compound gives rise to its mass spectrum (Figure B-l), The pattern of fragmentation ions in
a mass spectrum is used to distinguish one compound from another. In addition, the intensity
of the current recorded for one characteristic ion over time gives rise to its mass chromatogram,
which is used to quantify the concentration of the analyte as it elutes from the gas chromato-
graph. This characteristic ion is called the quantification ion. The mass chromatograms for all
ions detected can be superimposed into a reconstructed ion chromatogram (RIC), also called a
total ion chromatogram. The RIC is a graphic display of the total ionization current resulting
from all mass fragments for all compounds detected from the start to the finish of the analysis.
The RIC can be compared to the chromatograms produced by other detectors and provides an
indication of the relative composition of components in the sample mixture analyzed by GC/MS.
The mass spectrometer is a selective detector that allows for the positive identification of many
compounds. Other kinds of detectors may be more sensitive in detecting PCBs and other
chlorinated compounds.
B-8
-------
LIBRARY SEARCH DATA: F22744L #1922 BASE 11/2: 228
ttrt Is Call2 I 7 Pl> f 43727-
SAMPLEi 2744 L SRI! SEPIHEHT t ?Q-1 1:1 sfC 1,0 ML
SAMPLE
118,HI 2
lyflfl
H H t I*2y
C r*K 223 ¦
Riil IK 1
* 191 '
Benz (a) anthracene (Q.M. = 228)
LIBRARY
50 100 150 200 250
Figure B-i. Example mass spectrum for benz(a)anthracene identified in a sample sediment extract (upper) and
authentic spectrum stored in computerized GC/MS library (lower)
-------
Gas Chromatography/Electron Capture Detection
Gas chromatography/electron capture detection (GC/ECD) is useful for detecting analytes
such as pesticides, PCBs, and other similarly structured chemical compounds. The ECD
measures the total concentration of a chemical in a sample, but it cannot distinguish one
individual chemical from others. Verification of individual chemicals is accomplished by
comparing the order in which the chemicals appear (called the elution order) and the time that
passed before they appeared (called the retention time) with the elution orders and retention
times of certain analytical standards. The identity of a chemical is verified when the elution
orders and retention times match on two columns of different stationary phases. This verifica-
tion technique, called dual dissimilar column confirmation, is useful because two chemicals that
may have the same elution orders and retention times on one column will have different
characteristics on the second column.
Gas Chromatography/Flame Ionization Detection
Gas chromatography/flame ionization detection (GC/FID) can be used to detect organic
compounds that can be converted to ions during exposure to flame. This kind of detector is
especially sensitive to molecules that contain carbon and hydrogen, just as the GC/ECD is
especially sensitive to molecules containing chlorine. Since the flame ionization detector also
cannot distinguish between individual chemicals, dual dissimilar column confirmation must be
also performed for each sample analyzed. Related detectors that use flame for analyzing organic
samples include the nitrogen flame ionization detector (NFID), which is especially sensitive to
nitrogen- and phosphorus-containing molecules, and the flame photometric detector (FPD),
which is especially sensitive to organophosphorus pesticides and other compounds containing
sulfur.
PACKED VS. CAPILLARY COLUMNS
Different kinds of separating columns will yield different results. Packed columns
have been used routinely in the past for the analysis of PCBs, pesticides, and volatile
organic compounds. Packed columns produce chromatograms df fairly low resolution,
although the results may be reproducible (i.e. , precise). However, a large quantity of the
sample extract can be analyzed without overloading the instrument. More exacting analysis
is afforded by either megabore capillary or fused silica capillary eoluriins. Pesticides and
PCBs can now be routinely analyzed using megabore columns. Analysis of volatile organic
compounds can be conducted on Capillary columns; However, Since the entire sample
purge is used for volatile analyses, a packedcolumn with high loading capacity may still
be preferred if high resolution is not essential. If project results are dependent on detailed
recognition of contaminant mixtures (as is the case with PCBs and toxaphene), laboratories
equipped with capillary columns should be selected to perform analytical tasks.
B-10
-------
High Pressure Liquid Chromatography
Like gas chromatography, high pressure liquid chromatography (HPLC) is a technique used
to separate a complex mixture into its component compounds. The compounds are carried as
a liquid through solid absorbent phases and are sensed at the effluent end of the column by a
specialized detector sensitive to, for example, ultraviolet, fluorescence, or infrared signals. This
technique (described in EPA's laboratory manual Test Methods for Evaluating Solid Waste) is
useful for analyzing polycyclic aromatic hydrocarbon (PAH) compounds in samples because
many interferents on other instruments do not emit ultraviolet or fluorescent spectra, increasing
the sensitivity of the ultraviolet/fluorescence detector to many PAH compounds. However, some
compounds of interest also do not emit these characteristic spectra. It is for this reason that
EPA's Contract Laboratory Program statement of work for organic analysis recommends GC/MS
over HPLC using ultraviolet/fluorescence detectors. However, HPLC can be useful as a way
to screen samples for PAH contamination. Because it removes some interferents and separates
the sample into components that can be individually collected and analyzed, HPLC can also be
used as a powerful cleanup technique.
Atomic Absorption Spectrometry
Two basic methods of spectrometry are commonly used to identify and measure concentra-
tions of metals in a sample. Using the first method, atomic absorption spectrometry, the
digested sample is vaporized then exposed to a light source emitting a spectrum characteristic
of the target analyte. A portion of the light is absorbed by the analytc in the sample. The
remaining light is measured by a photoelectric detector and assigned a numerical value. Since
the intensity of light absorbed by the sample is proportional to the quantity of the target analyte
present in the light's path, this value represents the concentration of a metal in the sample.
Several different forms of atomic absorption are frequently used:
¦ Graphite furnace atomic absorption (GFAA) spectrometry determinations are
completed as single element analyses. With this technique, sample digestates are
vaporized in an electrically heated graphite furnace. The furnace allows for
gradual heating of the digestates in several stages, allowing an experienced analyst
to remove unwanted matrix components and select the optimum final temperature
for the metal being analyzed. The major advantage of this technique is that it
affords extremely low detection limits, particularly essential in the analysis of
arsenic, cadmium, selenium, or lead. Samples must be relatively clean for GFAA
to produce usable data.
¦ Hydride generation atomic absorption (HGAA) spectrometry uses a chemical
reaction to separate arsenic or selenium selectively from a sample digestate. This
technique removes these two elements from the sample matrix, minimizing
interferences and improving instrument sensitivity.
» Cold vapor atomic absorption (CVAA) spectrometry uses a chemical reaction to
release mercury from the digestate as a vapor, which is then analyzed by atomic
B-11
-------
absorption. This method should be used whenever analysis of mercury in samples
is required.
¦ Flame atomic absorption (FLAA) spectrometry determinations are normally
completed as single element analyses, following exposure of the vaporized
samples to either a nitrous oxide/acetylene or air/acetylene flame. Data produced
using this technique are relatively free of interferents, however instrument
sensitivity is not as great as with other forms of atomic absorption.
Inductively Coupled Argon Plasma Emission Spectrometry
The second widely used and cost-effective form of spectrometry is inductively coupled
argon plasma emission spectrometry (ICP). Using ICP, the digested sample is first turned into
an aerosol, then subjected to extremely high temperatures within the instrument. The high
temperature ionizes the atoms, which produce ionic emission spectra uniquely characteristic of
specific metals. The wavelengths of these spectra can then be used to identify one or many
different metals in the sample, while the intensity of light can be used to determine metals
concentrations.
The primary advantage of ICP is that it allows simultaneous or rapid sequential determina-
tion of many different metals, reducing the time and costs of individual metals analyses. The
primary disadvantage of ICP, however, is its lower degree of sensitivity. The detection limit
associated with ICP analysis is often higher than the detection limit that can be obtained through
the use of a graphite furnace or several other forms of atomic absorption spectrometry.
Although all ICP instruments use high-resolution optics and background corrections to minimize
interferences, analysis for traces of metals in the presence of a large excess of a single metal can
be difficult. Spectrometric data are reliable only if the analyte concentrations in the digestate
are 5-10 times greater than the instrument detection limit. When concentrations are lower than
this value for ICP analysis (as is often the case, for example, with samples containing arsenic
or lead), then GFAA should be used. A relatively new method of detection is the use of
combined inductively coupled plasma-mass spectrometry (ICP/MS), which not only allows for
simultaneous determination of many different metals, but also can achieve lower detection limits
comparable to those using graphite furnace techniques.
B-12
-------
APPENDIX C
Example Statements of Work
for the Laboratory
-------
PREFACE
This appendix contains the following two kinds of statements of work as examples for
project managers:
1. An example completed statement of work for the procurement of organic compound
analyses in soil and tissue samples (see pp. C-l through C-6).
2. A generic statement of work for the analysis of any kind of chemical analysis in any
kind of sample matrix (see pp. C-l through C-l 1).
The first example show how an actual statement of work for a specific set of analyses might
appear. The second example is prepared in the same format at the first example, but specifics
of the required analyses have been replaced with "fill-in-the-blank" instructions.
C-i
-------
EXAMPLE STATEMENT OF WORK
LABORATORY ANALYTIC ALSERVICES, INCORPORATED
The following tasks shall be performed by Laboratory Analytical Services, Inc (LASI) as
extensions to work identified as part of Contract No. 00-00-0000 between Consultant, Inc. (CONS)
and the U.S. Environmental Protection Agency. Where this statement of work (SOW) differs from
the referenced methods, this SOW shall take precedence. The QA Coordinator for this project is
Mary Doe (555/555-1111). The LASI contact is John Smith (555/555-2222).
SUMMARY OF ANALYSES
LASI shall perform quantitative analyses for the organic compounds listed in Table 1. The
analyses shall be in accordance with quality assurance/quality control (QA/QC) requirements
specified in the QA/QC project plan for the Waste Site survey for the samples identified in the
following sections.
Analysis of Soil Samples
The following list comprises the maximum number of soil samples that will be analyzed by
LASI:
Number of Price Per Total
Analysis Analyses Analysis Price
Semivolatile compounds"
40 Sediments $470
2 Matrix spike 470
2 MS duplicate 470
1 Reference material 470
$18,800
940
940
470
Pesticides/PCB
40 Sediments $170
2 Matrix spike 170
2 MS duplicate 170
$6,800
340
340
TOC
40 Sediments $40
2 Duplicates 40
$1,600
80
Analytical Total - Soil
$30,310
a All semivolatile compound analyses shall include a GC/MS scan for tentatively
identified compounds as described in the Analytical Procedures section of the QA
project plan for this survey.
C-1
-------
Analysis of Tissue Samples
The following list comprises the maximum number of tissue samples that will be analyzed by
LASI:
Number of Price Per Total
Analysis Analyses Analysis Price
Semivolatile compounds
35 Fish tissue
$550
$19,250
(including lipid content)
2 Matrix spikes
550
1,100
2 MS duplicates
550
1,100
1 Reference material
550
550
Pesticide/PCBs
35 Fish tissues
$170
$5,950
2 Matrix spikes
170
340
2 MS duplicates
170
340
1 Reference material
170
170
Analytical Total - Tissues
$28,800
Additional Supplies and Services
50 One liter-wide mouth jars prepared for organic analyses 250
All sample containers required for collection of sediments
plus additional jars for archiving twenty sediment samples no charge
Preparation of four floppy diskettes for analytical data® 100
Additional Total $350
a Floppy diskettes containing all analytical data in EPA/CLP format shall be delivered by
LASI at the conclusion of the project. The diskettes will include, at a minimum, all
necessary files for review of the organic compound analyses.
Price Summary
The services associated with analytical tasks shall be provided at the specified unit rates, where
appropriate, at a price not to exceed $59,360,
SAMPLE DELIVERY AND STORAGE
Sampling will take approximately 21 days to complete and will begin approximately 10 August
1990. Frozen samples will be provided to LASI by CONS at the end of the sampling period.
Delivery will be made by car. Samples will be provided to LASI no earlier than 14 August 1990.
This schedule is contingent on sampling conditions and is subject to change. CONS will keep LASI
apprised of changes in the sample delivery schedule.
Sediment samples received by LASI will consist of homogenized sediment in jars and bottles.
Fish samples will consist of frozen whole fish bodies (minus head and internal organs). LASI shall
prepare muscle tissue homogenates from these fish samples. All samples shall be stored frozen as
specified in the QA project plan and analyzed as soon as possible after receipt. If LASI exceeds
C-2
-------
the holding time requirements in the QA project plan (7 days for extraction of refrigerated water
samples; 14 days for frozen sediment and tissue samples), penalties will be imposed of up to 50
percent of the analysis costs plus any additional costs incurred by CONS, LASI, or other parties as
a result of this violation, including total costs of resampling efforts. Unused portions of samples
shall be retained until CONS provides EPA's instructions for final disposition. All samples shall
be maintained under strict chain of custody at all times, including documentation of any transfers
among facilities.
TURNAROUND TIME
Complete deliverable packages shall be due to CONS 30 days from the first business day
following receipt of the last sample for each matrix type. Data received by CONS after this time
period shall be subject to a penalty (2 percent per day reduction in the unit rate) for each day that
the data are late, to a maximum of 50 percent of the total price.
DELIVERABLES
Deliverable requirements are summarized in Attachment 1.
METHODS
Analyses shall be performed in accordance with analytical procedures specified in the QA
project plan, which conform to methods in the EPA Contract Laboratory (CLP) Statement of Work
(U.S. EPA 1988) as modified by the PSEP guidelines (PSEP 1989). Specific modifications are
discussed in the following sections.
Semivolatile Organic Compounds
The analysis of semivolatile compounds, including acid/base/neutral (ABN) extractable
compounds, PCBs, and pesticides shall follow modified EPA/CLP protocol that are consistent with
recommendations of low detection limits by PSEP (1989). Recovery shall be monitored using CLP
surrogate compounds, but no recovery corrections will be applied to the data. Limits of detection
for semivolatile compounds shall be 10-50 /ig/kg dry weight (DW) for samples without substantial
interferences. In order to attain these lower detection limits, modifications to CLP are necessary.
These include the use of a larger sample size (approximately 100 grams), a smaller extract volume
for GC/MS (0.5 mL), gel permeation chromatography (i.e., EPA Method 3640) as necessary, and
elemental sulfur cleanup (i.e., EPA Method 3660) to reduce interferences. Care will be taken to
ensure that any mechanical losses during optional GPC cleanup are minimized because recovery
corrections will not be applied to these data.
Ultrasonic extraction will be carried out as described by the CLP procedure, which is
equivalent to EPA Method 3550. The ABN fraction shall be analyzed by GC/MS. For samples
appearing oily or otherwise highly contaminated, LASI may extract <100 g to preclude excessive
extract dilution. However, if such samples prove to be far less contaminated than anticipated (i.e.,
most compounds are undetected at detection limits above 50 Mg/kg dry weight), then LASI shall
assume the cost of reextracting the samples using 100 g.
The pesticide and PCB extract shall be processed using alumina column chromatography (i.e.,
EPA Method 3610) followed by GC/ECD (i.e., EPA Method 8080). Quantification and
confirmation of pesticides and PCBs shall be by megabore column. In addition to the standard CLP
surrogate (i.e., dibutylchlorendate), 4,4,-dibromooctafluorobiphenyl (DBOFB) shall be added to
monitor recovery on a sample-by-sample basis. Alternative PCB surrogates may be substituted with
approval of the QA Coordinator. PCBs and pesticides shall be confirmed by GC/MS (at no
C-3
-------
additional charge) when detected at sufficient concentration. The ABN extract will be used for
these confirmation analyses.
Tissues
The same general approach will be used for tissues as for sediments, although the optional
GPC cleanup step shall be required for all tissue samples. The extract volume shall be reduced
(e.g., 0.5 mL for GC/MS) and the injection volume will be increased (e.g., 2 pL for GC/MS) to
the extent possible to enhance detection limits. Detection limits for tissue samples are expected to
be approximately 20-100 ppb (wet weight), which conform to PSEP guidelines.
In addition to analysis of chemicals in tissues (Table 1), lipid content will be determined by
gravimetrically weighing an aliquot of the extract for tissues prior to GPC cleanup. If an
electrobalance is available, a 1/10,000 aliquot is recommended; if a less sensitive balance must be
used, no more than a 1/40 aliquot will be analyzed.
QA/QC REQUIREMENTS
LASI will designate the samples to be analyzed as matrix spikes and matrix spike duplicates,
ensuring that sufficient material is available for analysis. Analytical variability will be assessed by
evaluating these QC samples. LASI shall inform CONS of these designations prior to analysis.
Matrix Spikes and Matrix Spike Duplicates
PSEP warning limits for semivolatile compounds and pesticide/PCB spiked only with EPA/
CLP surrogate compounds is 50 percent recovery, and will be enforced as an action limit for this
project. CONS shall be notified prior to data delivery if these limits are not met; at that time the
laboratory will recommend alternative corrective action based on best professional judgment
concerning potential matrix effects. Exception to these limits shall be made at the sole discretion
of CONS after consulting with LASI on a case-by-case basis. Reanalyses as the result of laboratory
error resulting in recovery out of action limits are required, but are not billable. CONS will assess
the need for other reanalyses, or in accordance with PSEP guidelines, to qualify such data as
estimates on a consistent basis for all analytes and samples.
The PSEP action limit for replicate analyses is 50 percent coefficient of variation between
measurements, or a factor of 2 for duplicates. This action limit shall be applied to the matrix spike
and matrix spike duplicate analyses. If this action limit is exceeded for more than two chemicals,
an additional replicate analysis of the (spiked) sample is required unless otherwise specified by
CONS. This additional replicate analysis is billable only by permission of the CONS Project
Manager.
Initial and Ongoing Calibration
All calibrations and action limits are as described by PSEP guidelines (1989), which generally
conform to EPA CLP requirements. Initial calibration shall be within 30 percent coefficient of
variation for the relative response factors. Ongoing calibrations shall be within 25 percent of the
initial calibration. Compounds for which these limits apply are specified in Table 9 of the PSEP
guidelines (PSEP 1989). An additional standard shall be analyzed near the detection limit (e.g., a
sample concentration equivalent to approximately 1-5 ng on-column for many compounds on
GC/MS) to provide evidence of the ability to report estimated quantities in the low concentration
range. The use of this standard in the calibration curve is not recommended because of random
error associated at concentrations near the detection limit. A three-point PCB calibration with
Aroclor 1254 or 1260 shall be required (in addition to that for pesticides) for confirmation of
linearity. The number of calibration points is consistent with EPA CLP and PSEP guidance.
C-4
-------
Blanks
If PSEP action limits for blanks are exceeded, analyses must be halted and the source of
contamination eliminated or reduced. The QA Coordinator must be notified of the situation
immediately, Corrective action, which may include reanalysis of affected samples at no cost to
CONS, will be determined in consultation with the QA Coordinator (see Responsibilities section).
PROGRESS REPORTS AND AUDITS
A verbal progress report to the QA Coordinator shall be required every two weeks for the
duration of the project. CONS may conduct onsite audits of the facility during the analysis period
to assess implementation of QA/QC requirements. LASI shall maintain records to support an audit
of the technical quality of all analyses and shall disclose all such records to CONS upon request.
RESPONSIBILITIES
Payment shall be on a per-sample-eompleted basis as specified in the Summary of Analyses
section. Invoices shall be processed following verification of receipt by CONS of acceptable data
that are approved by the QA Coordinator. Analytical results that exceed action limits specified in
this Statement of Work shall be verified by the laboratory prior to delivery of data. Corrective
actions will be discussed with the QA Coordinator. In special circumstances, exceptions for
individual analyses that do not meet these criteria may be made at the sole discretion of the
QA Coordinator. All such exceptions shall be documented in writing.
C-5
-------
ATTACHMENT 1. DELIVERABLES
The following deliverables will be provided by LASI for all analyses to enable independent
QA/QC review of the data package.
I. CASE NARRATIVE
The case narrative must contain: contract number, summary of any QC, sample, shipment,
and analytical problems, and documentation of all problems encountered and resolution. Be as
specific and detailed as necessary.
II. QC DOCUMENTATION
Documentation of QC information will include the following items:
A. Results for all of the QC checks initiated by the laboratory
B. Calibration and preparation blanks
C. Matrix spike and matrix spike duplicate results
D. Summary of any anomalies in instrument performance or unusual instrumental
adjustments.
III. SAMPLE DATA
Data packets for samples will include the following information:
A. Samples should be arranged in packets with the packaging list and signed chain of
custody forms, summary data sheet, followed by the raw data
B. Tabulated results shall be in units as specified for each matrix, validated and signed in
original by the laboratory manager.
IV. STANDARDS DATA
Data packets for standards will include the following information:
A. Tabulated instrument detection limits and limits of detection for the total analytical
procedure (as defined in the QA project plan)
B. Initial and ongoing calibration results.
V. FLOPPY DISKETTE (CLP FORMAT) OF ALL DATA
e-6
-------
STATEMENT OF WORK
INSERT LABORATORY NAME
The following tasks shall be performed by Laboratory Name {LAB) as extensions to work
identified as part of Contract No. 00-00-0000 between Consultant or Agency (CONS) and the U.S.
Environmental Protection Agency. Where this statement of work (SOW) differs from the refer-
enced methods, this SOW shall take precedence. The QA Coordinator for this project is QA NAME
(Phone Number). The LAB contact is Contact Name (Phone Number).
SUMMARY OF ANALYSES
LAB shall perform quantitative analyses for the Chemical Classes listed in Table 1. The
analyses shall be in accordance with quality assurance/quality control (QA/QC) requirements
specified in the QA/QC project plan for the Survey Name for the samples identified in the
following sections.
Analysis of Matrix-1 Samples
The following list comprises the maximum number of Matrix-1 samples that will be analyzed
by LAB:
Number of
Price Per
Total
Analysis
Analyses
Analysis
Price
Analysis I
xx Matrix-1
$xxx
$xx,xxx
x QC Sample
I
XXX
XXX
x QC Sample
II
XXX
XXX
x QC Sample
III
XXX
XXX
x QC Sample
IV
XXX
XXX
Analysis II
xx Matrix-1
$XXX
$XX,XXX
x QC Sample
I
XXX
XXX
x QC Sample
II
XXX
XXX
x QC Sample
III
XXX
XXX
x QC Sample
IV
XXX
XXX
Analysis III
xx Matrix-1
$xxx
$XX,XXX
x QC Sample
I
XXX
XXX
x QC Sample
II
XXX
XXX
x QC Sample
III
XXX
XXX
x QC Sample
IV
XXX
XXX
Analytical Total
- Matrix-1
$xx,xxx
C-7
-------
Analysis of Matrix-2 Samples
The following list comprises the maximum number of Matrix-2 samples that will be
analyzed by LAB:
Number of
Price Per
Total
Analysis
Analyses
Analysis
Price
Analysis I
xx Matrix-2
$xxx
$xx,xxx
x QC Sample I
xxx
xxx
x QC Sample II
xxx
xxx
x QC Sample III
xxx
xxx
x QC Sample IV
xxx
xxx
Analysis II
xx Matrix -2
$xxx
$xx,xxx
x QC Sample I
xxx
xxx
x QC Sample II
xxx
xxx
x QC Sample III
xxx
xxx
x QC Sample IV
xxx
xxx
Analysis III
xx Matrix-2
$xxx
$xx,xxx
x QC Sample I
xxx
xxx
x QC Sample II
xxx
xxx
x QC Sample III
xxx
xxx
x QC Sample TV
xxx
xxx
Analytical Total
-Matrix-2
$xx»xxx
Additional Supplies and Services
Provide a line-itemized list of additional services required xxx
Additional Total $xxx
Price Summary
The services associated with analytical tasks shall be provided at the specified unit rates, where
appropriate, at a price not to exceed Sxx.xxx.
SAMPLE DELIVERY AND STORAGE
Sampling will take approximately xxx days to complete and will begin approximately Date of
Survey, Samples will be provided to LAB by Sampling Firm at the end of the sampling period.
Delivery will be made by Delivery Method, Samples will be provided to LAB no earlier than
Earliest Date. This schedule is contingent on sampling conditions and is subject to change. CONS
will keep LAB apprised of changes in the sample delivery schedule.
C-8
-------
Insert description of sample type and condition on delivery (e.g., fresh, frozen). All samples
shall be stored as specified in the QA project plan and analyzed as soon as possible after receipt.
If LAB exceeds the holding time requirements in the QA project plan (specify for each matrix
type), penalties will be imposed of up to 50 percent of the analysis costs plus any additional costs
incurred by CONS, LAB, or other parties as a result of this violation, including total costs of
resampling efforts. Unused portions of samples shall be retained until CONS provides EPA's
instructions for final disposition. All samples shall be maintained under strict chain of custody at
all times, including documentation of any transfers among facilities.
TURNAROUND TIME
Complete deliverable packages shall be due to CONS xx days from the first business day
following receipt of the last sample for each matrix type. Data received by CONS after this time
period shall be subject to a penalty (2 percent per day reduction in the unit rate) for each day that
the data are late, to a maximum of 50 percent of the total price.
DELIVERABLES
Deliverable requirements are summarized in Attachment 1.
METHODS
Insert a description of the analyses to be performed, or reference a suitable description in the
QA project plan. Specifically identify any modifications to standard procedures in this statement
of work.
QA/QC REQUIREMENTS
Identify how known QC samples will be chosen by the laboratory. LAB shall inform CONS
of these designations prior to analysis.
List requirements for each kind of QC check, including calibrations, matrix spike samples,
reference materials, replicate samples, and blanks. Specify all action limits and the corrective
actions that will be imposed.
Exception to action limits shall be made at the sole discretion of CONS after consulting with
LAB on a case-by-case basis. Reanalyses as the result of laboratory error resulting in recovery out
of action limits are required, but are not billable. CONS will assess the need for other reanalyses,
or in accordance with available guidelines, to qualify such data as estimates on a consistent basis
for all analytes and samples.
PROGRESS REPORTS AND AUDITS
A verbal progress report to the QA Coordinator shall be required every Reporting Time for
the duration of the project. CONS may conduct onsite audits of the facility during the analysis
period to assess implementation of QA/QC requirements. LAB shall maintain records to support
an audit of the technical quality of all analyses and shall disclose all such records to CONS upon
request.
C-9
-------
RESPONSIBILITIES
Payment shall be on a per-sample-completed basis as specified in the Summary of Analyses
section. Invoices shall be processed following verification of receipt by CONS of acceptable data
that are approved by the QA Coordinator. Analytical results that exceed action limits specified in
this Statement of Work shall be verified by the laboratory prior to delivery of data. Corrective
actions will be discussed with the QA Coordinator. In special circumstances, exceptions for
individual analyses that do not meet these criteria may be made at the sole discretion of the
QA Coordinator. All such exceptions shall be documented in writing.
C-10
-------
ATTACHMENT 1. DELIVERABLES
The following deliverables will be provided by LAB for all analyses to enable independent
QA/QC review of the data package.
I. CASE NARRATIVE
The case narrative must contain: contract number, summary of any QC, sample, shipment,
and analytical problems, and documentation of all problems encountered and resolution. Be as
specific and detailed as necessary.
II. QC DOCUMENTATION
Documentation of QC information will include the following items:
A. Results for all of the QC checks initiated by the laboratory
B. Calibration and preparation blanks
D. Summary of any anomalies in instrument performance or unusual instrumental
adjustments.
III. SAMPLE DATA
Data packets for samples will include the following information:
A. Samples should be arranged in packets with the packaging list and signed chain of
custody forms, summary data sheet, followed by the raw data
B. Tabulated results shall be in units as specified for each matrix, validated and signed in
original by the laboratory manager.
IV. STANDARDS DATA
Data packets for standards will include the following information:
A. Tabulated instrument detection limits and limits of detection for the total analytical
procedure (as defined in the QA project plan)
B. Initial and ongoing calibration results.
V. INSERT ADDITIONAL MATERIALS REQUIRED
C-11
-------
APPENDIX D
Example Quality Assurance
Report
-------
PREFACE
The following example of a detailed QA review for a metals data package
demonstrates the kind of information provided by QA specialists. The sections
of this example report address each of the components of a QA review discussed
in Chapter VI of the guidance manual {Evaluating Data from the Laboratory, see
p. 74).
INTRODUCTION
This report documents the results of a quality assurance review of analytical
data for metals in water samples from Project X. This quality assurance report
is provided in support of the quality assurance project plan for this project.
All laboratory analyses were performed by Analysis Laboratory in City,
State. All samples were analyzed in accordance with the U.S. Environmental
Protection Agency (EPA) Contract Laboratory Program Statement of Work for
Inorganic Analyses, Multi-media, Multi-concentration. Data validation was
performed according to EPA's Laboratory Data Validation Functional Guidelines
for Evaluating Inorganics Analyses.
The quality assurance review included examination and validation of the
following laboratory data:
¦ Sample digestion and extraction logs
¦ All instrument printouts, except for mercury (the instrument
printout was not available from the laboratory)
¦ Instrument calibration and calibration verification procedures and
results
¦ Sample holding times and custody records
¦ Manual data transcriptions and computer algorithms.
Data qualifiers were assigned as necessary during this review. Following the
validation procedures, data quality was assessed with respect to accuracy, preci-
sion, and completeness. All qualifier codes used in this report are defined in
Table 1.
D-1
-------
TABLE 1. DATA QUALIFIER CODES
Qualifiers Applied During Quality Assurance Review
U The analyte was not present above the level of the associated value. The associated numerical
value indicates the approximate concentration necessary to detect the analyte in this sample.
J The analyte was positively identified, but the associated numerical value may not be consistent
with the amount actually present in the field sample. The data should be seriously considered for
decision-making and are usable for many purposes.
UJ The analyte was not present above the level of the associated numerical value. The associated
numerical value may not accurately or precisely represent the concentration necessary to detect
the analyte in this sample.
R The data are unusable for all purposes. The presence or absence of the analyte has not been
verified. Resampling and reanalysis are necessary to confirm or deny the presence of the analyte.
Qualifiers Applied During Laboratory Validation8
E The reported value is estimated because of the presence of interference. This qualifier is
commonly used when the serial dilution result for analyses by ICP does not meet control limits.
M Duplicate injection precision was not met.
N Predigestion matrix recovery was not within control limits.
S The reported value was determined by the method of standard additions (MSA). The associated
value is as reliable as unqualified results.
W The postdigestion spike recovery for GFAAb analysis was not within control limits (85-115
percent), and the sample absorbance was less than 50 percent of the spike absorbance.
* Duplicate analysis was not within control limits.
+ The reported value was determined by MSA. The correlation coefficient for MSA is <0.995.
a Adapted from U.S. EPA (1987).
b Graphite furnace atomic absorption spectrometry.
D-2
-------
QUALITY ASSURANCE REVIEW
OVERALL CASE ASSESSMENT
All data for metals in the five water samples are acceptable for the uses
specified in the quality assurance project plan as qualified in this review except
the matrix spike result for silver, which was rejected. Data for all samples
analyzed for cadmium, calcium, lead, mercury, silver, and zinc are acceptable as
estimates. Data qualified as J (estimated) are acceptable, but a greater degree of
uncertainty is associated with these values than with unqualified data.
The matrix spike result for silver was rejected because the postdigestion
spike recovery (58 percent) was well below the EPA Contract Laboratory
Program (CLP) control limit (85-115 percent recovery). Analysis of the sample
by the method of standard additions (MSA) is required in this case, but was not
performed.
Calcium values received J qualifiers because the CLP control limit (U.S.
EPA 1987) was exceeded slightly for the serial dilution sample analyzed by
inductively coupled plasma-atomic emission spectrometry (ICP). Reported results
may be underestimated by approximately 10 percent.
Cadmium and lead results received J qualifiers because CLP control limits
for matrix spike recoveries and for duplicate analyses were exceeded. In
addition, the result for lead in Sample 2 was restated as undetected (U) at the
reported concentration because the associated digestion blank was contaminated.
Cadmium and lead data should be considered order-of-magnitude estimates.
Mercury results were qualified J because the matrix spike recovery was
below the CLP control limit. These results may be 100 to 200 percent higher
than reported.
A J qualifier was applied to silver results because recovery of silver was
poor for the laboratory control sample (LCS). Silver results may be approximate-
ly 100 percent higher than reported. Additional individual results were qualified
J because the correlation coefficient for the results determined by the method of
standard additions did not meet the CLP control limit of 0.995 (Table 2).
The overall data quality achieved by the laboratory for analyses completed
by ICP (Table 3) is typical for metals analyses in water samples. The overall
data quality for analyses by graphite furnace atomic absorption (GFAA) is typical
for arsenic, chromium, and silver. Data quality for cadmium, lead, and mercury
D-3
-------
TABLE 2. RESULTS FOR SAMPLES ANALYZED BY THE METHOD
OF STANDARD ADDITIONS
Correlation
Sample
Result
Coefficient
Assigned
Number
Analyte
(mg/L)
for MSA*
Qualifiers
1
Arsenic
40.0
0.999
Sb
Lead
32.8
1.000
S
Silver
11.1
0.996
S
Chromium
69.2
1.000
S
2
Silver
28.0
0.993
+ J°
3
Lead
40.3
0.991
+ J
Silver
40.1
0.994
+ J
4
Arsenic
47.6
0.974
+ J
Chromium
67.1
0.983
+ J
Lead
33.0
0.999
s
5
Arsenic
9.9
0.986
+J
Cadmium
34.6
0.998
S
Chromium
22.7
0.999
S
Lead
35.6
0.996
s
8 The method of standard additions.
b The analyte concentration was determined by MSA. The correlation coefficient is
2:0.995.
0 The analyte concentration was determined by MSA. The correlation coefficient is
<0.995. The result is an estimate.
D-4
-------
TABLE 3. ANALYTICAL METHODS AND INSTRUMENT
DETECTION LIMITS
Analyte
Method of Analysis
Instrument
Detection Limit
(mg/L)
Aluminum
ICP"
55
Arsenic
GFAAb
5
Cadmium
GFAA
5
Calcium
ICP
28
Chromium
GFAA
10
Copper
ICP
11
Iron
ICP
9.6
Lead
GFAA
5
Magnesium
ICP
140
Manganese
ICP
1.8
Mercury
CVAA°
0.2
Nickel
ICP
18
Silver
GFAA
5
Zinc
ICP
4
¦ Inductively coupled plasma-atomic emission spectrometry.
b Graphite furnace atomic absorption spectrometry,
c Cold vapor atomic absorption spectrometry.
d Manual spectrophotometry.
D-5
-------
is less than may be expected for these analytes in similar samples. Data quality
may have been affected by unstable instrument performance.
COMPLETENESS
A complete data package was submitted by the laboratory for five water
samples, one matrix duplicate and one matrix spike, and one laboratory control
sample and one method blank for each digestion batch. A list of analytes is
included in Table 3. During the quality assurance review, 33 results were
qualified J as discussed above. Data completeness for metals was 100 percent of
total requested analytes.
HOLDING TIMES
Holding times required by EPA CLP protocols were met for all metals
analyses.
ANALYTICAL METHODS
All sample digestion and analysis procedures, instrument calibration proce-
dures, and quality control checks conformed to EPA CLP requirements except as
noted below.
Sample Preparation and Analysis
Water samples were digested according to requirements specified for the
CLP (U.S. EPA 1987). Sample digestates were analyzed by ICP, GFAA, and
cold vapor atomic absorption spectrometry (CVAA), as indicated in Table 3.
Multiple digestions were prepared for Samples 1, 2, and the duplicate and the
spike of Sample 2, because unacceptably high levels of lead were present in the
second preparation blank and because volumes of digestate were initially insuffi-
cient for all analyses. A preparation blank and a laboratory control sample were
digested and analyzed with each batch. Only lead and arsenic results were
obtained from the second and third digestion batches. Results for all applicable
quality control samples, except the method blank for lead for the third digestion
group, were provided on the appropriate CLP forms by the laboratory or were
added during the quality assurance review.
Instrument Calibration
Instrument calibration was completed according to EPA CLP procedures
(U.S. EPA 1987). Four calibration standards and one blank were used for all
D-6
-------
analyses by GFAA, The correlation coefficient of a least squares linear regres-
sion met the CLP control limit of >0.995 (U.S. EPA, 1988) in all cases except
one. The correlation coefficient was 0.993 for the initial calibration for analysis
of cadmium in Samples 3 and 5. Consequently, the cadmium results for these
samples were qualified J.
The ICP was calibrated according to manufacturer instructions, using one
standard and one blank. A low-level standard was used to verify accuracy of the
calibration curve at low analyte concentrations for all metals except mercury and
aluminum.
Initial (ICV) and continuing (CCV) calibration check standards and initial
(ICB) and continuing (CCB) calibration blanks were analyzed immediately after
instrument calibration, after every 10 samples or more frequently, and at the
conclusion of each analytical run, with the following exception: no CCV/CCB
pair was analyzed at the conclusion of the ICP run. However, only interference
check samples were analyzed after the final CCV/CCB pair, and data quality was
not affected. Results for all CCVs fell within 90-110 percent of the expected
value (80-120 percent for mercury), as required by EPA CLP. Instrument
calibration remained within control limits for all samples thorughout each sample
run and for all other analytes.
Instrument-Specific Quality Control Procedures
ICP—A serial dilution sample is required by EPA CLP protocols to check
for matrix interference in samples analyzed by ICP. All samples analyzed by ICP
were diluted to one fifth of their initial concentration to bring manganese concen-
trations within the linear range of the ICP. The laboratory chose to report the
results of diluted Sample 3 on CLP Form 9, ICP Serial Dilutions. A further
serial dilution was required by CLP protocols to obtain a diluted result for
manganese, but was not performed. Results of the serial dilution for iron,
magnesium, nickel, and zinc were within the CLP control limit of 10 percent
difference from the undiluted result. The results for aluminum and copper were
not applicable because the undiluted concentration of these metals was not
sufficiently high. The result for calcium (11 percent difference) exceeded control
limits, with the diluted result (corrected for dilution) exceeding the undiluted
result. All calcium data were qualified E by the laboratory and J during the
quality assurance review. Reported calcium results may have a small negative
bias of approximately 10 percent due to matrix interference.
Interference check samples (ICSs) were analyzed at the beginning and end
of the ICP sample run to check for interference by other metals. Results met
CLP control limits in all cases. In order to extend the linear range of the ICP to
accommodate the high analyte concentrations present in the ICSs, a second
calibration curve was obtained for some of the ICS analytes using higher
standards than were used for the sample analyses. The analytical wavelength and
D-7
-------
all instrument parameters remained the same. Calibration was verified at the
higher calibration curve as well. Data relating to the higher calibration curve
were labelled "secondary lines" in the raw data.
GFAA—Quality control procedures for GFAA analyses included duplicate
injection of all samples and analysis of a postdigestion analytical spike with each
sample. Results of duplicate injections were spot-checked at a frequency of
approximately 10 percent. All examined duplicate injection results agreed within
20 percent coefficient of variation, as required by CLP protocols.
Recoveries of the analytical spike for numerous samples and analytes did not
meet CLP control limits of 85 to 115 percent. In most cases, these data were
qualified W (analytical spike recovery did not meet control limits and sample
absorbance is less than 50 percent of spike absorbance) by the laboratory, or
MSA was used to analyze the samples as required by CLP protocols. Sample
results obtained by MSA were qualified S by the laboratory if the correlation
coefficient obtained with the MSA results was 2*0,995. Results qualified S are
reliable and are not considered to be estimates. Sample results obtained by MSA
with correlation coefficients <0.995 were qualified + by the laboratory and J
during the quality assurance review. These results are estimates.
A systematic calculation error was made by the laboratory for all sample
results obtained by MSA, The error consisted of the misassignment of axes to
the sample concentration values and to the instrument response values, resulting
in an incorrect value for the slope of the instrument response per added concen-
tration and consequently for the analyte concentration in the sample. Results
obtained with a poor correlation coefficient showed the greatest magnitude in the
error. All results were corrected during quality assurance review (Table 2).
Several errors were made by the laboratory in following the CLP sample
analysis sequence for analyses by GFAA. The analytical spike recoveries of
silver and lead in the first method blank (122 percent and 119 percent recovery,
respectively) exceeded CLP control limits (85-115 percent). According to U.S.
EPA (1987), the problems should have been corrected and acceptable results
should have been generated for the method blank prior to sample analysis. A
qualifier (E) was applied to the silver result for Sample 5 (the only result not
obtained by MSA) by the laboratory because of the high analytical spike recovery
from the blank, but was removed during the quality assurance review because
data qualification is not automatically warranted in this case. All samples results
for lead from the first digestion group were obtained by MSA and were not
qualified by the laboratory or during the quality assurance review.
The matrix spike samples for lead and silver should have been analyzed by
MSA because the analytical spike recoveries were low (74 and 58 percent recov-
ery, respectively) for these analytes. The initial sample and duplicate (Sample 2)
D-8
-------
for silver were analyzed by MSA. The spike results for silver and lead are
estimates.
The analytical spike recovery for lead in Sample 3 was 34 percent. This
sample should have been diluted and reanalyzed (U.S. EPA 1987) but instead,
MSA was performed. Samples 2 (duplicate), 5, and 6 were analyzed by MSA
for arsenic and had correlation coefficients below the control limit. These
samples should have been reanalyzed, but were not. The correlation coefficient
for arsenic by MSA in Sample 2 (duplicate) was 0.909, well below the control
limit of 0.995, and the curve generated by the standard additions was exponential
in appearance. This result (45.5 /xglL) was rejected during the quality assurance
review because of the poor correlation coefficient, and the initial result (26.2
#tg/L) was accepted as an estimate.
Detection Limits
All reported instrument detection limits (IDLs) were below or equal to the
CLP contract-required detection limits (CRDLs) (Table 3). The IDL for lead by
GFAA was omitted from CLP Form 11, but was subsequently provided by the
laboratory. The IDLs reported for GFAA analytes were estimated by laboratory
personnel based on their experience with the instrument and were not determined
statistically as required by CLP protocols (U.S. EPA 1987). Data were not
qualified for this omission. Based on the quality assurance review of original
laboratory data, in the reviewer's judgment the laboratory estimates of detection
limits tended to be high. Use of statistically determined detection limits may
result in lower values than the reported IDL in many cases.
ACCURACY
The laboratory performed one LCS analysis, using a commercially available
standard prepared specifically for CLP analyses, and one predigestion matrix
spike analysis (Sample 1 for mercury, and Sample 2 for all other analytes).
Recovery of all analytes except silver from the LCS ranged from 84 to 112
percent. Silver recovery was 52 percent (Table 4). CLP control limits for metals
in the LCS are 80-120 percent recovery except for silver, which has no contrac-
tual control limit (U.S. EPA 1987). All results for silver were qualified J during
the quality assurance review because of the poor LCS recovery (U.S. EPA 1988).
Predigestion matrix spike recovery was within control limits (75-125 percent;
U.S. EPA, 1987) for all metals except cadmium, lead, mercury, and silver
(Table 5). Results for cadmium and lead (194 and 261 percent recovery,
respectively) were greater than the control limit, and all sample results greater
than the IDL were qualified J during the quality assurance review (U.S. EPA,
1988). Only Sample 2 was not qualified for cadmium because none was detected.
The spike results for both lead and cadmium are questionable because the matrix
D-9
-------
TABLE 4. PERCENT RECOVERY FOR METALS
IN LABORATORY CONTROL SAMPLE
Percent
Analyte
Recovery8
Aluminum
98
Arsenic
105
Cadmium
112
Calcium
99
Chromium
109
Copper
101
Iron
99
Lead
98
Magnesium
99, 84, 93
Manganese
100
Mercury
111
Nickel
97
Silver
52
Zinc
98
a Percent recovery = measured value x 1 qq
true value
D-10
-------
TABLE 5. MATRIX SPIKE RECOVERY FOR METALS
IN SAMPLE 2
Analyte
Sample Result
(fJQ/l)
Spike Added
U/g/L)
Percent"
Recovery"
Aluminum
310
2,000
97
Arsenic
25
40
89
Cadmium
5Ub
5
194
Calcium
~
-
NRC
Chromium
69
10
NAd
Copper
27
250
103
Iron
7,090
1,000
77
Lead
29
20
261
Magnesium
-
--
NR
Manganese
6,560
500
76
Mercury"
0.2U
1.0
40
Nickel
180
500
106
Silver
28Rf
10
Zinc
180
500
95
a Percent recovery = measured value x 1Q0
true value
b U = the analyte was not detected to the indicated con-
centration.
c A matrix spike was not required for this analyte (U.S. EPA
1987).
d The result is not applicable because the sample concentra-
tion is greater than 4 times the spike concentration.
a Sample 1 was spiked for mercury only.
1 The spike sample result was rejected; the result is not
meaningful.
D-11
-------
duplicate results for Sample 2 exceeded control limits, so a reliable sample
concentration is not available. The spike sample result for lead is also question-
able because the sample should have been analyzed by MSA, but was not. In
addition, at least one method blank for lead was contaminated (as discussed in the
Blanks section); nonsystematic lead contamination may also have contributed to
the poor replicability of the duplicates and the high spike recovery for lead. All
data were qualified as estimated despite the uncertainty in the matrix spike results
because the magnitude of the control limit exceedance was large for both
analytes.
All mercury data were qualified J during the quality assurance review
because predigestion spike recoveries (40 and 39 percent, respectively) were
much lower than control limits. Recovery for a postdigestion mercury spike
analyzed for Sample 1 was 38 percent, similar to the predigestion spike result.
This indicates that a matrix interference at the spectrophotometer was probably
responsible for poor recovery. Reported results for mercury may be lower than
the actual sample concentrations.
The matrix spike result reported for silver was lower than the result reported
for the unspiked sample. The analytical spike result of the matrix spike sample
was low (58 percent recovery), and therefore the matrix spike sample should have
been analyzed by MSA, but was not. The original and duplicate Sample 2 were
both analyzed by MSA. The matrix spike result for silver was rejected during
the quality assurance review. The matrix spike result for chromium was not
applicable because the sample concentration exceeded 4 times the spike concentra-
tion. The magnitude of the precision error (the control limit is <20 relative
percent difference [RPD]) may be significant with respect to the spike concentra-
tion in this situation, and spike recovery results cannot be clearly interpreted.
Assessment of analytical accuracy was based on the LCS for both silver and
chromium.
PRECISION
Duplicate subsamples of Sample 2 for all metals and Sample 1 for mercury
only were analyzed by the laboratory. Results are summarized in Table 6. All
results except cadmium and lead were within the control limit of 25 RPD (for
sample results > 5 times the CRDL) or ± the CRDL (for results <. 5 times the
CRDL) specified by the EPA. A qualifier (*) was applied to all cadmium and
lead values by the laboratory or during the quality assurance review to indicate
EPA CLP duplicate control limit exceedance, and all cadmium and lead values
were qualified J during the quality assurance review.
The result for arsenic for Sample 2 (duplicate) as obtained by MSA and
reported by the laboratory, was rejected during the quality assurance review, but
the result obtained initially by direct comparison to the instrument calibration
curve was accepted as estimated (details in the Calibration section). The latter
D-12
-------
TABLE 6. DUPLICATE ANALYSIS RESULTS FOR METALS
IN SAMPLE 2
Analytc-s
Sample Result
U/Q/U
Duplicate Result
(//g/L)
Control
Limit*
Relative Percent
Difference15
Aluminum
310
308
200
~
Arsenic
25
26
10
..
Cadmium
5U°
17
5*d
--
Calcium
184,000
180,000
--
2
Chromium
69
78
--
--
Copper
27
29
15
--
Iron
7,100
6,700
--
8
Lead
29
47
-
46*
Magnesium
200,000
190,000
-
3
Manganese
6,600
6,400
--
2
Mercury8
0.2U
0.2U
0.2
--
Nickel
180
190
40
--
Silver
28
31
10
--
Zinc
180
190
--
3
¦ For results less than 5 times the CRDL, the difference between replicate
sample results must be & the CRDL.
b RpD _ [ sample - duplicate |
(sample + duplicate)/2 '
c U = the analyte was not detected to the indicated concentration.
d Results followed by a exceed CLP control limits.
• Sample 1 was analyzed in duplicate for mercury only.
D-13
-------
value was well within control limits, and the former value exceeded the control
limit by less than 1 jtg/L. The data qualifier (#) applied by the laboratory to the
arsenic value for Sample 2 was removed during the quality assurance review. No
arsenic data were qualified J.
BLANKS
A method blank and several calibration blanks were analyzed with the
samples for each metal. No contaminant was found in any method blank with
one exception: lead was present (6.1 fig/L) in the method blank prepared with
the second digestion batch. Results for Sample 2 and the duplicate and spike
samples for Sample 2 were reported from this digestion batch. Sample 2 was
qualified U (undetected at the reported concentration) during the quality assurance
review because the sample result (29.4 fig/L) was < 5 times the concentration
in the method blank (U.S. EPA 1988). According to the laboratory worksheets
for lead, the method blank prepared with the third digestion batch also contained
lead (105 /xg/L); however, data corresponding to this result were absent from the
instrument printout, and the result was not entered onto the appropriate CLP
form. The entry on the worksheet was apparently a transcription error, and no
result is available for this method blank. The result reported for Sample 1 was
obtained from this digestion batch, and was qualified J during the quality
assurance review.
Several results for CCBs exceeded the detection limit for calcium, manga-
nese, and zinc. However, all associated sample results exceeded 5 times the
concentration of the respective analyte found in any CCB, and were therefore not
significant with respect to the expected analytical variability of sample results.
No sample results were qualified as a result of detected analyte concentrations in
associated CCBs.
D-14
-------
REFERENCES
U.S. EPA. 1987. U.S. EPA Contract Laboratory Program statement of work
for inorganics analysis, multi-media, multi-concentration. SOW No. 788. U.S.
Environmental Protection Agency, Washington, DC.
U.S. EPA. 1988. U.S. EPA Contract Laboratory Program statement of work
for organics analysis, multi-media, multi-concentration. U.S. Environmental
Protection Agency, Washington, DC.
D-15
-------
APPENDIX E
Example Forms
-------
CHAIN-OF-CUSTODY RECORD
Project: COC Form Number:.
Sampler (Signature):
Sample Tag Sample No, Date Time Matrix Containers
1.
2.
3. - —
4.
5.
6.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
SAMPLE CUSTODY RECORDS
Every sample transfer must be signed by both parties involved.
Relinquished by Received by Date Time
Figure E-1. Example chain-of-custody form.
C744-26 069i
E-1
-------
SAMPLE ANALYSIS REQUEST
For:
Attention:
Project; SAR Number: Date:
Contact:
Lab Code Sample Tag Sample No, Matrix Cone. Preserv, Date Analyses (Digestion)
1.
2.
3.
4.
5.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
NOTES
Matrix:
Concentration:
Perse rvative:
Digestion:
Figure E-2. Example sample analysis request form.
C744-26 0891
E-2
-------
GROUNDWATER SAMPLE COLLECTION FORM
Collector: Sample Number:
Project Name: Station Idenification:
Sample Location and Depth: Project Number: —
Date Collected: (day/month/year) Time: (24-hr)_
Weather:
Sample Collected with: Q Bailer O Pump Q Spilt Barrel Other
Made of: Q Stainless Steel Q PVC ~ Teflon Other
PURGE DATA
Depth to Water (Top of PVC):
Begin: One Casing Volume (gal)
End: Gallons Purged =
(Remove 3 well volumes or until dry)
Description (Color, texture, density, moisture, turbidity, etc.)
FIELD PARAMETERS
Replicate pH Conductivity (^.mho/cm) Temperature
1
2 .
3
4
Meter Used in Measurement:
Sample Composited Over Time, Distance:
Duplicated Sample Number(s):
Well Condition:
(1. Well caps are secure and operational; 2. Visible damage to well)
Comments:
Signature: Date:.
Figure E-3. Example groundwater sample collection form.
C744-26 0691
E-3
-------
SOIL SAMPLING
FIELD DATA FORM
Station Identification:
Location:
Date:
Time:
Weather Conditions
Sampling Personnel (Signatures)
Project Name:
Project Number:
Photograph Roll.
Soil Samples
Number.
Sample I.D,
Tag Number
Date/l'ime
Station Sub. Replicate
Depth Soil Type/ Analysis
Interval Layer Code
2.
3.
4.
5.
6.
7.
10.
11.
12.
13.
14.
15.
Profiie/Site Sketch
Figure E-4. Example field data form for soil sampling.
C744-26 0891
E-4
-------
SURFACE WATER SAMPLING
FIELD DATA FORM
Station Identification:
Location:
Date: Time:
Weather Conditions
Sampling Personnel (Signatures)
Project Name:
Project Number:
Photograph Roll Number.
Field Measurements
Gage Height: Substrate Level on Gage: pH: _
Dissolved Oxygen (ppm): Water Temperature (°C): _
SC @ Field Temperature (microhos/cm): Calculated Streamflow (cfs):
Comments:
Water Quality Samples
Sample I.D.
Tag Number Date/Time Station Sample No. Replicate Filtered Preservative Analysis
2 .
3 .
4
6
8 .
9 .
10 ,
11. —
12
Site Sketch
Figure E-5. Example field data form for surface water sampling.
C744-26 0891
E-5
-------
STORM DRAIN SAMPLING
Date: Time:
Station:
Location:
Meter Readings:
°2
Comb. Gas
Person Sampling:
Sample:
Number:
Water:
Depth
Flow
Sediment;
Type
Depth _—
Color
Odor
Comments:
HNu/OVA
h2s.
Sketch of Manhole Sampling Location
Recorder:
Figure E-6. Example storm drain sampling form.
C744-260891
E-6
------- |