United States
Environmental Protection
Agency
Office of Solid Waste and
Emergency Response
(5102G)
EPA 542-R-01-013
October 2001
www.epa.gov
www.clu-in.org
V EPA Current Perspectives in Site Remediation and Monitoring
APPLYING THE CONCEPT OF EFFECTIVE DATA TO
ENVIRONMENTAL ANALYSES FOR CONTAMINATED SITES
D. M. Crumbling1
Executive Summary
Analytical chemistry methods can be classified as "definitive methods" or "screening methods."
Environmental decision-makers and practitioners frequently make a simplifying assumption that
definitive analytical methods generate "definitive data," while screening methods generate
"screening data." The pervasiveness of this incorrect assumption inhibits the development and
application of more cost-effective strategies for environmental sampling and analysis at
contaminated sites. Adopting the concept of "effective data" could promote the cost-saving
advantages of modern measurement and monitoring options in the context of contaminated site
cleanup, while ensuring the reliability of site cleanup decisions. This concept embodies the
principle that the information value of data (i.e., data quality) depends heavily upon the
interaction between sampling design, analytical design, and the intended use of the data.
Considering site-specific conditions, sample support, quality control, and data documentation
assure the scientific defensibility of effective data. When the interplay of these factors is
understood, screening methods can play important roles in generating data that are effective for
making defensible project decisions, while simultaneously improving the cost-effectiveness and
efficiency of site restoration activities.
Introduction
This issue paper provides detailed discussion
to supplement the article, Managing Uncer-
tainty in Environmental Decisions: Applying
the Concept of Effective Data at Contamina-
ted Sites Could Reduce Costs and Improve
Cleanups, that appeared in the October 1,
2001, issue of Environmental Science &
Technology (1).
This paper assumes that the regulatory action
levels or threshold values used to establish
acceptable/unacceptable levels of contamina-
tion have been developed in a defensible
manner. Although evaluating the validity of
the action level is a very important component
of scientifically defensible decisions about
whether a site poses unacceptable risk, the
topic itself is beyond the scope of this paper.
Likewise, while the selection and implemen-
tation of specific cleanup activities are impor-
tant, the topic of remedial technologies is also
beyond this discussion. This paper addresses
issues that revolve around the generation and
use of contaminant data as produced by
analytical chemistry methods. Uses for this
data include determining the "nature and
extent of site contamination," short- or long-
term monitoring of remedial effectiveness,
and demonstrating regulatory compliance.
Contaminant data are used to decide whether
remedial actions are required for a site, and if
' EPA, Technology Innovation Office
-------
so, to guide the selection and design of remedial
activities. The reliability of contaminant data (as well as
other types of data) may be critical to the success and
cost-effectiveness of remediation or monitoring
activities.
Overview
It is important that regulators provide direction on how
compliance with an action level is to be demonstrated.
For example, is the action level intended to represent an
average concentration that should not be exceeded over
some exposure unit, or does it represent some other
statistical benchmark? A clear mechanistic understan-
ding of what an action level represents is needed in
order to design a scientifically valid sampling and
analysis plan. Without this understanding, project
decision goals will remain vague, resulting in confusion
and wasted effort. An important task of project-specific
systematic planning is to establish how regulatory action
levels will be applied to a particular site or project.
Obviously, the regulatory agency must "participate" in
the up-front planning process for efficient design and
implementation of a project plan to be possible. This
"participation" may range from written guidance that
presents clear, unambiguous interpretation of the regula-
tory benchmark to regulatory staff representation on a
project-specific planning team. Modernization of site
characterization and cleanup activities requires that all
parties bring the industry and regulatory experience
gained over the past 25-30 years to the table when
planning today's projects.
When gathering contaminant data, regulators, lawyers,
and project managers dealing with contaminated sites
have often insisted upon using "approved" analytical
methods. There is a very common perception that
prescriptively mandating what methods may be used
(and how they may be used) can assure defensibility and
data quality. In other words, it is assumed that if
"approved" methods are used, the data can be trusted. If
the methods used are not considered to be regulator-
approved, the data may be considered suspect solely on
those grounds and for that reason be rejected the
regulator. A commonly expressed concern is that data
produced by not-approved methods will not be legally
defensible. This concern is unwarranted when courts
operate from the basic common-sense principle that if
data are scientifically defensible, they should be legally
defensible. Federal standards (and at least some state
standards) do operate from this principle, and for those
courts, the admissibility of evidence does not require
adherence to methods approved by EPA or any other
standard-setting organizations (2). A more thorough
discussion of the regulatory and technology issues
surrounding the question of "EPA-approved" methods
within the context of the waste programs is found in
another paper (3).
This issue paper will argue that rigidity in the applica-
tion of analytical methods to environmental samples can
undermine the very data quality and defensibility that
regulators seek to secure. Furthermore, this paper will
argue that using more modern and innovative methods
for sampling and analysis holds the promise of greatly
improving the cost-effectiveness of scientifically defens-
ible environmental decision-making. But taking advan-
tage of these modern new tools will require that
regulatory-driven conceptualizations of "data quality" be
placed on a more scientifically defensible footing. In
addition, realizing the benefits of analytical flexibility
requires that practitioners take responsibility for
instituting the multidisciplinary teaming and training
needed to select and use analytical tools properly.
Transitioning to a more modern site restoration
paradigm is facilitated when judgments about "data
quality" are more closely linked to the project decisions
actually driving the data collection efforts (i.e., the
data's intended use), rather than tied solely to the
analytical procedures used (which is the current
paradigm). In other words, the data assessment question
should not be, "Was an approved method used and
followed exactly as written?" Rather, the primary
questions should be, "Are the data effective for making
the specified decisions, and are both the sampling and
analytical documentation accompanying the data
sufficient to establish that they are?" Answering this
question is the foundation of scientific defensibility. We
suggest that the terms "effective data" and "decision
quality data" could intuitively reinforce scientific defen-
sibility within environmental cleanup programs if the
terms become part of the environmental lexicon, paving
the way for more modern and more cost-effective work
strategies. More cost-effective investigations and
cleanups mean that more sites may be evaluated and
brought to resolution for the same resource investment.
Terminology—Methods vs. Data
First, a distinction between methods and data is
required. Although analytical methods are indeed used
to generate data, the analytical method is one of the last
links in a very long chain of events that forms the
foundation of scientific data. Nonetheless, decision-
makers at all levels of policy and practice assume that
-------
"definitive data of high quality" is automatically
produced when traditional laboratories use definitive
analytical methods (4). It is further assumed that any
decisions based on those data will be legally defensible
as long as the laboratory strictly adhered to the
"approved" method and to whatever quality assurance/
quality control (QA/QC) is assumed specified for that
method (irrespective of the QC's relevance to the data's
intended use). On the other hand, on-site analytical
methods are usually categorized as "field screening
methods" [despite the fact that some field methods are
based on definitive method technologies (5)].
"Screening analytical methods" are assumed to produce
screening quality data that are considered inferior and
not legally defensible. It is also assumed that adequate
QA is not possible when screening methods are used,
and particularly when analysis is performed in the field.
Whatever utility these assumptions may have had in the
past, the evolution of analytical technologies and of our
experience in using them shows these generalizations to
be false. Data produced by screening methods can be of
known and documented quality; adequate quality control
can be used in conjunction with data generated in the
field. But to do so, common traps that compromise data
quality and drive up costs must be avoided. Current
engineering practice must be challenged to integrate
analytical chemistry expertise into project planning
when data collection efforts are designed. This challenge
is especially critical to the use of on-site measurement
technologies that are based on screening methodologies
where potential analytical uncertainties must be
balanced according to the data's intended use. As will be
discussed in more detail later in this paper, there is no
"bright line" that distinguishes screening analytical
methods from definitive analytical methods. Rather
there are gradations where screening methods tend to
have more uncertainty in analyte identification and
quantification than methods that are considered to be
definitive. Yet definitive methods are far from foolproof.
Even methods such as ICP-AES and ICP-MS are not
free from interferences that can compromise data quality
(6).
When data quality issues are treated as if they were
solely dependent on method requirements and
independent of data use, a myopic focus on managing
analytical error can actually trigger major decision
errors (7). Environmental decisions are especially
susceptible to error in site cleanup situations because the
maj or source of decision uncertainty (as much as 90% or
more by some estimates) is due to sampling variability
as a direct consequence of the heterogeneity of
environmental matrices (8-10). Figure 1 illustrates the
paradox that highly accurate and QA-documented (i.e.,
"high quality") data points may actually form a poor
quality data set that produces misleading conclusions
and erroneous project decisions. Figure 1 depicts two
different sampling and analysis scenarios for a cartooned
site containing two hot spots (locations with signifi-
cantly higher contamination concentrations than the
surrounding area). Analyzing samples using a highly
Data Quality vs. Information Value
Fewer "higher quality"
data points leads to
lower information
value of the data set
. \
Many "lower quality"
data points leads to
higher information
value of the data set
Goal: A defensible site decision that reflects the
"true" site condition
FIGURE 1.
-------
accurate method is very expensive, so the hard truth is
that budget constraints frequently limit the number of
samples used to determine the presence and degree of
contamination.
Even if the data points themselves are "perfect," an
inaccurate assessment is likely when a few samples
cannot accurately locate or represent site contamination
(i.e., the samples are not "representative" of the site in
the context of the intended decisions about the site). A
much more accurate picture of the site is gained when
many samples are analyzed, even if the analytical
method itself is somewhat less accurate.
Thus, we must move beyond the pervasive assumption
that the use of definitive methods will automatically
produce definitive data. "Data" (as measurements
produced on samples for the purpose of supporting
decisions) are the end product of a long chain of
activities. Selecting representative samples is one of the
early activities that critically impacts the usefulness and
reliability of data for decision-making purposes, but
there are many other steps that contribute to the overall
"quality" or validity of environmental measurements.
Errors and problems can occur in the processes of
collecting a sample; preserving it; transporting it to the
laboratory; taking a subsample for actual analysis;
extracting analytes from the sample matrix; introducing
the analytes into an instrument; coping with any
chemical and physical interferences that may arise;
ensuring that analytical quality control criteria are met;
and finally documenting and reporting the results to the
data user. A problem in any single step can compromise
or invalidate the accuracy of a data point generated from
that sample, no matter how much attention is given to
other steps. Therefore, project planning must carefully
consider each step in the data generation chain in light
of the project goals and the nature of the site conditions
(11-14). Yet, discussions about "methods" (especially
references to SW-846 methods) almost always focus
exclusively on instrumental determinative methods,
completely ignoring the critical importance of methods
for sample collection, sample preparation, and extract
cleanup (15). If quality assurance deliberations
concentrate only on strengthening only one or two links
in isolation from the rest of the data quality chain (for
example, ensuring stringent laboratory QA/QC practices
for the determinative method), the weaker links [for
example, unevaluated variability in sampling and
subsampling (16)] can still severely compromise the
ability of data to support valid environmental decisions.
This is one of the reasons why, despite heroic efforts to
control variability in the delivery of laboratory services,
data quality problems continue to plague environmental
programs (17).
Terminology—Effective Data
Public stakeholders do not care whether Method A or B
was used to generate data at hazardous waste sites. They
do care whether correct decisions are being made that
protect their well-being. The key to defensible environ-
mental decision-making is to openly acknowledge all
underlying assumptions and to manage all sources of
uncertainty that can significantly impact the correctness
of a decision to the degree feasible. Often a "weight of
evidence" approach is needed because no single piece of
information can provide definitive evidence given the
complexities present in environmental systems.
Although it makes regulators' and practitioners' jobs
much more difficult, the inescapable truth is that relying
on one-size-fits-all approaches to gathering environ-
mental data ultimately compromises the validity (or
"quality") of environmental decisions. This is true
whenever a wide variety of sampling and analytical
conditions and a broad range of decisions are encoun-
tered in site restoration programs. A multitude of
complex and interacting variables cannot be accommo-
dated by preparing a simple checklist of prescriptive
sampling or analytical methodologies, or by substituting
laboratory certification for systematic project planning
(7). Flexibility (both in the choice of analytical method
and in the specific operations of a method) and the
professional expertise to apply it are vital if site data are
to be generated reliably and economically.
Terminology that explicitly or implicitly judges data
quality according to the definitive vs. screening nature
of the determinative method is misleading, because (as
argued above) the nature of the determinative method is
inadequate to assess whether the data themselves are
useful and reliable for their intended purpose. Some
environmental practitioners report that they use the term
"definitive data" to apply to any data that is of known
quality and demonstrated useful and reliable for their
intended purpose, even if generated by a screening
method. This is a legitimate application of the term
"definitive data." But it must be recognized that use of
the term in this manner runs counter to the way it has
been traditionally used in the environmental industry,
and such usage could create additional confusion in an
already ambiguous and conflicted environmental
lexicon.
To foster clarity, this paper suggests that different
terminology be introduced. This paper suggests the term
-------
"effective data" as a term that describes "data of known
quality that can be logically shown to be effective for
making defensible project decisions because both
sampling and analytical uncertainties have been
managed to the degree necessary to meet clearly defined
project goals." The term "decision quality data" carries
the same intuitive meaning, and is viewed as an
equivalent term.
There are a number of implications of this definition that
should be noted:
1) In contrast to evaluation protocols that evaluate
"data quality" solely on adherence to analytical
protocols, data judged to be of decision quality
(alternatively, data judged to be effective for
decision-making) must explicitly include evaluations
of sample representativeness as a fundamental data
quality indicator (18,19). It is a matter of common
sense that if the samples cannot be shown to be
representative of the site conditions in the context of
the decision to be made, evaluation of the measure-
ment quality for the analysis on those samples is
meaningless. If evidence for representativeness is
not presented, the data cannot be characterized as
effective for project decision-making. Demonstra-
ting appropriate analytical quality is only part of the
picture.
2) Data cannot be characterized as effective for
decision-making (alternatively, data cannot be
characterized as being of decision quality) unless
the decision(s) that the data are to support has
(have) been clearly articulated, along with an
expression of the amount of uncertainty that can be
tolerated in the decision. Thus, a systematic
planning process based on the scientific method,
such as EPA's Data Quality Objectives (DQO)
process, is vital (20). All pertinent project planning
and reporting documents using the term "effective
data" should contain clear statements that concisely
describe:
• what the primary project decisions are;
• what level of confidence is desired in those
decisions;
• what sources of uncertainty could lead to
making an incorrect decision;
• what strategies are used to manage for each
source of uncertainty so that making an
incorrect decision is avoided;
• what assumptions are used when knowledge
gaps are encountered that are not feasible or
practical to fill with more definitive informa-
tion; and
• how do these assumptions impact the decision
outcome.
A summary sheet concisely listing these items
would have supporting discussion contained within
the body of the planning or reporting document.
3) What are the types of "decisions" for which the term
"effective data" should be used? Introduction of the
term "effective data" seeks to address issues
associated with the generation and use of analytical
chemistry data for characterizing chemical contam-
ination or demonstrating regulatory compliance with
limits placed on the amount of contaminants present
in environmental media. Therefore, the "decisions"
to which the term "effective data" alludes are those
that address questions about contamination: Is
contamination present, and if it is, is it at a high
enough concentration to exceed regulatory levels or
to pose a threat to receptors? These might be called
"primary project decisions" or some other term that
denotes that these are decisions that must be made
in order to resolve the status of a potentially
contaminated site (or portion thereof). There are
many other decisions that must be made during the
course of site activities, but unless they directly
involve decisions determining the presence/absence
of contamination, the distinction of "effective data"
probably is not necessary. Overuse of the term
would be undesirable since it would undermine the
meaning and impact of the term.
4) Managing uncertainty in environmental decision-
making will often involve the collection and
interpretation of environmental data, but this is not
an absolute. Careful planning may indicate that the
cost of a reliable sampling and analysis program is
as, or more expensive, than simply assuming a worst
case scenario and acting accordingly to manage the
assumed risks. One case study showed that,
although using immunassay methods to guide clean
up of a small contaminated site saved the responsi-
ble party at least 50% over projected costs, relying
solely on a traditional site characterization scenario
to delineate contaminated hot spots would have cost
as much as assuming the entire soil mass needed to
be incinerated without attempting characterization
(21).
5) A data set that might not be effective for making a
certain decision when considered alone may become
-------
part of an effective data set when considered in
conjunction with other relevant information, such as
another data set that contains supporting or comple-
mentary information. An example of this is when
the cost of a definitive analytical method may
prohibit the sampling density needed to manage
sampling uncertainty, whereas existing screening
analytical methods cannot supply all the analytical
quality needed. Intelligent sampling and analysis
design may be able to select an inexpensive
screening method to manage sampling uncertainties,
while judicious confirmatory analysis of selected
samples by the definitive method manages for
residual analytical uncertainties in the data set
produced by the screening method. In this way, the
two data sets collaborate to produce data effective
for supporting the final decision(s) (21).
6) There is a key phrase in the definition of effective
data that must not be overlooked: data must be of
"known quality." This means that analytical quality
assurance and quality control (QA/QC) protocols
must be selected, implemented, and interpreted so
that the analytical sensitivity, bias, precision, and
the effect of potential interferences can be deter-
mined and reported. To achieve this at the project
level, a demonstration of method applicability
(wherein site-specific samples are evaluated for
matrix-specific impacts on method performance)
may be required to identify the sampling and
analytical uncertainties requiring management, and
permit the proper selection of the QA/QC para-
meters and acceptance criteria to be used during
project implementation (22). Estimating the contri-
bution to data variability due to matrix heterogeneity
and subsampling may also be importantto establish-
ing that data are of known quality (19). No matter
whether a method is considered to be a screening
method or a definitive method, QA/QC procedures
are required to produce data of known and
documented quality from any analytical chemistry
method.
Terminology—Screening Data
Data that cannot be shown to be of decision quality may
still provide some useful information. As such they
might be characterized as "screening data." Screening
quality data do not provide enough information to
satisfactorily answer the question being asked with the
desired degree of certainty. For example, consider data
resulting from pesticide analysis by gas chroma-
tography/mass spectrometry (GC/MS—considered to be
a definitive determinative method) performed in the
presence of high levels of hydrocarbon interferences
without benefit of an appropriate extract cleanup method
(assume that representative sampling was documented).
Such interferences often raise the reporting limits of
individual pesticide analytes (14). If the reporting limits
are higher then the project's decision levels, non-detect
results are not sufficient to declare the samples "clean."
Although a potentially definitive technique (GC/MS)
was used, in this situation it provided screening data
because other aspects of the analytical chain were
insufficient to achieve the needed analytical data quality.
The information provided by the compromised data is
somewhat useful (there is indication that pesticide
contamination is not present above the reporting limit
for those samples), but that information does not meet
the project manager's need to make the primary project
decisions at the levels of concern. However, the
screening data can also provide information that can
guide the project chemist to modify any subsequent
analyses (e.g., select an appropriate cleanup method) to
address the analytical problems so that effective data can
be generated.
Terminology—Sample Support
Because sample representativeness is a critical first link
in the data quality chain, it is useful to discuss "sample
support," a term not yet commonly used within the
environmental industry, although it has been around for
some years (4). Sample representativeness can be
divided into broad components of sample selection and
sample handling. Sample selection must consider the
"location" of samples (i.e., where or when the specimen
is collected). The heterogeneity of most environmental
matrices demands that sample selection be carefully
considered so that the number, type, location, and timing
of sample collection will be representative of spatial and
temporal variability in relation to the study objectives.
On the other hand, sample support impacts both sample
selection and sample handling. The term "sample
support" refers to the physical dimensions of the sample,
which is determined by the interplay between a number
of factors. Sample support is critical to sample
representativeness. Evaluating sample support includes
considering the size, shape, volume, and orientation of
the specimen and of the components that comprise it,
and the ability of the sampling tool (such as a coring
device, spatula, or liquid sampler) to collect a represen-
tative specimen from the statistical population about
which decisions are to be made (19,20).
Even when analysis occurs in situ, the concept of sample
-------
support is very important to evaluate what "sample" the
sensor or detector actually "sees." Understanding the
sample support governs the comparability of in situ
results to results obtained by the analysis of discrete
samples, which, in turn, determines the ability to use in
situ results to guide project decisions. In all sampling
and analysis scenarios, sample support greatly influen-
ces the legitimate interpretation of analytical results. Yet
under the current paradigm, analysts charged with the
task of assessing the quality or usability of analytical
data packages may not understand what the proj ect goals
are or what was done in the field well enough to
evaluate whether the samples (and thus the analytical
data package) were indeed representative and thus
usable for their intended purpose (7,23).
Whether samples are tree cores, fish tissue, soil borings,
or industrial wastewater, the concept of sample support
is critical to generating reliable environmental data. To
illustrate, consider a hypothetical proj ect where environ-
mental decisions will hinge on ascertaining whether
recent atmospheric deposition has contributed to lead
contamination in surface soils. Contrast two possible
sampling and analysis designs. In Design 1, an approp-
riately calibrated field portable X-ray fluorescence
(XRF) instrument is operated in an in situ "point-and-
shoot" fashion where each "shot" measures the total
concentration of lead over a 2 cm2 area to a depth of 1 to
2 mm, and a high sampling density across the site's
surface soil is easily feasible. In Design 2, a small
number of samples are collected because of the expense
of sending samples to the laboratory for definitive
analysis using atomic absorption spectroscopy (AAS).
Design 2 samples are collected using a 4 inch diameter
coring device that takes a 4 inch deep vertical core. The
whole core is deposited in the sample container for
transport to the laboratory. Once there, the lab
technician thoroughly homogenizes the entire sample
before a 1 gram portion is taken to undergo acid
digestion. The digestate is then analyzed for lead content
by the AAS instrumentation. Which data generation
approach would be expected to be more representative
of the true site condition in relation to the stated project
decisions?
The XRF approach of Design 1 yields more represen-
tative data for two reasons. First, and most critically, the
sample support (a thin surface layer of soil) analyzed by
the XRF is more representative of soil contaminated
through atmospheric deposition than a 4 inch deep core
that is homogenized before analysis. Second, the higher
number of samples possible with the XRF for a similar
analytical budget permits a more thorough
characterization of variability due to heterogeneity,
improving confidence that anomalous readings (either
high or low) will not distort interpretation of the results.
If isolated high readings are found and if it is important
to the project goals, the extent of hot spot areas could be
quickly delineated during the same field mobilization if
XRF were being used.
As part of a carefully designed XRF quality control
program tailored to the needs of the project, a small
number of split samples might be sent to an off-site
laboratory to establish method comparability between
the XRF data set and more traditional lead analysis
results or to evaluate potential method interferences.
Note that if samples are sent for off-site analysis, the
sample support for the two sets of analyses must be the
same or else there will be poor agreement. Typically,
proj ect managers assume that the field method is at fault
if there is not close agreement with fixed laboratory
"confirmatory samples." In actuality, both methods may
be accurately reporting resultsyfor the samples presented
to them. Differences in sample support (the physical
nature of the sample, such as particle size) or matrix
heterogeneity (a failure to achieve sufficient sample
homogenization prior to splitting the sample) often
accounts for differences between split sample results.
Significant dissimilarities are also possible in the actual
constituents being measured by each method, even
though each method is working perfectly as designed.
For example, the XRF measures total lead in the 2 cm2
surface area it "sees," while the AAS method quantitates
only the lead solubilized under the particular digestion
conditions used (24).
The Data Quality Conundrum—Finding a Better
Way
Despite the fact that analytical rigidity in many
environmental programs is counterproductive, prescri-
bing how analytical methods are selected and applied
has nearly universal appeal among regulators seeking
simplicity and predictability in regulatory programs.
This is a commendable goal, but past attempts at
"standardizing" sampling and analysis procedures
created a false sense of security (7). The scientific need
for project-specific sampling and analysis designs
cannot be neglected in favor of convenient uniformity
without jeopardizing the reliability of the environmental
data and their ability to support sound decisions.
As illustrated in Figure 1, a one-size-fits-all quest for ill-
defined "high quality data" easily adds to program costs
without commensurate benefits. The effectiveness of
-------
subsequent remedial actions is put at risk when project
managers respond to high per-sample costs by
decreasing the number of samples, an action that
increases the likelihood of faulty conclusions (23). In
contrast, EPA policies explicitly require proj ect-specific
data collection designs to be matched to the nature of the
samples and to the intended use of the data (25). EPA's
SW-846 methods manual (used in the waste programs)
warns that its procedures are "meant to be... coupled
with the realization that the problems encountered in
sampling and analysis situations require a certain
amount of flexibility... [that] puts an extra burden on the
user, [but] is unavoidable because of the variety of
sampling and analytical conditions found with hazardous
waste" (15).
The way out of the data quality dilemma is to focus on
the bottom line, which is ensuring the overall quality of
the decision driving the data collection effort. Because
uncertainty in environmental decisions is dominated by
sampling variability, increasing the sampling density
increases the certainty that decisions will be correct (as
long as the data generated on those samples is of known
quality commensurate with the decision). Recent advan-
ces in electronics, photonics, and biological reporter
systems have supported the development of innovative
characterization technologies that economically facili-
tate higher sampling densities. Better management of
sampling uncertainty and increased statistical power (the
ability to find a statistical difference when one actually
exists) is possible when more samples are collected.
Public interests would be well served by integrating
these technologies into routine practice because better
overall decision certainty is achieved through a
combination of lower per-sample analytical costs and
(most importantly) the ability of innovative measure-
ment technologies to support smarter and faster work
strategies by providing real-time analytical results.
Smarter work strategies have been articulated by various
authors and practitioners over the years, and they go by
names such as expedited site characterization, dynamic
work plans, rapid adaptive site characterization, adaptive
sampling and analysis plans, and similar terms (26-29).
The concept common to all is using real-time data
results to guide real-time project decision-making and
integrate characterization efforts with cleanup activities
to the greatest extent feasible. Project managers that
successfully use this strategy demonstrate significant
cost-savings, dramatically shortened timeframes, and
increased confidence in the protectiveness of project
decisions. Successful implementation of a dynamic work
plan approach requires considerable investment in
funding and effort to perform thorough, up-front,
systematic planning with a core team possessing the full
range of technical skills relevant to project goals. Work
plans are designed to be dynamic, so that subsequent site
activities can rapidly adapt as new information is gained.
Flexibility in the work plans is guided by transparent,
regulator-approved, decision logic that is focused on
achieving clearly defined project goals. Highly
experienced staff must be present in the field to generate
and interpret data, communicate with regulators, and
implement the decision logic (28,30). Yet, the invest-
ment in planning and qualified technical specialists is
returned handsomely because lifetime project costs are
as much as 50% lower than under traditional scenarios
and project decisions are more reliable, often with
statistically quantifiable certainty. Also, client and
stakeholder satisfaction is much higher when work is
done quickly and correctly the first time (21,29,31).
There are a number of options by which real-time results
may be produced. Paying for 24-hour turnaround from
a traditional laboratory is an option that may be
logistically and economically feasible under some
circumstances. Under other circumstances, establishing
on-site laboratory facilities in vans or trailers may be
viable options. Field-portable or in situ instrumentation
is increasingly an option of choice as technology
development extends the capabilities of these technolo-
gies into a growing number of project situations.
Selection of analytical platform should be made only
after careful systematic planning has considered the pros
and cons of each option in the context of the project's
decision goals, contaminants of concern, site logistics,
budget, contractor capabilities, etc.
Field-portable technologies used to generate on-site
measurements encompass a growing number of both
definitive and screening methodologies. However, since
some of these technologies do not fit the "approved
method" paradigm, regulatory acceptance has lagged,
although there are signs that this is changing. Regulators
should be cautious when field analytical technologies
are proposed, ensuring that the use of these technologies
has been carefully considered on a proj ect-specific level.
The regulator would want to feel confident that
analytical and sampling uncertainties are managed and
balanced to meet the desired decision certainty, as
described in a proj ect-specific quality assurance plan. It
is to be expected that there would be a learning curve.
The generation of data of known and documented
quality using on-site measurementtechnologies requires
that analytical chemistry expertise be part of the project
planning process from the start. The selection of an
-------
appropriate field technology and the design of a field
QA/QC protocol that will demonstrate that all relevant
analytical uncertainties are managed to the degree
needed to assure scientific defensibility requires a
merger of project management expertise, statistical and
geostatistical sampling design knowledge, and analytical
chemistry sophistication. To achieve this multidisciplin-
ary skill mix, the consulting engineering community
might partner with analytical service providers, statis-
ticians, and other disciplines. A shift to such extensive
partnering will no doubt be new to many consulting
firms and regulatory agencies.
Regulators can play an important role to foster this
transition if their oversight shifts from controlling
analytical methods to managing the overall uncertainty
in project decisions. This can be done by ensuring that
project planning documents 1) clearly explain what a
project's goals really are, and what decisions have to be
made in order to achieve those goals; 2) identify the
maj or factors expected to contribute to uncertainty in the
project decisions; and 3) ensure there is a technically
valid strategy in place to manage each of those factors.
As the project proceeds, quality assurance staff could
assure the overall quality of project decisions by
evaluating whether the relative contributions to overall
uncertainty from the various components of sampling
and analytical error have indeed been considered
(18,20,25). Planning documents must clearly distinguish
between uncertainties that operate at the analytical level
(i.e., analytical quality or performance at the laboratory
level that is not affected by sample-specific constraints),
at the data level (i.e., evaluation of data quality that
includes sample-specific analytical performance and
consideration about sample representativeness), and at
the project level (i.e., expressions of decision confi-
dence).
Data Set Information Value vs. Data Point Quality in
the Use of Screening Methods
As discussed previously, accurate partitioning of
sampling and analytical errors reveals that the ability of
many environmental data sets to provide reliable
information has less to do with analytical data quality
than with sampling density. The advantage of many
screening methods is that they are less expensive than
most definitive methods (so more data points can be
obtained) and they can be operated to provide real-time
results. Contrary to popular belief, screening methods
can be selected and operated in ways that produce data
sets that contain meaningful information at a high degree
of confidence, but appropriate analytical chemistry
expertise and sufficient quality control mechanisms are
required to do so. The cost advantages of using
screening methods are not sacrificed by this investment
in analytical proficiency.
The contrast between the information value of data sets
and the quality of individual data points was illustrated
in Figure 1. Assume that the data points of Scenario B
were generated using a screening method with a higher
detection limit, less precision, more interferences, and a
tendency to be biased high compared to the more
expensive definitive method depicted in Scenario A.
However, the screening method is less expensive, and it
can be used on-site to generate results within hours of
sample collection. If the goal of the project is to detect
the presence of hot spots above a certain contaminant
concentration and greater than a given size, not only can
the analytical method in Scenario B produce a more
accurate representation of the site's contamination
(producing site decisions that are more protective and
defensible), the real-time results can be used to discover,
delineate, and remove hot spots in a single field
mobilization. This dynamic approach can save
considerable time and money over a hot spot delineation
approach phased over months or years while waiting for
each round of laboratory results to be returned and
interpreted. Instead, the screening method could produce
data of known quality that are effective for site
characterization and cleanup as long as project planning
establishes that the following conditions are met:
• The quantitation limit of the screening method is
well below the decision level(s) used to define
unacceptable concentrations of the targeted
contaminant(s).
• The analytical variability is minor compared to the
precision needed to support hot spot detection and
delineation.
• Adequate QC procedures (which may include, but
by no means are limited to, split sample analysis by
traditional laboratory methods) are used to monitor
the amount of bias in the field results, and to control
for any impact from interferences. Data quality is
judged acceptable as long as the amount of bias or
the effect of interferences is documented, and can be
shown to not cause unacceptable errors in project
decision-making.
Depending on the nature of the contaminants and the
field methods used, confirmation that a site is "clean"
for purposes of regulatory closure often will require the
-------
analyte-specific, highly quantitative results that only
definitive laboratory methods can provide. But if prior
site characterization using the field method was
thorough, the representativeness of these expensive
samples will be assured, even if they are relatively few
in number. There will be no surprise "hits" or unexpec-
ted analytical problems with the closure samples,
allowing statistical estimation of confidence in the
closure decision to be determined cost-effectively (21).
Screening Methods Can Produce Data of Known
Quality
Results from screening methods are often viewed with
suspicion. This view is justified if the QA/QC needed to
establish validity of the data has not been performed, or
if critical uncertainties in the analytical method have not
been managed. Screening methods may be described as
analytical methods for which significant uncertainties
exist in the method's ability to positively identify
individual compounds (within a class or of a certain
type) and/or to quantitate analyte concentrations. For
example, an immunoassay method for DDT will produce
a response not just for the two DDT isomers, but also for
degradation products of DDT and possibly other
compounds with similar chemical structures. Most
immunoassay kits for environmental applications are
also designed to have a significant positive bias to mini-
mize the chance of false negative results. Obviously, it
would be foolish to expect that a result from a "DDT"
immunoassay kit would be directly comparable to a
DDT result from a definitive method (21).
Although similar kinds of uncertainties exist for
definitive methods as well, the magnitude of these
uncertainties is expected to be much less for definitive
methods than for screening methods. Yet, data users
should be aware that the same definitive method that
produces excellent recovery and precision for some
analytes on the list of potential target analytes may well
produce poor recovery and precision for other analytes
on the list. That is because optimizing the operating
conditions of an analytical technique for certain analytes
necessarily degrades the performance of other analytes
that have different chemical properties. This short-
coming is particularly true for generalized methods that
have very long and diverse target analyte lists, such as
SW-846 Methods 8260 (GC/MS for volatile organic
compounds, VOCs) and 8270 (GC/MS for semi-volatile
organic compounds, SVOCs). Even if only analytical
quality is assessed, the data for some analytes from a
sample might be considered "definitive" while other
analyte results for the same sample and analytical run
should be considered "screening." This is not a fault of
environmental laboratories; this is the consequence of
demanding that a diverse list of analytes be reported
from a single instrument run to cut analytical costs. This
is an acceptable approach as long as it is quite clear to
the data user that some results should be considered
"screening" (i.e., highly uncertain) despite the fact that
they were generated from a definitive method.
The phrase "data of known quality" means that data
quality characteristics such as representativeness, degree
of bias and precision, selectivity, detection/quantitation
limits, and impact by interferences are documented. The
goal of project planning is to match an analytical method
and its ability to produce data of a certain quality with
the needs of the project. This principle is true whether
definitive or screening techniques are being considered.
With an appropriate project-specific QA/QC protocol,
estimates for a screening method's quantitation limit,
bias, and any other data quality indicators relevant to
project decision-making can be determined. Together
with evidence of sample representativeness, these para-
meters establish data of known quality. When the actual
data quality generated through the use of proj ect-specific
QC samples is compared against the data quality needed
for making defensible project decision, and the actual
data quality is found inadequate for decision-making
purposes, the data may still serve useful purposes as
screening quality data. Screening data may be defined as
data that provide some information (such as indications
of matrix variability or analytical interferences) useful
to furthering understanding of contaminant distribution
or behavior. But the data contain too much uncertainty
to be used for making solid project decisions that can
bring the site to final resolution. Deliberately producing
screening data (using either a screening or definitive
technique) can be a highly cost-effective strategy, as
long as the difference between decision quality data and
screening data (and how they individually will be used
in the context of project) remains clear.
Data that are of unknown quality (because of inadequate
QC or sampling density) may possibly serve as
screening data if interpreted very carefully and conser-
vatively. But the production of data of unknown quality
is an undesirable situation that generally means there
was a breakdown in the project planning process. Data
of unknown quality cannot be used to make project
decisions (i.e., data of unknown quality cannot be
treated as decision quality data) since, by definition,
critical analytical uncertainties were not controlled and
the possibility that the data may cause a decision error is
too great.
10
-------
Under the traditional paradigm, it may be weeks or
months before project personnel get data packages
returned from a laboratory and discover whether the
actual data quality is of decision quality, screening
quality, or unknown quality. Current procurement
practices mean that laboratories are seldom aware of a
project's actual data needs, and laboratories are seldom
authorized to explore method modifications to improve
data quality when sample interferences compromise
analytical performance. By the time a project managers
realizes that the data are inadequate, they are faced with
a difficult decision. They must choose either to signifi-
cantly delay subsequent site work and incur additional
costs while samples are recollected or reanalyzed, or to
significantly weaken the defensibility of their decisions
by "taking their best guess" based on the data available.
On-site analysis offer substantial advantages in this area,
as long as adequate systematic planning has clearly
defined the data requirements. On-site measurement
methods can easily be operated with a project-specific
QA/QC protocol tailored specifically to meet the
project's data quality requirements. During project
implementation, real-time results provide immediate
feedback to project personnel about actual data quality
as it is being generated. Contingency plans (which are an
integral feature of dynamic strategies) are activated if
analytical or matrix problems are encountered, minimi-
zing wasted analyses and delays. Analytical results that
seem out of line can be immediately investigated to rule
out clerical errors and other blunders, or to reevaluate
the representativeness of the current sampling design
(32).
When using a screening method to generate decision
quality data, the key is to openly acknowledge the
strengths and limitations of the method. In principle, this
is true whether using a definitive method or a screening
method, but there is more opportunity for error when
selecting a screening method, which is why it is
e specially important that the person making the selection
have the appropriate analytical chemistry experience.
Method selection requires that the project chemist:
1) Demonstrate that the uncertainty in the data
produced by the selected method will be insignifi-
cant in relation to the nature of the decision. For
example, uncertainty about whetherthe actual result
is 10 ppm vs. 20 ppm may be unimportant if the
decision hinges only on whether the contaminant
concentration is greater or less than 50 ppm
(compare Figure 2);
2) Use various strategies to cost-effectively control
potential analytical uncertainties, such as evaluating
historical information to assess what contaminants
likely may or may not be present, performing a
demonstration of applicability (i.e., an analytical
pilot study) to verify anticipated site-specific perfor-
mance (15), and tailoring a confirmation testing
protocol to establish project-specific method
comparability and detect any interferences; and
3) Use less specific methods to a project's advantage.
For example, a method that detects a wide range of
chlorinated organic compounds could be used to
assess site locations for a large number of such
contaminants simultaneously. Negative results at an
appropriate level of quantitation could economically
rule out the presence of contaminants such as
polychlorinated biphenyls (PCBs), organochlorine
pesticides, chlorobenzenes, chlorophenols, etc.
Expensive traditional analyses could be reserved for
selected samples with positive results higher than a
concentration of potential concern that is kit- or
proj ect-specific; and serve to unequivocally identify
the contaminant(s) and their actual concentration(s).
A prudent project chemist can use information about
interferences and contaminant concentrations provided
by screening method results to improve the quality and
cost-effectiveness of any follow-up definitive analyses.
Further, by collaborating with statistical expertise, the
planning team can use screening methods in conjunction
with limited definitive analysis to produce highly cost-
effective data sets based on statistically rigorous samp-
ling designs such as ranked set sampling and adaptive
cluster sampling (33).
When data are of known quality, it is possible to
designate which data results are effective for making
decisions, and which data results would not be effective.
For example, when a screening method is used, results
that fall well above or well below an action level are
often effective for making decisions about "clean"
versus "dirty" areas. However, when the uncertainty in
the method's results overlaps the action level, results
that fall within the range of overlap might not be
effective for making decisions about that action level.
Because further analysis would be required to make a
defensible decision, data within that range of uncertainty
would constitute screening data. Of course, those results
are still highly valuable since they guide sample selec-
tion for more expensive analysis. In this way, the value
of confirmation testing dollars is maximized since
samples are selected for confirmatory analysis with a
11
-------
specific goal in mind, that of filling the data gaps most
relevant to decreasing decision uncertainty and estab-
lishing regulatory compliance. Figure 2 illustrates how
the data ranges that would comprise effective data in the
context of a hypothetical project might be clearly
specified in the project's sampling and analysis plan,
along with the action that will be taken when data fall
into a range that is not effective for making the project
decision.
Benefits of More Descriptive Terminology
Adopting the concept of "effective data" would
reinforce a more productive conceptual framework for
data generation within the context of site restoration.
The foundation of that framework is an appreciation for
the importance of a systematic planning process (such as
EPA's DQO process) (20,34). Both terms, "effective
data" and "decision quality data," equivalently serve to
support systematic planning by encouraging critical
thinking about the anticipated role of data. Both terms
intuitively demand that these questions be addressed:
• What is it that the data are to be effective for? In
other words, what is the intended use of the data?
What are the decisions to be supported? And how
"good" should those decisions be (i.e., what level of
decision confidence or certainty is desired)?
• When planning to generate effective data, what are
the strengths and limitations of the proposed
methods (costs, labor requirements, quantitation
limits, precision, expected rates for false positive
and false negative analytical results, bias, turn-
around time, complexity of the procedure, equip-
ment reliability, etc.)?
• What are the site-specific considerations that could
adversely impact analytical performance (e.g.,
physical and chemical matrix effects, and operating
conditions for onsite analysis like temperature and
humidity), and how will those things be controlled?
• What are the site-specific considerations that will
influence representative sampling (e.g., contaminant
variability in time and space, and the physical
makeup of the matrix)?
• What are the site-specific considerations that will
govern what statistical measure(s) should be
determined (e.g., the mean concentration across
some exposure unit vs. an estimate of some
maximum value)?
Conclusion
When data needs are clearly articulated, and where a
number of modern sampling and analytical options exist,
it is possible to optimize data collection so that the
information produced is sufficiently accurate for its
intended purpose, yet at a much lower cost than
previously thought possible. A judicious blending of
screening and definitive methods, used both in the
traditional laboratory setting and in the field, contribute
to generating both effective and screening data sets that
each play valuable roles in defensible, yet highly cost-
Effective Data Range Illustration
A project plans to classify drums of PCB waste as or than
a 50 ppm action level. An immunoassay (IA) kit is demonstrated to be
effective for such decision-making if the kit result is < 45 ppm or > 65
ppm. , IA kit accuracy does not achieve the
level needed to meet the decision goal confidence as set in the project
DQOs. Therefore, samples with kit results in the 45-65 range will be
tested by another method that can provide the needed accuracy.
PCB concentration by IA (ppm)
Effective Data
frue< AL
AL
45 50
H !
65
Effective Data
Additional Testing
Required
FIGURE 2.
12
-------
effective decision-making, as long as the distinctions
between them are understood by decision-makers.
Emerging site characterization and monitoring tools
promise to bring down the costs of environmental
restoration and long-term monitoring, but only if regula-
tors and practitioners incorporate them appropriately
into modern, efficient work strategies, such as dynamic
work plans.
Terminology that reinforces systematic planning and
acknowledges the capabilities of new tools to assist
environmental decision-making could cultivate a more
productive attitude about data quality issues where
assessment of "data quality" is solidly anchored in the
data's intended use. Language is the instrument of
thought, and unfortunately, terms like "definitive data"
or "high quality data" have become ingrained with
misconceptions arising from the idea that prescriptive
method requirements can somehow guarantee that data
point quality will equate to decision quality. A culture
has emerged that rigidly scrutinizes data points, while
the very foundations of information quality and scien-
tific defensibility are neglected. The authors propose
adoption of the equivalent terms, "effective data" and
"decision quality data," as the foundation of a frame-
work that can refocus the environmental community and
data quality assessment on ensuring better decisions
through more defensible, protective, cost-effective, and
innovation-friendly approaches to environmental
decision-making.
Glossary
Term
Description of term as used in this paper
field-based Equivalent to "on-site analytical methods" and a host of similar terms that are used to denote that the
measurement instrumentation and methods used to perform real-time analyses is in close proximity to the actual location
technologies of sample collection. Implementation ranges from hand-held instruments used outdoors to full-scale mobile
laboratories.
field screening This term is highly ambiguous and misleading. Its use is discouraged unless additional descriptors are
provided to clarify whether the speaker intends to refer to screening methods (i.e., non-specific, interference
prone, imprecise analytical techniques), screening data (i.e., some useful information is provided, but not
enough to defensibly support project decisions), or screening decisions (i.e., decisions that are not fully
defensible because of insufficient evidence).
decision Ideally, the degree to which an actual decision coincides with the decision that would have been made if
quality complete and fully accurate information (i.e., the true state) were known (or knowable). Because the "true
state" might not be known at the time of decision-making (i.e., it may not be feasible to know for certain
whether the decision was correct in an absolute sense), decision quality is commensurate with the degree of
confidence in the correctness of a decision. That confidence is a function of the extent to which information
is weighed fairly while acknowledging the assumptions, conditions, and uncertainties that could impact the
correctness of the decision. Hence, decision quality is also related to its ability to be defended in a reasonable,
honest and logical discussion of issues.
data quality Although usage of this term has tended to be vague, the EPA has recently defined data quality as "the totality
of features and characteristics of data that bear on its ability to meet the stated or implied needs and
expectations of the customer " (i.e., data user) (35). In the same vein, recent EPA guidance states that"... data
quality, as a concept, is meaningful only when it relates to the intended use of the data. Data quality does not
exist in a vacuum; one must know in what context a data set is to be used in order to establish a relevant
yardstick for judging whether or not the data set is adequate" (36). Since analytical data are generated from
samples, pre-analytical considerations (such as sample representativeness and sample integrity) are crucial
when determining whether the data are of sufficient quality to meet the user's need to make correct decisions.
analytical An expression of the bias, precision, and other characteristics of the measurement process that reflect the
quality ability of the analytical method to produce results that represent the true concentration of the target analyte
in the sample that was presented to the analytical process. Pre-analytical considerations are not a factor in
determining analytical quality.
defensible Derived logically with all underlying assumptions and uncertainties openly acknowledged. To the degree
feasible, uncertainties are controlled or documented so that the impact on the likelihood of decisions errors
is understood. Conclusions are thus able to withstand reasonable challenge.
13
-------
Term
Description of term as used in this paper
definitive As the term is used in the environmental field: An analytical method for which the degree of uncertainty in
analytical the identification and quantification of target analytes is documented, normally using ideal or well
method characterized matrices. The specificity associated with an analytical measurement and the potential for
influences from interferences can be identified. Example: GC-MS. Definitive methods are not free of
uncertainties, but the degree of uncertainty is less than that considered to be characteristic of screening
methods.
screening Analytical methods for which higher levels of uncertainty are expected in the data produced because the
analytical method is limited in its ability to quantify the presence of specific analytes. The resulting data are expected
method to have higher quantitation limits, be more biased, be less precise, be less selective, and/or be more
susceptible to interferences than data produced by definitive methods. Example: immunoassay kit for DDT.
data point Analytical results for a single sample, specimen, or target analyte.
data set Analytical results for a group of samples that are expected to be representative of the characteristic(s) of the
environmental matrix under investigation.
definitive data Although a legitimate term in science, this term is not recommended in the environmental field because the
current convention for using the term has focused solely on analytical quality, and sampling uncertainty (or
total measurement uncertainty) has not addressed in practice. The term has thus not been conducive to
ensuring decision quality. Current usage of the term in the environmental field seems to stem from selective
reading of an EPA definition says, in part: "Definitive data are generated using rigorous analytical
methods... are analyte-specific, with confirmation of analyte identity and concentration." (4).
screening data Data (points or set) that may provide some useful information, but that information by itself may not be
sufficient to support project decision-making because the amount of uncertainty (due to sampling, analytical,
or other considerations) is greater than what is tolerable. When data that would be considered screening
quality (if considered in isolation) are combined with other information or additional data that manages the
relevant uncertainties, the combined data/information package becomes effective for decision-making (see
collaborative data sets).
effective data Data (points or set) of known quality that can be logically shown to be effective for making scientifically
defensible primary project decisions without requiring additional data or information to back them up,
because both the sampling and analytical uncertainties in the data have been controlled to the degree
necessary to meet clearly defined decision goals. Equivalent to "decision quality" data.
Term is equivalent to "effective" data.
decision
quality data
data of known
quality
Data for which the contributions to its uncertainty from both sampling and analytical variability can be
estimated (either qualitatively or quantitatively) with respect to the intended use of the data, and the
documentation to that effect is verifiable and recognized by the scientific community as defensible.
collaborative Data sets that might not be effective for making project decisions when considered alone, but combined
data sets together they manage all relevant uncertainties to the degree necessary to support defensible decision-making.
This may sometimes be considered a type of "weight of evidence" approach.
ancillary data Project data used to manage project activities other than those directly engaged in supporting primary project
decisions. Examples of ancillary data include health & safety monitoring data, meteorological data,
stratigraphic data, etc.
sample support The size, shape (length, width and height dimensions), and orientation of a sample in relation to the parent
matrix or contaminant population it is desired to emulate.
sample An expression of the degree to which a sample can be used to estimate the characteristics of a population
representa- under investigation with respect to the decision to be made.
tiveness
analytical An expression of the degree to which a sample analysis represents the characteristic of a population under
representative- investigation with respect to the decision to be made.
ness
primary Forprojects involving the cleanup and closeout of contaminated sites, these are decisions that drive resolution
project of that project. Generally these decisions are based on demonstrating the presence/absence of pollutants
decision above/below certain thresholds. Therefore, contaminant data generated on environmental matrices by
analytical chemistry methods usually drive primary project decisions.
14
-------
References
(1) Crumbling, D.M., C. Groenjes, B. Lesnik, K. Lynch, J. Shockley, J. van Ee, R.A. Howe, L.H. Keith, and J.
McKenna. 2001. Managing Uncertainty in Environmental Decisions: Applying the Concept of Effective Data at
Contaminated Sites Could Reduce Costs and Improve Cleanups. Environmental Science & Technology 35:9, pp.
404A-409A.
(2) Simmons, B.P. Using Field Methods - Experiences And Lessons: Defensibility Of Field Data. California
Environmental Protection Agency Department of Toxic Substances Control. Article available at http://clu-
in.org/download/char/legalpap.pdf
(3) Crumbling, D.M. 2001. Current Perspectives in Site Remediation and Monitoring: The Relationship Between SW-
846, PBMSand: Innovative Analytical Technologies. EPA 542-R-01-015. August. Available at the following website:
http: //cluin. org/tiopersp/
(4) U.S. Environmental Protection Agency (USEPA). 1993. Data Quality Objectives Process for Superfund, Interim
Final Guidance. EPA 540-R-93-071. September. (See pages 41 and 43.)
(5) U.S. Environmental Protection Agency (USEPA). 2001. Innovations in Site Characterization Technology
Evaluation: Real-time VOC Analysis Using a Field Portable GC/MS. EPA 542-R-01-011. August. Document
available for download from http://cluin.org/charl_edu.cfm#site_char.
(6) Smith, R.-K. 2001. Interpretation of Inorganic Data. Genium Publishing Corporation. Canada. http://www.
genium.com
(7) Francoeur, T.L. 1997. Quality Control: The Great Myth. In the Proceedings of Field Analytical Methods for
Hazardous Wastes and Toxic Chemicals, a specialty conference sponsored by the Air & Waste Management
Association, January 29-31, 1997. Las Vegas, NV, pp. 651-657.
(8) Homsher, M.T.; F. Haeberer; P. J. Marsden; R.K. Mitchum; D. Neptune; and J. Warren. 1991. Performance Based
Criteria, A Panel Discussion. Environmental Lab, October/November.
(9) Jenkins, T.F., C.L. Grant, G.S. Brar, P.G. Thorne, T.A. Ranney, and P.W. Schumacher. 1996. Assessment of
sampling error associated with collection and analysis of soil samples at explosives-contaminated sites. Special
Report 96-15. Army Corps of Engineers/Cold Regions Research and Engineering Laboratory. National Technical
Information Service, Springfield, VA. Report available from http://www.crrel.usace.army.mil/techpub/CRREL_
Reports/reports/SR96_l 5 .pdf
(10) Jenkins, T.F., M.E. Walsh, P.G. Thorne, S. Thiboutot, G. Ampleman, T.A. Ranney, and C.L. Grant. 1997.
Assessment of sampling error associated with collection and analysis of soil samples at a firing range contaminated
with HMX; Special Report 97-22. U.S. Army Corps of Engineers/Cold Regions Research and Engineering
Laboratory, National Technical Information Service, Springfield VA. Report available from http://www.crrel.usace.
army.mil/techpub/CRREL_Reports/reports/SR97_22.pdf
(11) Fairless, B.J. and D.I. Bates. 1989. Estimating the quality of environmental data. Pollution Engineering
March: 108-111.
(12) Korte, N. 1999.^4 Guide for the Technical Evaluation of Environmental Data. Technomic Publishing Company,
Inc. Lancaster, PA. http://www.techpub.com/
(13) Barcelona, M.J. 1988. Overview of the sampling process (Chapter 2) in Principles of Environmental Sampling,
2nd ed. L.H. Keith, Ed. American Chemical Society. Washington, DC. 1996.
15
-------
(14) Smith, R.-K. 2000. Interpretation of Organic Data. Genium Publishing Corporation. Canada. http://www.
genium.com
(15) U.S. Environmental Protection Agency (USEPA). Test Methods for Evaluating Solid Waste, Physical/Chemical
Methods (SW-846). Available from the Office of Solid Waste Methods Team website: http://www.epa.gov/SW-
846/index.htm . See Preface and Chapter 2, including the November 2000 update of Chapter 2 in Draft Update IVB.
(16) Ramsey, C.A. and J. Suggs. 2001. Improving Laboratory Performance Through Scientific Subsampling
Techniques. Environmental Testing & Analysis, March/April. Article available at http://cluin.org/charl_edu.
cfm#stat_samp
(17) U.S. Environmental Protection Agency (USEPA). 1998. Office of Inspector General Audit Report: EPA Had
Not Effectively Implemented Its Superfund Quality Assurance Program. E1SKF7-08-0011-8100240. September 30.
Documents available at http://www.epa.gov/oigearth/audit/list998/8100240.pdf
(18) U.S. Environmental Protection Agency (USEPA). 1998. EPA Guidance for Quality Assurance Project Plans
(QA/G-5). EPA 600/R-98/018. February. Available from http://www.epa.gov/quality/qs-docs/g5-fmal.pdf
(19) U.S. Environmental Protection Agency (USEPA). 1989. Soil Sampling Quality Assurance User's Guide (2nd
Edition). EPA/600/8-89/046. Available through the Clu-In website athttp://cluin.org/chartext_edu.htm#stats
(20) U.S. Environmental Protection Agency (USEPA). 2000. Guidance for the Data Quality Objectives Process for
Hazardous Waste Sites (G-4HW). EPA/600/R-00/007. Washington, DC. January 2000. http://www.epa.gov/qualityl/
qs-docs/g4hw-final.pdf
(21) U.S. Environmental Protection Agency (USEPA). 2000. Innovations in Site Characterization Case Study: Site
Cleanup of the Wenatchee Tree Fruit Test Plot Site Using a Dynamic Work Plan. EPA-542-R-00-009. August.
Available from the Clu-In website: http://cluin.org/download/char/treefruit/wtfrec.pdf
(22) Crumbling, D.M. 2001. Current Perspectives in Site Remediation and Monitoring: Clarifying DQO Terminology
Usage to Support Modernization of Site Cleanup Practice. EPA 542-R-01-014. August. Available at http://cluin.org/
tiopersp/
(23)Popek, E.P. 1997. "Investigation versus Remediation: Perception and Reality" in Proceedings of WTQA '97—the
13th Annual Waste Testing and Quality Assurance Symposium, pp. 183-188. Paper available at http://cluin.org/
products/dataquality/
(24) Shefsky, S. 1997. ComparingFieldPortableX-RayFluorescence (XRF) to Laboratory Analysis of Heavy Metals
in Soil. Paper available at http://www.niton.com/shef02.html
(25) U. S. Environmental Protection Agency (USEPA). EPA Quality Manual for Environmental Programs (5360A1).
May 2000. Available from the EPA Quality System website: http://www.epa.gov/qualityl/qs_docs/5360.pdf
(26) Burton, J.C. 1993. Expedited Site Characterization: A Rapid, Cost-Effective Process for Preremedial Site
Characterization, Superfund XIV, Vol. II, Hazardous Materials Research and Control Institute, Greenbelt, MD, pp.
809-826.
(27) American Society for Testing and Materials (ASTM). 1998. D 6235-98a Standard Practice for Expedited Site
Characterization of Vadose Zone and Ground Water Contamination at Hazardous Waste Contaminated Sites.
Conshohocken, PA.
(28) Robbat, A. 1997. A Guideline for Dynamic Workplans and Field Analytics: The Keys to Cost-Effective Site
Characterization and Cleanup, sponsored by the President's Environmental Technology Initiative, through the U.S.
16
-------
Environmental Protection Agency, Washington, DC. http://clujn.org/download/char/dynwkpln.pdf
(29) U.S. Department of Energy (DOE). 2001. Adaptive Sampling and Analysis Programs (ASAPs). DOE/EM-0592.
August. Available from DOE's Office of Environmental Management/Office of Science and Technology/
Characterization, Monitoring, and Sensor Technology Crosscutting Program and Subsurface Contaminants Focus
Area website: http://apps.em.doe.gov/ost/pubs/itsrs/itsr2946.pdf
(30) Crumbling, D.M. 2001. Current Perspectives in Site Remediation and Monitoring: Using the Triad Approach
to Improve the Cost-Effectiveness of Hazardous Waste Site Cleanups. EPA 542-R-01-016. August. Available
at http://cluin.org/tiopersp/
(31) U.S. Department of Energy. 1998. Innovative Technology Summary Report: Expedited Site Characterization.
DOE/EM-0420; Tech ID: 77. December. Available at http://ost.em.doe.gov/pubs/itsrs/itsr77.pdf
(32) Crume, C. The Business of Making a Lab Field-Portable: Getting the Big Picture on an Emerging Market.
Environmental Testing & Analysis. November/December 2000. pp. 28-37. Available on-line at: http://cluin.org/
char 1 _edu. cfm#usin_fiel
(33) For additional information concerning statistical sampling designs, refer to EPA's Cleanup Information website
athttp://cluin.org/chartext_edu.htm#stats
(34) U.S. Environmental Protection Agency (USEPA). 1999. Review of the Agency-Wide Quality System. Letter
Report of the Science Advisory Board. EPA-SAB-EEC-LTR-99-002. February 25. Report accessible at
http://www.epa.gov/sciencel/eecl9902.pdf
(35) U.S. Environmental Protection Agency (USEPA). 2000. Office of Environmental Information Management
System for Quality, http://www.epa.gov/oei/quality.htm
(36) U.S. Environmental Protection Agency (USEPA). 2000. Guidance for Data Quality Assessment: Practical
Methods for Data Analysis (QA/G-9 QAOO Update). EPA 600/R-96/084. July, http://www.epa.gov/quality/qs-docs/g9-
final.pdf
17
------- |