542F03012
Deana M. Crumb Li n g
Jennifer Griffith
Dan M. Powell
REMEDIATION Spring 2003
Improving Decision Quality: Making the
Case for Adopting Next-Generation Site
Characterization Practices
Better site characterization is critical for cheaper, faster, and more effective cleanup. This fact is
especially true as cleanup decisions increasingly include site redevelopment and reuse consider-
ations. However, established attitudes about what constitutes "data quality" create many barriers
to exciting new tools capable of achieving better characterization, slowing their dissemination
into the mainstream. Traditional approaches to environmental "data quality" rest on simplifying
assumptions that are rarely acknowledged by the environmental community. Data quality assess-
ments focus on the quality of the analysis, while seldom asking what impact matrix heterogeneity
has had on analytical results. Assessments of data quality typically assume that chemical contam-
inants are distributed nearly homogeneously throughout environmental matrices and that con-
taminant-matrix interactions are well behaved during analysis. Yet, these assumptions seldom
hold true for real-world matrices and contaminants at scales relevant to accurate risk assessment
and efficient remedial design. For the site cleanup industry to continue technical advancement,
over-simplified paradigms must give way to next-generation models that are built on current sci-
entific understanding. If reuse programs such as Brownfields are to thrive, the scientific defensi-
bility of individual projects must be maintained at the same time as characterization and cleanup
costs are lowered. The U.S. Environmental Protection Agency (EPA) offers the Triad Approach as
an alternative paradigm to foster highly defensible, yet extremely cost-effective reuse decisions.
© 2003 Wiley Periodicals, Inc.*
INTRODUCTION
A number of exciting developments are emerging in the field of environmental cleanup
and revitalization. This period of exploration and transition provides tremendous oppor-
tunities for evolving the practices and expectations of this dynamic arena. The accumula-
tion of institutional experience is transforming the field as we understand more fully
what works and what does not work for real-world site investigation and cleanup.
Continued investment of public and private dollars into basic and applied research adds
to a growing understanding of the mechanisms governing contaminant release, fate,
transport, transformation, and interaction with biotic systems, and how these mecha-
nisms are influenced by human-engineered interventions. Our ability to understand and
predict contaminant behavior improves as new sampling and analytical tools are devel-
oped that permit better spatial resolution of contaminant distributions. Better resolu-
tion, in turn, supports better site characterizations that support more defensible
decisions about whether site-specific contamination may pose risks, and if so, what risk
reduction strategies can give the most "bang for the buck." Yet, important barriers exist
that hinder widespread use of these tools.
91
Published online in Wiley Interscience {www.interscience.wiley.com). DOI: 10.1002/rem. 10066
This article is a U.S. Government work and as such, is in the public domain in the United States of America.
© 2003 Wiley Periodicals, Inc.
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Everyone must be willing
to break out of traditional
comfort zones to apply
the benefits of scientific
progress to regulation
and field practices.
It is U.S. Environmental Protection Agency (EPA) policy to make decisions based
on sound science. If sampling for contamination and its effects is poorly done, the
accuracy of the resulting conceptual site model and all subsequent decisions based on it
(especially risk estimation and remedy design) may be faulty. To support the scientific
defensibility of decisions involving contaminated sites, EPA has articulated the Triad
Approach (Crumbling, 200la). The Triad is a conceptual and strategic framework
that explicitly recognizes the scientific and technical complexities of site characteri-
zation, risk estimation, and treatment design. In particular, the Triad Approach
acknowledges that environmental media are fundamentally heterogeneous on a vari-
ety of scales, a fact that complicates sampling design, analytical method performance,
and toxicity estimations.
The challenge of adopting the Triad into routine practice is significant. The nor-
mal growing pains experienced by any science-based field of endeavor are magnified
for the environmental cleanup industry. Relationships among federal, state, and local
regulators and the regulated community are complex. These relationships intersect
with an array of other players including consulting engineering firms, property
investors, academia, citizen stakeholders, technology vendors, and non-governmental
J ' * &J ' O
organizations. The diverse interests and motivations of these stakeholders influence
O
the site investigation and cleanup process. Modifying familiar regulatory, procure-
ment, business, and technical practices to accommodate evolving scientific knowl-
edge and technology requires intensive coordination and outreach. For the entire
field to embrace improved strategies and technologies, all parties must "move"
together: regulators must be open to innovations in order for vendors to risk mar-
O O 1
keting them; consulting firms must offer innovative services to their clients while the
o ' O
clients, as educated consumers, must encourage consulting firms to "think outside of
' 'of
the box." Everyone must be willing to break out of traditional comfort zones to
apply the benefits of scientific progress to regulation and field practices. Second-gen-
eration tools and strategies will produce better data quality and decision defensibility
than the first-generation procedures inherited from the 1970s and 1980s that con-
tinue to be used today,
HETEROGENEITY AS A FUNDAMENTAL DRIVER FOR CHANGE
92
For the vast majority of site investigations, contaminant data are generated by taking
a relatively few small-volume samples from an environmental or waste matrix. The
per-sample costs for trace level analysis are high because satisfactory analytical per-
formance requires sophisticated instrumentation along with experienced and well-
educated operators. Therefore, there is strong financial motivation to minimize the
number of samples. Compounding the potential for a non-representative data set is
the fact that, especially for solid samples, an even smaller volume of the sample
(i.e., a subsample) is analyzed to generate the result. Consequently, the volume of
matrix actually analyzed is tiny compared with the volume of parent matrix to
which the sample results will be extrapolated, increasing the risk of highly variable
results and skewed data sets (Gilbert & Doctor, 1985). If contaminants of interest
occurred at nearly constant concentrations throughout the parent matrix, then
drawing conclusions about the parent matrix from a few small samples would be
fairly straightforward. Unfortunately, real environmental matrices are seldom
homogeneous.
o
© 2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
Good science involves making observations (such as generating data results) and drawing
conclusions from those observations to make decisions. Key to good science is not
extrapolating beyond the evidence. Field studies show that matrix heterogeneity severely
limits the confidence with which analytical results can be justifiably extrapolated beyond
the tiny sample analyzed. Environmental heterogeneity is a consequence of the release
mechanism(s) and the partitioning behavior of the analyte in conjunction with the trans-
port and transformation mechanisms offered by the environment. For example, Exhibit
1 illustrates data from extensive studies of explosive residues performed by the Cold
Regions Research and Engineering Laboratory (CRREL) of the U.S. Army Corps of
Engineers (Jenkins et al., 1996). As you can see, there is tremendous contaminant con-
centration variability over a very small area. The site soils depicted by this particular data
set were contaminated due to activities that spilled solid explosive material from decom-
missioned munitions. To characterize the short-range variability of trinitrotoluene (TNT,
in units of ppm) in surface soil, grab samples were collected in a seven-location wheel
configuration. The diameter of the wheel was 4 feet; therefore each sample within the
wheel is only about 2 feet from its nearest neighbors. Individual grab samples were col-
lected from a 0- to 6-inch depth using a 2-inch diameter stainless steel auger. Each of
the seven soil cores was thoroughly homogenized in separate containers.
[Homogenization was critical because the particulars nature of matrix constituents cre-
ate ample opportunities for subsampling procedures to exacerbate data variability
(Gerlach et al., 2002; Ramsey & Shuggs, 2001).] Subsamples from each homogenized
sample were analyzed by both an on-site analytical method (EnSys Colorimetric Test Kit;
EPA SW-846 Method 8515) and by a traditional laboratory method (EPA SW-846
HPLC Method 8330).
Exhibit 1 shows that there is general agreement of the on-site colorimetric results
o &
with the off-site HPLC results, in marked contrast to the differences among the seven
' o
~ 5% of variability due to ail other sources, including the difference
between 2 completely different analytical methods
331 On-site
286
-f^oI^^oS?
4^Lfli€Hr«2J-
*•**:*
500 On-site
416 Lab
•ilgj&a&'Si*p*'*K*
1,280 On-site $?%£$***
1,220 Lab '**&*•*
%^4 ^ ""' -f^_J^'''L'"
24,400 On-site '" ^ 4
27,700 Lab
39,800 On-site
41,400 Lab
On-site
27,800 On-site
42,800 Lab
Exhibit 1. Variability of TNT in soil across a 4-foot diameter circle (units = ppm) (from Jenkins
etal.,1996).
© 2003 Wiley Periodicals, Inc.
93
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Heterogeneity of soil pol-
lutants is not limited to
explosives contamination.
94
sampling locations. Using common grab sampling scenarios, any one of these samples
could have been collected and assumed to represent the TNT concentration for an area
much larger than the 2-inch diameter core. But that standard assumption would be erro-
neous. A single grab sample from location 1 would lead to very different conclusions
about the degree ofTNT contamination than a sample from location #3 only 2 feet
away. When the variability in this data set of 14 results is partitioned, at least 95 percent
of the total variability (as statistical variance) in this data set is due simply to sample
position (that is, a consequence of matrix heterogeneity), whereas using two very differ-
ent analytical methods contributes no more than 5 percent of the variation. Stated
another way, matrix heterogeneity over a distance of only 4 feet caused over 19 times
more variability (i.e., uncertainty) in data results than did the choice of analytical
method. Improving the performance of the analytical methods would not reduce the
uncertainty in the data set, since nearly all of the variability is caused by true sample-to-
sample variation.
Conclusions from this CRREL study on TNT residues found "enormous short-range
heterogeneity and sampling error [that] overwhelmed analytical error." The investigators
recommended that the quality of site characterization data from explosives-contaminated
sites could be improved by "reducing sampling error.. .using composite sampling, in-field
sample homogenization, and on-site analysis [as] an efficient method of producing data
that are accurate and precise, and also representative of the area" (Jenkins et al., 1996).
Similar extremes of heterogeneity have been found for other explosive residues. "Spatial
heterogeneity of HMX concentrations was large on both short- and mid-range scales and
this factor dominated the overall uncertainty associated with site characterization.
Relatively minor uncertainties were due to analytical error" (Jenkins et al., 1997).
Heterogeneity of soil pollutants is not limited to explosives contamination. The
ratio of sampling to analytical variability was estimated for a data set from a Brownfields
redevelopment project (a former scrap-yard site). The data set was composed of 291
analyses for a suite of metal contaminants and 570 analyses for a suite of polycyclic aro-
matic hydrocarbon compounds (PAHs). Samples were collected in an approximate grid
design from an area roughly 800 X 300 feet to a depth of 12 feet (a volume of about
110,000 cubic yards of site materials). Despite the large number of samples, the com-
bined volume of field samples represents something on the order of one-millionth of the
site volume. If the volume of the subsamples actually analyzed is compared to the vol-
ume of original matrix, the fraction is more on the order of one-billionth. No wonder
then that the contribution of sampling variability to overall data uncertainty ranged to
99.9999 percent and more for those analytes that tended to show hotspot patterns
attributable to discrete spills (lead and PAH analytes). Arsenic, on the other hand,
demonstrated a low natural background that tended to even out its variability, making it
one of the "better"-behaved analytes with only 90 percent of the data uncertainty con-
tributed by sampling considerations.
Nor is environmental heterogeneity confined to vadose soils. As technologies such
o J o
as direct push-deployed detection systems and passive diffusion samplers are applied to
subsurface and aquifer characterization, marked vertical heterogeneities in contaminant
concentrations are being found (Vroblesky, 2001). This high degree of heterogeneity
complicates interpretation of groundwater data since the proportion of mixing between
water from more contaminated horizons with water from less contaminated horizons is
an uncontrolled and unconsidered variable for the vast majority of monitoring well sam-
pling plans. Yet this variable determines the contaminant concentrations in the samples
© 2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
received by the laboratory. On the other hand, careful delineation of vertical hetero-
geneity supports vastly improved sampling designs, as well as better targeting of remedi-
ation strategies to save time and money (Tillman & Sohl, 2001).
REPRESENTATIVENESS TOO COSTLY UNDER THE TRADITIONAL
MODEL
Actually, the impact of heterogeneity on data uncertainty has been known for some time.
By the 1980s, investigators were discovering that matrix heterogeneity compromised
their ability to draw reliable conclusions from analytical data. Consequently, EPA issued
guidance documents with suggestions for focusing efforts on understanding and managing
0 oo o o e> to
sampling uncertainties for commonly heterogeneous matrices such as soils. These docu-
ments introduced the concept of "sample support" (the physical volume, orientation, and
particulate makeup of samples) as critical to sampling design and data interpretation
(USEPA, 1989, 1990). EPA also published the deliberations of an expert panel convened
in 1991 to explore the ramifications of environmental variability. The panel noted that
studies were showing that 70 to 90 percent of data variability was caused by "natural," in-
place variability, with only 10 to 30 percent of variability being contributed by the rest of
the data generation process (such as the sample collection procedures, sample handling,
laboratory handling and cleanup, laboratory analyses, data handling, data reporting, and
data interpretation). The import was summarized by a panel member: "So when you
think about where you might get the best advantage by spending another dollar, in the lab
versus spending the same increment of money to get a better handle on that natural pop-
ulation variability, it is obvious that you get a much bigger bang for your buck by deter-
mining natural population variability. That is where the real bulk of the error is typically
found" (Homsher et al., 1991).
The panel's recommendation that environmental workers focus more on understand-
ing sampling variability had little impact on regulatory paradigms and field practices at
the time. Although representativeness was considered important to data quality [it is the
"R" in the "PARCC parameters" of Precision, Accuracy/Bias, Representativeness,
Comparability, and Completeness (USEPA, 1998)], careful management of sampling rep-
resentativeness was inconceivable from a cost standpoint when standard laboratory analy-
ses were the only game in town. It was much easier to oversee "data quality" if that
concept was defined in terms of analytical methods and laboratory performance. The
problem with defining data quality in that way is that analytical data are generated from
environmental samples, and even perfect analytical quality is no guarantee that sample col-
lection will produce data that are representative of the decisions the data ultimately are
used to make. The more heterogeneous the matrix, the more likely it is that a small data
set will be skewed by the "luck of the draw."
THE CONSEQUENCES OF IGNORING HETEROGENEITY AND
REPRESENTATIVENESS
It was much easier to
oversee "data quality" if
that concept was defined
in terms of analytical
methods and laboratory
performance.
Decisions about risk are usually based on an estimate of average concentrations across an
exposure unit. In contrast, decisions about risk reduction strategies are usually based on
discriminating between zones with higher concentrations requiring treatment or
removal and lower concentrations that may not require remedial attention. If data are
not representative of the decision being made (averages in one case, extremes in the
©2003 Wiley Periodicals, Inc.
95
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
An even more compelling
reason to update the first-
generation data quality
model is to reduce the
financial and liability risks
created when non-repre-
sentative data lead to
erroneous decisions.
other case), the data will lead to flawed decisions. The first-generation data quality
model that considers data quality only in terms of analytical method performance
ignores sampling uncertainties and the impacts of matrix heterogeneity. Given what we
know now, this cannot be considered "sound science." It is imperative that environmental
programs update their data quality model to reflect current scientific understanding.
Fortunately, technological tools now exist to cost-effectively implement a sounder data
quality model (USEPA, 2002). Another encouraging trend is greater emphasis on geo-
statistical analysis. Whereas classical statistical models may be more appropriate for eval-
uating risk, geostatistics is better suited for modeling spatial patterning that may drive
more cost-effective remedial designs.
An even more compelling reason to update the first-generation data quality model
is to reduce the financial and liability risks created when non-representative data lead to
erroneous decisions. Attempts to save resources in the short run by skimping on the data
collection program subsequently waste far more resources (including client and stake-
holder good will) when erroneous decisions are discovered. For example, a quality
assurance manager for a major consulting firm observed that "[Deductions in the com-
prehensiveness of the field investigation, based on budgetary considerations, schedule-
driven approval of incomplete plans, superficial or protocol-oriented reviews by
technically unqualified agency personnel, all come back to haunt the stakeholders at
remediation time...remedial action case histories have, in fact, proved [that] the percep-
tion of site conditions based upon site investigation does not reflect reality. Use of site
investigation data invariably leads to underestimating or overestimating the extent of
contamination, sometimes on an alarming scale. In either case, ramifications may be sub-
stantial with respect to remediation budgets and public perception of the environmental
industry" (Popek, 1997).
THE TRIAD APPROACH: A FRAMEWORK TO MOVE BEYOND FIRST-
GENERATION DATA PARADIGMS
96
So how can the environmental community move toward next-generation paradigms that
are scientifically sound and protective, yet lower project costs so that more sites can be
investigated and returned to productive reuse? What vision should guide the environ-
O 1 O
mental cleanup field as it matures its fundamental assumptions, its regulatory expecta-
tions, its scientific and technology sophistication, and its implementation strategies? EPA
has articulated the Triad Approach to fill this role. As illustrated in Exhibit 2, the Triad
refers to three primary elements; (1) detailed and specific systematic project plan-
ning that begins by clearly defining desired project outcomes (e.g., potential goals for
site reuse), and exploring the uncertainties (i.e., unknowns) that stand in the way of
achieving those outcomes; (2) dynamic work planning strategies that can drasti-
cally save time and money over the project lifetime; and (3) real-time data generation
and interpretation to support the real-time decision making of the dynamic work plan,
while cost-effectively managing sampling uncertainty and data representativeness. The
Triad Approach allows the investigator to adjust field activities to address specific condi-
tions, increasing site-specific information in an efficient and inexpensive manner and
improving decision-making confidence.
The Triad Approach is a continuation and synthesis of efforts begun in the 1980s
by Jacqueline Burton and colleagues to make site investigation and cleanup more
effective and less costly (US DOE, 1998). Over the years, like-minded innovators
© 2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
A Systems Approach Framework
The Triad Approach
Systematic
Project
Planning
Dynamic
Work Plan
Strategy
Real-time Measurement
Technologies
Exhibit 2. All three elements of the Triad Approach are necessary to decrease investigation
and cleanup costs, but systematic planning is the critical foundation for successful projects.
from the governmental, academic, and private sectors have contributed to the theoret-
ical and practical considerations that the Triad Approach embraces (Tetra Tech EM
Inc., 1997).
The Triad Approach revolves around identifying, understanding, and managing the
uncertainty in site decisions. When scientific data are used to provide input to the deci-
sion-making process, the uncertainty in that data needs to be managed to a degree com-
mensurate with the desired decision confidence. Because most data uncertainty stems
from sampling variability, the Triad Approach maximizes the use of new technologies to
cost-effectively increase sampling density so that contaminant distributions and spatial
heterogeneity can be explicitly characterized at the scale of project decisions. Dynamic
work plans are used to simultaneously cut project posts while improving decision qual-
ity: data gaps are identified and resolved in the field so that the number of remobiliza-
tions back to the site can be minimized (Robbat, 1997). Better site characterization
(decreased uncertainty in the conceptual site model) is possible because plumes can be
chased and spatial patterns delineated in real time by daily adapting the sampling design
according to newly acquired information (US DOE, 2002).
SYSTEMATIC PLANNING FOCUSES ON MANAGING UNCERTAINTY
TO REACH PROJECT GOALS
Detailed project-specific systematic planning is the most critical element of the Triad. Without it,
applications of the other two elements can become mired in confusion and quickly losejbcus and
direction. A dynamic work plan strategy might save as much as 50 percent of overall pro-
ject costs, but only if there has been sufficient investment in up-front planning. In a cost-
consciousness world, a significant barrier to the Triad is that most budget and staffing
structures are not designed to support this level of intense planning. Revisions to these
structures would permit a greater investment of resources before the fieldwork phase
begins in order to save time and resources later.
o
© 2003 Wiley Periodicals, Inc.
97
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Successful projects show
that the selection of ana-
lytical methods is best
mixed and matched
according to the site-spe-
cific data needs.
Successful projects tend to actively engage all stakeholders in transparent,
consensus-driven project planning, implementation, and decision making. A concep-
tual site model (CSM) that goes beyond the typical exposure modeling for risk assess-
ment is an essential planning and communication tool. The CSM is any graphical or
descriptive device used to organize what already is known about the site and what
needs to be known in order to reach the project's ultimate goals. Articulation of the
CSM helps the project team identify areas of decision uncertainty and determine what
additional information must be obtained in order to support decision making. Once
the data gaps contributing to decision uncertainty are identified, site characterization
activities are planned to provide the required information. By working from the top-
down, a clear articulation of decision uncertainty allows identification of data gaps.
This saves money during data generation because only data that is valuable and neces-
sary to decision making is collected. Development of the CSM should involve multi-
disciplinary expertise to clarify project goals, predict technical and legal issues likely
to arise, identify risk assessment data needs, and anticipate cleanup alternatives. All
these considerations should drive data gathering (sampling and analysis) strategies. The
various factors to be considered during systematic planning for remediation projects
are covered by the U.S. Army Corps of Engineers in their Engineering Manual for
Technical Project Planning (USAGE, 1998).
Systematic planning for the dynamic work plan often involves developing a series of
"if-then" statements or a decision tree that can be implemented in real time as data fills
knowledge gaps. If certain unknowns cannot be cost-effectively filled, then the decision-
making process can be constructed to accommodate that limitation. Sometimes it is
more cost-effective to manage decision uncertainty by erring on the side of caution and
taking the most protective action rather than paying to generate the data needed to
prove whether the action is absolutely necessary. In this way, decisions can be made
despite some unresolved uncertainties, yet the decisions still remain protective of
human health and the environment.
The real-time decision making of dynamic work plans requires that the data used to
implement the decision tree be turned around in an equivalent "real-time" time frame.
Successful projects show that the selection of analytical methods is best mixed and
matched according to the site-specific data needs. Depending on the nature of the pro-
ject, real-time measurements have been provided by field-portable methods (such as
hand-held instruments and kits of varying technical complexity), by mobile laboratories
(which may run kits, standard laboratory instrumentation, or both), by rapid turnaround
analysis at a traditional fixed laboratory, or a combination of all three. Contracting to
operate screening analytical methods in traditional laboratories can be more cost-effec-
tive than trying to operate kits under certain circumstances, such as difficult field condi-
tions. Variables that may compromise kit performance in the field, such as ambient
temperature, dust, humidity, reagent storage, water and power supplies can be much
better controlled in a mobile or fixed laboratory setting. If a fixed laboratory happens to
be located nearby, rapid turnaround to support a dynamic work plan may be just as fea-
sible as doing the analyses in the field.
ANCHORING DATA QUALITY IN DECISION MAKING
98
Systematic planning is critical for managing the analytical uncertainty inherent to all
environmental data. This is particularly important for measurements produced in the
© 2003 Wiley Periodicals, he.
-------
REMEDIATION Spring 2003
field. Yet current practices present many opportunities for errors. As noted previously,
field methods provide important advantages: real-time results can support a dynamic
work plan, and increased numbers of samples can support management of sampling
uncertainty. However, many practitioners associate common field methods with "field
screening," and the first-generation data quality model assumes that "screening quality
data" is of little value. Updating our current data quality model means updating this
thinking. If "data quality" is assessed according to the ability to support confident deci-
sions, then "screening quality data" should be defined as those data that provide some
useful information, but not enough information to alone support decision making at the
desired level of certainty (Crumbling, 200 Ib). Since the term "data quality" must
include the concept of "sampling quality," as well as the more familiar analytical quality,
uncertainty either about the sample representativeness OR about the analytical quality
(or both) with respect to the intended decision could render data as screening quality.
Therefore, perfectly accurate analytical methods will produce screening quality data if
the representativeness of the sample is not known. On the other hand, screening analyti-
cal methods may produce data effective for making decisions if both the sampling repre-
sentativeness and the analytical quality are known to be sufficient to meet
decision-making needs.
Even if the analytical quality produced by a field method is not sufficient to support
final decisions about risk assessment or regulatory compliance, the field method may
serve a critical role in managing the sampling uncertainty of fixed laboratory data. This
is, in fact, what many practitioners intend to convey when they talk about "field screen-
ing." However, because of pervasive confusion about the relationships between analytical
method performance, data quality, and the decision-making process, many regulators
have reacted with distrust to the term "field screening" and its associated technologies.
To move the environmental field past this hurdle, it would be wise to change several
common practices.
Recommended Change: Stop using the term "field screening." As cur
rently used, this term is ambiguous and misleading. It echoes and perpetuates the myths
of the first-generation data quality model (Crumbling, 2002). The truth is that not all
field methods are screening analytical methods (although it is true that many are).
Additionally, the location where data are generated (i.e., in a fixed laboratory or in the
field) should not be assumed to dictate the quality of the data. As should be clear by
now, data generated in a traditional laboratory setting should be considered screening
quality if the sampling representativeness, a key aspect of data quality, is unknown.
Instead of "field screening," more neutral and accurate, terms, such as "field analytical,"
"on-site measurement," "real-time analysis," and similar terms, should be used to avoid
implying that data quality is tarnished simply by virtue of being produced in the field.
Recommended Change: Use internally consistent and clearly defined
terms. When discussing data quality concepts, the Triad Approach makes careful distinc-
tions that explicitly link data quality assessment to the data's ability to help manage deci-
sion uncertainty.
1. Data of known quality: These are data for which each step of the sampling, sub-
sampling, and analytical procedures and performance is documented. This allows a data
user to establish estimates for the sampling representativeness, for the analytical bias,
precision, and reporting limits, and for the possible impact of interferences on the mea-
surement process. This allows the data user to decide whether or not the uncertainty in
the data is excessive with respect to the desired data use.
© 2003 Wiley Periodicals, Inc.
When discussing data
quality concepts, the
Triad Approach makes
careful distinctions that
explicitly link data quality
assessment to the data's
ability to help manage
decision uncertainty..
99
-------
Improving Decision Quality: Mating the Case for Adopting Next-Generation Site Characterization Practices
The term "effective data"
is a short way to say "data
that are effective for deci-
sion making."
100
2. Data of unknown quality: Data are of unknown quality when critical steps in the
data generation process are improperly performed or have not been documented, creat-
ing irresolvable uncertainty about whether the data results are credible. Here are some
examples of data of unknown quality; (A) Improper soil collection procedures or lack of
documentation about what procedures were used that creates significant uncertainty
about whether or not there has been significant loss of volatile compounds from the
sample prior to analysis, possibly invalidating the results for decision-making purposes.
(B) Inadequate analytical quality control or lack of documentation that creates uncer-
tainty about whether the analytical instrument was calibrated correctly, so it is unknown
whether results are reliable. (C) Failure to consider the performance limitations of an
analytical method in the context of the project that creates uncertainty whether interfer-
ences that were likely present in the samples caused the data to be biased. Improperly
interpreting this data without acknowledging the possible impact of interferences could
lead to decision errors beyond what can be tolerated.
3. Decision quality data (or "effective data"): This term is defined as "data of known
quality that are demonstrated to be effective for making the specified decision because
both the sampling and analytical uncertainties have been managed to the degree neces-
sary to meet clearly defined and specified decision confidence goals."The term "effective
data" is a short way to say "data that are effective for decision making." Note the strong
intuitive linkage of data quality to decisions. Any claims of "decision quality data" or
"effective data" in a project-specific context are meaningless unless it is very clear what
decisions the data will be (or, are being) used to support.
4. Screening quality data: Under the Triad paradigm, this term describes data of
known quality that may contribute some useful information to the decision-making pro-
cess, but by itself, the data set is inadequate (i.e., too uncertain or incomplete) to sup-
port confident decision making. It is possible that a screening quality data set can be
used carefully and collaboratively in conjunction with other data or information that
manages the residual uncertainties (see below).
S. Collaborative data: These are separate data sets that are used together to manage
different aspects of data uncertainty. For example, because of the high cost of traditional
laboratory analyses, it is generally cost-prohibitive to take enough samples for traditional
analysis to develop a good understanding of contaminant distributions and patterning.
Because of this problem, when used on their own, traditional "definitive" environmental
methods may produce results that have excellent analytical quality on tiny samples, but
huge uncertainty remains about whether those results can be extrapolated to larger por-
tions of the matrix (i.e., the sampling representativeness). On the other hand, less
expensive methods frequently used in the field are often based on screening analytical
methodology, but they can provide the higher sampling densities needed to manage sam-
pling uncertainty. Yet the analytical quality may be insufficient for purposes of risk
assessment or for demonstrating regulatory clean closure. The solution to this dilemma
o to J
is to use both method types collaboratively according to their individual strengths. This
concept is illustrated in Exhibits 3 and 4.
Be aware that mathematically merging different data sets created by different methods
may not always be possible or advisable. Very often, collaborative data sets may not be sta-
tistically comparable to each other for a number of reasons. For example, they may be
based on different analytical principles and so respond to target analytes differently, or a
different sample preparation method in the analytical chain may cause different analytical
bias and precision. However, this does not detract from their usefulness. One data set can
© 2003 Wiley Periodicals, Inc.
-------
Uncertainty Produces "Screening Data"
Costly definitive
analytical methods
Cheaper/screening
analytical methods
Low DL + analyte specificity
High spatial density
Manages sampling uncertainty
= sampling representativeness
= sampling quality
Manages analytical uncertainty
= analytical representativeness
analytical quality
Definitive analytical quality J Scroonlng D2t3 L Punitive sampling quality
Screening sampling quality ""'" > - 'Screening analytical quality
Exhibit 3. It is extremely difficult to cost-effectively manage all relevant sampling and analytical
aspects of data uncertainties using just one analytical method. {DL = detection/quantitation limits.)
REMEDIATION Spring 2003
be used to make decisions about one aspect of uncertainty (such as contaminant distribu-
tions), whereas another data set is used to manage other contributions to overall data
uncertainty. For example, the data produced by a screening method might be used to
delineate different contaminant populations (this is termed "stratification" in statistical ter-
minology) that are then considered separately for purposes of subsequent data gathering or
remedial decision making. After sampling representativeness has been established using the
cheaper method, a follow-up sampling scheme can be designed for sending proportionally
fewer samples for more expensive analysis to the extent needed to manage the analytical
uncertainty remaining in the data set produced by the screening method. An example of
how this can be done is presented in the "Tree Fruit Site" case study (USEPA, 2000).
Recommended Change: Data must be of known quality. When any analytical
method is used with inadequate quality control (QC), the data produced is of unknown
quality, and may be justifiably rejected by regulatory agencies. There are enough anecdotal
reports to suggest that data generated in the field under current practice is often of unknown
quality. No doubt this has contributed to regulator distrust of field data. No data user
should have to accept data values on faith. There is no excuse for neglecting to do QC on
field methods, even if field data are "only" used to manage sampling uncertainty. No matter
what the data use, the data must be of known and documented quality to permit correct
interpretation with respect to the decision. Relying on simple checklists or blanket rules
to assemble a QC program usually fails to serve the goal of scientific defensibility. The
solution is reliance on a multidisciplinary technical team that includes appropriate analyti-
cal chemistry expertise to support project planning and implementation.
Even seemingly simple field kits are based on not-so-simple analytical principles,
with ample potential for instrument failure and analytical interferences. Although tech-
nician-level staff may be appropriate to operate simpler field kits, this should be done
under the supervision of a chemist knowledgeable about field analysis and the limitations
of the particular kit being used. Selection of the proper kit, modification of extraction
or analysis procedures, interpretation of the results, selection of appropriate QC mea-
© 2003 Wiley Periodicals, Inc.
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Mix and Match Methods to Manage All Uncertainties
Cheap screening
analytical methods
Costly definitive
analytical methods
High spatial density
Low DL + analyte specificity
Manages sampling
uncertainty
Manages analytical
uncertainty
Collaborative Data Sets = Decision Quality Data
Exhibit 4. Collaborative data sets are designed to complement each other so that all sources
of data uncertainty important to decision defensibility are managed. {DL = detection/quantita-
tion limits.)
102
sures, maintenance of equipment, and trouble-shooting of the problems that inevitably
arise: all require instrumental analytical chemistry knowledge and experience. All too
often, those implementing cleanup activities avoid collaborating with in-house or con-
tracted chemists during project planning or data collection. This is a mistake, and it
hurts the quality and cost-effectiveness of all environmental chemistry data, not just data
produced in the field. The first-generation data quality paradigm expects the lab to mag-
ically produce "definitive data" in spite of having no information about potential project-
specific interferences and how the data will be used. As attractive as it might be, this
expectation is simply unreasonable (Crumbling, 2002).
For the Triad Approach to become mainstream, patterns derived from the first-gener-
ation data quality model must be broken. Under the Triad Approach, summarizing project-
specific analytical performance based on an evaluation of QC data is a critical aspect of
assessing analytical uncertainty. The plan for how data of known quality will be produced
at the project level needs to be documented (often in a project-specific quality assurance
plan usually called a "QAPP"). Evaluation of QC data is an integral part of data interpreta-
tion and it should be documented in projects reports (Lesnik & Crumbling, 2001).
A dangerous, yet very common, assumption is that the reliability of field data can be
established solely through splitting a certain percentage of randomly selected samples (a
frequently heard ratio is 1 in 10) and sending them to a traditional laboratory. It is true
that careful homogenization and splitting of specially selected samples can help establish
the comparability of the field data with the more rigorous analytical techniques. Split
samples are often a very important component of QC for field data. However, splits are
too often done without considering either the impact of sample heterogeneity on split
results, or without attention to the specific uncertainty(ies) that the splits should man-
age. To produce scientifically defensible data, the number of samples to be split and the
rationale for their selection should be guided by the need to manage uncertainty, not by
rules of thumb. For example, if field data are to be used to make decisions about
©2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
whether or not soils exceed a regulatory threshold, then if at all possible, most of the
split samples should be taken around that decision point to build a data set that demon-
strates that field decisions around that action level are correct.
Another factor to recognize is that when the laboratory confirmation results do not
exactly match the field analysis results, it does not automatically mean that the field analy-
sis is worthless. Because of method bias or imprecision, it may be necessary to derive pro-
ject- and kit-specific thresholds to make field decisions on the basis of kit results. In other
words, a result of 10 ppm from the kit may actually correspond to a result of 5 ppm by
more traditional methods. This is a common consideration for using certain methods, such
o '
as immunoassays, that are deliberately calibrated in the factory to be biased high. In addi-
tion, kits that respond to a suite of related compounds cannot be expected to numerically
match results provided by more selective laboratory methods. Users of these methods
must accommodate these considerations. For example, before the proposed analytical
design is finalized, "pilot studies," also called "demonstrations of method applicability," are
used by successful practitioners to predict the performance of field methods, the need for
method modifications, the appropriate analytical QC procedures to use in the field, and
the comparability of the data to regulatory actions (USEPA, 2000).
MOVING TOWARD "ALLIED ENVIRONMENTAL PROFESSIONALS"
With the advent of more specialized technologies and skills, there are striking parallels
between the evolution of medical practice and the evolution of environmental cleanup
practice. In the old days country doctors did everything themselves. They examined the
patient, ran the few tests available at the time, made the diagnosis, selected the treat-
ment, and administered it. This was reasonable when the options for all these activities
(i.e., the list of known diagnoses, the number of testing procedures, and the potential
treatments) were quite limited. Obviously, that situation changed dramatically as
medicine was transformed by science and technology. Every organ system now merits
its own medical specialty. Powerful, complex diagnostic imaging and laboratory testing
techniques have proliferated, as have options for pharmaceutical or surgical treatment.
We no longer expect one physician to know or do everything. Nor would we have
much confidence in a technician or doctor who reads a "cookbook" to operate testing
equipment or perform surgery. Since the body of knowledge and skills is too great for
any single person to master, good patient care requires that various disciplines share
responsibilities through specialization. Allied health professionals in nursing, the labora-
tory, "X-ray," the pharmacy, physical therapy, etc., all function as a multidisciplinary
team that collectively possesses up-to-date knowledge for each discipline and applies it
for modern patient care, diagnostics, and treatment. Likewise, for environmental health
and economics to benefit from a rapidly growing body of knowledge, open partnerships
between multidisciplinary experts representing the "allied environmental professions" of
engineering, geology, hydrology, soil science, analytical chemistry, statistics, law, con-
tracting, community relations, etc., will be necessary to interface technical knowledge
and skills with routine business and field practices.
A CLASSIC CASE STUDY ILLUSTRATING THE TRIAD APPROACH
With the advent of more
specialized technologies
and skills, there are strik-
ing parallels between the
evolution of medical prac-
tice and the evolution of
environmental cleanup
practice.
Within this article we have referred to the "Tree Fruit Site" case study. This project was
run by the U.S. Army Corps of Engineers (USAGE) to clean up pesticide contamination
© 2003 Wiley Periodicals, Inc.
103
-------
Improving Decision Quality; Making the Case for Adopting Next-Generation Site Characterization Practices
The Triad Approach
reflects the recognition,
based on experience, of
the need for a second-
generation model for
sampling and analysis.
at the Wenatchee Tree Fruit Test Plot Site using a dynamic work plan strategy (US EPA,
2000). The project provides concrete examples for several of the concepts discussed in
the paper. The full case study, along with actual work plans used by the USAGE, are
downloadable from the Clean-Up Information Web site (http://cluin.org) of EPA's
Technology Innovation Office. Briefly, two immunoassay kits (one for each of two major
pesticide groups, the cyclodiene and the DDT families) were used to locate and delin-
eate soil contamination for precision removal and segregation into three categories:
clean soil (i.e., meeting regulatory requirements) that was ultimately reused as backfill;
lesser contaminated soil that was destined for final disposal in a landfill; and highly con-
taminated soil that required incineration for final disposal. In total, site characterization
and cleanup involved analyzing 230 site samples by immunoassay, with 29 samples
selected for splitting for fixed laboratory analysis. In addition to the thorough quality
control performed in the field on the immunoassay kits (3-point calibration curves with
continuing calibration verification, reagent blanks, matrix duplicates, and commercial
performance evaluation (PE) samples used as laboratory control samples), data from the
29 splits were used to aid interpretation of the immunoassay results and to derive and
adjust the action levels used to make decisions in the field on the basis of kit results.
After all contamination was identified and removed with "surgical" precision, 33 soil
samples were collected for fixed laboratory analysis for the 33 constituents of concern.
This clean closure data set demonstrated that the cleanup, guided by the field methods
and performed using a dynamic work plan, achieved full compliance with all state regu-
latory requirements for all 33 target analytes to better than 95 percent statistical confidence.
The entire project cost ($589,000, including disposal of wastes, backfilling and seeding
of the site, and all contractor and oversight costs) was less than half the cost projected
($1.2 million) using more traditional investigation and cleanup strategies. Although this
was a very small site [85 ft X 33 ft X 6 ft (depth)], the extreme heterogeneity caused by
highly toxic pesticide spills required a high sampling density to control sampling errors
while making cleanup as cost-effective as possible,
THE TRIAD APPROACH SUPPORTS THE LAND REVITALIZATION
ORIENTATION OF CLEANUP PROGRAMS
704
The Triad Approach reflects the recognition, based on experience, of the need for a
second-generation model for sampling and analysis. Cleanup programs as a whole are
also experiencing an evolving orientation. These programs are increasingly widening
their focus to include not just cleanup, but also the ultimate revitalization or reuse of
sites. Building on the popularity of redevelopment-based initiatives such as land recy-
cling and voluntary cleanup efforts at the state level, and the Brownfields effort at the
national level, Superfund cleanups, corrective actions under the Resource
Conservation and Recovery Act (RCRA) and Underground Storage Tanks (UST) pro-
grams are striving towards land revitalization. For some time, remediation decisions at
o o
closing military bases and other federal facilities have considered the ultimate disposi-
tion and reuse of contaminated property. Land Revitalization, Superfund Site
Recycling, RCRA Brownfields, and UST Fields are all recent additions to the EPA
waste program lexicon that reflect this trend. These programs promote cleanups that
not only meet stringent health and eco-based cleanup requirements, but also benefit
communities by including reasonable future use considerations as part of the remedy
decision-making process.
€> 2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
This broader view of cleanup presents an excellent opportunity to advance the
changing data quality paradigm advocated under the Triad. The Triad provides a technical
framework to help realize land revitalization objectives. A cleanup strategy tailored to
the end-use of a property should discourage the use of one-size-fits-all approaches to
site investigation and instead look at the specific contamination issues in relation to site
plans. A robust planning process not only ensures successful redevelopment, it can also
ensure that the technical work at the site is done as efficiently as possible. Thus, the Triad
can create a technical bridge between regulatory and cleanup requirements and commu-
nity needs, reuse plans, and resources for a property.
The redevelopment perspective for contaminated sites also creates a market disci-
pline for site cleanup. As the real estate and financial communities evaluate the viability
of private site development, the comparison of costs of acquisition and site preparation
vs. ultimate worth and revenues of the property creates an incentive to minimize both
the financial costs of cleanup as well as the time frames for cleanup. The maxim, "time is
money" is especially poignant in the redevelopment setting as is the idea of "striking
while the iron is hot." A narrow emphasis on cutting costs and time could encourage less
than adequate site characterization. Fortunately, this is counterbalanced by the sensitivity
of the development and financial communities to future liability. The ideas of unfore-
seen, costly, and time-consuming problems during the cleanup and construction process
and of future regulatory or tort action after development can have a chilling effect on
the market for Brownfields. With its underpinnings of cost-effectively addressing sam-
pling uncertainty and sample representativeness, the Triad can help alleviate the con-
cerns of "missing something" or inadequately remediating a site so that goals of
cost-minimization and protectiveness can both be achieved.
The Triad offers the potential time and cost savings essential to increasing the num-
ber of properties that are viable candidates for redevelopment. Even at sites with mini-
mal private development appeal, reducing cleanup costs will allow public entities to
address more sites with their finite resources. A long-ranpe view on advancing and
to D O
demonstrating successful reuse should increase the market of sites to be cleaned up.
Environmental practitioners should not view Phase I and Phase II investigations as end
products of the Brownfields market or as off-the-shelf commodities. Instead, these tasks
should be viewed as stepping-stones to more efficient and seamless site-specific cleanup
strategies. The systematic planning process helps ensure that the characterization strat-
egy effectively addresses all aspects of uncertainty in site decisions, thereby increasing
the level of comfort with the chosen cleanup strategy. Increasing decision comfort, cut-
ting costs, and reducing time frames are all advantages of the Triad Approach that sup-
port land revitalization.
EFFORTS TO PROMOTE THE TRIAD APPROACH IN THE NORTHEAST
The ideas of unforeseen,
costly, and time-consum-
ing problems during the
cleanup and construction
process and of future
regulatory or tort action
after development can
have a chilling effect on
the market for
Brownfields.
The waste site cleanup program directors in the seven Northeast states (Connecticut,
Maine, Massachusetts, New Hampshire, New York, Rhode Island, and Vermont) collec-
tively determined that improving the quality of the site characterizations performed
was their number one priority for work together as a region. The main state concerns
center around two areas: inadequate data collection to support conclusions about the
nature and extent of contamination; and submitted characterization reports that do
not clearly explain what was done and why, and lack maps and other visual aids to
present site data.
© 2003 Wiley Periodicals, Inc.
105
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Regulatory agencies are
mainly concerned that
consulting firms might not
have the capability to
place well-trained staff in
the field to operate the
field-based analytical
equipment properly, and
to make on-site decisions.
The Northeast states support the use of the Triad Approach as a way to address
these concerns and would like to see increased use of field-based analytics where appro-
priate. Therefore, to raise awareness of state and federal concerns about the quality of
the site characterizations performed, to promote the Triad Approach, and to begin the
process of change, the Northeast Waste Management Officials' Association
(NEWMOA)2, EPA Region 1, and EPA's Technology Innovation Office (TIO) sponsored
two one-day conferences in the Northeast, on June 4, 2002, in Manchester, New
Hampshire, and again on June 6, 2002, in Farmington, Connecticut.Together, over 300
people, including local, state, and federal regulatory staff, consultants, and facility repre-
sentatives from the eight Northeast states, attended the conferences. Each conference
o '
also included a vendor showcase where attendees could learn more about innovative
sampling and analytical equipment, as well as data management software and companies
that perform field-based analytical work on a subcontracting basis.
Several issues arose during the interactive portions of the conferences and in a sur-
vey that was distributed to participants. One area of concern was the lack of accepted
guidelines on how to construct and then implement a dynamic work plan. The consult-
ing community also cited the lack of regulatory agency acceptance of dynamic work
plans and field sampling methods as a major barrier to their greater use. Regulatory
agencies are mainly concerned that consulting firms might not have the capability to
place well-trained staff in the field to operate the field-based analytical equipment prop-
erly, and to make on-site decisions. Participants also cited uncertain characterization
costs, increased data interpretation, and data defensibility as barriers. In the next phase
of the project, NEWMOA plans to undertake efforts to address the issues, both real and
perceived.
Participant suggestions for efforts to reduce barriers to the Triad Approach mainly
center on the development of guidance documents:
• develop guidance documents on generating dynamic work plans, including exam-
ples of actual work plans
• develop guidance on statistically based sampling for regulatory decision making
• develop guidance/protocols on the use of various field-based analytical methods
* provide more training, especially for regulators
• provide certification courses for users of field-based analytical equipment
Both consultants and state regulatory agencies look towards EPA for leadership on
these issues. As can be seen in the following sections, EPA has or is addressing many of
these suggestions; however, many seem unaware of these efforts. For more information
about the conferences, including copies of the presentations and the results of the par-
ticipant survey, and to learn about other NEWMOA waste site cleanup-related activi-
ties, please visit www.newmoa.org/cleanup.
OTHER STATE REGULATORY EFFORTS
106
In addition to NEWMOA's activities, the Interstate Technology and Regulatory Council
(ITRC) is promoting the Triad Approach. The ITRC is a state-led coalition of personnel
from the regulatory and technology programs working together with industry and stake-
holders to achieve regulatory acceptance of environmental technologies. ITRC currently
consists of more than 40 states, the District of Columbia, multiple federal partners,
©2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
industry participants, and other stakeholders, all together to break down barriers and
reduce compliance costs, making it easier to use new technologies, and maximize
1 ' O O '
resources. An ITRC workgroup is developing a guidance document to help educate state
regulators about the importance of the regulator role for supporting the systematic
planning needed to implement the Triad Approach. In time, the ITRC plans to provide
training courses to aid both regulators and environmental consultants to adopt the Triad
.00 i
Approach (ITRC, 2002).
AN ONLINE RESOURCE TO SUPPORT TRIAD ADOPTION
Through the collaborative efforts of the Innovative Technology Advocate Program of the
O QJ to
U.S. Army Corps of Engineers, the Environmental Assessment Division of Argonne
National Laboratory, and EPA's Technology Innovation Office, an online resource titled
"Handbook of Technical Best Practices for Implementing the Triad Approach" is being
developed. The "Triad Handbook" will provide a structure for synthesizing and dissemi-
nating technical knowledge and information resources that support contaminated site
investigation and cleanup. The Handbook is not itself an EPA guidance, although EPA
guidances will be linked into the Handbook as references.
O
The virtual platform of the Handbook should permit it to fill a number of roles.
First and foremost, the Handbook will be a forum to share technical information across
geographic barriers to encourage rapid dissemination of successful field practices and
strategies for site characterization. The Handbook will serve as an information clearing-
O O
house to direct users to useful technical resources organized according to the typical
technical project lifecycle, and if possible, link them electronically for instant access. In a
sense, the Handbook will serve as an annotated "library" for technical staff seeking quick
access to existing guidance documents and publicly available information relevant to site
cleanup policies, procedures, and technical/scientific developments. As it fills these
roles, the Handbook will emphasize the role of systematic planning to consider sources
of analytical and sampling uncertainties in site cleanup decisions.
This is an ambitious undertaking, and the completeness of the Handbook's refer-
ences will depend on participation of site cleanup community to include as many rele-
vant documents and useful Web site links as possible. An important feature of the
Handbook will be hyperlinked case studies that more specifically illustrate general prin-
ciples in project-specific applications. Over time, the Handbook will evolve and change,
just as the science and technology underpinning site characterization continues to evolve
and change. Public access to the Handbook is expected in the later half of 2003, and will
be announced throughTechDirect (http://www.cluin.org/newsletters/). Once
released, a link to the Handbook will be available through http://clu-in.org.
SOURCES OF ADDITIONAL INFORMATION
The Handbook will serve
as an information clearing-
house to direct users to
useful technical resources
organized according to
the typical technical pro-
ject lifecycle, and if possi-
ble, link them
electronically for
instant access.
Case study reports, published papers and articles (Crumbling et al., 2001) concerning
the Triad Approach, dynamic work plans, sample handling and statistics, and a host of
related topics are available through the Clu-In Web site (http://clu-in.org) under the
"Site Characterization" and "About the Technology Innovation Office/TIO Perspectives"
menus. Live and prerecorded Internet-based seminars, including seminars dedicated to
the Triad Approach, are available through the Clu-In "Studio." The Clu-In Web site is con-
tinually updated as additional support services become available.
© 2003 Wiley Periodicals, Inc.
107
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
The environmental
cleanup community will
be challenged to evolve
their assumptions and
paradigms, as well as
their mechanisms for reg-
ulatory oversight and
procuring services.
108
The Field Analytical Technologies Encyclopedia (FATE) can be accessed through the
Clu-In Web site or directly at its URL (http://fate.clu-in.org/). Offered freely to the
public through a partnership between EPA's Technology Innovation Office and the
Innovative Technology Advocate Program of the U.S. Army Corps of Engineers, FATE is
an on-going effort to provide overview information about analytical chemistry, geo-
physics, and other technologies used in the field to characterize contaminated soil and
ground water, monitor the progress of remedial efforts, and in some cases, for confir-
mation sampling and analysis for site close out,
Finally, to build an understanding of the requirements, benefits, and limitations of
the practical application of the Triad in a reuse setting, EPA, the U.S. Army Corps of
Engineers, and Argonne National Laboratories are assisting several communities in
0*0 o
implementing the Triad in a Brownfields reuse setting. By providing direct, in-kind tech-
nical support through the Brownfields Technology Support Center or BTSC
(http://brownfields tsc.org), EPA and its partners hope to improve acceptance of
streamlined approaches by all decision makers involved in Brownfields cleanup. An early
lesson learned from this work is the importance of building the required flexibility and
capacity into the procurement process to allow the proper application of the Triad
Approach. The emphasis on the Triad should start with the request for proposals from
potential service providers and carry through to the planning and direction required to
implement the Triad process. EPA will make these lessons available through case study
write-ups on the BTSC Internet site and through Triad training targeting Brownfields
localities, contractors, and regulators.
SUMMARY
The hazardous waste cleanup arena is changing as a result of 20 to 30 years of practi-
tioner and regulatory program experience, greater scientific sophistication, better
options for treatment, and the electronic technology revolution. EPA articulated the
Triad Approach to create a conceptual framework that could organize innovative tech-
nologies and "smarter" work strategies around the theme of basing environmental regu-
latory decisions on good science to the extent possible, including acknowledging the
impact of uncertainties on decision making.
Although there has been progress toward greater regulatory and practitioner
acceptance of these tools and the strategies, significant institutional barriers remain.
The environmental cleanup community will be challenged to evolve their assump-
tions and paradigms, as well as their mechanisms for regulatory oversight and
procuring services. Altering staffing and procurement structures to recognize the
value of an "allied environmental professionals" approach to project planning and
implementation will be a gradual process. But in time, assembling multidisciplinary
teams should become easier for both regulatory agencies and contracting firms.
o j o o
Communicating concepts fundamental to managing data uncertainties will continue
to be difficult as long as the data quality paradigm begins and ends with the fallacies
that environmental data quality is solely a function of the analytical method and that
fixed laboratory analyses always produce the best data quality. However, much of the
science needed to improve both sampling and analytical quality is falling into place
(although there is still much to be learned). A wealth of "lessons learned" and
remarkable new technologies are helping to transform how the cleanup community
does its job.
© 2003 Wiley Periodicals, Inc.
-------
REMEDIATION Spring 2003
NOTES
1. At a Northeast Waste Management Officials' Association (NEWMOA) meeting held on October 21,
1999. Note that New Jersey subsequently joined NEWMOA and participated in more recent activities.
2. NEWMOA is a nonprofit, nonpartisan, interstate association whose membership is composed of the haz-
ardous waste, solid waste, waste site cleanup, and pollution prevention program directors for the environ-
mental agencies in the eight Northeast states (CT, ME, MA, NH, NJ, NY, Rl, and VTJ. NEWMOA's mission is
to hefp states articulate, promote, and implement economically sound regional programs for the enhance-
ment of the environment. NEWMOA is funded by state membership dues and by contracts and EPA grants.
REFERENCES
Crumbling, D.M. (2001a, August). Current perspectives in site remediation and monitoring: Using the Triad
Approach to improve the cost-effectiveness of hazardous waste site cleanups. EPA 542-R-01-016.
Available: http://cluin.org/tiopersp/
Crumbling, D.M. {2001b, August). Applying the concept of effective data to environmental analyses for
contaminated sites. EPA 542-R-01-013. Available: http://cluiri.org/tiopersp/
Crumbling, D.M. (2002, April). Getting to the bottom line: Decision quality vs. data quality. Proceedings of
EPA's 21st Annual National Conference on Managing Quality Systems. Available:
http://www.epa. gov/qua!ity/qs_docs/21 qa_papers.pdf
Crumbling, D.M., Groenjes, C., Lesnik, B., Lynch, K., Shockley, J., van Ee, J., et al. (2001). Managing uncer-
tainty in environmental decisions: Applying the concept of effective data at contaminated sites could
reduce costs and improve cleanups. Environmental Science & Technology 35(9), 404A-409A. Available:
http://cluin.org/download/char/oct01est.pdf
Gerlach, R.W., Dobb, D.E., Raab, G.A., & Nocerino, J.M. (2002). Gy sampling theory in environmental stud-
ies. 1. Assessing soil splitting protocols. Journal of Chemometrics, 16, 321-328. Available:
http://cluin.org/download/char/gerfach_sampling_article.pdf
Gilbert, R.O., & Doctor, P.G. (1985). Determining the number and size of soil aliquots for assessing particu-
late contaminant concentrations. Journal of Environmental Quality, 14(2), 286-292.
Homsher, M.T., Haeberer, R, Marsden, P.J., Mitchum, R.K., Neptune, D., & Warren, J. {1991, Oct./Nov.).
Performance based criteria, a panel discussion. Environmental Lab. Available: http://cluin.org/down-
load/char/dataquality/perf based.pdf
Interstate Technology and Regulatory Council (ITRC). See Web site at http://www.itrcweb.org
Jenkins, T.F., Grant, C.L., Brar, G.S., Thorne, P.G., Ranney, T. A., & Schumacher, P.W. (1996). Assessment of
sampling error associated with collection and analysis of soil samples at explosives-contaminated sites.
U.S. Army Corps of Engineers Cold Regions Research & Engineering Laboratory Special Report 96-15.
Available: http://www.crre!.usace.army.miI/techpub/CRREL_Reports/reports/SR96_15.pdf
Jenkins, T.F., Walsh, M.E., Thome, P.G., Thiboutot, S., Ampleman, G., Ranney, T.A., et al. (1997). U.S. Army
Corps of Engineers Cold Regions Research & Engineering Laboratory Special Report Special Reports
97-22. Available: http://www.crre!.usace.army.mil/techpub/CRREL_Reports/reports/SR97_22.pdf
Lesnik, B., & Crumbling, D. M. (2001, Jan./Feb.). Guidelines for preparing SAPs using systematic planning
and PBMS. Environmental Testing & Analysis, 10(1), 26-40. Electronic reprint available:
http://cluin.org/download/char/etasaparticle.pdf
© 2003 Wiley Periodicals, Inc.
-------
Improving Decision Quality: Making the Case for Adopting Next-Generation Site Characterization Practices
Popek, E.P. (1997). Investigation versus remediation: Perception and reality. Proceedings of WTQA'97-the
13th Annual Waste Testing and Quality Assurance Symposium, pp. 183-188. Available:
http://cluin.org/products/dataquality/
Ramsey, C.A., & Shuggs, J. (2001, March/April). Improving laboratory performance through scientific sub-
sampling techniques. Environmental Testing & Analysis. Available: http://cluin.org/download/stats/eta-
subsamplingarticle.pdf
Robbat, A. (1997). A guideline for dynamic workplans and field analytics: The keys to cost-effective site
characterization and cleanup, sponsored by the President's Environmental Technology Initiative, through
the U.S. Environmental Protection Agency, Washington, DC. Available;
http://clujn.org/download/char/dynwkpln.pdf
Tetra Tech EM Inc. (1997). Summary of recent improvements in methods for the study of contaminated and
potentially contaminated sites (White paper prepared for EPA under contract No, 68-W5-0055).
Available: http://www.brownfieldstsc.org/publicationsjndex.htm
Tillman, N., & Sohl, J. (2001, August). Subsurface profiling systems: The use of effective data for making
defensible project decisions. Proceedings of the 17th Annual Waste Testing and Quality Assurance
Symposium. Available: http://www.columbiadata.com/mip/resources.cfm
U.S. Army Corps of Engineers (USAGE). (1998, Aug.). Environmental quality: Technical project planning
(TPP) process (Engineering Manual 200-1-2), Washington, DC. Available:
http://www.usace.army.mil/publications/eng-manuals/em200-1-2/toc.htm
U.S. Department of Energy (US DOE). (1998, Dec.). Expedited site characterization. Innovative Technology
Summary Report, OST Reference #77. Office of Environmental Management, U.S. Department of
Energy. Available: http://apps.em.doe.gov/ost/itsrall.html See also:
http ://www. etd .a mes la b. gov/etd/tech nol og i es/proj ects/esc/
U.S. Department of Energy (US DOE). (2002). Environmental Assessment Division (EAD) Adaptive Sampling
and Analysis Program (ASAP) Available: http://www.ead.anl.gov/project/dspjopicdetail.cfm?topicid=23
U.S. Environmental Protection Agency (USEPA). (1989). Soil sampling quality assurance user's guide, 2nd
ed. EPA/600/8-89/046. Available: http://cluin.org/download/char/soilsamp.pdf
U.S. Environmental Protection Agency (USEPA). (1990). A rationale for the assessment of errors in the sam-
pling of soils. EPA/600/8-89/046. Available: http://cluin.org/download/stats/rationale.pdf
U.S. Environmental Protection Agency (USEPA). (1998). EPA guidance on quality assurance project plans {G-
5), EPA/600/R-98/018. Washington, DC. Available: http://www.epa.gov/quality/qs-docs/g5-final.pdf
U.S. Environmental Protection Agency (USEPA). (2000, Aug.). Innovations in site characterization case study:
Site cleanup of the Wenatchee Tree Fruit Test plot site using a dynamic work plan. EPA 542-R-00-009.
Washington, DC. Available: http://clujn.org/char1_edu,cfm#site_char
U.S. Environmental Protection Agency (USEPA). (2002). On-line field analytical technologies encyclopedia
(FATE), Available: http://fate.clu-in.org/
Vroblesky, D.A. (2001). User's guide for polyethylene-based passive diffusion bag samplers to obtain
volatile organic compound concentration in wells. U.S. Geological Survey Water-Resources Investigation
Report 01-4060. Available: http://sc.water.usgs.gov/publications/difsamplers.html
770 © 2003 WileV Periodicals, Inc.
-------
REMEDIATION Spring 2003
Deana M. Crumbling has worked in the hazardous waste site cleanup arena over the past 10 years, and
in U.S. EPA's Technology Innovation Office since 1997. She is an analytical chemist with clinical, industrial,
and research experience. She holds a B.S. in Biochemistry, a B.A. in Psychology, and an M.S. in
Environmental Science.
Jennifer Griffith is a project manager at the Northeast Waste Management Officials' Association (NEW-
MOA). Ms. Griffith is a registered professional engineer with a B.S. in Environmental Engineering from the
University of Vermont, and an M.S. in Civil Engineering and an M.S. in Technology and Policy from the
Massachusetts Institute of Technology.
Dan Powell has been with the U.S. EPA's Technology Innovation Office since 1990. He manages the
Brownfields Technology Support Center to promote the use of innovative investigation and clean-up technolo-
gies for site redevelopment. He holds a Masters of Public Administration and a B.A. in Political Science and
Urban Studies.
© 2003 Wiley Periodicals, Inc. f) f
-------
------- |