^           Quality Assurance, 9:179-190, 2001/2002
-4           1052-9411/02 $12.00> .00
            DO1: 10.1080/10529410290116991
                    IN SEARCH OF REPRESENTATIVENESS: EVOLVING
                      THE ENVIRONMENTAL DATA QUALITY MODEL

                                      Deiina M. Crumbling
                         U.S. EPA Technology! Innovation Office, Washington, DC, USA
                    Environmental regulatory policy states a goal of "sound science." The practice
                    of good science is founded on the systematic identification and management of
                    uncertainties; i.e., knowledge gaps that compromise our ability to make accu-
                    rate predictions. Predicting the consequences of decisions about risk and risk
                    reduction at contaminated sites requires an  accurate model of the nature
                    and extent of site contamination, which in turn requires measuring contami-
                    nant concentrations in cdmplex environmental matrices. Perfecting analytical
                    tests to perform those measurements has consumed tremendous regulatory at-
                    tention for the past 20-310 years. Yet, despite great improvements  in environ-
                    mental analytical capability, complaints about inadequate data quality still
                    abound.  This paper argues that the first generation data quality model that
                    equated environmental data quality with analytical quality was a useful start-
                    ing point, but it is insufficient because it is blind to the repercussions of multi-
                    faceted issues collectively', termed "representativeness." To achieve policy goals
                    of "sound science" in environmental restoration projects, the environmental
                    data quality model must be updated to recognize and manage the uncertainties
                    involved in generating representative data from heterogeneous environmental
                    matrices.              •;-.--
             INTRODUCTION
             Investigating and restoring contaminated sites face conflicting goals:
             Site decisions are supposed jto be protective and based on sound science,
             yet project costs are expected to be low. Conflict arises since gathering
             environmental data to support these  kinds of decisions is generally
             expensive  because  measuring  trace chemicals  in  complex,  hetero-
             geneous matrices can be extremely difficult. Developing the technolo-
             gies and expertise for trace contaminant analyses challenged analytical
             chemistry to create the nevf discipline of environmental analysis, with
             new techniques and new equipment. A natural outcome was intense
                                         i                             ...
                 Received 16 August 2002; accepted 1 October 2002.
                 This article is not subject to U.S copyright law.
                 Address correspendence to Deana M. Crumbling, U.S. EPA Technology Innovation Office,
             1200 Pennsylvania Avenue, NW, Washington, DC 20460. E-mail: crumbling.deana@epa.gov
                                                                                   179

-------
legal and regulatory attention on the reliability of chemical analysis.
Meanwhile, the high per-sample cost of analysis naturally drove cost-
conscious project managers to sharply limit the numbers of samples.
Unfortunately, the heterogeneity of most environmental matrices rai-
ses fundamental uncertainties about the ability to extrapolate analy-
tical results from a few  small-volume samples to the much larger
volume of matrix being investigated. Cost and practical considerations
have blunted awareness within  the environmental community to the
fact that sample representativeness is the foundation of data quality.
Now that analytical methodologies are more  advanced, sampling is
generally recognized as the largest single source of uncertainty in en-
vironmental data. But for many years, there were few ways to escape
the quandary of how to ensure  data  representativeness on behalf of
good science and correct  environmental decisions while at the same
time containing  project costs.
    Fortunately,  that situation has  changed. Ongoing technology ad-
vancements  in  rapid  soil and groundwater  sampling tools, field-
portable analytical instrumentation, and  decision-support software
present both opportunity  and challenge. It  is now possible to  manage
the critical sampling and decision  uncertainties that  stem from the
heterogeneity of waste-related  matrices.  In  addition, cost-effective
generation of data in "real-time" (often, but not always, involving field
analytical methods) permits a work-now strategy commonly known as
"dynamic work plans," which employs real-time decision-making in the
field by experienced staff following pre-approved decision trees. When
thoroughly  planned and  properly  implemented, real-time decision-
making saves 30-50% of project costs because fewer remobilization
cycles  (to fill data gaps) are required, and expensive equipment and
labor (such as backhoes, drill rigs, and their operators) are more effi-
ciently utilized.  Dynamic  work plans also produce more thorough and
accurate site characterizations  because immediate feedback allows
        „ I     '     i'1 i   ' i   .^ni '*'|, iju ' i ii ]'  "'  , •„ ',i  »' , :, i,,,ik" i 'I,,,1 iir. ,,• i i|i »»,,  I,, 'i .I., ,  i. | i », ,
data gaps and unexpected discoveries to be rapidly resolved. The re-
sulting complete and accurate conceptual site  models enable decision-
makers to design successful and cost-effective treatment systems and
redevelopment options.
    The obvious  benefits of these new technologies and dynamic work
plan strategies  are gradually increasing their acceptance  by regula-
tors and practitioners.  Yet many institutional barriers remain that
challenge the environmental cleanup community  to evolve their as-
sumptions  and paradigms, as well  as their mechanisms for contract-
ing and regulatory oversight. For  example, field  methods are often
dismisse
-------
                         I
                   In Search of Representativeness                181

expertise needed to design| sampling and analytical plans capable of
generating data of known and documented quality that is explicitly
matched to the intended project decision.  Communicating concepts
that are fundamental to  managing data uncertainty is difficult be-
cause the historical data quality paradigm begins and ends with the
assumption that environmental data quality is a function of the
analytical  method. This  paper discusses  evolution of the environ-
mental data quality  model by  evaluating the relationship between
data quality and decision quality,  and  by distinguishing analytical
quality from data quality. A "next-generation" data quality model can
create the framework needed for explicitly managing both data and
decision uncertainties usii^g new strategies to produce greater deci-
sion confidence ("better")j while simultaneously shortening project
lifetimes ("faster") and cuiting overall project costs ("cheaper") more
than ever before possible (Refs. 1-3).
                         i

"QUALITY" AS A POLICY GOAL
Exhortations for "sound science" and "better quality data" within the
context of regulatory environmental decision-making are increasingly
popular. Is the current daia quality model sufficient to achieve sound
science? Is "data quality" Ireally the key issue, or is there something
more fundamental at stake? Although this paper focuses primarily on
contaminated site cleanup] many of these issues are broadly applicable
to other areas of environmental management.
    Since 1979, U.S. Environmental Protection Agency (EPA) policy has
required an Agency-wide i quality system, with the goal  of providing
"environmental data of adequate quality and usability for their in-
tended purpose of supporting Agency decisions" (Ref. 4). Yet the linkage
•»  .       -i,      1 • j _ 	_1 J _ j_-. _„ _.A'LL.!1^4-w £I»H.IA rf-J *-*/i4 n4r%f\ _n-» aim-** (T  1 Q OOG1 I "XT
 between data quality and
data usability for decision-making is easily
 lost from programmatic knd  project planning and implementation.
 "Data quality" is too often viewed as some independent standard es-
 tablished by outside arbiters independent of how the data will actually
 be used. Project managers tend to  follow a checklist  of "approved^
 analytical methods as the' primary means of achieving "data quality."
 Yet, striving for "high quality data" under the current model has proven
 to be an expensive and sometimes counterproductive exercise.
    In contrast to checklist approaches to "data quality," sound science
 in regulatory and project decision-making is achieved by acknow-
 ledging and  managing  decision uncertainty. Correspondingly,  accep-
 table  data  quality  is  achieved by  managing all aspects of data
 uncertainty to the degree needed to support the decisions for which the
 data are intended. Managing uncertainty, either of decisions or of data,

-------
182                      D. M. Crumbling

requires careful planning using relevant expertise and technical skills.
Calls for "sound  science" and "better data quality" are meaningless
without a simultaneous commitment to include scientifically qualified
staff when planning science-based programs and  projects. Environ-
mental programs exist because there is work that must be done at the
project level. Policy-makers that desire to see sound  science in environ-
mental decisions  need to provide a coherent vision  that will steer the
development of program infrastructure that focuses on managing de-
cision quality at the project level.
   It is a mistake to assume that scientific data are (or can be) the only
basis for regulatory decision-making. Science may  be  able to provide
information about the nature and likelihood of consequences stemming
from  an action, but the decision to pursue or reject that action (i.e.,
accept or reject the risk of consequences) based on scientific information
is within the province of values, not science.  Even the choice of how
much uncertainty  is tolerable in statistical  hypothesis testing lies
in the realm  of values. Thus,  it  is appropriate that many  non-
scientific considerations feed into a regulatory decision-making process.
This  does not  invalidate  a foundation of "sound science" as long as
the various roles of science and values are  differentiated, and any
underlying  assumptions and other uncertainties  in  both data and
decision-making  are openly  declared with  an understanding of how
decision-making could be affected if the assumptions were erroneous.


DECISION QUALITY AS DEFENSIBILITY
The term "decision  quality" implies that decisions are defensible (in the
broadest scientific and legal  sense). Ideally, decision quality would be
equivalent to the correctness of a decision,, but in  the environmental
field, decision correctness is often unknown (and perhaps unknowable)
at the time of decision-making. When knowledge is limited,  decision
quality hinges on whether the decision can be  defended against rea-
sonable challenge in whatever venue it is contested, be it scientific,
legal, or otherwise. Scientific defensibility requires that conclusions
drawn from scientific data  do not extrapolate beyond the available
evidence. If scientific evidence is insufficient or conflicting and cannot
be resolved in the allotted time frame, decision defensibility will have to
rest on other  considerations, such as economic concerns or political
sensitivities. No matter what considerations are actually used to arrive
at a decision, decision quality (i.e., defensibility) implies there is honest
and open acknowledgment  and accountability for the full range of
decision inputs and associated uncertainties impacting the decision-
making process.

-------
                   In Seardfa of Representativeness                183

   Managing  scientific  defensibility is extremely difficult when the
science behind a new initiative is immature. This was undeniably the
situation when Superfund ajnd. other sile cleanup programs were cre-
ated in the 1980s. In a classic chicken-and-egg  dilemma, fledgling
waste programs were askec. to create site investigation and cleanup
procedures despite the fact that the scientific and technical foundations
for those procedures barely existed.  At the same time, programs were
called upon to legally defend their cleanup decisions. To develop the
needed scientific theory, practice, and tools for measuring and miti-
gating contamination and its effects, the government began to pour
funding into research to understand the complex relationships among
environmental, chemical,  and health phenomena.  Despite the im-
maturity of the science, policy-makers and  the public  expected that
cleanup activities  would bbgin and proceed immediately. Few anti-
cipated the daunting technical complexities that would be encountered
by cleanup programs as  they leapt into this unknown sphere of science
and engineering.

FIRST-GENERATION STEPPING-STONES THAT BECAME
STUMBLING BLOCKS   !
When immediate action  is desired, but knowledge and expertise are not
yet sufficient to plot the smartest plan of attack, a reasonable tactic is
to initially create  a consistent, process-driven strategy based on the
best available information so everyone can "sing from the same sheet of
music" while experience anjd knowledge  are being accumulated. Cer-
tainly this made sense for the emerging cleanup programs. To be con-
sistent with sound science,! however,  such a process-driven approach
should be openly acknowledged by all participants as the first appro-
ximation that it is, with the understanding that one-size-fits-all over-
simplifications will be discarded in  favor of more scientifically sound
information as it becomes available. Although science may be comfor-
table viewing first approximations as short-lived stepping-stones sub-
ject to continual improvement and revision, this view is less welcome
when economic and litigiousj forces intersect with broader societal goals
in a regulatory crucible. This is one of the fundamental conflicts faced
by policy makers  seeking '[sound science"  as a basis for regulation.
Furthermore, as individual cleanup programs proliferate at the state
and local levels, first approximations become more and more solidified
in bureaucratic processes tliat naturally prefer predictability and con-
sistency. First approximations take on the aura  of "received truth."
Disseminating and integrating new information  and procedures be-
comes difficult. The net result is that the regulatory and procedural

-------
184                      D. M. Crumbling

infrastructures that  support project  implementation  have trouble
keeping up with maturing science.
   A prime example of this kind of lag is the prevailing concept of "data
quality"  as applied to  environmental analytical  chemistry data.  A
universal assumption of the current model is that analytical quality is
equivalent to data quality. Since definitive analytical methods offer the
potential to produce very high analytical quality (it is debatable whe-
ther the achieved analytical quality is as good as assumed when rote
environmental methods are used indiscriminately for certain analytes
and complex matrices),  conventional wisdom has it that any data pro-
duced by screening analytical methods are automatically inferior and
suspect.  Therefore, technologies such as in  situ or field analytical
methods risk rejection simply because they do not fit the ancestral data
quality model. The point of this paper is that it is this data quality
model  that is inferior and suspect, since it was  developed as  a first
approximation based on incomplete knowledge of environmental sys-
tems and limited technology capability. At the root of the current data
quality model are several assumptions about environmental chemical
analysis:

1. "Data quality" is determined by the accuracy and documentation of
   the chemical analysis procedure (traditionally performed in a labo-
   ratory).
2. The accuracy of analyses on  environmental samples can be ensured
   by consistently performing all analyses according to strictly pres-
   criptive regulator-approved methods.
3. Analytical uncertainty (i.e., the degree to which the accuracy of the
   analytical results are in question) can be managed according to a
   checklist regimen of quality  control procedures that rely largely on
   ideal  matrices such as reagent water or clean sand  to  establish
   method performance.
4. Laboratory quality assurance is equivalent to, and substitutable for,
   project quality assurance.
5. With "cook book" analytical procedures for the laboratory, and a list
   of approved analytical methods in hand during project planning, the
   need for environmental analytical chemistry expertise can be mini-
   mized in the environmental  laboratory and eliminated from project
   planning.

Decision-makers accepted these assumptions when establishing site
investigation and cleanup procedures and programs, even though scien-
tists warned of  their  questionable  validity (Refs. 5, 6). This over-
simplified "analytical quality equals data quality" model supported the

-------
                   In Search of Representativeness                185

imperative to "define the nature and extent of contamination," itself a
first approximation of a j regulatory-based  sampling  and  analysis
strategy for hazardous waste sites.  It was hoped that "defining the
nature and extent" would produce information (in the form of data) that
would tell the project manager what to so with the site. Naturally, it
was impossible in the early! days to predict the kind of cleanup and land
reuse decisions that would be faced later on, so each  site  had to be a
"study" with ill-defined ana shifting project goals. There was no choice
but to collect data with the jhope that it would be appropriate to making
site decisions once it became clear (1) what those site decisions would
be, and (2) how defensible piose decisions would have to be to gain the
buy-in of regulators or stakeholders. This unfocused approach can work
as long as there are sufficient resources (time, money,  and stakeholder
forbearance) to repeatedly return to the site to fill each newly dis-
covered data gap as piecemeal identification of individual site decisions
(and their attendant  uncertainties) progresses  on the way to site
closure.  There is no doubt that that strategy was  the  best available
at that time.
   But  fortunately, advancing knowledge, technology, and  20-plus
years of experience  meaiis that this process can  be replaced  by
something better. It is possible now to anticipate project goals (or at
least a short-list of desirable site outcomes) at the start of the project.
Regulatory agencies  provide  residential and industrial thresholds
derived from estimations of human health risks and other impacts to
the environment as targets  for decision-making.  Vast institutional
knowledge exists for  most site types, their  contaminants' release
patterns, and exposure scenarios. To be sure, we have only scratched
the surface in our understanding of contaminant behavior, risk, and
cleanup options, but we ncj longer need to function as if we must start
from scratch for every project. In fact,  as program budgets shrink and
rapid reuse of sites is desired (e.g., in "Brownfields" programs),  the
traditional approach is no longer viable due to its cost and inefficiency.
"Defining the nature and  extent"  without first identifying project
goals amounts to groping around in  the dark. It carries a serious
danger that decision uncertainties will not be identified in a timely
manner, and that data generation designs  will be inadequate to de-
fend the decisions being jmade. If there are not sufficient funds to
continue data collection Until decision uncertainties  are managed,
there is a strong  incentive  to downplay or  ignore decision uncer-
tainties. This in turn  increases the chance that decision  errors could
pose unacceptable risks tc
receptors, or will waste resources through
ineffective remedial actions (Refs. 7, 8). This is the antithesis of sound
science.

-------
186                      D. M Crumbling

EVOLVING A SECOND-GENERATION DATA QUALITY MODEL
To set the stage for an updated data quality model, we must clarify the
term  "data quality." According to EPA's Office of Environmental In-
formation, data quality is "the totality of features and characteristics of
data that bear on its ability to meet the stated or implied needs and
expectations of the  user/customer" (Ref. 9). What data users "need,"
ultimately, is  to make the correct decisions. Therefore, data quality
cannot be viewed according to some arbitrary standard, but must be
judged according to its ability to supply information that is repre-
sentative of the particular decision that the data user intends to make.
Said in a different way, anything that compromises data  representa-
tiveness  compromises data quality, and  data quality should not be as-
sessed except  in relation  to  the intended decision (Ref. 10). The
assumptions of the  current data generation model and routine appli-
cation of this model to environmental decision-making for site cleanup
are inadequate to ensure that data are representative of the site deci-
sions being made. The root cause of data non-representativeness is the
fact that environmental data are generated from environmental sam-
ples (i.e., specimens) taken from highly variable and complex parent
matrices (such as soils, waste piles, sludges, sediments, groundwater,
surface water, waste waters, soil gas, fugitive airborne emissions, etc.).
This fact has several repercussions:

1. The concept of representativeness demands that  the  scale (spatial,
   temporal, chemical  speciation, bioavailability, etc.) of the suppor-
   ting data be the  same (within tolerable uncertainty bounds) as the
   scale needed to make the intended decisions (does unacceptable risk
   exist or not; how much contamination to remove or treat; what treat-
   ment system to select; what environmental matrix to monitor; what
   analytes to monitor for; where and how to sample; etc.).  In contami-
   nated site projects, the true state (such as the concentrations of con-
   taminants across space or time or the properties  of the  matrix that
   control contaminant fate and transport) can easily vary markedly
   over smaller (inches to feet to yards) or larger (feet to yards to miles)
   scales that depend heavily on one's perspective. Decisions about risk
   and treatment design also vary over a range of scales. High variabil-
   ity at one scale may be inconsequential if viewed over a different
   scale.  Discrete contamination patterns (such as "hotspots") may be
   apparent at some scales, but not at others. Since it is not resource-
   feasible to characterize the "true state" of all relevant properties of
   the site at  all possible scales, there  must be a rationale to decide
   which scale(s) is(are) important. The purpose of project planning is

-------
•                            In Searcfy of Representativeness                187

           to develop an understanding of the scale over which decision-making
           (e.g., risk decisions, remedy selection, remedy design) will occur,
           identify what uncertainties need to be resolved in order for defensi-
           ble decision-making to oqcur, and then design a data generation
           scheme that will provide the corresponding information to manage
           those uncertainties. That is how sound science is practiced. Without
           first defining the decision] selecting the scale over which to "define
           nature and extent" becomes guesswork.
         2. The concept of  representativeness can  be coarsely broken  into
           sample representativeness and analytical representativeness, both
           of which are critical to managing data uncertainties:
             • Sample representativeness  includes procedures  related  to
               specimen selection, collection  (i.e.,  extraction from the parent
               matrix), preservation,' and subsampling (although this is often
               included with "analytical" since it  typically takes place in the
               lab). All are crucial to jiata quality, but the representativeness of
               specimens  is  difficult] to ensure  without sufficient  sampling
               density to  understand the scale  and characteristics of matrix
               heterogeneities. Even perfectly accurate analysis is no guarantee
               of good  data quality if jthe sample were not representative of the
               properties  of concern  to the  decision-maker. Since many en-
               vironmental matrices are highly heterogeneous on many differ-
               ent scales that affect contaminant concentration and behavior in
               analytical and biotic systems, most  of the uncertainty in most of
               today's  site  data sterns from the  sampling  side,  although
               inaccurate  analysis certainly can (and do) occur.
             • Analytical  representativeness involves selecting an analytical
               method that produces! test results that are representative of the
               decision. Causes of analytical non-representativeness include
               selecting the wrong method or erroneously interpreting method
               results  (such as selecting a  method that  reports total DDT-
               related isomers when a regulatory decision based on 4,4'-DDT is
               required).  Analytical jrepresentativeness is compromised when
               matrix  interferences Degrade  method performance to the point
               where erroneous decisions would be made if the data were not
               recognized as suspectl If interferences are found, sound science
               demands that  method modification or an alternate method  be
               used  to compensate. However, not infrequently regulatory pro-
               grams inhibit the use
               method  performance
of alternative methods that could improve
  Evaluating analytical  performance  on
               ideal matrices (reagent water and clean  sand)  provides little
               reassurance that equivalent performance  is being achieved on
               project-specific samples. Well-behaved matrices provide valuable

-------
188                       D. M. Crumbling

      information about analytical quality, but data users  cannot
      automatically assume that their performance is representative
      of analytical  quality for the real-world matrices under in-
      vestigation.
3. The wide range of decisions, contaminants, matrices,  and interfe-
   rences encountered in site cleanup  programs and the pace of tech-
   nology development make it impossible for prescriptive analytical
   requirements to accommodate the multitude of complex and inter-
   acting variables that determine  method performance. Regulatory
   flexibility for the selection and operation of analytical methods is
   not only vital to  ensuring  representative results, but also fosters
   acceptance of highly cost-effective,  second-generation  technologies
   and strategies.
4. The scientific and technical complexities of site cleanup require that
   appropriate scientific  expertise  be involved in up-front  project
   planning (to identify  decision goals  and design  data collection
   strategies), in design implementation, and in data interpretation.
   Without appropriate expertise,  identification and management of
   relevant  heterogeneities  and  uncertainties does not  occur,  data
   quality is frequently mismatched to data use, and sound science is
   not achieved.
5. Arbitrary regulatory  requirements for "data  quality" should be
   avoided since this short circuits the  planning  process needed to
   achieve sound science. Regulations should focus on requirements
   for performance that demonstrate explicit management of decision
   uncertainty.
6. Conceivably there will be circumstances where it is more cost-
   effective to manage the uncertainty involved hi ensuring  a protec-
   tive outcome by simply choosing the most protective action without
   generating data. Generating the data needed to manage decision un-
   certainty may cost more  than simply taking action.  Although there
   may still be uncertainty  about whether the decision to take protec-
   tive action is correct in an absolute  sense, the ultimate goals of the
   decision-making process  will have been achieved.

In contrast to the assumptions that underlie the current  data quality
model, a second-generation data quality model for the environmental
field will explicitly recognize that:

    • Data quality is an emergent property arising from the inter-
      action between the attributes of the analytical data (such as its
      bias,  precision, detection and  quantitation limits, and other
      characteristics that together contribute  to  data uncertainty)

-------
                   In Search of Representativeness                 189
            '         "'        '.'.'•        •.
      and the intended  use of the data (which is to assist managing
      decision uncertainty).      .
      Data uncertainty  ijs cbmprisM of both sampling and analytical
      uncertainties.     j
      Analytical uncertainty in a test result arises from both  the
      analytical uncertainty  of the measurement method itself and
      from interaction between the sample matrix and the analytical
      process. The analytical uncertainty arising from the  method
      itself is only a fraction (and often a negligibly small fraction) of the
      overall data uncertainty. The impact of sample matrix on ana-
      lytical uncertainty varies to a greater or lesser degree depending
      on how well the analytical methodologies have been matched to
      the characteristics of the particular sample matrix and to  the
      data needs.  Complex environmental matrices are notorious for
      interferences that degrade analytical reliability. Current quality
      assurance practices may not detect when interferences  are
      causing problems  if there is not a high index of suspicion on the
      part of the analyst and the data  user.
      Sampling uncertainty accounts  for the majority  (and  in some
      situations, nearly all) of the data uncertainty/This uncertainty
      can be managed by increasing the sampling density and/or by
      targeting  sample  collection designs to  yield the  most valuable
      information (i.e.,  gather more data where decisions are more
      uncertain, such as boundaries between "clean" and "dirty" areas,
      and less data where there decisions are more certain, such as
      obviously "clean" !  or  obviously "dirty"  areas).  Sample  re-
      presentativeness requires that all aspects of sampling design be
      matched to the scale of decision-making.
      Procedures to estimate and report data uncertainties (e.g., un-
      certainty intervals) to the data user need to be developed for the
      environmental field-
      Investment in properly educated and experienced technical staff
      is  a necessary and  cost-effective means to achieve data quality
      and  good science i where numerous complex  and interacting
      variables must be evaluated and balanced.
SUMMARY
                       i
Years of experience with investigating and  cleaning  contaminated
sites have made it clear that data quality cannot be managed  in-
dependent of the overarching goal of decision uncertainty management.
Pursuing  arbitrary notions of "data  quality"  becomes  an elusive,
aimless, disconnected resource sink that fails to achieve sound science.

-------
190                           D.-M.  Crumbling


Data quality (management of data uncertainty) and decision  quality
(defensible management of decision uncertainty) are distinctly differ-
ent endeavors, both, of which are critical to the pursuit of sound science.
Yet their roles are easily confounded in the regulatory arena. Isolated
attempts .to address data quality issues that fail to recognize and ad-
dress fundamental conflicts  between outdated models  and contem-
porary scientific knowledge only  perpetuate problems and stakeholder
dissatisfaction. Pursuing policies  based on sound science will challenge
regulatory agencies to modernize first-generation environmental mod-
els and  regulatory strategies to  accommodate the ever-evolving pro-
gressive nature of science itself.


REFERENCES

Crumbling, D. M. 2001. Current Perspectives in Site Remediation and Monitoring: Using the Triad
    Approach to Improve the Cost-Effectiveness of Hazardous Waste Site Cleanups. EPA 542-R-01-
    016. August. Issue paper available at http://cluin.org/download/char/triad2.pdf
Crumbling, D. M. 2001. Applying the Concept of Effective Data to Environmental Analyses for
    Contaminated Sites. EPA 542-R-01-013. August. Issue paper available at http://cluin.org/
    download/char/effective_data.pdf
Crumbling, D. M., C. Groenjes, B. Lesnik, K. Lynch, J. Shockley, J. van Ee, R. A. Howe, L. H.
    Keith, and J. McKenna. 2001. Managing Uncertainty in Environmental Decisions: Applying
    the Concept of Effective Data at  Contaminated  Sites Could Reduce Costs and Improve
    Cleanups. Environmental Science & Technology 35: 404A-409A. Article reprint is available at
    http://cluin.org/download/char/oct01est.pdf
U.S. Environmental Protection  Agency. 2000a.  EPA Order 5360.1 A2, Policy and Program
    Requirements for the Mandatory Agency-Wide Quality System. Washington, DC. Available at
    http://www.epa.gov/quality/qs-docs/5360-l.pdf.
Fairless, B. J., and D.  I. Bates.  1989. Estimating the quality of environmental data. Pollution
    Engineering, March: 108-111.
Homsher, M. T., F. Haeberer, P. J. Marsden, R. K. Mitchum, D. Neptune, and J. Warren. 1991.
    Performance Based Criteria, A Panel Discussion. Environmental Lab, October/November.
    Article available at http://cluin.org/download/char/dataquality/perfbased.pdf
Francoeur, T. L. 1997. Quality Control: The Great Myth.  Proceedings of the Field Analytical
    Methods for Hazardous Wastes and Toxic Chemicals Conference, January 29-31, 1997, Las
    Vegas, NV, Air &  Waste Management Association, Pittsburgh, PA.  Article available  at
    http://cluin.org/download/char/dataquality/ qc_greatmyth.pdf
Popek, E. P. 1997. Investigation versus Remediation: Perception and Reality. In Proceedings of
    WTQA'97—the 13th Annual Waste Testing and Quality Assurance Symposium, pp. 183-188.
    Paper available at http://cluin.org/download/char/dataquality/Epopek.pdf
U.S. Environmental Protection  Agency (USEPA). 2000. OEI Quality System 2000: Office of
    Environmental Information Management System for Quality, http://www.epa.gov/
    oei/quality.htm
U.S. Environmental Protection Agency (USEPA). 2000. Guidance for Data Quality Assessment-
    Practical Methods for Data Analysis (QA/G-9 QAOO  Update). EPA 600/R-96/084,  July.
    Available at http://www.epa.gov/quality/qs-docs/g9-final.pdf

-------