United States
                              Environmental Protection
                              Agency
   Office of
   Solid Waste and
   Emergency Response
Publication 9200.2-16FS
February 1993
          v>EPA       Quality  Assurance for
                              Superfund  Environmental  Data
                              Collection  Activities
    Office of Emergency and Remedial Response
    5203G
                  Quick Reference Fact Sheet
Many Superfund decisions  (both Fund-financed and  enforcement)  require  the collection  and evaluation  of site-specific
environmental data.  Major activities associated with acquiring these data include planning, sample collection and analysis, and
data quality assessment. EPA policy requires the development and implementation of quality assurance (QA) programs to ensure
that these activities generate  data of known quality.  The overall goal of a QA  program is to measure and minimize systematic
sources of error and to monitor conditions of sampling, storage and transport.

The Office of Emergency and Remedial Response (OERR) developed this fact sheet to promote a common understanding of
Superfund QA requirements  for site-specific environmental data collection activities.  The fact sheet focuses on the preparation
and implementation of sampling and analysis plans (SAPs). Requirements for  planning and design, sampling, analysis, quality
control (QC),  and data assessment are discussed.  The process described is consistent with the integrated site assessment and
accelerated response objectives of the Superfund Accelerated Cleanup Model (SACM). Conforming to these requirements will
help ensure that site managers generate data of known quality.
                 INTRODUCTION

This fact sheet provides Superfund program participants with an
overview of Superfund QA requirements for data collection
activities.  The information is pertinent to all Superfund  site
managers, including remedial project managers  (RPMs),  site
assessment managers (SAMs),  and on-scene  coordinators
(OSCs).  The information also applies to Agency contractors,
states,  and potentially responsible parties (PRPs) and their
contractors.

The fafct sheet addresses three primary areas: (1) the mandatory
QA requirements specified in Agency policy documents; (2) Q A
management for Superfund; and (3) the process for developing
SAPs for Superfund activities.  The relationship between these
primary areas  is  depicted  in Exhibit  1.   References  are
identified after each primary  section  to provide additional
information on  discussion topics.   These reference materials
contain guidance on the appropriate  quality  control  (QC)
considerations sife rhanagers should include as part of the QA
program.

              AGENCY QA POLICY

   EPA   Order   5360.1  establishes  mandatory   QA
   requirements for Agency environmental data collection
   activities. The National Oil and Hazardous Substances
   Pollution Contingency Plan (NCP) mandates specific
   Superfund QA requirements.
EPA Order 5360.1 and the NCP collectively define Superfund
QA policy for environmental data collection. Both documents
emphasize two requirements.  The first is  that Superfund
environmental data must be of known quality.  The quality of
data is  known when all  components  associated with  its
derivation are thoroughly documented and the documentation
has been reviewed by a qualified reviewer.  Second, QA plans
are required to attain the first objective.  These may be based
on generic or site-specific  procedures depending on  project
requirements.  This section  summarizes the QA requirements
contained in each document.

EPA Order 5360.1, entitled, Policy and Program Requirements
to Implement the Mandatory  Quality Assurance  Program.
describes  two  major  EPA  requirements  related   to
environmental  data  collection activities.   The  first  is
participation  by all  EPA  organizations in  a  central QA
program.   The goal  of the QA program is to ensure the
generation of data of known  quality.  Basic Agency QA
implementation requirements are summarized in Exhibit 2.
The second major requirement is  the development  of QA
project plans for all  environmental  data collection activities.
These plans specify data quality goals acceptable to data users,
and they assign responsibility for achieving these goals.

The NCP establishes the specific requirements used in the
Superfund program to comply with EPA's overall QA policy.
The NCP requires site  managers  to develop SAPs  for the
following Superfund hazardous substance response activities:

-------
  EXHIBIT 1 - SUPERFUND QA OVERVIEW


AGENCY QA POUCY
EPA Order 5360.1
NCP


QA MANAGEMENT
OERR QA Program
Regional QA Programs
Contractor QA Programs


SITE-SPECIFIC QA
Sampling & Analysis
Plan (SAP) Development
• FSP
- QAPP


Project
Objectives



Sampling Sampling Sample
Design Execution Analysis
Assessment
of
Data Quality

•  Remedial site inspections
•  Removal site evaluations
•  Remedial investigation/feasibility studies

These  plans  document  the  process for  obtaining  data  of
sufficient quality and quantity to satisfy data users' needs. The
NCP further states that the SAP shall include a field sampling
plan (FSP)  and a QA project plan (QAPP). The FSP defines
the number of samples, sample type (matrix), sampling location,
and required analyses.  The QAPP describes  the  policy,
organization, functional activities, and data quality objectives
(DQOs) that site managers need to establish and document prior
to performing any  site-specific work.  The SAP is  a single
document with two  separable components -  the FSP and QAPP
- allowing for separate  submissions consistent with Regional
guidance.

   REFERENCE BOX 1
      Environmental Protection Agency (EPA). 1984.  EPA
         Order 5360.1 - Policy and Program Requirements to
         Implement  the   Mandatory   Quality  Assurance
         Program. Office of Research and Development.

      Environmental Protection Agency (EPA). 1988. National
         Oil and Hazardous Substances Pollution Contingency
         Plan (NCP). 40 CFR 300.
           QA MANAGEMENT FOR
           SUPERFUND ACTIVITIES

  OERR, Regional Offices, and contractors participate in
  Super/and QA management activities.

To conform to the requirements specified in Order 5360.1 and
the  NCP, Superfund  follows a  well-defined management
structure operated by the Office of Solid Waste and Emergency
Response (OSWER). Within OSWER, OERR establishes and
oversees QA procedures, performed in support of Superfund
data collection activities. Regions perform most data collection
activities and implement the associated QA program. Regions
achieve QA goals  by using  qualified  personnel and well-
defined procedures (including the development of DQOs) and
performing or requiring the  performance  of precise data
collection and accurate interpretation of data results.

OERR Quality Assurance Program

The OERR QA program applies to all Superfund site-specific
data collection activities.  This program has been developed to
establish national  consistency in  the implementation of  the
Superfund QA program.  Agency and Superfund policy is set
forth in the OERR Quality Management Plan to provide site
managers  with information  on  program  requirements  for
generating data of known quality.

-------
EXHIBIT 2 - EPA ORDER 5360.1 BASIC
QA REQUIREMENTS
       Preparation and annual update of a Quality Management
       Plan

       Development of QA project plans for all major contracts
       involving environmental measurements

       Implementation  of  QA  procedures  for all contracts
       involving environmental data  collection  activities  as
       specified in applicable Agency regulations, subcontracts,
       and subagreements

       Conduct audits on a scheduled basis

       Development and adoption of technical guidelines  for
       assessing data quality

       Establishment of achievable  data  quality limits  for
       methods cited in regulations

       Implementation of a corrective action program

       Provisions for appropriate training as required for all
       levels of QA management
Regional Quality Assurance Programs

Regional Administrators are responsible for implementing EPA
Order 5360.1 and for tailoring  the  OERR QA program to
Region-specific  operations.   Regional Quality Management
Plans   contain  Region-specific   policies,  procedures,  and
organizational structures necessary for generating data of known
quality.

Contractor Quality Assurance Programs

Each Superfund contractor performing data collection activities
must also establish a QA program to generate data of known
quality  and  to  meet  other Agency policies.    Specific
requirements for contractor QA  programs are defined in the
OERR and Regional Quality Management Plans.

The contractor QA program  must be documented through a
Quality  Management  Plan  that  describes the corporate  QA
policies  and general requirements  for all environmental data
collection activities.  In addition, the contractor must develop
project-specific  QA  plans and SAPs that  are presented for
review and  approval  as delineated in each Region's Quality
Management Plan.

Superfund Quality Assurance Program Assessment

EPA  Headquarters and the Regions  continually monitor the
effectiveness of the Superfund QA program  through the use of
management and technical systems reviews.  EPA Regional and
Headquarters staff review the performance of each contractor to
ensure conformance to technical and contractual requirements.
The frequency of these reviews will be determined by contract
requirements  or  as  specified  in  the  Region's  Quality
Management Plan.

Exhibit 3 presents a brief description of the systems reviews.
These reviews assist OERR and Regional QA staff in assessing
the implementation  and adequacy  of Superfund QA  at  the
program and project management levels.  Project reviews, a
type of  management systems  review,  evaluate the integral
components associated  with  data collection activities.   The
results of these  reviews assist site managers and other data
users  to  verify the  quality  of sampling and analytical
operations.

EXHIBIT 3 - SYSTEMS REVIEWS
   Management Systems Reviews assess the effectiveness of
   the  implementation of the approved QA  program.  These
   reviews consider linkages across organizational lines and can
   be used to discern areas requiring improved guidance.

   Technical Systems Reviews assess project QC activities and
   environmental  data  collection  systems.    Areas  typically
   examined during this review include: sampling/measurement
   systems; equipment/facility  maintenance records; and control
   charts.

   Audits  of Data Quality address whether or not sufficient
   information exists for data  sets  to  support data quality
   assessment. This type of audit may also be used to determine
   if the organization collecting or using the data performed a data
   quality assessment.

   Performance Evaluation  Reviews evaluate the laboratory
   and/or  field  analytical  personnel's performance and the
   instrumentation or analytical systems used.
Superfund Evidentiary Concerns

The National  Enforcement  Investigations Center  (NEIC)  is
responsible for providing a  range of technical, investigative,
and litigation expertise for the Agency's enforcement cases.
NEIC  is  granted  statutory  authority under  CERCLA for
inspecting,  record-keeping,  and  compiling  confidential
information.  Applicable NEIC evidentiary  requirements for
site-specific field activities  must be included in the project
SAP.

NEIC  has  prepared  guidance  pertaining  to  evidentiary
requirements  for  Superfund in  the NEIC  Policies  and
Procedures Manual.   Examples of evidentiary requirements
include:
   Sample identification
   Chain-of-custody procedures
   Sample receiving procedures
   Sample tracking procedures
   Document control procedures
   Standard operating procedures
                                                           3

-------
Additional information on evidentiary policies for the Contract
Laboratory Program (CLP) can be found  in Exhibit F of the
current  CLP Statements of Work.   Exhibit  F describes  the
chain-of-custody,  document  control,  and  related  standard
operating procedures for the CLP.  Individual laboratories are
expected to incorporate Agency evidentiary  requirements in
their own standing procedures.

REFERENCE BOX 2
   Environmental Protection Agency (EPA). 1985. Interim Policy
       and Guidance for  Management Systems Audits  of
       National  Program  Offices.  Quality  Assurance
       Management Staff (QAMS).

   Environmental Protection Agency  (EPA). 1987. Guidelines
       and Specifications  for  Preparing  Quality  Assurance
       Program Plans (QAPPs) and Quality Assurance Annual
       Report  and Workplans  (QAARWs)  for EPA  National
       Program Offices  and the Office  of Research and
       Development  (ORD). Quality  Assurance Management
       Staff. EPAQA/G-2.

   Environmental Protection Agency (EPA). 1992.   Quality
       Assurance  Management  Plan  for  the  Office  of
       Emergency and Remedial Response.

   Environmental Protection Agency  (EPA). 1980 Guidelines
       and Specifications  for  Preparing  Quality  Assurance
       Program Plans,  QAMS-004/80.  QAMS is  currently
       developing an update to this guidance entitled, Guidance
       for Preparing, Reviewing,  and Implementing Quality
       Assurance Management Plans, EPAQA/G-2.
      SAMPLING AND  ANALYSIS PLAN
                  DEVELOPMENT

    Sampling  and   analysis  plans  are   site-specific
    documents that contain sampling objectives, strategies,
    and the appropriate QA procedures necessary to meet
    project objectives. SAPs should incorporate or build
    upon   generic   plans   and  standard   operating
    procedures,  when  available.    Major  activities
    associated with  the development of the plans are
    presented in Exhibit 4.

The effective and efficient development and implementation of
SAPs is essential to obtaining data of sufficient quantity and
quality to support program decisions.  As defined in the NCP,
SAPs consist of  two integral components,  the FSP and  the
QAPP.  Exhibit 5 presents the  minimum requirements for each
component.  When preparing  SAPs, care should be taken to
streamline the process and avoid  duplication between the two
components.   Also  SAPs.  should incorporate  or reference
generic plans and Regional standard operating procedures, as
appropriate.
The SAP should describe each project objective  in  detail.
Usually this is done by describing the specific decisions to be
made with the data and involving the decision maker from the
beginning. The plan should describe how each data value will
be used to make a decision.  It should include a description of
the monitoring network, the location of each place samples will
be collected, the sampling frequency, the types of analyses for
each  sample,  the target  precision at the concentration  of
interest and  the rationale for the design.  All factors that will
influence the eventual decision should, to the extent practical,
be  evaluated and specified at the  beginning  of the process.
The plan should balance the need for an appropriate level of
QA (commensurate with project needs)  with timeliness and
cost considerations.   Finally, the  plan  should include  the
organization's functional activities  and the names of all key
people. The remaining sections of this document discuss the
SAP  development process, from  definition  of the project
objectives to generation and evaluation of the environmental
data.

Project Objectives

    Project and data quality objectives must be developed
    to assist  in assuring the generation of useable data.

The first stage in developing the SAP is to determine overall
project objectives and DQOs.  Project objectives define the
type and  extent of investigations  required at a given site.
DQOs specify the level of uncertainty that site managers are
willing to  accept  in   results  or  decisions derived  from
environmental data.  Site managers should develop project
objectives and DQOs  in  accordance with  data  useability
requirements for project activities. For example, the technical
requirements for scoring  a site using the Hazard Ranking
System (HRS) may be less stringent than those required for a
risk assessment.

Because DQOs are developed before the data are collected, this
process can improve the  efficiency of data gathering by
defining the number and  type of samples and  level of QA.
Since  these  factors are determined based on project  need,
DQOs assist in  stream lining the process and ensuring cost
effectiveness.

Exhibit 6 illustrates the  DQO process  as it is defined  in
Guidance for Data Useability in Site Assessment. Additional
references on the DQO process can be found in Reference Box
3.

Once these  objectives  have been  defined, the  site manager
must   identify  the  procedures  required to  achieve  these
objectives and the acceptable degree of uncertainty.  Chemists,
geologists,  biologists,  ecologists,  risk  assessors,   computer
modelers, statisticians, QA staff, and Regional Counsel should
be  invited to participate in this process, as appropriate.

-------
EXHIBIT 4 - SAMPLING AND ANALYSIS PLAN DEVELOPMENT
                ASSESSMENT
                      OF
               DATA QUALITY



                                           SAMPLING & ANALYSIS
                                                     PLAN
                                                DEVELOPMENT

Sampling Design

    Effective sampling designs are dependent upon project
    and data quality objectives.  It Is important to avoid
    collecting  more samples than required to support
    project decisions.

During  sampling  design,  site  managers  develop  project
objectives into  specific  operational parameters.   The design
identifies the number, type, and location of samples.  Effective
sampling designs result in the_generation of data that meet the
project objectives and DQOs. The sampling design should also
generate data that are representative of the conditions at the site
within resource limitations.

Examples  of the types  of site-specific  factors that should be
considered when  designing a  sampling  plan include:  site
accessibility, climate, potential hazards, media of concern,  and
site heterogeneity. Information that can  be used to support the
design often includes site maps, geological information, disposal
records, and historical data. Standard  Operating Procedures
(SOPs)  for the  most common sampling techniques and field
procedures should be used consistent with Regional guidance.
Sampling  designs  may  be  statistical,  judgmental,  or  a
combination of both.   Statistical  sampling  designs  entail
selecting sampling locations using a probability based scheme.
Judgmental sampling  designs  focus  the  sampling  location
specifically in the area of concern. HRS scoring is an example
of when high bias is acceptable, therefore, the use of non-
statistical or judgmental sampling is appropriate. To determine
which  design  is  appropriate,  practical  trade-offs  between
response time, analytical costs, number of samples, sampling
costs, and  level  of uncertainty should be weighed by the site
manager.   A  combination  of  statistical and  judgmental
sampling can  often be used to maximize available resources,
but a statistician should be consulted.

Because site conditions may change, sampling designs should
be flexible enough to allow for modifications during sampling
execution.   However, deviations from the original design
should be approved in advance by the site manager.

Field analyses can also  be an important component of the
overall  sampling design.  These analyses can  be used  to
provide threshold indications of contamination and may be
helpful  in  revising  and  refining   the  sampling  strategy.
Analytical  field methods  also  can be  useful  in  directing
sampling into areas of greatest contamination or "hot spots."

-------
EXHIBIT 5  -  SAMPLING  AND  ANALYSIS  PLAN
COMPONENTS
   Field Sampling Plan: Specifies field activities
   necessary to obtain environmental data and contains
   the following elements:
      Site background
      Sampling objectives and rationale
      Sampling matrix/location/frequency
      Sample identification/documentation
      Sampling equipment/procedures/decontamination/
      disposal
      Sample handling/packaging/analysis
   Quality Assurance Project Plan:  Describes the
   policy, organization, and DQOs for decision-making
   purposes.  Key elements include:
      Project description
      Project organization/responsibilities
      QA objectives for measurement
      Sampling procedures and QC
      Sample custody
      Calibration procedures
      Analytical procedures with detection
      limits/quantitation limits
      Data reduction/validation and reporting
      Internal quality control
      Performance and systems reviews
      Preventive maintenance
      Data assessment procedures
      Corrective actions
      QA reports
Finally, field analytical methods should be used to accelerate
the site assessment process and reduce costs when their use is
consistent with site conditions (e.g. contaminants, media)  and
the DQO's established for the site.

Sampling Execution

    Representative samples are collected through the use
    of well-defined sampling practices.

Sampling execution involves the collection and documentation
of samples  identified by  site managers during  the  sampling
design phase.  The  goal of sampling execution  is to collect
samples  representative  of  site  conditions  to fulfill  project
requirements and DQOs.

In order to collect representative samples, the number, location,
sampling  methods,   equipment,  and  containers  must  be
appropriate  for  the  sample matrix and the  contaminants of
concern.  Collection procedures  should  not alter the matrix
sampled. In addition, samples should be preserved in a manner
that minimizes potential chemical and biological degradation.

Site managers are responsible for identifying background  and
QC samples during  the sampling  design stage.   Background
samples  are .collected in conjunction  with environmental
samples and are evaluated to establish baseline values for the
contaminants of concern.   These samples are collected at or
near the hazardous waste site in areas not influenced by site
contamination. Background samples should be collected at

EXHIBIT 6 - THE DQO PROCESS
   STEPS IN THE DQO PROCESS

   State Problem  -  Describe the  problem,  possible
   resolutions,  and data collection constraints.

   Identify  Problem -  State  the question that will  be
   answered using environmental data.

   Identify Input Affecting Decision - List the variables to
   be measured and other information needed to make the
   decision.  List procedures for assessing the precision
   and accuracy of the data at the concentration of interest.

   Specify Domain Of Decision - Specify the locations of
   concern  within the  site and describe the  different
   pathways.

   Develop  Logic  Statement  - Develop a quantitative
   statement defining how the data will be summarized and
   used to answer each question.

   Establish Constraints  On Uncertainty - Define the
   procedures for determining total uncertainty in the data,
   and develop data acceptance criteria.

   Optimize Design For Obtaining Data - Develop a
   practical design for the study that is expected to produce
   the necessary data.
approximately the same time and under the same conditions as
the test samples, and they should be collected for each matrix.

QC samples assist in assessing  the data quality.  Field QC
samples are collected on-site in conjunction with environmental
samples and are used to gather information on the precision
and bias of the sampling process. Types of field QC samples
include double-blind samples (e.g., field evaluation samples,
field matrix spikes, and field duplicates), single-blind samples
(e.g., trip blanks, rinsate blanks), and non-blind samples (e.g.,
laboratory control samples as used in the CLP).

The  precise composition  and frequency of QC samples  is
dependent on the objectives for the sampling event and existing
Regional guidelines.  All field QC  samples should be stored,
transported, and analyzed using the same procedures used for
site samples.

Site managers should assess sampling execution by evaluating
the data from field QC samples and observing  field activities.
Field duplicate  sample  results  can  provide  useful  QC
information.  However, this field assessment will not provide

-------
real time data on the precision of the sampling event since it is
assessed during  the data review.

On-site observations of field activities are conducted to verify
that the SAP is being followed.  The SAP should specify areas
where flexibility in procedures and/or criteria may be acceptable
and should specify procedures for documenting these changes.
Field documentation  is critical  to a successful QA  program.
The field log book should be legally defensible and all entries
should   be  complete   and  accurate   enough   to  permit
reconstruction of field activities.

Sample Analysis

    Contract Laboratory Program (CLP) or non-CLP
    analytical  services may be procured in support of
    Superfund  activities.     Laboratories  which   can
    demonstrate a successful history of independent audits
    should be selected for use.

Project DQOs and analytical factors dictate the selection of
analytical methods.  The analytical  method and associated QC
should  provide data' of  known quality for the contaminants of
concern. Data users should consider the following factors when
selecting analytical methods:
    Quantitation limit
    Detectable constituents
    Qualitative confidence
    Method precision and bias
    Turnaround time
    Analytical cost
 Once the site manager has evaluated these  factors, analytical
 services may be procured through  either CLP or  non-CLP
 services.  The site manager or other data user is responsible for
 planning,  monitoring,  and  assessing  the  quality  of  data
 regardless of the analytical service procured.

 The CLP  is a  national program of commercial laboratories
 under  contract  to EPA that provides routine or  specialized
 analytical  services.  Routine analytical services (RAS) use a
 limited number of standardized methods and  are designed to
 generate  consistent  results.   Specialized analytical services
 (SAS) provide non-standardized analyses or faster turnaround
 time  and  require the  data user to  specify  the  necessary
 analytical methods and QC requirements. The CLP adheres to
 specific data acceptance criteria that result in data of known and
 documented quality.  However, it cannot be assumed that CLP
 data achieve the DQO requirements established for the project.
 Data quality assessment is still required.

 Analytical , services  procured  outside  of  the   CLP  are
 characterized  as  non-CLP.   These  can  be  provided  by
 laboratories that participate  in  the CLP, use CLP  methods,
.generate CLP-type data packages, or by laboratories  that have
Jiever  participated in  a national analytical program.    It is
recommended that non-CLP laboratories be audited to assure
the validity and defensibility of any data generated.

Non-CLP data are generated by laboratories whose proficiency
in the methods of interest may or may not be known.  It is the
responsibility  of  the  data  generator and user  to  select the
method and data  acceptance  criteria that will verify method
and laboratory performance.

Two  categories  of non-CLP services are  available:  fixed
laboratory and field analyses.  Fixed laboratory analyses are
performed by commercial  laboratories selected  by the data
user.   Field  analyses are  commonly  performed  in mobile
laboratories or with fieldable or portable analytical instruments.

In addition to quick turnaround and lower cost, field analyses
can:  (1)  focus  sampling  efforts;  (2) provide  qualitative
information;   (3)  provide   quantitative    results   when
supplemented  by  confirmatory samples  sent  to a  fixed
laboratory; and (4) potentially satisfy project needs.

Analytical QC is  comprised of a planned system  of routine
activities for  obtaining prescribed standards  of performance.
QC frequency, type, and acceptance criteria should correlate
with the study objectives. The type, frequency, sequence, and
control criteria for analytical QC samples are predetermined for
CLP RAS.  Site managers specify the control criteria for both
CLP SAS and non-CLP analyses.

Assessment of Data Quality

   Site managers and other data users assess data quality
   to determine if the  data  are consistent  with project
   objectives and are appropriate for supporting a specific
   decision.

Steps in assessing  data quality may  include  data  review,
uncertainty determination,  and data  useability  assessment.
Benefits data users can obtain from proper assessment of data
quality  include:   (1)  establishment of data useability;  (2)
determination of sufficient data quantity; and (3) improvement
of future data collection efforts by identifying major sources of
error in the data.

Data  Review/Validation:   The first  step in assessing data
quality is data review, also known as data validation.   Data
review/validation is the technical examination of environmental
data and the associated QC data to determine the limitations of
the data. During  this  process, the reviewer applies analytical
criteria  to determine if  analyses were performed  under
controlled conditions  and whether  or  not the data should be
qualified.  Because data review/validation criteria are based on
the methods used to generate the data, the results of a data
review/validation  are  frequently independent  of the intended
use of the data. The data review/validation process establishes
the quality of the data.  Data review must be consistent with
the project DQQs and QAPP.

-------
CLP data review is performed by technical personnel who have
been trained by Regional staff or follow Agency guidance. The
data package is reviewed using EPA's functional guidelines for
evaluating organic and inorganic laboratory data (see Reference
Box  3)  and  Regional  SOPs that comprise standardized
procedures and  criteria based  on  the  associated analytical
methods.   Non-CLP  data  are  reviewed based on  available
information including the sample matrix and analytical method
used  and  in  accordance  with the  procedures and criteria
specified in the DQOs. Data validation procedures must avoid
conflict of interest problems.

Determination of Total Uncertainty:  Each step of the data
acquisition process has an inherent uncertainty associated with
it. The uncertainty acceptance level depends on the purpose for
which the data are being collected. Total error is comprised of
two types of error:  sampling variability and measurement error.
Sampling variability is the variation between true sample values
and is a  function of  the  spatial variation  in  the  pollutant
concentrations.   Measurement  error represents the difference
between  the  true  sample  value  and  the  reported  value.
Examples of these  types of errors are provided in Exhibit 7.

Factors that can influence  sampling and measurement errors
include:
    Instrument capabilities
    Variability (media, spatial, temporal, operational)
    Incorrect sample collection coordinates
    Improper decontamination procedures
    Improper sample preservation
    Inadequate storage procedures
    Inappropriate sample preparation analysis
    Exceeded  holding times

   EXHIBIT 7 - TOTAL ERROR COMPONENTS
      Sampling variability is a function of the spatial variation
      in pollutant concentrations.  For example, landfills may
      have "hot spots" with non-representative concentrations.

      Measurement error, which has components from both
      sampling and analysis, is estimated using the results of
      QC samples. For example, sample results may be biased
      low due to the holding time being exceeded.
Site managers and other data users should establish procedures
for estimating total uncertainty and data acceptance criteria
during  the  DQO  development stage.   EPA currently  is
developing procedures for determining total  error for soil
analyses.  The Environmental Monitoring Systems Laboratory
in Las  Vegas  (EMSL/LV)  has  developed a  guidance,  A_
Rationale for the Assessment of Errors in the Sampling of Soils,
to serve this purpose.
Data Useability  Assessment:   After the  data  have been
reviewed and the total uncertainty estimated, the data must be
examined in the context of the DQOs to determine whether
they are valid for their intended use.

Site managers or other data  users assess data useability by
evaluating the sampling and analytical performance against the
quality  indicators specified in the DQOs.  Quality indicators
consist of quantitative statistics and qualitative descriptors and
are used to interpret the degree of acceptability of data to the
user.  The data quality indicators are:
   Bias/Accuracy
   Precision
   Comparability
   Completeness
   Representativeness
Site managers may be required to implement corrective action
in  the event the  system fails  to  achieve  the established
performance criteria.

EPA has established a Data Useability Workgroup to develop
national guidance for minimum data quality  requirements to
increase  the useability of environmental data  in support of
Superfund.   Within this workgroup, the risk assessment
subgroup  has developed minimum requirements for risk
assessments (see  Guidance  for  Data  Useability in  Risk
Assessment:  Final).    The  site  assessment  subgroup  has
developed similar guidance for site assessments.

Automated Computer Systems

   Automated computer  systems  are  useful  tools  in
   supporting data collection activities.
Several  automated   computer
systems are being developed that
can  assist  site  managers  in
performing various aspects of
data   collection,   including
developing  SAPs,  developing
and    evaluating   sampling
strategies,   and   performing
automated  data  review.   This
section describes some of the systems that are in the prototype
stage of development.  Because these systems have not been
finalized, their current useability cannot be guaranteed. Exhibit
8 provides EPA contacts for further information on each of
these systems.

Sampling and  Analysis Plan Development.   The Quality
Assurance Sampling Plan for Emergency Response (QASPER)
was  created to  automate the development of a site-specific
SAP for the Removal program.  The system is  implemented
using WordPerfect software. QASPER includes step-by-step
procedures for developing a SAP, from development of DQOs

-------
through data validation.  The system significantly expedites the
SAP development process.

Development and Evaluation of Sampling Strategies: Several
automated systems have been produced to develop and evaluate
sampling strategies.  Each automated system has  specific data
requirements and is based on specific site assumptions.

The  DQO Expert  System  was  developed by  the Quality
Assurance Management Staff (QAMS). It is a training system
that assists in planning environmental investigations based on
the DQO process. The four systems that follow were developed
by EMSL/LV.  The Environmental Sampling Expert System
(ESES) assists  in planning  sample collection.   It includes
models that address statistical design, QC, sampling procedures,
sample handling, budget, and  documentation.   The current
system  addresses  metal contaminants  in a   soil matrix.
Expanded application of this  system is under development. The
Geostatistical Environmental Assessment Software (GEO-EAS)
is  a  collection  of software  tools  for  two-dimensional
geostatistical  analysis of spatially  distributed  data points.
Programs include file management, contour mapping, variogram
analysis, and kriging.  SCOUT Multivariate Statistical Analysis
Package is a collection of statistical programs that accept GEO-
EAS files for multivariate analysis.  The Assessment of Errors
in Sampling of Soils (ASSESS)  system estimates measurement
error variance components.  It presents scatter plots of QC data
and error plots to assist in determining the appropriate number
of QC samples required.

Automated Data Review:  Automated data evaluation systems
Inave been developed to reduce the resources and the amount of
time required  to review data.    The Computer-Aided Data
Review and  Evaluation  (CADRE)  system developed  by
EMSL/LV is an automated evaluation system that  assists in the
review of CLP organic RAS data.  CADRE evaluates  data
quality according to  the QC criteria defined  in the EPA's
functional guidelines for evaluating inorganic and organic data.
The system is being  modified to accept manual entry of other
date sets and  to accept user-defined criteria to meet specific
information needs.

The Electronic Data Transfer and Validation System (eDATA)
developed by  the Removal program assists in rapid evaluation
of data in emergency responses.  This system may be applicable
for both CLP and non-CLP data. The system combines DQOs,
pre-established  site  specifications,  QC criteria,  and sample
collection  data with laboratory results to determine useability.
This software consists of three  separate and distinct modules
that reflect the needs  of the  site manager, the laboratory, and
the data validator.  In using eDATA, the site manager specifies
the DQOs associated with  any given batch of samples,  the
choice  of the pre-established  QA/QC criteria, and the limits for
volatile, semivolatile, PCB and pesticide, and metal constituents.
The site manager can also create sets of user-defined criteria to
meet project-specific needs.
EXHIBIT 8 - COMPUTER SYSTEM CONTACTS
   ASSESS: Jeff Van Ee, Exposure Assessment Division,
      USEPA EMSL/LV, (702) 798-2367.

   CADRE: John Nocerino, Quality Assurance
      Division, USEPA  EMSL/LV, (702) 798-2110.

   DQO Expert System:  John  Warren,  USEPA Quality
      Assurance Management Staff, (202) 260-9464.

   eDATA:  William  Coakley,   USEPA   Environmental
      Response Team, (908) 906-6921.

   ESES: Jeff  Van  Ee,  Exposure Assessment  Division,
      USEPA EMSL/LV, (702) 798-2367.

   GEO-EAS:   Evan Englund,  Exposure  Assessment
      Division, USEPA EMSL/LV, (702) 798-2248.

   QASPER:  William Coakley,  USEPA  Environmental
      Response Team,  (908) 906-6921.

   SCOUT:  Jeff Van  Ee, Exposure Assessment  Division,
      USEPA EMSL/LV, (702) 798-2367.

-------
REFERENCE BOX 3
    American Public Health Association.  1985.  Standard Methods
       for the Examination of Water and Wastewater. American
       Public Health Association, 16th Edition, 1985.

    Environmental Protection Agency (EPA). 1986. Test Methods for
       Evaluating Solid Waste (SW-846): Physical/Chemical Methods.
       Third Edition. Office of Solid Waste.

    Environmental Protection Agency (EPA). 1987. A Compendium of
       Superfund Field Operations Methods. Office of Emergency and
       Remedial Response. EPA/540/P-87/001. •

    Environmental Protection  Agency  (EPA).  1987.  Data  Quality
       Objectives for  Remedial Response Activities:  Development
       Process. EPA/540/G-87/003.

    Environmental Protection  Agency  (EPA).  1987.  Data  Quality
       Objectives for Remedial Response Activities, Example Scenario:
       RI/FS Activities at a Site with Contaminated Soil and Ground
       Water.  Office  of  Emergency   and  Remedial  Response.
       EPA/540/G-87/004.

    Environmental Protection Agency (EPA). 1988. Field Screening
       Methods Catalog. Office of Emergency and Remedial Response.
       EPA/540/2-88/005.

    Environmental Protection Agency (EPA).  1988.  Ground  Water
       Modeling: An Overview and Status Report.  EPA/600/2-89/028.

    Environmental Protection Agency (EPA). 1988. Laboratory Data
       Validation Functional  Guidelines for  Evaluating  Inorganics
       Analysis. Office of Emergency and Remedial Response.

    Environmental Protection Agency (EPA). 1988. Laboratory Data
       Validation Functional  Guidelines  for Evaluating  Organics
       Analysis. Office of Emergency and Remedial Response.

    Environmental Protection Agency (EPA).  1989. Determining Soil
       Response Action Levels  Based  on Potential  Contaminant
       Migration  to Ground Water: A  Compendium  of  Examples.
       EPA/540/2-89/057.

    Environmental Protection  Agency  (EPA). 1989.  Guidance on
       Applying the Data Quality Objectives Process for Ambient Air
       Monitoring Around Superfund Sites (Stages 1 and 2). Office of
       Air Quality and Planning and Standards. EPA/450/4-89/015.

    Environmental Protection Agency (EPA). 1989. Quality Assurance
       Technical Information Bulletin, Vol. 1, No. 2, 11/13/89.

    Environmental Protection Agency (EPA). 1989. Soil Sampling Quality
       Assurance User's Guide. Environmental Monitoring Systems
       Laboratory. Las Vegas, NV. EPA/600/8-89/046.

    Environmental Protection Agency (EPA). 1989. Superfund Ground
       Water  Issue:  Ground Water Sampling for Metals Analyses.
       Office of Research and Development. EPA/540/4-89/001.

    Environmental Protection Agency (EPA). 1990. QA/QC Guidance for
       Remedial Activities.  EPA 540/G-90/004.

    Environmental Protection Agency (EPA). 1990.  A Rationale for the
       Assessment  of Errors in the Sampling  of Soils. Office  of
       Research and Development. EPA/600/4-90/013.
Environmental Protection Agency (EPA). 1991. CLP Analytical
    Results Database (CARD) Quick Reference Fact Sheet.
    Office of Emergency and Remedial Response.

Environmental  Protection  Agency  (EPA).   1991.   CLP
    Statement of Work For Inorganics Analysis. Document
    Number ILM02.0 (or most recent update).

Environmental  Protection  Agency  (EPA).   1991.   CLP
    Statement of -Work for  Organics Analysis. Document
    Number OLM01.1 (or most recent  update).

Environmental Protection Agency (EPA). 1991. Compendium
    of ERT Ground Water Sampling Procedures. Emergency
    Response Division Office of Emergency and Remedial
    Response. EPA/540/P-91/007.

Environmental Protection Agency (EPA). 1991. Compendium
    of  ERT  Soil   Sampling  and  Surface   Geophysics
    Procedures.  Emergency Response Division  Office  of
    Emergency and Remedial Response. EPA/540/P-91/006.

Environmental Protection Agency (EPA). 1991. Compendium
    of  ERT  Surface Water  . and  Sediment   Sampling
    Procedures.  Emergency Response Division  Office  of
    Emergency and Remedial Response. EPA/540/P-91/005.

Environmental Protection Agency (EPA).  1991. Compendium
    of  ERT  Waste  Sampling  Procedures.   Emergency
    Response Division Office of Emergency and  Remedial
    Response. EPA/540/P-91/008.

Environmental Protection Agency (EPA). 1991. Management
    of Investigation-Derived Wastes  During Site Inspections.
    EPA/540/G-91/009, May 1991.

Environmental Protection Agency (EPA).  1991.  Practical
    Guide for Groundwater Sampling. EPA 600/2-85/104,
    September 1985.

Environmental   Protection   Agency   (EPA).   1991.
    Representative  Sampling  Guidance,  Vol.  I,  Soil.
    OSWER Directive 9360.4-10.

Environmental Protection Agency (EPA).  1991. Sampler's
    Guide to the Contract Laboratory Program.  Office  of
    Emergency  and Remedial  Response. EPA/540/P-
    90/006.

Environmental  Protection Agency (EPA).  1991. User's
    Guide to the Contract Laboratory  Program. Hazardous
    Site  Evaluation  Division Office  of Emergency  and
    Remedial Response. EPA/540/P-91/002.

Environmental Protection Agency (EPA).  1992. Guidance
    for Data Useability in Risk Assessment: Final. Office
    of  Emergency  and   Remedial  Response.  Part A:
    9285.7-09A.  Part B [radionuclides]: 9285.7-09B

Environmental Protection Agency (EPA). 1992. Preparation
    of Soil Sampling Protocol: Sampling  Techniques and
    Strategies. EPA 600/R-92/128.
                                                           10

-------
GLOSSARY
    Assessment - the evaluation process used to measure the
    performance or effectiveness of a system and its elements.
    Assessment is an all-inclusive term used to denote any of
    the following: audit, performance evaluation, management
    systems review, peer review, inspection or surveillance.

    Audit - a planned and documented investigative evaluation
    of an item  or process to determine the adequacy and
    effectiveness as  well  as compliance with established
    procedures, instructions,  drawings, QAPPs, and/or other
    applicable documents.

    Data Quality Objectives (DQOs) - a statement of the
    precise data,  the  manner in which such data may be
    combined, and the acceptable uncertainty in those data in
    order to resolve an environmental problem or condition.
    This may also include the criteria or specifications needed
    to design a study that resolves  the question or decision
    addressed by the DQO process.

    Data Quality Objectives  Process  -  a Total  Quality
    Management  (TQM)  tool   developed  by   the   U.S.
    Environmental Protection Agency to  facilitate the planning
    of  environmental  data collection activities.   The  DQO
    process asks planners to focus  their planning efforts by
    specifying the use of the  data (the decision), the decision
    criteria, and their tolerance to accept an incorrect decision
    based on the data.  The products of the DQO process are
    the DQOs.

    Data Useability - the  process of ensuring or determining
    whether the quality of  the  data  produced  meets  the
    intended use of the data.

    Detectable  Constituent  - a target analyte that can be
    determined  to  be different  from  zero by  a  single
    measurement at a stated level of probability.

    Management Systems Review (MSR) - the  qualitative
    assessment  of   a  data  collection  operation   and/or
    organization(s) to establish whether the prevailing quality
    management structure, policies, practices, and procedures
    are adequate for ensuring that the type and quality of data
    needed are  obtained.

    Performance Evaluation (PE) - a type of audit in which
    the quantitative data generated in a measurement system
    are obtained independently and  compared with routinely
    obtained data to evaluate the proficiency of an  analyst or
    laboratory.

    Quality - the sum of features and  properties/characteristics
    of a process, item, or service that bears on its ability to
    meet the stated needs and expectations of the user.

    Quality  Assurance (QA)   - an integrated  system  of
    management activities involving planning, implementation,
    assessment, reporting, and quality improvement to ensure
    that a process, item, or service  is of the type and quality
    needed and expected by  the customer.
Quality Assurance Project Plan  (QAPP) - a  formal
document describing in comprehensive detail the necessary
QA,  QC,  and  other  technical activities that must be
implemented to ensure  that  the  results  of  the work
performed will satisfy the stated performance criteria:

Quality Control (QC)  - the overall system of technical
activities that measure the attributes and performance of a
process, item, or service against defined standards to verify
that they meet the stated requirements established by the
customer.

Quality Management Plan (QMP) - a formal document that
describes the quality system in terms of the organizational
structure, functional responsibilities of management and
staff, lines of authority, and required interfaces for those
planning,  implementing,   and  assessing  all  activities
conducted.

Quantitation Limit  - the  maximum  or minimum level or
quantity of a target variable that can be quantified with the
certainty required by the data user.

Sampling and Analysis Plan (SAP) - a formal document
that provides a process for obtaining data of sufficient
quality and quantity to satisfy data needs. A sampling and
analysis plan consists of two parts:

   (a)  The field sampling plan,  which describes the
   number, type, and location of samples and the types of
   analyses; and

   (b)   The  quality  assurance  project  plan,  which
   describes policy, organization, and functional activities
   and  the data  quality  objectives  and  measures
   necessary to achieve adequate data for use in planning
   and documenting the response action.

Technical Review - a documented critical review of work
that has been performed within the state of  the art. The
review is accomplished  by one or more qualified reviewers
who are independent of those who performed the work, but
are collectively equivalent  in technical expertise to those
who performed the original work. The review is an in-depth
analysis and evaluation of documents, activities, material,
data, or items that require technical verification or validation
for applicability, correctness, adequacy, completeness, and
assurance that established requirements are satisfied.

Technical Systems Audit (TSA) - a thorough, systematic,
on-site, qualitative audit of facilities, equipment,  personnel,
training, procedures, record keeping, data validation, data
management, and reporting aspects of a system.

Validation - an activity  that demonstrates or confirms that
a  process,  item,  data  set,  or  service  satisfies  the
requirements defined by the user.
                                                           11

-------