\      UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                        WASHINGTON, D.C. 20460

                                                EPA 823-B-95-001
                                APR 2 8  f995

Dear Colleagues:

     The U.S. Environmental Protection  Agency  (EPA)  is pleased to
transmit a copy of the document titled  QA/QC Guidance  for
Sampling and Analysis of Sediments, Water, and Tissues for
Dredged Material Evaluations. Chemical  Evaluations.  This
document was prepared in response to  regional  requests for
quality assurance/quality control  (QA/QC)  guidance  associated
with the testing and evaluation of proposed dredged material
discharges into inland or ocean waters.  The workgroup that
developed this national guidance was  comprised of individuals
from headquarters, field offices, and research laboratories of
EPA and the U.S. Army Corps of Engineers  (USAGE) with  experience
related to dredged material discharge activities.

     EPA and USAGE technical guidance for  evaluating the
potential for contaminant-related impacts  associated with the
discharge of dredged material into inland  and  ocean waters,
respectively, is found in the documents "Evaluation of Dredged
Material Proposed for Discharge in Waters  of the U.S.—Testing
Manual (Draft)" (the Inland Testing Manual)  (U.S. EPA  and USAGE
1994), and "Evaluation of Dredged Material Proposed for Ocean
Disposal—Testing Manual" (the Ocean Testing Manual)  (U.S. EPA and
USAGE 1991).  Results of tests conducted using the  testing
manuals are the basis of independent  evaluations made  by EPA and
USAGE regarding the suitability of proposed dredged material for
aquatic disposal.

     This QA/QC guidance document serves as a  companion document
to the Inland and Ocean Testing manuals.   The  purpose  of this
document is as follows:  1) to provide  guidance on  the
development of quality assurance project plans for  ensuring the
reliability of data gathered to evaluate dredged material
proposed for discharge under the Clean  Water Act or the Marine
Protection Research and Sanctuaries Act, 2) to outline procedures
that should be followed when sampling and  analyzing sediments,
water, and tissues, and 3) to provide recommended target
detection limits for chemicals of concern.  This document
pertains largely to physical and chemical  evaluations.   Though it
is directed primarily toward the evaluation of dredged material
for aquatic disposal, it may be useful  in  other areas  of dredged
material assessment and management as well (e.g., disposal site
monitoring or evaluation of alternative disposal options).  The
audience for this document is Federal and  State agency personnel
and public with an interest in the evaluation  and management of

                                                     Recycled/Recyclable
                                                     Primed with Soy/Canola Ink on paper that
                                                     contains at least 50% recycled fiber

-------
dredged material.  The information provided herein is for the
purpose of guidance only and does not constitute a regulatory
requirement.

     Requests for copies of this document (EPA document number
EPA 823-B-95-001) should be sent to U.S. Environmental Protection
Agency, National Center for Environmental Publications and
Information, 11029 Kenwood Road, Building 5, Cincinnati, Ohio
45242.

     We appreciate your continued interest in EPA's activities
related to impact assessment of potentially contaminated
sediments.
                         Sincerely,
   Tudor T.  Davies
   Director
   Office of Science
     and Technology
Robert H. Wayland III
Director
Office of Wetlands,
   Oceans and Watersheds
Enclosure

-------
QA/QC Guidance for Sampling and Analysis of Sediments,
   Water, and Tissues for Dredged Material Evaluations

                 Chemical Evaluations

-------
ACRONYMS AND ABBREVIATIONS
     AVS
     BCF
     CLP
     CVAA
     CWA
     EPA
     GC
     GC/ECD
     GC/MS
     GFAA
     ICP

     MPRSA
     PAH
     PCB
     PCDD
     PCDF
     QAMP
     QAPP
     QA/QC
     SRM
     TCDD
     TDL
     TEF
     TOG
     USAGE
acid volatile sulfide
bioconcentration factor
Contract Laboratory Program
cold vapor atomic absorption spectrometry
Clean Water Act
U.S. Environmental Protection Agency
gas chromatography
gas chromatography/electron capture detection
gas chromatography/mass spectrometry
graphite furnace atomic absorption spectrometry
inductively coupled plasma-atomic emission
spectrometry
Marine  Pollution,  Research,  and Sanctuaries Act
polycyclic aromatic hydrocarbon
polychlorinated biphenyl
polychlorinated dibenzo-p-dioxin
polychlorinated dibenzofuran
quality assurance management plan
quality assurance project plan
quality assurance and quality control
standard reference material
tetrachlorodibenzo-p-dioxin
target detection limit
toxiclty  equivalency factor
total organic carbon
U.S. Army Corps of Engineers
                                      XI

-------
f/EPA
          United States
          Environmental Protection
          Agency
            Office of Water
            (4305)
EPA 823-B-95-001
April 1995
QA/QC Guidance for Sampling
and Analysis of Sediments,
Water, and Tissues for Dredged
Material Evaluations

Chemical Evaluations

-------
v>EPA
    United States
    Environmental Protection Agency
    (4305)
    Washington, DC 20460

    Official Business
    Penalty for Private Use
    $300

-------
QA/QC GUIDANCE FOR SAMPLING AND ANALYSIS
     OF SEDIMENTS, WATER, AND TISSUES
    FOR DREDGED MATERIAL EVALUATIONS

           CHEMICAL EVALUATIONS
                  Office of Water
            Office of Science and Technology
           Standards and Applied Science Division
           U.S. Environmental Protection Agency
               Washington, DC 20460
                   April 1995

-------
The polices set out in this document are not final agency action, but are intended
solely as guidance. They are not intended, nor can they be relied upon, to create any
rights enforceable by any party in litigation with the United States.   EPA officials may
decide to follow the guidance provided in this document, or to act at variance with the
guidance, based on  an analysis of specific site circumstances.  The Agency also
reserves the right to change this guidance at  any time without public notice.

-------
CONTENTS
                                                             Page

     LIST OF FIGURES                                            vii

     LIST OF TABLES                                              ix

     ACRONYMS AND ABBREVIATIONS                               xi

     ACKNOWLEDGMENTS                                        xiii

     1.  INTRODUCTION                                           1

        1.1   GOVERNMENT (DATA USER) PROGRAM                   3

        1.2   CONTRACTOR (DATA GENERATOR) PROGRAM             3

     2.  DRAFTING A QUALITY ASSURANCE PROJECT PLAN              6

        2.1   INTRODUCTORY MATERIAL                             6

        2.2   QUALITY ASSURANCE ORGANIZATION AND
             RESPONSIBILITIES                                    7

             2.2.1  Staffing for Quality Assurance                       7
             2.2.2  Statements of Work                               8

        2.3   QUALITY ASSURANCE OBJECTIVES                      14

             2.3.1  Program vs. Project Objectives                     14
             2.3.2  Target Detection Limits for Chemicals                15

        2.4   STANDARD OPERATING PROCEDURES                   16

        2.5   SAMPLING STRATEGY AND PROCEDURES                36

             2.5.1  Review of Dredging Plan                          39
             2.5.2  Site Background and Existing Database               40
             2.5.3  Subdivision of Dredging Area                      42
             2.5.4  Sample Location and Collection Frequency            42
             2.5.5  Sample Designation System                       46
             2.5.6  Station Positioning                              47

-------
     2.5.7   Sample Collection Methods                           50
     2.5.8   Sample Handling, Preservation, and Storage             53
     2.5.9   Logistical Considerations and Safety Precautions         59

2.6  SAMPLE CUSTODY                                        60

     2.6.1   Sample Custody and Documentation                 ;   60
     2.6.2   Storage and Disposal of Samples                      64

2.7  CALIBRATION PROCEDURES AND FREQUENCY              64

     2.7.1   Calibration Frequency                                65
     2.7.2   Number of Calibration Standards                       68
     2.7.3   Calibration Acceptance Criteria                        69

2.8  ANALYTICAL PROCEDURES                                70

     2.8.1   Physical Analysis of Sediment                         70
     2.8.2   Chemical Analysis of Sediment                        71
     2.8.3   Chemical Analysis of Water                        . i  78
     2.8.4   Chemical Analysis of Tissue                           83

2.9  DATA VALIDATION, REDUCTION, AND REPORTING           87
                                                                . i
     2.9.1   Data Validation                                      88
     2.9.2   Data Reduction and Reporting                         91

2.10 INTERNAL QUALITY CONTROL CHECKS                     91

     2.10.1  Priority and Frequency of Quality Control Checks         93
     2.10.2  Specifying Quality Control Limits              •         96
     2.10.3  Quality Control Considerations for Physical Analysis
            of Sediments                                        99
     2.10.4  Quality Control Considerations for Chemical Analysis of
            Sediments                                          99
     2.10.5  Quality Control Considerations for Chemical Analysis of
            Water                                             100
     2.10.6  Quality Control Considerations for Chemical Analysis of
            Tissue                                          "100
                              IV

-------
    2.11 PERFORMANCE AND SYSTEM AUDITS                    101

        2,11.1  Procedures for Pre-Award Inspections of Laboratories    101
        2.11.2  Interlaboratory Comparisons                         102
        2.11.3  Routine System Audits                             104

    2.12 FACILITIES                                            104

    2.13 PREVENTIVE MAINTENANCE                             105

    2.14 CALCULATION OF DATA QUALITY INDICATORS             105

    2.15 CORRECTIVE ACTIONS                                  107

    2.16 QUALITY ASSURANCE REPORTS TO MANAGEMENT         108

        2.16.1  Preparing Basic Quality Assurance Reports            108
        2.16.2  Preparing Detailed Quality Assurance Reports          109

    2.17 REFERENCES                                         110

3.   REFERENCES                                              111

4.   GLOSSARY                                                 123


APPENDIX A  - Example QA/QC Checklists, Forms, and Records

APPENDIX B  - Example Statement of Work for the Laboratory

APPENDIX C  - Description of Calibration, Quality Control Checks,
               and Widely Used Analytical Methods

APPENDIX D  - Standard Operating Procedures

APPENDIX E  - EPA Priority Pollutants and Additional Hazardous Substance
               List Compounds

APPENDIX F  - Example Quality Assurance Reports

APPENDIX G  - Analytical/Environmental Laboratory Audit
               Standard Operating Procedure

APPENDIX H  - Format for the Sediment Testing Report

-------
LIST OF FIGURES
                                                         Page

     Figure 1.     Guidance for data assessment and screening
                for data quality                                92
                                vii

-------
LIST OF TABLES
      Table 1.     Checklist of laboratory deliverables for the analysis
                  of organic compounds

      Table 2.     Checklist of laboratory deliverables for the analysis
                  of metals

      Table 3.     Routine analytical methods and target detection
                  limits for sediment, water, and tissue

      Table 4.     Levels of data quality for historical data

      Table 5.     Summary of recommended procedures for sample
                  collection, preservation, and storage

      Table 6.     Example calibration procedures

      Table 7.     PCDD and PCDF compounds determined by
                  Method 1613

      Table 8.     Polychlorinated biphenyl congeners recommended
                  for quantitation as potential contaminants of
                  concern

      Table 9.     Methodology for toxicity equivalency factors

      Table 10.    Octanol/water partition coefficients for organic
                  compound priority pollutants and 301 (h) pesticides

      Table 11.    Bioconcentration factors of inorganic priority
                  pollutants

      Table 12,    Levels of data validation

      Table 13.    Example warning and control limits for calibration
                  and quality control samples

      Table 14.    Sources of standard reference materials
                                                                        Paqe
 11


 13


 17

 41


 54

 66


 74



 76

 79


 81


 85

 90


 98

103
                                         IX

-------
ACRONYMS AND ABBREVIATIONS
     AVS
     BCF
     CLP
     CVAA
     CWA
     EPA
     GC
     GC/ECD
     GC/MS
     GFAA
     ICP

     MPRSA
     PAH
     PCB
     PCDD
     PCDF
     QAMP
     QAPP
     QA/QC
     SRM
     TCDD
     TDL
     TEF
     TOG
     USAGE
acid volatile sulfide
bioconcentration factor
Contract Laboratory Program
cold vapor atomic absorption spectrometry
Clean Water Act
U.S. Environmental Protection Agency
gas chromatography
gas chromatography/electron capture detection
gas chromatography/mass spectrometry
graphite furnace atomic absorption spectrometry
inductively coupled plasma-atomic emission
spectrometry
Marine  Pollution,  Research,  and Sanctuaries Act
polycyclic aromatic hydrocarbon
polychlorinated biphenyl
polychlorinated dibenzo-p-dioxin
polychlorinated dibenzofuran
quality assurance management plan
quality assurance project plan
quality assurance and quality control
standard reference material
tetrachlorodibenzo-p-dioxin
target detection limit
toxiclty  equivalency factor
total organic carbon
U.S. Army Corps of Engineers
                                      XI

-------
ACKNOWLEDGMENTS
     The contributions made by many individuals are gratefully acknowledged.  The
     work group was comprised of individuals from headquarters, field offices, and
     research laboratories of the U.S. Environmental Protection Agency (EPA) and
     the U.S. Army Corps of Engineers (USAGE) with experience related to dredged
     material discharge activities.

     Members;         Jim Barron                   EPA/OERR (Superfund)
                      Patricia Boone                EPA/Region 5
                      Suzy Cantor-McKinney         EPA/Region 6
                      Tom Chase                   EPA/OWOW
                      Pat Cotter                    EPA/Region 9
                      Tom Dixon                   EPA/QAMS
                      Bob Engler                   USACEE/WES
                      Rick Fox                     EPA/GLNPO
                      Catherine Fox                 EPA/Rsgion 4
                      Bob Graves                   EPA/EMSL Cincinnati
                      Doug Johnson                EPA/Region 4
                      Lloyd Kahn                   EPA/Region 2
                      Linda Kirkland                 EPA/QAMS
                      Mike Kravitz                  EPA/OST (CHAIR)
                      Jim Lazorchak                EPA/EMSL Cincinnati
                      Alex Lechich                 EPA/Region 2
                      John Malek                   EPA/Region 10
                      William Muir                  EPA/Region 3
                      Rich Pruell                   EPA/ORD-N
                      Norm Rubinstein              EPA/ORD-N
                      Brian Schumacher             EPA/EMSL Las Vegas
                      George Schupp               EPA/Region 5
                      Ann Strong                   USACEE/WES
                      William Telliard               EPA/OST
                      Dave Tomey                 EPA/Region 1

     This manual also benefitted from contributions made by the following
     individuals: Robert Barrick (PTI  Environmental Services), John Bourbon (EPA
     Region 2), Melissa Bowen (Tetra Tech), Alan Brenner (EPA Region 2), Peter
     Chapman (EVS Consultants), James Clausner (USAGE WES), John Dorkin
     (EPA Region 5), Robert Howard (EPA Region 4), Charlie MacPherson  (Tetra
     Tech), Kim Magruder (EVS Consultants), Brian Melzian (EPA ORD-N), Bob
     Runyon (EPA Region 2), Sandra Salazar (EVS Consultants), Jane Sexton (PTI
     Environmental Services), Bruce Woods (EPA Region 10), and  Tom Wright
     (USAGE WES).
                                      XIII

-------
1.    INTRODUCTION
     This document provides programmatic and technical guidance on quality
     assurance and quality control (QA/QC) issues related to dredged material
     evaluations. The U.S. Army Corps of Engineers (USAGE) and U.S.
     Environmental Protection Agency (EPA) share the Federal responsibility for
     regulating the discharge of dredged material under two major acts of Congress.
     The Clean Water Act (CWA) governs discharges of dredged  material into
     "waters of the  United States," including all waters landward of the baseline of
     the territorial sea. The Marine Protection, Research, and Sanctuaries Act
     (MPRSA) governs the transportation of dredged material seaward of the
     baseline (in ocean waters) for the purpose of disposal.

     EPA and USAGE technical guidance for evaluating the potential for
     contaminant-related impacts associated with the discharge of dredged material
     into inland and ocean waters, respectively, is found in the documents
     "Evaluation of  Predged Material Proposed for Discharge in Waters of the
     U.S.—Testing  Manual (Draft)" (the Inland Testing Manual) (U.S. EPA and
     USAGE 1994), and "Evaluation of Dredged Material Proposed for Ocean
     Disposal—Testing Manual" (the Ocean Testing Manual) (U.S. EPA and USAGE
     1991).  Results of tests conducted using the testing manuals are the basis of
     independent evaluations made by EPA and USAGE regarding the suitability of
     proposed dredged material for aquatic disposal.

     This QA/QC guidance document serves as a companion document to the
     Inland and Ocean Testing manuals. The purpose of this document is  as
     follows:  1) to provide guidance on the development of quality assurance project
     plans for ensuring the reliability of data gathered to evaluate  dredged material
     proposed for discharge under the CWA or the MPRSA, 2) to  outline procedures
     that should be followed when sampling and analyzing sediments, water, and
     tissues, and 3) to provide  recommended target detection limits (TDLs) for
     chemicals of concern.  This document pertains largely to physical and chemical
     evaluations.  Though it is directed primarily toward the evaluation of dredged
     material for aquatic disposal, it may be useful in other areas of dredged
     material assessment and management as well (e.g., disposal site monitoring or
     evaluation of alternative disposal options).

     QA/QC  planning is necessary to ensure that the chemical and biological data
     generated during dredged material evaluations meet overall program and
     specific project needs.  Establishing QA/QC procedures is -fundamental to
     meeting project data quality criteria and to providing a basis for good decision-
     making.  The EPA has developed a two-tiered quality management structure

-------
that addresses QA concerns at both the organizational level and at the
technical/project level. QA management plans (known as QAMPs) identify the
mission and customers of the organization, document specific roles and
responsibilities of top management and employees, outline the structure for
effective communications, and define how measures of effectiveness will be
established. The quality standards, goals, performance specifications, and the
QA/QC activities necessary to achieve them, are incorporated into project-
specific QA project plans (known as QAPPs).

QA activities provide a formalized system for evaluating the technical adequacy
of sample collection and laboratory analysis activities.  These QA activities
begin before samples  are collected and  continue after laboratory analyses are
completed, requiring ongoing coordination and oversight.  The QA program
summarized in this document integrates management and technical practices
into a single system to provide environmental data that are sufficient,
appropriate, and of known and documented quality for dredged material
evaluation.

QA project plans (QAPPs) provide a detailed plan for the activities performed at
each stage of the dredged material evaluation (including appropriate sampling
and analysis procedures) and outline project-specific data quality objectives that
should be achieved for field observations and measurements,  physical
analyses,  laboratory chemical analyses,  and biological tests.  Data quality
objectives should be defined prior to initiating a project and adhered to for the
duration of the project to guarantee acquisition of reliable data. This is
accomplished by integrating quality control (QC) into all facets of the project,
including development of the study design, implementation of sample collection
and analysis, and data evaluation. QC is the routine application of procedures
for determining bias and precision.  QC procedures include activities such as
preparation of replicate samples, spiked samples,  blanks;  calibration and
standardization; and sample custody and recordkeeping. Audits, reviews, and
compilation of complete and thorough documentation are QA activities used to
verify compliance with predefined QC procedures. Through periodic reporting,
these QA  activities provide a means for management to track project progress
and milestones, performance of  measurement systems, and data quality.

A complete QA/QC effort for a dredged material testing program has two major
components:  a QA program implemented by the responsible governmental
agency (the data user), and QC  programs implemented by sampling and
laboratory personnel performing  the tests (the data generators). QA programs
are also implemented  by each field contractor and each laboratory. Typically,
all field andjaboratory data  generators agree to adhere to the QA/QC of the
data  user  for the contracted project as specified in the project QAPP.  USEPA
(1987a) provides useful guidance and may be followed on all points that are not
in conflict  with the guidance in this document. The guidance provided in this

-------
      document also incorporates information contained in U.S. EPA (1984a, 1991d)
      and U.S. EPA and USAGE (1991,  1994).
1.1    GOVERNMENT (DATA USER) PROGRAM

      Because the data generated in a dredged material evaluation are used for
      regulatory purposes, it is important to have proper QA oversight.  The USAGE,
      working in conjunction with the appropriate EPA Region(s), should implement a
      QA program to ensure that all program elements and testing activities (including
      field and laboratory operations) in the dredged  material evaluation comply with
      the procedures in the QA project plan or with other specified guidelines for the
      production of environmental data of known quality. This QA  guidance
      document was designed with the assistance of programmatic and scientific
      expertise from both EPA and USAGE.  Other qualified sources of QA program
      management should be contacted as appropriate.  Some specific QA
      considerations in contract laboratory selection are discussed by Sturgis (1990)
      and U.S. EPA (1991 d).

      The guidance in this document is intended to assist EPA and USAGE dredged
      material managers in developing QA project plans to ensure  that:  1) the data
      submitted with dredged material permit applications are of high quality,
      sufficient, and appropriate for determining if dredging and disposal should
      occur; and 2) the contract laboratories comply with QC specifications of the
      regulations and guidelines governing dredged material evaluations. This
      includes  the development of an appropriate QA management plan.
1.2   CONTRACTOR (DATA GENERATOR) PROGRAM

      Each office or laboratory participating in a dredged material evaluation is
      responsible for using procedures which assure that the accuracy (precision and
      bias), representativeness, comparability, and completeness of its data are
      known and documented.  To ensure that this responsibility is met, each
      participating organization should have a project manager and a written QA
      management plan that describes, in specific terms, the management approach
      proposed to assure that each procedure under its direction complies with the
      criteria accepted by  EPA and USAGE.  This plan should describe a QA policy,
      address the contents and application of specific QA project plans, specify
      training requirements, and include other elements recommended by EPA quality
      assurance management staff (e.g., management system reviews).  All field
      measurements, sampling, and analytical components (physical, chemical, and
      biological) of the dredged material evaluation should be discussed.

-------
For the completion of a dredged material testing project, the project manager of
each participating organization should establish a well-structured QA program
that ensures the following:


    •   Development, implementation, and administration of appropriate
        QA planning documents for each study

    •   Inclusion of routine QC procedures for assessing data quality in all
        field and laboratory standard operating procedures

    •   Performance of sufficiently detailed audits at intervals frequent
        enough to ensure conformance with approved QA project plans
        and standard operating procedures

    •   Periodic evaluation of QC procedures to improve the quality of QA
        project plans and standard operating procedures

    «   Implementation of appropriate corrective actions in a timely
        manner.

The guidance provided in this document is intended to assist the data generator
with the production of high-quality data in the field and in the laboratory (i.e.,
the right type and quality of information is provided to EPA and USAGE to
make a decision about the suitability of dredged material for aquatic disposal
with the specified degree of confidence).

-------
2.   DRAFTING A QUALITY ASSURANCE
      PROJECT PLAN
      A formal strategy should always be developed to obtain sufficient and
      appropriate data of known quality for a specific dredged material testing
      program. When the sample collection and laboratory analysis effort is small,
      this strategy may be relatively straightforward.  However,  when the sample
      collection and laboratory analysis effort is significant, the assurance of data
      quality may require the formulation of a formal and often quite detailed  QA
      project plan.  The QA project plan is  a planning and an operational document.

      The QA  project plan should be developed by the applicant or contractor for
      each dredged material evaluation, in  accordance with this document. The QA
      project plan provides an  overall plan  and contains specific guidelines and
      procedures for the activities performed at each stage of the dredged material
      testing program, such as dredging site subdivision, sample collection,
      bioassessment procedures, chemical and physical analyses, data quality
      standards, data analysis, and reporting.  In particular, the QA plan addresses
      required QC checks, performance and system audits, QA reports to
      management, corrective  actions, and assessment of data accuracy (precision
      and bias)1, representativeness, comparability, and completeness. The plan
      should address the quantity of data required to allow confident and justifiable
      conclusions and decisions.

      The following information should be included in each QA  project plan for
      dredged material evaluation unless a more abbreviated plan can be justified
      (see U.S. EPA1989a);

          •   Introductory material,  including title and signature pages, table of
              contents, and project description

          •   QA organization and responsibilities (the QA organization should
              be designed to operate with  a degree of independence from the
              technical project organization to ensure appropriate oversight)
         1 Historically, "accuracy" and "precision" have often been defined as separate
      and distinct terms.  In particular, accuracy has often been taken to be only a
      measure of how different a value is from the true value (i.e., bias).  However, data
      that have poor precision (i.e., high variability) may  only have low bias  on the
      average (i.e., close agreement to the true value). Therefore, recent literature (e.g.,
      Kirchmer 1988) has defined accuracy as both the precision and bias of the data.
      This definition  of accuracy is used throughout this guidance document.

-------
          •   QA objectives
          •   Standard Operating Procedures
          •   Sampling strategy and procedures
          •   Sample custody
          •   Calibration procedures and frequency
          •   Analytical procedures
          •   Data validation, reduction, and reporting
          •   Internal QC checks
          •   Performance and system audits
          •   Facilities
          •   Preventive maintenance
          •   Calculation of data quality indicators
          »   Corrective actions
          •   QA reports to management
          •   References.
      The remaining sections of this document provide more specific information on
      each of these items.

2.1   INTRODUCTORY MATERIAL
      The following sections should be included at the beginning of every QA project
      plan:
          •   Title and signature pages
          •   Table of contents
          •   Project description
          •   Certification.
      The signature page should be signed and dated by those persons responsible
      for approving and implementing the QA project plan. The applicant's project
      manager's signature should be included even if other persons are primariiy
      responsible for QA activities.  The headings in the table of contents should
      match the headings in the QA project plan.  A list of figures, list of tables, and
      list of appendices should be included in the table of contents.

-------
      The goals and objectives of the study project should be outlined in the project
      description.  The project description should illustrate how the project will be
      designed to obtain the information needed to achieve those goals. Sufficient
      detail and information should be included for regulatory agency decision-
      making.

      The QA project plan should include the following certification statement signed
      by a duly authorized representative of the permittee:
              / certify under penalty of law that this document and all
              attachments were prepared under my direction or supervision.
              The information submitted is, to the best of my knowledge and
              belief, true, accurate, and complete.  I am aware there are
              significant penalties for submitting false information,  including
              the possibility of fine and imprisonment for knowing violations.
2.2   QUALITY ASSURANCE ORGANIZA TION AND
      RESPONSIBILITIES

      A clear delineation of the QA organization and line of authority is essential for
      the development, implementation, and administration of a QA program. The
      relationship of the QA personnel to the overall project team and their
      responsibilities for implementing the QA program are identified in this section.
      In  addition, guidance is provided for developing statements of work that address
      the responsibilities of contract laboratories used in the project.
2.2.1 Staffing for Quality Assurance

      Organizational charts or tables should be used in the QA project plan to
      describe the management structure, personnel responsibilities, and the
      interaction among functional units. Each QA task should be fully described and
      the responsible individual, their respective telephone number, and the
      associated organization named.  Names of responsible individuals should be
      included for the sampling team, the analytical laboratory, the data evaluation,
      QA/QC effort in the laboratory, and the data analysis effort.  An example of a
      QA organization flow diagram is provided in Appendix A.

      The project manager has overall responsibility for assuring the quality of data
      generated for a project. In most projects, actual QA activities are performed
      independent of the project manager.  However, the project manager does
      ensure the implementation of any corrective actions that are called for during
      sampling, analysis, or data assessment. The writing of a QA project plan can
      usually be accomplished by one person with  assistance as needed from

-------
      technical specialists for details of methods or QC criteria. One person should
      also have primary responsibility for coordinating the oversight of all sampling
      activities, including completion of all documentation for samples sent to the
      laboratory. Coordinating laboratory interactions before and during sample
      analysis is also best performed by one person  to avoid confusion.  Subsequent
      interactions that may be necessary with the laboratory during a QA review of
      the data may involve the persons actually doing the review.

      Additional QC tasks and responsibilities during sampling and analysis are often
      assigned to technicians who collect samples, record field data,  and operate  and
      maintain sampling and analytical equipment. These technicians perform a
      number of  essential day-to-day activities,  which include  calibrating and servicing
      equipment, checking field measurements  and laboratory results, and
      implementing modifications to field or laboratory procedures.  These individuals
      should have training to perform these functions and follow established  protocols
      and guidelines for each of these tasks.

      Technical staff are responsible for the validity and integrity of the data
      produced.  The QA staff should be responsible for ensuring that all personnel
      performing tasks related to data quality are appropriately qualified.  Records of
      qualifications and training of personnel should be kept current for verification by
      internal QA personnel or by regulatory agency  personnel.

      Technical competence and experience of all  contract laboratory staff should  be
      demonstrated.  Staff qualifications should be documented, and training should
      be provided by the laboratory to encourage staff to attain the highest levels of
      technical competence. Staff turnover can affect the ability of a laboratory to
      perform a particular analysis. The experience of current staff with projects of
      similar scope should be assessed during  the laboratory selection process.
      Technical competence and other factors such as the  laboratory setup (including
      quality and capacity of the available analytical equipment), past experience
      (e.g., analysis of appropriate QC check samples and  review of quarterly
      performance evaluation analyses), or an upfront demonstration of performance
      can  be used to influence the project manager's selection. The need to conduct
      a comprehensive evaluation of candidate  laboratories will vary with the project
      and the familiarity with available laboratories.
2.2.2 Statements of Work

      Statements of work are prepared for both field work and laboratory analysis.
      Data quality requirements and analytical methods need to be clearly and
      concisely communicated to either USAGE personnel performing the analyses or
      to the laboratory selected by USAGE'S or the  permit applicant's project
      manager. These specifications are best contained  in a written laboratory
      contract. The main body of the contract should consist of general terms and
                                           8

-------
conditions common to any legal contract. A statement of work should be
appended to the contract.  The statement of work should be drafted and
negotiated with the laboratory prior to the start of any analyses.  The statement
of work should be written in clear and concise terms, providing sufficient detail
and references to approved protocols for each required procedure or method to
eliminate any confusion about steps in the analysis. The statement of work
should define all requirements for acceptable  analyses, an important
consideration even when working with a familiar laboratory, and all pertinent
information on the price, timing, and necessary documentation of the analyses.
All available information on the range of concentrations expected and any
special characteristics of the samples to be analyzed should also be contained
in the statement of work.  A generic statement of work for tf le analysis of most
chemicals in the most commonly analyzed sample matrices is provided in
Appendix B, and is based on the following outline:

    «   A summary of analyses to be performed, including:

        -   A list of all variables to be analyzed for in each sample or
            group of samples

        -   A list of all methods and target detection limits (TDLs)
            (see discussion in Section 2.3.2) for physical and
            chemical analyses and a list of test protocols for biological
            toxicity tests

        -   The total number of samples provided for analysis and
            the associated laboratory QC samples, the cost of each
            analysis, and the total cost of the analytical service
            requested for each sample matrix.

    •   Acceptable procedures for sample delivery and storage, including:

        -   The method of delivery, schedule of delivery, and person
            responsible for notifying the laboratory of any changes in
            the schedule

        -   Requirements for physical storage of samples, holding
            times (consistent with those specified in the QA project
            plan), chain-of-eustody, and sample logbook procedures.

    •   Methods to be followed for processing and analyzing samples.

    »   QA/QC requirements, including the data quality objectives
        specified in the QA project plan and appropriate warning and
        control limits.

    »   A list of products to be delivered by the laboratory, specifying the
        maximum time that may elapse between the submittal of samples
        to the laboratory and the delivery of data reports to the agency,

-------
        organization, or industry requesting the analyses.  Penalties for
        late delivery (and any incentives for early delivery) should be
        specified, as should any special requirements for supporting
        documentation and electronic data files. A checklist of the
        laboratory deliverables for analysis of organic compounds,
        pesticides, and polychlorinated biphenyls (PCBs) is presented in
        Table 1.  A checklist of laboratory deliverables for analysis of
        metals is presented in Table 2.

    •   Progress notices (usually necessary only for large projects).

    •   Circumstances under which the laboratory should notify project
        personnel of problems, including, for example, when control limits
        or other performance criteria cannot be met, instrument
        malfunctions are suspected, or holding time limits have or will
        shortly expire.

    •   Written authorization for any deviations from the sampling and
        analysis plan should be obtained from EPA  and USAGE before the
        deviation occurs.

    •   Notice that scheduled and unannounced laboratory visits by the
        project manager or representative may  be conducted.

The following additional information should also  be provided in the laboratory
statement of work:

    •   Requirements that each laboratory submit a QA manual for review
        and approval by the agency, organization, or industry requesting or
        funding the analysis. Each manual should contain a description of
        the laboratory organization and personnel, facilities and equipment,
        analytical methods, and procedures for  sample custody, quality
        control, data handling, and results of previous laboratory audits.

    •   Conditions for rejection or non-analysis  of samples and reanalysis
        of samples.

    •   Required storage time for records and samples prior to disposal.

    •   Terms for payments to the laboratory, including a requirement that
        the quality of data must be acceptable (pending the outcome of
        the QA review) before payment is made.

Including these elements in the statement of work helps to assure that
responsibilities, data requirements, and expectations  for performance are clear.
A copy of the statement of work should be provided to the individual performing
the data assessment to assist in the evaluation of data returned by the
laboratory.
                                    10

-------
        TABLE 1. CHECKLIST OF LABORATORY DELIVERABLES FOR
                 THE ANALYSIS OF ORGANIC COMPOUNDS
[]    A cover letter discussing analytical problems (if any) and referencing or
      describing the procedures and instrumentation used.

C]    Tabulated results, including final dilution volume of sample Detracts, sample
      size, wet-to-dry ratios for solid samples (if requested), concentrations of com-
      pounds of interest (reported in units identified to two significant figures unless
      otherwise justified), and equations used to perform calculations.
      Concentration units should be jig/kg (dry weight) for sediment, and jig/L for
      water, jig/kg (wet weight) for tissue. These results should be checked for
      accuracy and the report signed by the laboratory manager or designee.

n    Target detection limits (see discussion in Section 2.3.2 of this; document),
      instrument detection limits, and detection limits achieved for the samples.

n    Original data quantification reports for each sample.

Q    Method blanks associated with each sample, quantifying all compounds of
      interest identified in these blanks.

[]    A calibration data summary reporting the calibration range used. For the
      analysis of semivolatile organic compounds analyzed by mass spectrometry,
      this summary should include spectra and quantification reports for deca-
      fluorotriphenylphosphine (DFTPP) or an appropriate substitute standard.  For
      volatile organic compounds analyzed by mass spectrometry, the summary
      should include spectra and quantification reports for bromofluorobenzene
      (BFB) or an appropriate substitute standard.

fj    Recovery assessments and replicate sample summaries.  Laboratories
      should report all  surrogate  spike recovery data for each sample, and a
      statement of the range of recoveries should be included in reports using
      these data.

fj    All data qualification codes assigned by the laboratory, their description, and
      explanations for all departures from the analytical protocols.
                                         11

-------
TABLE 1.  (cont.)
Additional Deliverabies for Volatile or Semivolatile Organic Compound Analyses8

     L~]    Tentatively identified compounds (if requested) and methods of quantification,
           along with the three library spectra that best match the spectra of the
           compound of interest (see Appendix C, Figure 1 for an example of a library
           spectrum).

     L~]    Reconstructed ion chromatograms for gas chromatography/mass
           spectrometry (GC/MS) analyses for each sample.

     Q    Mass spectra of detected compounds for each sample.

     Q    Internal standard area summary to show whether internal standard areas
           were stable.

     Q]    Gel permeation chromatography (GPC) chromatograms (for analyses of
           semivolatile compounds, if performed), recovery assessments, and replicate
           sample summaries.  Laboratories should report all surrogate spike recovery
           data for each sample, and a statement of the range of recoveries should be
           included in reports using these data.
Additional Deliverabies for Pesticide and Polychlorinated Biphenyl Analyses8

     Q  .  Gas chromatography/electron capture detection (GC/ECD) chromatograms for
           quantification column and confirmation columns for each sample and for all
           standards analyzed.

     Q    GPC chromatograms (if GPC was performed).

     Q]    An evaluation summary for 4,4'-DDT/endrin breakdown.

     H]    A pesticide standard evaluation to summarize retention time shifts of internal
           standards or surrogate spike compounds.

' Many of the terms in this table are discussed more completely in Appendix C.
                                             12

-------
            TABLE 2. CHECKLIST OF LABORATORY DELIVERABLES FOR
                             THE ANALYSIS OF METALS
    L~]     A cover letter discussing analytical problems (if any) and referencing or
           describing the digestion procedures and instrumentation used.

    L~]     Tabulated results for final dilution volumes of sample digestates, sample size,
           wet-to-dry ratios for solid samples (if requested), and concentrations of
           metals (reported in units identified to two significant figures unless otherwise
           justified).  Concentration units should be (4,g/kg (dry weight) for sediment, (ig/L
           for water, and ng/kg (wet weight) for tissue.3 These results should be
           checked for accuracy and the report signed by the laboratory  manager or
           designee.

    L~]     Target detection limits (see discussion in Section 2.3.2 of this document),
           instrument detection limits, and detection limits achieved for the samples.

    n     Method blanks for each batch of samples.

    Q     Results for all the quality control checks and initial and continuing calibration
           control checks conducted by the laboratory.

    Q     All data quantification codes assigned by the laboratory, their  description, and
           explanations for all departures from the accepted analytical protocols.

a Most laboratories will report metals data in mg/kg for solid samples. Tine specification here
of (xg/kg is for consistency with organic chemical analyses, which are typically reported as
|o.g/kg for solid samples.   If different units are used, care should be taken to ensure that results
are not confused.
                                              13

-------
2.3   QUALITY ASSURANCE OBJECTIVES

      Data quality objectives are addressed in this section of the QA project plan.
      Data quality objectives define performance-based goals for accuracy (precision
      and bias), representativeness, comparability, and completeness, as well as the
      required sensitivity of chemical measurements (i.e., TDLs). Accuracy is defined
      in terms of bias (how close the measured value is to the true value) and
      precision (how variable the measurements are when repeated) (see footnote at
      the beginning of Section 2).  Data quality objectives for the dredged material
      program are based on the intended use of the data, technical feasibility, and
      consideration of cost.  Therefore, data that meet all data quality objectives
      should be acceptable for unrestricted use in the project and should enable all
      project objectives to be addressed.

      Numerical data quality objectives should be summarized in a table, with all data
      calculated and reported in units consistent with those used by other
      organizations reporting similar data, to allow comparability among databases.
      All measurements should be made so that results are representative of the
      medium (e.g., sediments, water, or tissue) being measured.  Data quality
      objectives for precision and bias established for each measurement parameter
      should be based on prior knowledge of the measurement system employed,
      method validation studies, and the requirements of the specific project.
      Replicate tests should be performed for all test media (e.g., sediments, water,
      or tissue).  Precision of approximately < 30-50 relative percent difference
      between measurements (the random error of measurement) and bias of
      50-150 percent of the true value (the systematic error of measurement) are
      adequate in many programs for making comparisons with regulatory limits.
      Precision may be calculated using three or more replicates to obtain the
      standard deviation and the derived confidence interval.  Bias  may be
      determined with standard reference material (SRM) or by spiking analyte-free
      samples.

      These data quality objectives define the acceptability of laboratory
      measurements and  should also include criteria for the maximum allowable time
      that samples or organisms can be held prior to analysis by a  laboratory. An
      example of a data quality objectives summary for laboratory measurements is
      provided in Appendix A.
2.3.1 Program vs. Project Objectives

      This document provides general guidance for QA activities conducted during
      dredged material evaluations.  However, specific project needs will affect the
      kinds of chemical analyses that are requested by the project manager.  Special
      project needs should be identified during preparation of the QA project plan and
      should be documented in this section of the plan.  For  example, a preliminary
                                         14

-------
      reconnaissance of a large area may only require data from simple and quick
      checks performed in the field. In contrast, a complete characterization of
      contamination in a sensitive area may require specialized laboratory methods,
      lower TDLS, and considerable documentation of results.

      Before defining the analyses that should be performed to meet the data quality
      objectives established on a project-specific basis, a thorough review of all
      historical data associated with the site (if applicable) should be performed (see
      discussions in U.S. EPA and USAGE 1991, 1994). A review of the historical
      data should be conducted in response to data needs in the testing program.  A
      comprehensive review of all historical data should eliminate unnecessary
      chemical analyses and assist in  focusing the collection of chemical-specific data
      that are needed.  A more thorough discussion of how to review and use
      historical data is provided in Section 2.5.2.
2.3.2 Target Detection Limits for Chemicals

      Different analytical methods are capable of detecting different concentrations of
      a chemical in a sample.  In general, as the sensitivity and overall accuracy of a
      technique increases, so does the cost. Recommended TDLs that are judged to
      be feasible by a variety of methods, cost effective, and to meet the
      requirements for dredged material evaluations are summarized in Table 3 (at
      the end of Section 2.4), along with example analytical methods that are capable
      of meeting the TDLs.  However, any method that can achieve these TDLs is
      acceptable, provided that the appropriate documentation of the method
      performance is generated for the  project.

      The TDL is a performance goal set between the lowest, technically feasible,
      detection limit for  routine analytical methods and available regulatory criteria or
      guidelines for evaluating dredged material (see summaries iin McDonald et al.
      [1992]; PSEP [1991]).  The TDL is, therefore, equal  to or greater than the
      lowest amount of  a chemical that can be reliably detected based on the
      variability of the blank  response of routine analytical methods (see
      Section 2.10.1 for discussion of method blank response).  However, the
      reliability of a chemical measurement generally increases as the concentration
      increases.  Analytical costs may also  be lower at higher detection limits.  For
      these reliability, feasibility, and cost reasons, the TDLs in Table 3 have been set
      at not less than 10 times lower than available regional or international dredged
      material guidelines for potential biological effects associated with sediment
      chemical contamination.  In many cases, lower detection limits than the TDL
      can be obtained and may be desired for some regional programs (e.g., for
      carefully documenting  changes in conditions at a relatively pristine site).

      All data generated for  dredged material evaluation should meet the TDLs in
      Table 3 unless a regional requirement is made or  sample-specific interferences
                                          15

-------
      occur. Any sample-specific interferences should be well documented by the
      laboratory. If significantly higher or lower TDLs are required to meet rigorously
      defined data quality objectives (e.g., for human health risk assessments) for a
      specific project, then on a project-specific basis, modification to existing
      analytical  procedures may be necessary. Such modifications should be
      documented in the QA  project plan. An experienced  analytical chemist should
      be consulted so the most appropriate  method modifications can be assessed,
      the appropriate coordination with the analytical laboratory can  be implemented,
      and the data quality objectives can  be met. A more detailed discussion of
      method modifications is provided in Section 2.8.2.2.
2.4   STANDARD OPERATING PROCEDURES

      Standard operating procedures are written descriptions of routine methods and
      should be provided for as many methods used during the dredged  material
      evaluation as possible.  A large number of field and laboratory operations can
      be standardized and presented as standard operating procedures.  Once these
      procedures are  specified, they can be referenced or provided in an appendix of
      the QA project plan. Only modifications to standard operating procedures  or
      non-standard procedures need to be explained in the main body of the QA
      project plan (e.g, sampling or analytical procedures summaries discussed in
      Sections 2.5 and 2.8, respectively).

      General types of procedures that benefit from standard operating procedures
      include field measurements ancillary to sample collection (e.g., depth of
      overlying water, sampling depth, water quality measurements or  mixing model
      input measurements), chain-of-custody, sample handling and shipment, and
      routine analytical methods for chemical analyses.  Standard operating
      procedures ensure that all persons conducting work are following the same
      procedures and that the procedures do not change over time.  All personnel
      should be thoroughly familiar with the standard operating procedures before
      work is initiated. Deviations from standard operating procedures may affect
      data quality and integrity. If it is necessary to deviate from approved standard
      operating procedures, these deviations should be documented and approved
      through an appropriate  chain-of-command.  Personnel responsible for ensuring
      the standard operating procedures are adhered to should be identified in the
      QA project plan. Example standard operating procedures are provided in
      Appendix D.
                                         16

-------
TABLE 3. ROUTINE ANALYTICAL METHODS AND TARGET DETECTION LIMITS
FOR SEDIMENT, WATER, AND TISSUE (parts per billion, unless otherwise noted)
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method8 TDL Method8 TDLb Method" TDL"
Inorganic Chemicals
Aluminum
Antimony
Arsenic
Beryllium
Cadmium
Chromium

3050A/6010A;
U.S. EPA(1993a)
3050A;
7040/7041;
U.S. EPA(1993a);
PSEP (1990a)
7061;7060A;
3050A;
U.S. EPA(1993a);
PSEP(1990a);
EPRI (1986)
200.8;
7090/7091;
U.S. EPA(1993a)
3050A; 601 OA;
71 31 A/71 30;
U.S. EPA(1993a);
PSEP (1990a)
3050A/7191;
7190; 601 OA;
U.S. EPA (1993a);

50,000C
2,500
5,000
2,500d
300
5,000


200.8;
U.S. EPA(1993a);
200.8;
7040/7041;
U.S. EPA(1993a);
200.8/7061;
7060A;
U.S. EPA (1993a)
200.8;
7090/7091;
U.S. EPA(1993a)
200.8; 7131A;
7130;
U.S. EPA(1993a)
200.8/7191;
7190;
U.S. EPA(1993a)


1,000
100
100
100
100
100


202.2;
6010A/200.7
7041; 204.2
3010;7061;
206.2; 206.3;
EPRI (1986)
7091; 210.2;
6010A/200.7;
200.8
213.2; 7131A;
3010;
6010A/200.7;
200.8
71 91; 200.8;
218.2; 3010;
6010A/200.7

40
3
1
0.2
1
1

-------
TABLES, (cont.)
Chemical
Example
Sediment
Method1
Recommended
Sediment
TDL
Example
Tissue
Method1
Recommended
Tissue
TDLb
Example
Water
Method1
Recommended
Water
TDLb
Cobalt
Copper



Hexavalent chromium
Iron


Lead



Manganese

Mercury

Nickel



Selenium


7201
3050A/7211;
7210; 6010A;
U.S. EPA (1993a);
PSEP (1990a)
_i
3050A/7381;
U.S. EPA (1993a)

3050A/7421;
7420; 601 OA;
U.S. EPA (1993a);
PSEP (19903)
3050A/7461;
U.S. EPA (1993a)
7471;
U.S. EPA (1993a)
3050 A/601 OA;
7521; 7520;
U.S. EPA(1993a);
PSEP (1990a)
7741:7740;
U.S. EPA (19933);
EPRI (1986)
























100
S.OOO1



-
50,000*


5,000



5,000C

200

5,000



1,000e


























200.8; 7201
200.8/7211;
7210;
U.S. EPA (1993a)

-'
200.8; 7381;
601 OA;
U.S. EPA (1993a)
200.8/7421;
7420;
U.S. EPA(1993a)

200.8/7461;
U.S. EPA (1993a)
7471;
U.S. EPA (1993a)
200.8/601 OA;
7521;7520;
U.S. EPA (1993a)

200.8/7741;
7740;
U.S. EPA (1993a)
























100
100



-
10,000


100



500

10

100



200


























219.2
721 1; 200.8;
220.1; 220.2;
3010;
60 10 A/200.7
71 97; 21 8.5
6010A/200.7;
3010; 7381;
236.2
7421;239.2



6010A/200.7;
243.2; 3010
7471; 245.1;
245.2
601 OA; 7521;
249.2


7740;7741;
270.2; 270.3;
EPRI (1986)
























4
1



50
10


1



1

0.2

1



2



-------
TABLE 3. (cont.)
Chemical
Example
Sediment
Method1
Recommended
Sediment
TDL
Example
Tissue
Method"
Recommended
Tissue
TDLb
Example
Water
Method8
Recommended
Water
TDLb
Silver



Thallium

Tin

Zinc



3050A/7761;
7760;
U.S. EPA (1993a);
PSEP (1990a)
7840/7841;
U.S. EPA (1993a)
U.S. EPA (1993a)

3050A/7951;
7950;
U.S. EPA (1993a);
PSEP (1990a)












200"



200"

500e'e

15,000















200.8/7761:7760;
U.S. EPA (1993a)


200.8; 7840; 7841;
U.S. EPA (1993a)
200.8;
U.S. EPA(1993a)
200.8/7951; 7950;
U.S. EPA(1993a)














100



100

100

2,000















7761:272.2



7840; 7841;
279.2
282.2

7951;289.2;
200.7;3010;
601 OA













1



1

5

1



 Organotin
NCASI(1986);
Uhler and Durrel
(1989);
Rice et al. (1987)






10









NCASI (1986);
Riceetal. (1987);
Uhler etal. (1989)







10









NCAS!(1986);
Rice et al.
(1987); Uhler
and Durrel
(1989)





o.o-





-------
IMOUC O.  IUUIIUJ
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method* TDL Method* TDL" Method* TDL"
Nonlonlc Organic Compounds
LPAH Compounds
Naphthalene






Acenaphlhylene






Acenaphlhene




1625C;3540A;
3550A/8100;
8250; 8270A;
8310;
NOAA (1989);
U.S. EPA (1993a)

1625C;3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA(1993a)


1625C;3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)



















20






20






20























16250; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a)
Sloan et al.
(1993);
NOAA (1989)
1625C;8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA (1989)
1625C;8100;
8250; 8270A; 8310;
U.S. EPA(1993a)
Sloan etal. (1993);
NOAA (1989)



















20






20






20























16250; 351 OA;
3520A/8100;
8250; 8270A;
8310



16250; 351 OA;
3520A/8100;
8250; 8270A;
8310



16250; 351 OA;
3520A/8100;
8250; 8270A;
8310




















10






10






10





-------
TABLES, (cont.)
Chemical
Example
Sediment
Method*
Recommended
Sediment
TDL
Example
Tissue
Method8
Recommended
Tissue
TDL"
Example
Water
Method8
Recommended
Water
TDL"
Fluorene






Phenanthrene






Anthracene






1 -Methylnaphthalene






1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)


1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)


1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (19933)


1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA(1993a)






























20






20






20
•





20


































1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA(1989)
1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA (1989)
1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA (1989)
1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA (1989)




























20






20






20






20


































1625C; 3510A;
3520A/8100;
8250; 8270A;
8310



1625C; 351 OA;
3520A/8100;
1 8250;8270A;
8310



1625C; 3510A;
3520A/8100;
8250; 8270A;
8310



1625C; 3510A;
3520A/8100;
8250; 8270A;
8310































10






10






10






10







-------
TABLES, (cont.)
Exampte Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method* TDL Method1 TDLb Method* TDLb
2-MethyInaphthalene





1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)







20











1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al. (1993);
NOAA (1989)






20











1625C; 351 OA;
3520A/8100;
8250; 8270A;
8310








10





HPAH Compounds
Fluoranthene






Pyrene




Benz[a]anthracene




1625C;3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)


1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)
1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)

















20






20




20





















1625C; 8100;
8250; 8270A;
8310;
U.S. EPA (1993a);
Sloan et al.
(1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan et al. (1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan at al. (1993);
NOAA (1989)

















20






20




20





















1625C; 351 OA;
3520A/8100;
8250; 8270A;
8310



1625C;3510A;
3520A/8100;
8250; 8270A;
8310

1625C; 3510A;
3520A/8100;
8250; 8270A;
8310


















10






10




10





-------
     1ME3. fcont.)
Chemical
Example
Sediment
Method"
Recommended
Sediment
TDL
Example
Tissue
Method-
Recommended
Tissue
TDL"
Example
Water
Method*
Recommended
Water
TDL"
Chrysene




Benzo(b&k)ftuoranthenes




Benzo[a]pyrene




Ind8no[1 ,2,3-c,d]pyrene




Dibenz[a,h]anthracene




1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)
1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (19933)
1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)
1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)
1625C; 3540A;
3550A/8100;
oizou; OZ/UM;
8310;
U.S. EPA (1993a)

























20




20




20




20




20





























1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan et al. (1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan et ai. (1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan et al. (1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.S. EPA (1993a);
Sloan et al. (1993);
NOAA (1989)
1625C; 8100;
8250; 8270A; 8310;
U.5. th"A naaaaK
Sloan et al. (1993);
NOAA (1989)

























20




20




20




20




20





























1625C; 3510A;
3520A/8100;
8250; 8270A;
8310

1625C; 3510A;
3520A/8100;
8250; 8270A;
8310

1625C; 3510A;
3520A/8100;
8250; 8270A;
8310

1625C; 351 OA;
3520A/8100;
8250; 8270A;
8310

1625C; 3510A;
3520A/8100;
8250; 8270A;
8310


























10




10




10




10




10




Co

-------
TABLES, (cont)
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method* TDL Method* TDL" Method* TDL"
Benzo{g,h,i]peiylene



Chlorinated Benzenes
1 ,3-Dichlorobenzene


1 ,4-Dichlorobenzene



1 ,2-Dfchlorobenzene


1 ,2,4-TrichIorobenzene

Hexachlorobenzene


1625C; 3540A;
3550A/8100;
8250; 8270A;
8310;
U.S. EPA (1993a)

1625C; 3540A;
3550A/8100;
8240A; 8250;
8260; 8270A
1625C; 3540A;
3550A/8100;
8240A; 8250;
8260; 8270A

1625C; 3540A;
3550A/8100;
8240A; 8250;
8260; 8270A
1625C; 3540A;
3550A/8250;
8260; 8270A
1625C; 3540A;
3550A/8250;
8260; 8270A



















20





20



20




20



10'


10'





















1625C;8100;
8250; 8270A; 8310;
U.S. EPA (19938);
Sloan et al. (1993);
NOAA (1989)

1625C; 8240A:
8250; 8270A;
Sloan etal. (1993)

1625C: 8100;
8240A; 8250;
8270A;
Sloan etal. (1993);
NOAA (1989)
1625C; 8240A:
8250; 8270A;
Sloan etal. (1993)

1625C; 8250;
8260; 8270A;
Sloan etal. (1993)
1625C; 8250;
8260; 8270A;
Sloan etal. (1993)



















20





20



20
i



20



20


20





















1625C;3510A;
3520A/8100;
8250; 8270A;
8310


1625C; 3510A;
3520A/8100;
8240A; 8250;
8260; 8270A
1625C; 3510A;
3520A/8100;
8240A; 8250;
8260; 8270A

1625C; 3510A;
3520A/8100;
8240A; 8250;
8260; 8270A
1625C; 3510A;
3520A/8250;
8260; 8270A
1625C; 3510A;
3520A/8250;
8260; 8270A



















10





10



10




10



10


10



-------
    TABLES, (cont.)
Ol
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method" TDL Method" TDL" Method" TDL"
Phthalate Esters
Dimethyl phthalate



Diethyl phthalate



Di-n-butyl phthalate



Butyl benzyl phthalate



Bis[2-ethylhexyl]phthalate





1625C; 3540A;
3550A/8060; 8100;
8250; 8270A


1625C; 3540A;
3550A/8060; 8100;
8250; 8270A


1625C; 3540A;
3550A/8060; 8100;
8250; 8270A


1625C;3540A;
3550A/8060; 8100;
8250; 8270A


1625C;3540A;
3550A/8060; 8100;
8250; 8270A
























50




50




50




50




50


























1625C; 8060;
8100; 8250; 8270A;
Sloan et al.
(1993);
NOAA (1989)
1625C; 8060;
8100;8250;8270A;
Sloan et al.
(1993);
NOAA (1989)
1625C; 8060;
8100; 8250; 8270A;
Sloan et al.
(1993);
NOAA (1989)
1625C; 8060;
8100; 8250; 8270A;
Sloan et al.
(1993);
NOAA (1989)
1625C;8060;
8100;8250;8270A;
Sloan et al.
(1993);
NOAA (1989)






















20




20




20




20




20


























1625C;3510A;
3520A/8060;
8100; 8250;
8270A

1625C;3510A;
3520A/8100;
8060; 8250;
8270A

1625C; 3510A;
3520A/8100;
8060; 8250;
8270A

1625C; 3510A;
3520A/8100;
8060; 8250;
8270A

1625C;3510A;
3520A/8100;
8060; 8250;
8270A























10




10




10




10




10





-------
TABLES, (cont.)
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method* TDL Method* TDL" Method* TDL"
Di-n-octyl phthalate
Miscellaneous Retractable Compo
Benzyl alcohol
Benzole acid
Dibenzofuran
Hexachloroethane
Hexachlorobutadiene
1625C; 3540A;
3550A/8060; 8100;
8250; 8270A
jnds
1625C;3540A;
3550A/8250;
8270A
1625C; 3540A;
3550A/8250;
8270A
1625C; 3540A;
3550A/8250;
8270A
1 625C; 3540A;
3550A/8250;
8270A
1625C;3540A;
3550A/8250;
8270A

50

50
100
50
100
20

1625C; 8060;
8100;8250;8270A;
Sloan el at.
(1993);
NOAA(1989)

1625C;8250;
8270A
1625C; 8250;
8270A
1625C; 8100;
8250; 8270A;
Sloan et al. (1993);
NOAA (1989)
1625C; 8250;
8270A
1625C; 8250;
8270A

20

100
100
20
40
40

1625C;3510A;
3520A/8100;
8060; 8250;
8270A

1625C;3510A;
3520A/8250;
8270A
1625C;3510A;
3520A/8250;
8270A
1625C; 3510A;
3520A/8250;
8270A
1625C;3510A;
3520A/8250;
8270A
1625C;3510A;
3520A/8250;
8270A

10

50
50
10
50
50

-------
TABLE 3.  (cont)
M
N
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method" TDL Method" TDL" Method" TDLb
N-Nitrosodiphenylamine
Methylethyl ketone
Polychlorlnated Dlbenzofurans
Tetrachlorinated furans
Pentachlorinated furans
Hexachlorinated furans
Heptachlorinated furans
Octachlorinated furans
Polychlorlnated Dibenzo-p-dloxlns
2,3.7,8-TCDD
Other tetrachlorinated dioxins
Pentachtorinated dioxins
Hexachlorinated dioxins
1625C; 3540A;
3550A/8250;
8270A
1624C; 3540A;
3550A/8250;
8240A; 8260;
8270A

1613;8290
1613; 8290
1613; 8290
1613;8290
1613; 8290

1613; 8290
1613; 8290
1613; 8290
1613; 8290


20
20

0.001
0.0025
0.005
0.005
0.01

0.001
0.001
0.0025
0.005


1625C; 8250;
8270A
1624C; 8250;
8270A

8290
8290
8290
8290
8290

8290
8290
8290
8290


20
20

0.001
0.0025
0.005
0.005
0.01

0.001
0.001
0.0025
0.005


1625C; 3510A;
3520A/8250;
8270A
1624C;
351 OA;
3520A/8250;
8240A; 8260;
8270A

1613; 8290
1613;8290
1613:8290
1613; 8290
1613;8290

1613; 8290
1613;8290
1613; 8290
1613; 8290


50
50

0.00001
0.000025
0.00005
0.00005
0.0001

0.00001
0.00001
0.000025
0.00005


-------
   TABLES,  (cont.)
Qo
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method" TDL Method" . TDLb Method* TDL"
Heptachlorinated dioxins
Octachlorinated dioxins
PCBs
PCB congeners
Pesticides
Aldrin
Chlordane and derivatives
Dieldrin
4,4'-DDD
4,4'-DDE
1613; 8290
1613; 8290


0.005
0.01

Sloan et al. (1993);
NOAA (1989);
U.S. EPA (1993a)

3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)


1

10
10
10
10
10

8290
8290


0.005
0.01


1613; 8290
1613; 8290


0.00005
0.0001

NOAA (1989);
U.S. EPA (1993a)

8080;
NOAA (1985)
8080;
NOAA (1985)
8080;
NOAA (1985)
8080;
NOAA (1985)
8080;
NOAA (1985)


2

NOAA (1989)

10
10
10
10
10


608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080


0.01

0.04
0.14
0.02
0.1
0.1


-------
TABLE 3.  (cont.)
 Chemical
Example
Sediment
Method"
Recommended
  Sediment
    TDL
Example
 Tissue
Method"
Recommended
   Tissue
    TDLb
Example
 Water
Method"
Recommended
   Water
    TDL"
4,4'-DDT


Endosulfan and derivatives


Endrin and derivatives


Heptachlor and derivatives


fHexachlorocyclohexane
(lindane)

Toxaphene


Methoxycnior


Chlorbenside


3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080

3540A;
3550A/8080

3540A;
3550A/8080; NOAA
(1985)
























10


10


5


10


10


50


10


2


























8080;
NOAA (1985)

8080;
NOAA (1985)

8080;
NOAA (1985)

8080;
NOAA (1985)

8080;
NOAA (1985)

8080


8080


8080;
NOAA (1i85)

























10


10


10


10


10


50


10


2


























608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
608;
3510A;
3520A/8080
608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
























0.1


0.1


0.1


0.1


0.1


0.5


0.5


0.002



-------
TABLE 3.  (cont)
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method* TDL Method* TDLb Method* TDLb
Dacihal
Total chlorinated pesticides
Malathion
Parathion
Volatile Organic Compounds
Benzene
Chloroform
Ethylbenzene
Toluene
Trichloroethene
Tetrachloroethene
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8080; NOAA
(1985)
3540A;
3550A/8141
3540A;
3550A/8141

1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260


2
20
5
6

10
10
10
10
10
10


8080;
NOAA (1985)
8080;
NOAA (1985)
8141
8141

1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260


2
20
5
6

10
10
10
10
10
10


608;
351 OA;
3520A/8080
608;
351 OA;
3520A/8080
351 OA;
3520A/8141
351 OA;
3520A/8141

1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260
1624C; 8240A;
8260


0.03
0.02
0.8
0.8

5
5
5
5
5
5


-------
TABLES, (cont.)
Example Recommended Example Recommended Example Recommended
Sediment Sediment Tissue Tissue Water Water
Chemical Method' TDL Method8 TDLb Method' TDLb
Total xylenes
lonlzable Organic Compounds
Phenols
Phenol
2-Methylpheno!
4-Methylphenol
2,4-Dimethylphenol
Pentachlorophenol
Resin Acids and Gualacols
1624C; 8240A;
8260

1625C; 3540A;
3550A/8040A;
8250; 8270A; 9066
1625C;3540A;
3550A/8040A;
8250; 8270A
1625C;3540A;
3550A/8040A;
8250; 8270A
1625C;3540A;
3550A/8040A;
8250; 8270A
1625C; 3540A;
3550A/8040A;
8250; 8270A
1625C;3540A;
3550A; 8250;
8270A

10

100
50
100
20
100
10

1624C;8240A;
8260

1625C;8040A;
8270A
1625C;8040A;
8270A
1625C;8040A;
8270A
1625C; 8040A;
8270A
1625C;8040A;
8270A
__g

10

20
20
20
20
100


1624C;8240A;
8260

1625C;3510A;
3520A/8040A;
8250; 8270A
1625C;3510A;
3520A/8040A;
8250; 8270A
1625C; 3510A;
3520A/8040A;
8250; 8270A
1625C;3510A;
3520A/8040A;
8250; 8270A
1625C;3510A;
3520A/8040A;
8250; 8270A


5

10
10
10
10
50


-------
     Chemical
Example
Sediment
Method*
Recommended
  Sediment
    TDL
Example
 Tissue
Method*
Recommended
   Tissue
    TDL"
Example
 Water
Method*
Recommended
    Water
    TDL"
f\i
     Other Analyses

          Ammonia


          Cyanides

          Total organic carbon




          Total petroleum hydrocarbons

          Total recoverable petroleum
          hydrocarbons

          Total phenols


          Acid-volatile sullides
           Total sulfides
           Grain size
           Total suspended solids
           Total settleable solids
350.1;350.2;
Plumb (1981)
901 OA; 9012
PSEP (1986);
U.S. EPA(1987a,
1992b)
9070; 418.1;
418.1
8040A
Cutter and
Dates (1987);
U.S. EPA (1991 a);
DiToroetal. (1990)
9030;
Plumb (1981)
Plumb (1981);
ASTM (1992)
—
—


100
2,000
0.1%
5,000
5,000
1,000
0.1jimole/g
100
1%
--
—


-
901 OA; 9012

418.1
-
8040A

—
-•
—
•—


-
1,000

100,000
—
10,000

~
—
—
•"•


350.1; 350.2;
350.3
335.2
9060; 415.1;
APHA 53100
418.1
418.1
420.1; 625;
8040A

376.2
—
160.2; APHA
25100
160.5; APHA
2540B


30
5,000
0.1%
100
500
50

100
—
1.0mg/L
0.05 ml/L

-------
TABLE 3. (cent.)
 Chemical
Example
Sediment
Method*
Recommended
  Sediment
     TDL
Example
 Tissue
Method8
Recommended
    Tissue
    TDL"
Example
 Water
Method8
Recommended
    Water
    TDL"
Total solids/dry weight
Total volatile solids
Specific gravity
PH
= Total water content of test
species
Total lipid
160.3;
Plumb (1981);
PSEP (1986)
160.4;
Plumb (1981);
APHA 2540E;
PSEP (1986)
Plumb (1981)
9045;
Plumb (1981)
—


0.1%
0.1%
0.01 mg/L
0.1 pH units
—


"

--
—
U.S. EPA (1986b,
1987a)
Bligh and Dyer
(1959);
Folch et al. (1957)



_
—
0.1%
0.1%

"

-
Plumb (1981)
—




—
0.1 pH units
— •

Note:   HPAH  -  high molecular weight polyeyclfc aromatic hydrocarbon
        LFAK  -  iOw rrtoiecuiar weiyhi poiyCyctic aromatic hydroCarbort
        TCDD  -  tetrachlorodibenzo-p-dioxin
        TDL   -  The target detection limit is a performance goal set between the lowest, technically feasible, detection limit for routine analytical methods and available
                  regulatory criteria or guidelines for evaluating dredged material.  The target detection limit is, therefore, equal to or greater than the lowest amount of a
                  chemical that can be reliably detected based on the variability of the blank response of routine analytical methods. However, the reliability of a
                  chemical measurement  generally increases as the concentration increases.  Analytical costs may also be lower at higher detection limits. For these
                  reasons, the target detection limit has been set not less than 10 times lower than available dredged material guidelines.

*  Numbered methods are found in references as listed on following page.
   Determined by work group discussion; no or few effects guidelines are available for comparison.

-------
TABLES. (conL)
e  No sediment screening or adverse effects guidelines are available for comparison.
d  Less than 1/10 of available sediment guidelines for screening concentrations or potential adverse effects, but still cost effective and feasible to attain with a range of
   routine analytical methods.
*  TDL may restrict use of some routine analytical methods, but reflects work group consensus.
1   Sediment TDL slightly exceeds one available sediment guideline (Washingtorj Sediment Management Standards) at low organic carbon content (< 2 percent TOG).
9  -- Not applicable.

-------
REFERENCES CONTAINING NUMBERED METHODS IN TAEILE 3.
Reference
Method
t
US EPA 1983
US EPA 1982
US EPA
19895
US EPA 19901
US EPA 1989c
US EPA
1986a
APHA 1989
160.2
160.3
160.4
160.5
200.7
200.8
202.2
204.2
608
8290
1613
1624C
3010
3050A
351 OA
3520A
3540A
3550A
601 OA
7040
7041
7060A
7061
APHA2510D
206.2
206.3
210.2
213.2
218.2
218.5
219.2
220.1
625


1625C
7090
7091
7130
7131A
7190
7191
7197
7201
7210
7211
7381
APHA 2540B
220.2
236.2
239.2
243.2
245.1
245.2
249.2





7420
7421
7461
7471
7520
7521
7740
7741
7760
7761
7840
APHA 2540E
270.2
270.3
272.2
279.2
282.2
289.2
335.2





7841
7950
7951
8040A
8060
8080
8100
8141
8240A
8250
8260
APHA
531 OD
350.1
350.2
350.3
376.2
415.1
418.1
420.1





8270A
8310
901 OA
9012
9030
9045
9060
9066
9070



                          35

-------
2.5   SAMPLING STRATEGY AND PROCEDURES

      A sampling strategy should be developed to ensure that the sampling design
      supports the planned data use.  For example, a project that was planned to
      characterize a specific area would have different sampling design requirements
      than a project that was screening for a suspected  problem chemical.  The
      sampling strategy will strongly affect the representativeness, comparability, and
      completeness that might be expected for field measurements.  In addition, the
      strategy for collecting field QC samples (e.g., replicates) will assist in the
      determination of how well the total variability of a field measurement can be
      documented. Therefore, development of the sampling  strategy should be
      closely coordinated with development of QA objectives discussed in Section
      2,3.

      Specific procedures  for collecting each kind of sediment, water, tissue, or
      biological sample are described in this section of the QA project plan.  The level
      of detail can range from a brief summary of sampling objectives, containers,
      special sample handling procedures (including compositing and subsampling
      procedures, if appropriate), and storage/sample  preservation to a complete
      sampling plan that provides all details necessary to implement the field
      program. Standard operating procedures do not require elaboration in this
      section of the QA  project plan (see Section 2.4).

      If complete sampling details are  not provided in  the QA project plan, then
      reference should be  made to  the sampling plan  that does provide all details.
      The QA project plan may  be an appendix of the sampling plan, or specific
      sampling details may be provided as an appendix  of the QA project plan.  For
      smaller projects, a single planning document may  be created that combines a
      work plan (project rationale and schedule for each task), detailed sampling plan
      (how project tasks are implemented), and the QA project plan. For larger
      projects, the QA project plan and the detailed sampling plan may be two
      separate documents.

      This section of the document provides basic guidance for assuring sample
      quality from collection to delivery to the laboratory  and guidance on items to
      consider when designing a sampling plan.  A well-designed sampling plan is
      essential when evaluating the potential impact of dredged material discharge  on
      the aquatic environment.  The purpose of the sampling plan is to provide a
      blueprint for all fieldwork by defining in detail the appropriate sampling  and data
      collection methods (in accordance with the established  QA objectives; see
      Section 2.3). Before any sampling is initiated, the  sampling plan  should meet
      clearly defined objectives for individual dredging projects.  Factors such as the
      availability and content of  historical data, the degree of sediment heterogeneity,
      the volume of sediment proposed to be dredged, the areal extent of the
      dredging project, the number and geographical distribution of sample collection
      sites, potential  contaminant sources, and the procedures for collection,
                                         36

-------
preservation, storage, and tracking of samples should be carefully considered
and are necessary for adequate QA/QC of the data. The magnitude of the
dredging project and its time and budgetary constraints should also be
considered.

The following kinds of information should be reviewed for assistance in
designing the sampling plan:

    •   Geochemical and hydrodynamic data—The grain size, specific
        density, water content, total organic carbon (TOG), and
        identification of sediment horizons are helpful in making
        operational decisions. Areas of high tidal currents and high wave
        energy tend to have sediments with larger grain sizes than do
        quiescent areas.  Many contaminants have a greater affinity for
        clay and silt than for sand.  Horizontal and vertical gradients may
        exist within the sediment.  If the sediments are  subject to frequent
        mixing by wave action, currents or prop wash, the sediments are
        likely to be relatively homogenous. Local groundwater quality and
        movement should be determined if groundwater is a potential
        source of contamination.

    •   Quality and age of available data—Reviewing the results of
        chemical analyses performed in past studies can help in selecting
        the appropriate contaminants of concern and in focusing plans for
        additional analyses.  In particular, analytical costs can be reduced
        if historical results can substitute for new analyses. Collecting
        these data is only the initial step, however.  Assessing their
        usefulness to the  current project should always be performed
        before substantial effort is spent on incorporating historical results
        into a project database.  If it is determined that the historical data
        are of questionable use for a specific project, then the
        determination of the most appropriate chemical analyses that will
        meet the project needs,  including the level of effort necessary, will
        need to be assessed.

    •   Spill data—Evidence of a contaminant spill within or near the
        dredging  area may be an important consideration in  identifying
        locations for sampling.

    •:.  Dredging history—Knowledge of prior dredging may  dramatically
        affect sampling plans. If the area is frequently dredged (every 1-2
        years) or If the area is subject to frequent ship traffic, the
        sediments are  likely to be relatively homogenous.  Assuming that
        therejs nq. major  contaminant input, the sampling effort may be
        minimal.-  However, if there is information  regarding possible
        contamination, a more extensive sampling effort may be indicated.
        New excavations  of material unaffected by anthropogenic  input
        may require less intensive sampling than  maintenance dredging.

                                    37

-------
An acceptable sampling plan, including QA/QC requirements, should be in
place before sampling begins.  Regional guidance from governmental agencies
(i.e., EPA and USAGE) is required for developing these project-specific
sampling plans.  The sampling plan should be written so that a field sampling
team unfamiliar with the site would be able to gather the necessary samples
and field information.

Addressing quality assurance in the sampling plan includes designating field
samples to be collected and used for assessing the quality of sampling and
analysis, and ensuring that quality assurance is included in standard operating
procedures for field measurements.  The quality of the information obtained
through the testing process  is impacted by the following four factors:

    •   Collecting representative samples

    •   Collecting an appropriate number of samples

    •   Using appropriate sampling techniques

    •   Protecting or preserving the samples until they are tested.

Ideally, the importance of each of these four factors will be fully understood and
appropriately implemented; in practice, however, this is not always the case.
There may be occasions when time or other resource constraints will limit the
amount of information that should or can be gathered. When this is the case,
the relative importance of each of these factors has  to be carefully considered
in light of the specific study  purposes.

An important component of any field  sampling program is a preproject meeting
with all concerned personnel.  As with the drafting of the QA project plan,
participation by several  individuals may be necessary when developing the
sampling plan.  Personnel involved may include management, field personnel,
laboratory personnel, data management/analysis personnel, and representatives
of regulatory agencies, the permit applicant, and the dredging company. To
assure sampling quality, at least one individual familiar with the study area
should be included in the preproject meeting.  The purposes  of the meeting
include:

    •   Defining the  objectives of the sampling program

    •   Ensuring communication among participating groups

    •   Ensuring agreement on methods and contingency plans.

The more  explicitly the objectives of a testing program can be stated (including
QA objectives), the easier it will be to design an appropriate sampling plan.  A
complete sampling plan will  result in a level of detail such that all sampling
procedures and locations are clearly defined, sample volumes are clearly
established, and all logistical concerns are fully addressed.
                                   38

-------
      To ensure an adequate level of confidence in the data produced and in the
      comparability of the data to information collected by other sampling teams, the
      sampling plan should adhere to published sampling protocols and guidance.
      Descriptions of widely used sampling methods can be found in several EPA
      publications, many of which are cited in this section.

      The sampling plan should include the following specific sections:

          •   Summary of dredging plan, including the dimensions of the
              dredging area, the dredging depths, side-slopes, and the volume of
              sediment for disposal (including overdredged material)

          •   Site background and existing database for the  area, including
              identification of 1) relevant data, 2) need for additional data, and 3)
              areas of potential  environmental concern within the confines of the
              dredging project

          •   Subdivision of dredging area into project segments, if appropriate,
              based on an assessment (review of historical data and past
              assessment work) of the level of environmental concern  within the
              dredging area

          «   Sample location and sample collection frequency,  including
              selection of methods and equipment for positioning vessels at
              established stations

          •   Sample designation system (i.e., a description  of how each
              independently collected sample will be identified)

          •   Sample collection methods for all test media (e.g., sediment,
              water, or tissue)

          •   Procedures for sample handling (including container types  and
              cleaning procedures), preservation, and storage, and (if applicable)
              field or shipboard  analysis

          «   Logistical considerations and safety precautions.

      The subsections that follow discuss each of these steps and provide general
      guidance for their conduct.
2.5.1 Review of Dredging Plan

      A review of the plan for the dredging project provides a basis for determining
      the sampling  strategy (see summary discussion in Section 2.3).  The volume of
      material to be dredged  and the method of dredging are two important factors
      that will help to determine the number of samples required. The number of
      samples required is generally a judgment that considers the cost, resolution,
                                         39

-------
      and the risk of an incorrect decision regarding the volume of material to be
      dredged.  Knowledge of the depth, volume, and physical characteristics of the
      material to be dredged will help to determine the kind of sampling equipment
      that is required.  The boundaries of the dredging area have to be known
      (including the toe and the top of all side-slopes) to ensure that the number and
      location of samples are appropriate. Sampling should generally be to below the
      project depth plus any overdredging.

2.5.2 Site Background and Existing Database

      As previously stated, reviewing the results of chemical analyses performed in
      past studies can help in  selecting the appropriate contaminants of concern and
      in focusing plans for additional analyses.  The level of data quality for historical
      data will affect the  selection of station locations. Examples of four levels of
      data quality that can be  assigned to historical results are summarized in Table
      4.  Labeling each set of  results with a data quality level is also a simple way to
      summarize the relative quality of the data set for future use. This classification
      provides a useful summary of data quality when making conclusions and writing
      up the results of the project.  The example classification in Table 4 considers
      the following factors when  determining the suitability of historical  results for a
      particular project:

          •   The analytical methods used and their associated detection limits—
              Analytical  methods often improve over time.  For example, as late
              as the 1970s, concentrations of many organic compounds in
              sediment samples were difficult to measure routinely, accurately,
              or sensitively. However, as better preparation methods and  more
              sensitive analytical techniques have  been developed, the ability to
              distinguish these compounds from other substances and the
              overall sensitivity of analyses have substantially improved.
              Methods are now available that afford detection limits that are well
              within the  range of documented adverse biological effects.

          •   QA/QC  procedures and documentation—The usefulness of data will
              depend  on whether appropriate QA procedures have been used
              during analysis  and if the data have  been properly validated  (see
              Section  2.9.1) and documented.  Because more  rigorous methods
              to analyze samples and document data quality have  been required
              by environmental scientists over the  past decade, only well-
              documented data that have been produced by laboratories using
              acceptable data quality controls should be considered to have no
              limitations.  Historical data produced by even the best laboratories
              often may lack complete documentation, or the documentation
              may be  difficult  to  obtain.   However,  historical data with incomplete
              documentation could still be used for projects with certain
              objectives  (e.g.,  screening-level studies).
                                         40

-------
 TABLE 4.  LEVELS OF DATA QUALITY FOR HISTORICAL DATA
Level 1   Data are acceptable for all project uses.

         The data are supported by appropriate documentation that
         confirms their comparability to data that will be generated in
         the current project.
Level 2  Data are acceptable for most project uses.

         Appropriate documentation may not be available to confirm
         conclusions on data quality or to support legal defensibility.
         These data are supported by a summary of quality control
         information, and the environmental distribution of
         contamination suggested by  these data is comparable to the
         distribution suggested by an  independent analytical tech-
         nique. The data are thus considered reliable and potentially
         comparable to data that will be produced in the  project.
Level 3  Data are acceptable for reconnaissance-level analyses.

         The data can be used to estimate the nature and extent of
         contamination.  No supporting quality control information is
         available, but standard methods were used,  and there is no
         reason to suspect a problem with the data based on 1) an
         inspection of the data, 2) their environmental distribution
         relative to data produced by an independent analytical
         technique, or 3) supporting technical reports. These data
         should be considered estimates and used only to provide an
         indication of the nature and possible extent of contamination.
Level 4  Data are not acceptable for use in the current project.

         The data may have been acceptable for their original use.
         However, little or no supporting information is available to
         confirm the methods used, no quality control information is
         available, or there are documented  reasons in tec mica!
         reports that suggest the data may not be comparable to
         corresponding data to be collected in the current project.
                                  41

-------
2.5.3 Subdivision of Dredging Area

      Sediment characteristics may vary substantially within the limits of the area to
      be dredged as a result of geographical and hydrological features. Many
      dredging projects can be subdivided into project segments (horizontal and/or
      vertical) that can be treated as separate management units. A project segment
      is an area expected to have relatively consistent characteristics that differ
      substantially from the characteristics of adjacent segments.  Project segments
      may be sampled with various intensities as warranted by the study objectives
      and testing results.

      Any  established sampling program should  be sufficiently flexible to allow
      changes based on field observations.  However, any deviations from
      established procedures should be documented, along with the rationale for such
      deviations. An alteration checklist form is generally appropriate to implement
      required changes. An example of such a checklist is provided in Appendix A.
2.5.4 Sample Location and Collection Frequency

      The method of dredging, the volume of sediment to be removed, the areal
      extent of the dredging project, and the horizontal and vertical heterogeneity of
      the sediment are key to determining station locations and the number of
      samples to be collected for the total dredging project. When appropriate to
      testing objectives, samples may be composited prior to analysis (with attention
      to the discussion in Section 2.5.4.8). The appropriate number of samples and
      the proper use of compositing should be determined for each operation on a
      case-by-case basis.

      Using pertinent available information to determine station locations within the
      dredging area is both cost effective and technically efficient. If a review of
      historical data (see Section 2.5.2) identifies possible sources of contamination,
      skewing the sampling effort toward those areas may be justified to thoroughly
      characterize those areas, but can lead to an incomplete assessment of
      contamination in the whole study area.  In areas of unequally distributed
      contamination, the total sampling effort  should be increased to ensure
      representative, but not necessarily equal, sampling of the entire site. The
      following factors should  be among those considered when selecting  sampling
      stations and patterns: objectives of the testing program, bathymetry, area of
      the dredging project, accessibility, flows (currents and tides), mixing  (hydrology),
      sediment heterogeneity, contaminant source locations, land use activities,
      available personnel and facilities, and other physical characteristics of the
      sampling site.  A discussion of locating  appropriate stations, sample collection,
      and sample  handling procedures is provided in the following sections.
                                          42

-------
2.5.4.1    Station Locations

Station locations within the dredging area should include locations downstream
from major point sources and in quiescent areas, such as turning basins, side
channels, and inside channel bends, where fine-grained sediments and
associated contaminants are most likely to settle.  Information that should help
to define the representativeness of stations within a dredging area includes:

    •   Clearly defined distribution of sediments to be dredged (i.e., project
        depth, overdredged depth, and side slopes)

    •   Clearly defined area to be sampled

    •   Correctly distributed sampling locations within each  dredging area.

If sample variability is  suspected within the dredging area, then multiple
samples should be collected. When sediment variability is unknown, it may be
necessary to conduct  a preliminary survey of the dredging area to better define
the final sampling program.
2.5.4.2     Sample Replication

Within a station, samples may be collected for replicate testing. Sediment
testing is conducted on replicate samples, for which laboratory replicates
(subsamples of a composite sample of the replicates) are generally
recommended as opposed to  field replicates (separate samples for each
replicate).  The former involves pseudo-replication but is more appropriate for
dredged material evaluations where sediments will be homogenized by the
dredging and discharge process. The latter involves true replication but is more
appropriate for field investigations of the extent and degree of variability of
sediment toxicity.
2.5.4.3     Depth Considerations

Sediment composition can vary vertically as well as horizontally.  Samples
should be collected over the entire dredging depth (including over-dredging),
unless the sediments are known to be vertically homogenous or there are
adequate data to demonstrate that contamination does not extend throughout
the depth to be excavated.  Separate analyses of defined sediment horizons or
layers may be useful to determine the vertical distribution ol contamination.
                                    43

-------
2.5.4.4    Sampling Bias

Ideally, the composition of an area and the composition of the samples
obtained from that area will be the same.  However, in practice, there often are
differences due to bias in the sampling program, including disproportionate
intensity of sampling in different parts of the dredging area and equipment
limitations.

In some cases, to minimize bias, it may be useful to develop a sampling grid.
The horizontal dimensions may be subdivided into grid cells of equal size,
which are numbered sequentially.  Cells are then selected for sampling either
randomly or in a stratified random manner. It can be important to collect more
than the minimum number of samples required, especially in areas suspected
of having high or highly variable contamination. In some cases extra samples
may be archived (for long time periods in the case of physical characterization
or chemical analyses and for short time periods in the case of biological tests)
should reexamination of particular stations be warranted.

In other cases, a sampling grid may not be desirable. This is particularly the
case where dredging sites are not continuous open areas, but are rather a
series of separate humps, bumps, reaches, and pockets with varying depths
and surface areas.  In these latter cases, sample distribution is commonly
biased with intent.
2.5.4.5    Level of Effort

In some cases, it may be advisable to consider varying the level of sampling
effort. Dredging areas suspected or known to be contaminated may be
targeted for an increased level of effort so that the boundaries and
characteristics of the contamination can be identified. A weighting approach
can be applied whereby specific areas are ranked in increasing order of
concern, and level of concern can then be used as a factor when  determining
the number of samples within each area.
2.5.4.6    Number of Samples

In general, the number of samples that should be collected within each
dredging area is inversely proportional to the amount of known information, and
is proportional to the level of confidence that is desired in the results and the
suspected level of contamination.  No specific guidance can be provided, but
the following factors should be considered:

    •   The greater the number of samples collected, the better the areal
        and vertical definition
                                   44

-------
        Single measurements are inadequate to describe variability

        The means of several measurements at each station within a
        dredging area are generally less variable than individual
        measurements at each station.
2.5.4.7     Time and Funding Constraints

In all cases, the ultimate objective is to obtain sufficient information to evaluate
the environmental impact of a dredged material disposal operation.  The
realities of time and funding constraints have to be recognized, although such
do not justify inadequate environmental evaluation. Possible responses to cost
constraints have  been discussed by Higgins (1988).  If the original sampling
design does not seem to fit time or funding constraints,  several options are
available, all of which increase the risk of an incorrect decision.  For example,
the number of  segments into which the project is divided can be reduced, but
the total number  of samples remains the same. This option results in fewer
segments and  maintains the power of station-to-station comparisons. This may,
however,  provide a poor assessment of spatial variability bicause of reduced
stratification. Another example would  be to maintain (or even increase) the
number of stations sampled, and composite multiple samples from within a
segment.  This option results in a lower number of analyses being performed
per segment, but may provide a poor assessment of spatial variability within
each segment.
2.5.4.8    Sample Compositing

The objective of obtaining an accurate representation and definition of the
dredging area has to be satisfied when compositing samples.  Compositing
provides a way to control cost while analyzing sediments from a large number
of stations.  Compositing results in a less detailed description of the variability
within the area sampled than would individual analysis at each station.  How-
ever if, for example, five analyses can be performed to characterize a project
segment, the increased coverage afforded by collecting 15 individual samples
and combining sets of three into five composite samples for analysis may justify
the increased time and cost of collecting the extra 10 samples. Compositing
can also provide the large sample volumes required for some biological tests.
Composite samples represent the "average" of the characteristics of the
individual samples making up the composite and are generally appropriate for
logistical and other reasons; however, they are not recommended where they
could serve to "dilute" a highly toxic but localized sediment "hot spot." Further,
composite samples are not recommended for stations with very different grain
size characteristics.
                                   45

-------
      2.5.4.9    Sample Definition

      When a sediment sample is collected, a decision has to be made as to whether
      the entire sediment volume is to be considered as the sample or whether the
      sediment volume represents separate samples.  For instance, based on
      observed stratification, the top 1 m of a core might be considered to be a
      separate sample from the remainder of the core.  After the sediment to be
      considered as a sample is identified, it should be thoroughly homogenized.
      Samples may be split before compositing, with a  portion of the original
      sediment archived for possible later analysis, and the remainder combined with
      parts of other samples. These are then thoroughly homogenized (using clean
      instruments until color and textural homogeneity are  achieved),  producing the
      composite sample.
2.5.5 Sample Designation System

      Information on the procedures used to designate the sampling location and type
      of sample collected should be clearly stated in the field sampling plan. The
      sampling stations should be named according to the site and the type of
      station.  Each sample should be assigned an identifier that describes the
      station, type of sample, and field replicate. An example sample designation
      format is as follows:

          »   The first two characters of the station name could identify the site
              (e.g., BH = Boston Harbor).

          •   The third character of the station name  could identify the type of
              station (e.g., S = site station, P = perimeter station,  or R =
              reference station).

          •   The fourth and fifth characters'of the station name could consist of
              a sequential number (e.g., 01, 02, or 03) that would be assigned to
              distinguish between different stations of the same type.

          •   The sixth character of the station name could describe the type of
              sample (e.g., C = sediment for chemistry and bioassay analyses, B
              = bioaccumulation, or I = benthic infauna).

          •   The resulting sample identifier would be:  BHS01 C.

      When field replicates are collected  (i.e., for benthic samples), the replicate
      number should be appended to the sample identifier.  All field replicates from
      the same station should have the same sample identifier.  The sample identifier
      and replicate number should be linked by a dash to form a single identifier for
      use on sample labels.  The sample date should  also be recorded on the sample
      label.
                                         46

-------
2.5.6 Station Positioning
      The type of positioning system used during sample collection and detailed
      procedures for station positioning should be clearly stated in the sampling plan.
      No single positioning method will be appropriate for all sampling scenarios.
      U.S. EPA (1987b), PSEP(1990b), and USAGE (1990) provide useful information
      on positioning systems and procedures.  Guidance in these publications may be
      followed on all points that do not conflict with this document.
      2.5.6.1     Selection of Station Positioning System

      Available systems should be evaluated based on positioning requirements and
      project-specific constraints to select the most appropriate station positioning
      method for the project.  Specific design and location factors that may affect
      station positioning include physical conditions (e.g., weather and currents) and
      topography of the study site, proposed equipment and analyses, minimum
      station separation, station reoccupation, and program-imposed constraints.
      U.S. EPA's (1993b) locational data policy implementation guidance calls for
      positioning accuracy within 25 m.

      There are many methods available for navigating and positioning sampling
      vessels.  These methods range from simple extensions of well-established
      onshore survey techniques using theodolites to highly sophisticated electronic
      positioning systems. A general discussion of a few of the station positioning
      methods available for dredged material evaluations is provided in the following
      sections.  U.S. EPA (1987b), PSEP (1990b), USAGE (1990), and current
      literature from the manufacturers of station positioning systems should be
      thoroughly reviewed during the selection process to choose the most
      appropriate project-specific positioning system.

          Optical Positioning Techniques

      Optical positioning requires visual sighting to determine alignment on two or
      more ranges, or the distances and angles between the vessel and shore
      targets.

      Intersecting ranges can be used when a number of established landmarks
      permit easy selection of multiple ranges that intersect at the desired sampling
      point, and accuracy is not critical.  One of the more traditional optical
      positioning systems is the theodolite system. Position of the sampling vessel
      can be established using theodolites by two onshore  observers who
      simultaneously measure the  angle between a reference object or shore traverse
      and the vessel. Using a theodolite with an accuracy  of ±15 seconds for a
      single angle measurement at an intercept angle of approximately 45° and a
                                          47

-------
range of 5 km, could potentially yield a positioning error of < ±1 m (Ingham
1975). Although the accuracy of this method is good under optimal conditions,
its use in open waters has several disadvantages such as limited line-of-sight,
limits on intersection of angles, requirement of two manned shore stations,
simultaneous measurements, and target movement and path interferences
(e.g., fog, heavy rain, or heat waves).
    Electronic Positioning Techniques

Electronic positioning systems use the transmission of electromagnetic waves
from two or more stations and a vessel transmitter to define a vessel's location.
Under routine sampling conditions, which may disfavor optical positioning, and
at their respective maximum ranges, electronic positioning methods have
greater accuracy than optical positioning methods (U.S. EPA 1987b).

LORAN-C is one type of electronic positioning system.  Based on the signal
properties of received transmissions from land-based transmitters, the LORAN-
C receiver can be used to locate an approximate position, with a repeatable
accuracy that varies from 15 to 90 m (U.S. EPA 1987b), depending on the
weather and the geometry of the receiver within the LORAN-C station network.
Although the LORAN-C system is not limited by visibility or range restrictions
and does not require additional personnel to monitor onshore stations (as the
theodolite system does), the LORAN-C system does experience interferences in
some geographic areas and is more appropriately used to reposition on a
previously sampled station.

Microwave  positioning systems are typically effective between 25 and  100 km
offshore, depending on antenna heights and power outputs, and have
accuracies  of 1-3 m.  Microwave systems consist of two or more slave shore
stations positioned over known locations and a master receiver on the vessel.
By accurately measuring the travel time of the microwaves between the two
known shore points and the vessel receiver, the position of the vessel can be
accurately determined.  The shore stations, typically tripod-mounted antennas
powered by 12-volt batteries, are very susceptible to vandalism.

The global positioning system (GPS) is another electronic system that can
determine station positions by receiving digital codes from three or more
satellite systems, computing time and distance, and then calculating an earth-
based position.  Two levels of positioning accuracy are achievable with the GPS
system. The positional accuracy of standard GPS is approximately 50-100 m
(U.S.  EPA 1987b).  The accuracy can be improved to between 0.5-5 m by
differential GPS (U.S. EPA 1987b).  In differential GPS, two receivers are used.
The master receiver is placed on a known location. If s location is computed
based on satellite data, and a correction is applied to account for the  errors in
                                   48

-------
position from the satellites.  This correction is then sent via radio link or satellite
to vessel-mounted receivers.
    Hybrid Positioning Techniques

A number of hybrid positioning systems combine positional data from various
sources to obtain fixes.  Such systems usually involve the intersection of a
visual line-of-position with an electronic line-of-position.  Of particular interest to
coastal monitoring programs are dynamic positioning systems that require only
a single shore station and that use the simultaneous measurement of angle
from a known direction and range to the survey vessel.  These range-azimuth
systems are characterized by their operating medium (optional, microwave,
laser) and/or procedure (i.e. manual or automatic tracking).
2.5.6.2     Physical Conditions at the Study Site

The ability of a positioning method to achieve its highest projected accuracy
depends, in part, on site-specific conditions. Weather, currents and other
physical factors may reduce the achievable accuracy of a positioning method.
For example, the relative drift of the sampling equipment away from the boat
under strong currents or winds can increase with depth. Resulting positioning
errors in sample location (as opposed to boat location) may exceed acceptable
limits for the study if effects of site location on positioning accuracy are not
considered during design of the sampling program.
2.5.6.3     Quality Assurance Considerations

Once the positioning method has been selected for the specific dredged
material evaluation, the proper setup, calibration, and operational procedures
must be followed to achieve the intended accuracy.  At least one member of
the field crew should be familiar with the selected positioning method.

Recordkeeping requirements should be established to ensure that station
locations are accurately occupied and that adequate  documentation is available.
Adequate information to ensure consistent positioning and to allow reoccupation
of stations for replicate sample collection or time-series monitoring should be
kept in a field logbook.  Entries should be initialed  by the person entering the
data. Required entries into the field logbook include  the following:

    •   Initial Survey Description—The positioning method and
        equipment used, all changes or modifications to standard methods,
        names of persons who set up and operate the station positioning
        equipment, location of on-board equipment and the reference point
        (e.g., antennae, sighting position),  the type of map used for

                                    49

-------
               positioning and its identification number or scale should be
               recorded in the field logbook.  In addition, a complete copy of the
               survey notes (if appropriate) should be included in the field
               logbook.

           •   Day Log Entries—The same information that is included in the
               initial survey description is also recorded  on a daily basis in the
               day log.  In addition, all problems or irregularities, any weather or
               physical conditions that may affect achievable accuracy, and all
               calibration  data should be recorded in the day log.

           •   Station Log Entries—Each station location should be recorded in
               the coordinates or readings of the method used for positioning in
               sufficient detail to allow reoccupation of the station in the future.
               The positioning information should be recorded at the time of
               sample collection (versus time of equipment deployment) and for
               every reoccupation of that station, even during consecutive
               replicate sampling.  In  addition, supplemental positioning
               information that would  define the station location or help
               subsequent relocation  (e.g., anchored, tied  to northwest comer of
               pier, buoy) should be recorded. If photographs are to be used for
               a posteriori plotting of stations, the roll and  frame numbers should
               be recorded. Depth, time (tidal height) ship heading, and wire
               angle estimation should also be recorded for each occupation of a
               station.

      Sampling reports should include the type of positioning method used during
      data collection. Any specific problems (e.g., wind, currents, waves, visibility,
      electronic interferences) that resulted in positioning problems and those stations
      affected should be identified in the sampling report.  Estimates of the accuracy
      achieved for station positioning  should be included.  Station locations should be
      reported in appropriate units (e.g., latitude  and longitude  to the nearest second).
      Coordinates do not  need to be  reported for each replicate collected; a single set
      of coordinates for the station is  sufficient.  Depth corrected to  mean lower low
      water should also be supplied for each station.
2.5.7 Sample Collection Methods

      Detailed procedures for performing all sampling and equipment decontamination
      should be clearly stated in the sampling plan and can be included as standard
      operating  procedures (see Appendix D). Sample collection requires an
      experienced crew, an adequate vessel equipped with navigational and
      supporting equipment appropriate to the site and the study, and
      noncontaminating sampling apparatus capable of obtaining relatively
      undisturbed and representative samples. To assure sampling quality, at least
      one individual familiar with the study area should be present during the


                                          50

-------
sampling activities. Sampling effort for a proposed dredging project is primarily
oriented toward collection of sediment samples for physical and chemical
characterization and for biological tests. Collection of water samples is also
required to evaluate potential water column impact.  Collection of organisms
near the disposal site might be necessary if there is a need to characterize
indigenous populations or to assess concentrations of contaminants in tissues.
Organisms for use in toxicity and bioaccumulation tests may also be field-
collected.

In general, a hierarchy for sample collection should be established to prevent
contamination from the previous sample, especially when using the same
sampling apparatus to collect samples for different analyses. Where possible,
the known or expected least contaminated stations should bo sampled first.  At
a station where water and sediment are to be collected, watcsr samples should
be collected prior to sediment samples.  The vessel should ideally be positioned
downwind or downcurrent of the sampling device. When lowering and retrieving
sampling devices, care should be taken to avoid visible surface slicks and the
vessel's exhaust. The deck and sample handling area should be kept clean to
help reduce the possibility of contamination.
2.5.7.1    Sediment Sample Collection

Mudroch and MacKnight (1991) provide useful reference information for
sediment sampling techniques. Higgins and Lee (1987) provide a perspective
on sediment collection as commonly practiced by USAGE. ASTM (1991b) and
Burton (1991) provide guidelines for collecting sediments for toxicological
testing.  Guidance provided in these publications may be followed on all points
that do not conflict with this document.

Care should be taken to avoid contamination of sediment samples during
collection and handling.  A detailed procedure for handling sampling equipment
and sample containers should be clearly stated in the sampling plan associated
with a specific project; this may be accomplished by using standard operating
procedures.  For example, samples designated for trace metal analysis should
not come into contact with metal surfaces (except stainless steel, unless
specifically prohibited for a project), and samples designated for organic
analysis should not come into contact with plastic surfaces.

A coring device is recommended whenever sampling to depth is required.  The
choice of corer design depends on factors such as the objectives of the
sampling program, sediment volumes required for testing, sediment
characteristics, water depth, sediment depth, and currents or tides.  A gravity
corer may be limited to cores of 1-2 m in depth, depending on sediment grain
size, degree of sediment compaction, and velocity of the drop.  For  penetration
greater than 2 m, a vibratory corer or a piston corer  is generally preferable.
These types of coring devices are generally limited to soft, unconsolidated
sediments. A split-spoon core may be used for more compacted sediment.
The length of core that can be collected is usually limited  to 10 core diameters
in sand substrate and 20 core diameters in clay substrate. Longer cores can


                                   51

-------
be obtained, but substantial sample disturbance results from internal friction
between the sample and the core liner.

Gravity corers can cause compaction of the vertical structure of sediment
samples, if they freefall into the sediment.  Therefore, if the vertical stratification
in a core sample is of interest, a  piston corer or vibra corer should be used.
The piston corer uses both gravity and hydrostatic pressure. As the cutting
edge penetrates the sediments, an internal piston remains at the level of the
sedimenl/water interface, preventing sediment compression and overcoming
internal friction. The vibra corer is a more complex piece of equipment but is
capable of obtaining 3- to 7-m cores in a wide range of sediment types by
vibrating a large diameter core barrel through the sediment column with little
compaction.  If the samples will not be sectioned prior to analysis, compaction
is not a problem, and noncontaminating gravity (freefall) corers may be the
simplest alternative.

Corers are the samplers of preference in most cases because of the variation in
contamination with depth that can occur in sediment deposits.  Substantial
variation with depth is less likely in shallow channel areas without major direct
contaminant inputs that have frequent ship traffic and from which sediments are
dredged at short intervals. Generally, in these situations, bottom sediments are.
frequently resuspended and mixed by ship  scour and turbulence, effectively
preventing stratification.  In such  cases, surface grab samples can be
representative of the mixed sediment column, and corers should be necessary
only if excavation of infrequently disturbed sediments below the mixed layer is
planned. Grab samplers are also appropriate for collecting surficial samples of
reference or control sediments.

Grab samplers and gravity corers can either be Teflon®-coated or be made of
stainless steel to prevent potential contamination of trace metal samples.  The
sampling device should at least be rinsed with clean water between samples.
More thorough cleaning will be required for certain analyses; for instance,
analyses performed for chlorinated dioxins require that all equipment and
sample containers be scrupulously cleaned with pesticide-grade solvents or
better because of the low detection limits required for these compounds. It is
recommended that a detailed  standard operating procedure specifying all
decontamination procedures be included in the  project sampling plan.
2.5.7.2     Water Sample Collection

If water samples are necessary, they should be collected with either a
noncontaminating pump or a discrete water sampler. When sampling with a
pump, the potential for contamination can be minimized by using a peristaltic or
a magnetically coupled  impeller-design pump.  These kinds of pumps provide
barriers between the sample and the surfaces of the pump (e.g., motor or fan)
                                    52

-------
      that would cause contamination.  The system should be flushed with the
      equivalent of 10 times the volume of the collection tubing.  Also, any
      components within several meters of the sample intake should be
      noncontaminating (i.e., sheathed  in  polypropylene or epoxy-coated or made of
      Teflon®). Potential sample contamination must be avoided, including vessel
      emissions and other sampling apparatus.

      A discrete water sampler should be of the close/open/close type so that only
      the target water sample comes into contact with internal sampler surfaces.
      Water samplers should be made  of stainless steel or acrylic plastic.  Seals
      should be Teflon®-coated whenever possible. Water sampling devices should
      be acid-rinsed (1:1 nitric acid) prior to use for collection of trace-metal samples,
      and solvent-rinsed (assuming the sampler material is compatible) prior to
      collection of samples for organic analyses.
      2.5.7.3     Organism Collection

      Collection methods for benthic organisms may be species-specific and can
      include, but are not limited to, bottom trawling, grabs, or cores. If organisms
      are to be maintained alive, they should be transferred immediately to containers
      with clean, well-oxygenated water, and sediment, as appropriate.  Care must be
      taken to prevent organisms from coming into contact with natural predators and
      potentially contaminated areas or fuels, oils, natural rubber, trace metals, or
      other contaminants (U.S. EPA 1990a, 1992a).
2.5.8 Sample Handling, Preservation, and Storage

      Detailed procedures for sampling handling, preservation, and storage should be
      part of the project-specific protocols and standard operating procedures
      specified for each sampling operation and included in the sampling plan.
      Samples are subject to chemical, biological, and physical changes as soon as
      they are collected. Sample handling, preservation, and storage techniques
      have to be designed to minimize any changes in composition of the sample by
      retarding chemical and/or biological activity and by avoiding contamination.
      Collection methods, volume requirements, container specifications, preservation
      techniques, storage conditions, and holding times (from the time of sample
      collection) for sediment, water, and tissue samples are discussed  below and
      summarized in Table 5.  Exceedance of the holding times presented in Table 5
      would not necessarily result in qualification of the data during data validation.
      However, technical reasons justifying acceptance of data that exceed the
      holding time should be provided on a compound class basis.
                                         53

-------
TABLE 5. SUMMARY OF RECOMMENDED PROCEDURES FOR SAMPLE
         COLLECTION, PRESERVATION, AND STORAGE
Analyses
Sediment
Chemical/Physical Analyses
Metals
Organic compounds
(e.g., PCBs, pesticides,
polycycllc aromatic
hydrocarbons)
Particle size
Total organic carbon
Total solids/specific
gravity
Miscellaneous
Sediment from which
elutriate is prepared
Biological Tests
Dredged material
Reference sediment
Control sediment
Collection
Method*


Grab/corer
Grab/corer
Grab/corer
Grab/corer
Grab/corer
Grab/corer
Grab/corer

Grab/corer
Grab/corer
Grab/corer
Sample
Volume"


100 g
250 g
100 g
50 g
50 g
SSOg
Depends on tests
being performed

12-15 L per
sample
45-50 L per test
21-25 L per test
Container0


Precleaned polyethy-
lene jar*
Solvent-rinsed glass
jar with Teflon* lid'
Whirl-pac bag"
Heat treated glass
vial with Teflon*-lined
lid0
Whirl-pac bag
Whirl-pac bag
Glass with Teflon*-
lined lid

Plastic bag or con-
tainer"
Plastic bag or con-
tainer"
Plastic bag or con-
tainer"
Preservation Storage
Technique Conditions


Dry ice* or freezer £ 4°C
storage for extended
storages; otherwise
refrigerate
Dry ice" or freezer £ 4°C°/dark'
storage for extended
storage; otherwise
refrigerate
Refrigerate < 4°C
Dry ice" or freezer £ 4°C*
storage for extended
storages; otherwise
refrigerate
Refrigerate < 4°C
Refrigerate < 4°C
Completely fill and 4°C/dark/airtight
refrigerate

Completely fill and 4°C/dark/airtight
refrigerate; sieve
Completely fill and 4°C/dark/airtight
refrigerate; sieve
Completely fill and 4°C/dark/airtight
refrigerate; sieve
Holding Times'1


Hg - 28 days
Others - 6 months'
14 days8
Undetermined
14 days
Undetermined
Undetermined
14 days

14 days'
14 days'
14 days'


-------
TABLE 5. (cont.)









Ul
Ol





Analyses
Water and Elutriate
Chemical/Physical Analyses
Participate analysis
Metals
Total Kjeldahl nitrogen
Chemical oxygen
demand
Total organic carbon
Total inorganic carbon
Phenolic compounds
Soluble reactive
phosphates
Extractabte organic
compounds (e.g., semi-
volatile compounds)
Volatile organic
compounds
Total phosphorus

Collection Sample
Method' Volume"


Discrete sampler 500-2,000 mL
or pump
Discrete sampler 1 L
or pump
Discrete sampler 100-200 mL
or pump
Discrete sampler 200 mL
or pump
Discrete sampler 100 mL
or pump
Discrete sampler 100 mL
or pump
Discrete sampler 1 L
or pump
Discrete sampler
or pump
Discrete sampler 4 L
or pump
Discrete sampler 80 mL
or pump
Discrete sampler
or pump

Container11


Plastic or glass
Acid-rinsed polyethy-
lene or glass jaH
Plastic or glass"
Plastic or glass"
Plastic or g!assk
Plastic or glass1*
Glass"
Plastic or glass"
Amber glass bottle1
Glass vial1
Plastic or glassh

Preservation Storage
Technique Conditions


Lugols solution and 4°C
refrigerate
pH < 2 with HNO3; 4°C 2°C'
refrigerate1
H2SO4 to pH < 2; 4°C"
refrigerate
H2SO4 to pH < 2; 4°C"
refrigerate
H2SO< to pH < 2; 4°C"
refrigerate
Airtight seal; refrig- 4°C"
erate"
0.1-I.OgCuSO,; 4°C"
H2SO< to pH < 2;
refrigerate
Filter; refrigerate" 4°C"
pH < 2, 6N HCI; 4°C'
airtight seal; refrigerate
pH<2with 1:1 HCL; 4°C'
refrigerate in airtight,
completely filled con-
tainer1
H2SO4 to pH < 2; 4°C"
refrigerate

Holding Times"


Undetermined
Hg - 14 days
Others - 6 months"
24 h"
7 days"
<48 hours"
6 months"
24 hours"
24 hours"
7 days for extrac-
tion; 40 days for
sample extract
analyses'
14 days for sample
analysis, if pre-
served'
7 days"


-------
  TABLE 5. (conL)
Analyses
Total solids

Volatile solids

Sulfides
Biological Tests
Site water
Dilution water
Tissue
Metals
PCBs and chlorinated
pesticides
Volatile organic
compounds
Semh/olatile organic
compounds
Lipids
Collection
Method*
Discrete sampler
or pump
Discrete sampler
or pump
Discrete sampler
or pump

Grab
Grab or makeup

Trawl/Teflon*-
coated grab
TrawVTeflon*-
coated grab
Trawl/Teflon*-
coated grab
Trawl/Teflone-
coated grab
Trawl/Teflon"-
coated grab
Sample
Volume"
200 mL

200 mL

—

Depends on tests
being performed
Depends on tests
being performed

5-10 g
10-25 g
10-25 g
10-25 g
Part of organic
analyses
Container8
Plastic or glass11

Plastic or glass"

Plastic or glass*

Plastic carboy
Plastic carboy

Double Ziploc*0
Hexane-rinsed double
aluminum foil and
double Ziploc*0
Heat-cleaned alum-
inum foil and water-
tight plastic bag1
Hexane-rinsed double
aluminum foil and
double Ziploc*0
Hexane-rinsed alumi-
num foil
Preservation
Technique
Refrigerate

Refrigerate

pH > 9 NaOH (ZnAc);
refrigerate*

Refrigerate
Refrigerate

Handle with non-
metallic forceps; plastic
gloves; dry ice'
Handle with hexane-
rinsed stainless steel
forceps; dry fee"
Covered ice chest'
Handle with hexane-
rinsed stainless steel
forceps; dry ice8
Handle with hexane-
rinsed stainless steel
forceps; quick freeze
Storage
Conditions
4°C11

4°C*

4°C>

<4°C
<4°C

<, -20°C° or freezer
storage
£ -20°C° or freezer
storage
< -20°Cm or
freezer storage
<, -2Q°V or freezer
storage
<, -20°C or freezer
storage
Holding Times'*
7 days*

7 days*

24 hours*

14 days
14 days

Hg - 28 days
Others - 6 months™
14 days"
14 days'"
14 days"
14 days9
:   Note:  This table contains only a summary of collection, preservation, and storage procedures for samples. The cited references should be consulted for a more detailed
         description of these procedures.

-------
TABLE 5. (cent.)
PCS - polychlorinated biphenyl
8 Collection method should include appropriate liners.
" Amount of sample required by the laboratory to perform the analysis (wet weight or volume provided, as appropriate).  Miscellaneous sample size for sediment should be
increased if auxiliary analytes that cannot be included as part of the organic or metal analyses are added to the list.  The amounts shown are not intended as firm values;
more or less tissue may be required depending on the analytes, matrices, detection limits, and particular analytical laboratory.
c All containers should be certified as clean according to U.S. EPA (1990c).
* These holding times are for sediment, water, and tissue based on guidance that is sometimes administrative rather than technical in nature. There are no promulgated,
scientifically based holding time criteria for sediments, tissues, or elutriates.  References should be consulted if holding times for sample extracts are desired. Holding
times  are from the time of sample collection.
8 NOAA (1989).
1 Tetra Tech (1986a).
9 Sample may be held for up to 1  year if«»-20°C.
h Polypropylene should be used if phthalate bioaccumulation is of concern,
' Two weeks is recommended; sediments  must not be held for longer than 8 weeks prior to biological testing.
' U.S.  EPA (1987a); 40 CFR Part 136, Table  III.
" Plumb (1981).
' If samples are not preserved to pH < 2, then aromatic compounds must be analyzed within 7 days.
m Tetra Tech (1986b).

-------
2.5.8.1     Sample Handling

Sufficient sample volume should be collected to:

    •    Perform the necessary analyses

    •    Partition  the samples, either in the field or as soon as possible
         after sampling, for respective storage and analytical requirements
         (e.g., freezing for trace metal analysis or refrigeration for
         bioassays)

    •    Archive portions of the sample for possible later analysis.

    •    Provide sample for replicate or QA analyses, if specified.

Sample handling is project- and analysis-specific, as well as being based on
what is practical and possible. Generally, samples to be analyzed for trace
metals should not come into contact with metals, and samples to be analyzed
for organic compounds should not come into contact with plastics. All sample
containers should be scrupulously cleaned (acid-rinsed for analysis of metals,
solvent-rinsed for analysis of organic compounds).

For analysis of volatile compounds, samples should completely fill the storage
container, leaving no airspace.  These samples should be refrigerated but never
frozen or the containers will crack.  Samples for other kinds of chemical
analysis  are sometimes frozen. Only wide-mouth ("squat") jars should be used
for frozen samples; narrow-mouth jars are  less resistant to cracking.  If the
sample is to be frozen, sufficient air space should be left to allow expansion to
take place (i.e., the wide-mouth sample container should be filled to  no more
than the  shoulder of the bottle [just below the neck of the bottle] and the
container should be frozen at an angle). Container labels have to withstand
soaking,  drying, and freezing without becoming detached or illegible. The
labeling system should be tested prior to use in the field.

Sediment samples for biological testing should  have larger (possible predatory)
animals removed from the sediment by screening or press sieving prior to
testing.  Other matter retained on the screen with the organisms, such as shell
fragments, gravel, and debris, should be recorded and discarded.  Prior to use
in bioassays, individual test sediments should be thoroughly homogenized with
clean instruments (until color and textural homogeneity is achieved).
2.5.8.2    Sample Preservation

Preservation steps should be taken immediately upon sediment collection.
There is no universal preservation or storage technique, although storage in the
dark at 4°C is generally used for all samples held for any length of time prior to
processing, and for some samples after processing.  A technique for one group


                                    58

-------
      of analyses may interfere with other analyses.  This problem can be overcome
      by collecting sufficient sample volume to use specific  preservation or storage
      techniques for specific analytes or tests.  Preservation, whether by refrigeration,
      freezing, or addition of chemicals, should be accomplished as soon as possible
      after collection, onboard the collecting vessel whenever possible. If final
      preservation techniques cannot be implemented in the field, the sample should
      be temporarily preserved in a manner that retains its integrity.

      Onboard refrigeration is easily accomplished with coolers and ice; however,
      samples should be segregated from melting ice and cooling water. Sediment
      samples that are to be frozen on board may be stored in an onboard freezer or
      may simply be placed in a  cooler with dry ice or blue  ice. Sample containers to
      be frozen (wide-mouth jars; see Section 2.5.7.1) should not be filled completely
      because expansion of the sample could cause the container to break.
      Sediment samples for biological analysis should be preserved at 4°C, never
      frozen or dried. Additional  guidance on sample preservation is given in Table 5.
      2.5.8.3    Sample Storage

      The elapsed time between sample collection and analysis should be as short as
      possible. Sample holding times for chemical evaluations are analysis-specific
      (Table 5). Sediments for bioassay (toxicity and/or bioaccumulation) testing
      should be tested as soon as possible, preferably within 2 weeks of collection.
      Sediment toxicity does change with time. Studies to date suggest that
      sediment storage time should  never exceed 8 weeks (at 4°C, in the dark,
      excluding air) (Becker and  Ginn 1990; Tatem et al. 1991)  bescause toxicity may
      change with storage time.  Sample storage conditions  (e.g., temperature,
      location  of samples) should be documented.

2.5.9 Logistical Considerations and Safety Precautions

      A number of frustrations in sample collection and handling can  be minimized by
      carefully thinking through the process and requirements before  going to the
      field.  Contingency plans are essential. Well-trained, qualified, and experienced
      field crews should be used. Backup  equipment and sampling gear, and
      appropriate repair parts, are advisable. A surplus of sampling containers and
      field data sheets should be available. Sufficient ice and adequate ice chest
      capacity should be provided, and the necessity of replenishing ice before
      reaching the laboratory should be considered.  A vessel with adequate deck
      space is safer and allows for more efficient work than an overcrowded vessel.
      Unforeseeable circumstances  (e.g., weather delays) are to be expected during
      field sampling, and time to  adequately accommodate the unforeseen has to be
      included in sampling schedules.
                                         59

-------
      Appropriate safety and health precautions must be observed during field
      sampling and sample processing activities. The EPA Standard Operating
      Safety Guides (U.S. EPA 1984b) should be used as a guidance document to
      prepare a site-specific health and safety plan.  The health and safety plan
      should be prepared as a separate document from the QA project plan.
      Requirements implementing the Occupational Safety and Health Act at 29 CFR
      §1910.120 (Federal Register, Vol.  54, No. 43) should be met for medical
      surveillance, personal protection, respirator fit testing (if applicable), and
      hazardous waste operations training (if applicable) by all personnel working in
      contaminated areas or working with contaminated media.

      The procedures and practices established in the site-specific health and safety
      plan should be observed by all individuals participating in the field activities.
      Safety requirements should also be met by all observers present during field
      audits and inspections.  The plan should include the following information:

          •   Site location and history

          •   Scope of work

          •   Site control

          •   Hazard assessment (chemical and physical hazards)

          •   Levels of protection and required  safety equipment

          •   Field monitoring requirements

          •   Decontamination

          •   Training and medical monitoring requirements

          •   Emergency planning and emergency contacts.


2.6   SAMPLE CUSTODY

      Recordkeeping procedures are described in detail in this section of the QA
      project plan, including specific procedures  to document the physical possession
      and condition of samples during their transport and storage. This section also
      describes  how excess or used samples will be disposed of at the end  of the
      project.


2.6.1 Sample Custody and Documentation

      Sample custody and documentation are vital components of all dredged
      material evaluations, particularly if any of the data may be used in a court of
      law.  It is important to record all events associated with a sample so that the
      validity of the resulting data may be properly interpreted. Thorough

                                         60

-------
documentation provides a means to track samples from the Held through the
laboratory and prevent sample loss.  The contents and location of all
documents related to dredged sediment samples should be specified, and
access to the samples should be controlled.

The possession of samples should be documented from sample collection
through laboratory analysis. Recording basic information during sample
handling is good scientific practice even if formal custody procedures are not
required. Sample custody procedures, including examples of forms to be used,
should be described in the QA project plan.  Minimum requirements for
documentation of sample handling and custody on simple projects should
include the following information:

    •   Sample location, project name, and  unique sample number

    •   Sample collection  date (and time if more than one sample may be
        collected at a location in a day)

    •   Any special notations on sample characteristics or problems

    •   Initials of the person collecting the sample

    •   Date sample sent to the laboratory

    •   Conditions under which the samples were sent to the laboratory.

For large or sensitive projects that may result in enforcement actions or other
litigation, a strict system for tracking sample custody should be used to assure
that one individual has responsibility for a set of samples at all times.  For these
projects,  only data that have clear documentation of custody can be accepted
without qualification.
A strict system of sample custody implies the following conditions:

  ,  •   The sample is possessed by an individual and secured so that no
        one can tamper with it

    •   The location and condition of the sample is known and
        documented at all times

    •   Access to the sample is restricted to authorized personnel only.

Where samples may be needed for potential litigation, chain-of-custody
procedures should be followed. Chain-of-custody procedures are initiated
during sample collection. Chain-of-custody forms are often used to document
the transfer of a sample from collection to receipt by the laboratory (or between
different facilities of one laboratory). Although not always required, these forms
provide an easy means of recording information that may be useful weeks or
months after sample collection. When these forms are used, they are provided
                                   61

-------
to field technicians at the beginning of a project. The completed forms
accompany the samples to the laboratory and are signed by the relinquisher
and receiver every time the samples change hands.  After sample analysis, the
original chain-of-custody form is returned by the laboratory. The form is filed
and becomes part of the permanent project documentation.  An example of a
chain-of-custody form is provided in Appendix A. Additional custody
requirements for field and laboratory operations should be described in the QA
project plan, when appropriate.

When in doubt about the level of documentation required for sampling and
analysis, a strict system of documentation using standard forms should  be
used.  Excess documentation can be discarded; lack of adequate
documentation in even  simple projects sometimes creates the unfortunate
impression that otherwise reasonable data are unusable or limited.  Formal
chain-of-custody procedures are outlined briefly in the statements-of work for
laboratories conducting analyses of organic and inorganic contaminants under
EPA's Contract Laboratory Program  (CLP) (U.S. EPA 1990d,e).
2.6.1.1     Field Operations

The potential for sample deterioration and/or contamination exists during
sample collection, handling, preservation, and storage. Approved protocols and
standard operating procedures should be followed to ensure all field sampling
equipment is acceptably calibrated and to prevent deterioration or
contamination.  Experienced personnel should be responsible for maintaining
the sample integrity from collection through analysis, and field operations should
be overseen by the project manager. A complete record of all field procedures,
an inventory log, and a tracking log should be maintained. A field tracking
report (see example in Appendix A) should identify sample custody and
conditions in the field prior to shipment.

Dates and times of collection, station locations, sampling methods, and sample
handling, preservation, and storage procedures should be documented
immediately, legibly, and indelibly so that they are easily traceable.  Any
circumstances potentially affecting sampling procedures  should be documented.
The data recorded should be thorough enough to allow station relocation and
sample tracking. An example of a station location log is provided in
Appendix A.  Any field preparation of samples should also be described. In
addition,  any required calibration performed for field instruments should be
documented in the field logbook. Samples should be identified with a
previously prepared label (see example in Appendix A) containing at least the
following information:

    •   Project title

    •   Sample identification number
                                    62

-------
    •   Location (station number) and depth

    •   Analysis or test to be performed

    •   Preservation and storage method

    •   Date and time of collection

    •   Special remarks if appropriate

    •   Initials of person collecting the sample

    •   Name of company performing the work.


2.6.1.2    Laboratory Operations

Documentation is necessary in the laboratory where chemical and biological
analyses are performed. A strict system of sample custody for laboratory
operations should include the following items:

    •   Appointment of a sample custodian, authorized to check the
        condition of and sign for incoming field samples, obtain documents
        of shipment, and verify sample custody records

    •   Separate custody procedures for sample handling, storage, and
        disbursement for analysis in the laboratory

    •   A sample custody log consisting of serially numbered, standard
        laboratory tracking report sheets.

A laboratory tracking  report (Appendix A) should be prepared for each sample.
The location of samples processed through chain-of-custody must be known at
all times. Samples to be used in a court of law must be stored in a locked
facility to prevent tampering or alteration.

A procedure should be established for the retention of all field and laboratory
records and samples as various tasks or phases are completed.  Replicates,
subsamples of analyzed samples, or extra unanalyzed samples should be kept
in a storage bank.  These samples can be used to scrutinize anomalous results
or for supplemental analyses, if additional information is needed.  All samples
should be properly stored and inventoried. The retention arid archiving
procedure should indicate the storage requirements, location, indexing codes,
retention time, and security requirements for samples and data.
                                   63

-------
2.6.2  Storage and Disposal of Samples

       In the statement of work, the laboratory should be instructed to retain all
       remaining sample material (under appropriate temperature and light conditions)
       at least until after the QA review has been completed. In addition, sample
       extracts or digestates should be appropriately stored until disposal is approved
       by the project manager. With proper notice, most laboratories are willing to
       provide storage for a reasonable time period (usually on the order of weeks)
       following analysis.  However, because of limited space at the laboratory, the
       project manager may need to make arrangements for long-term storage at
       another facility.

       Samples must be properly disposed when  no longer needed. Ordinary sample-
       disposal methods are usually acceptable, and special precautions are seldom
       appropriate.  Under Federal law [40 CFR 261 -5(a)], where highly contaminated
       wastes are involved, if the waste generated is less than 100 Kg per month, the
       generator is conditionally exempt as a small-quantity generator and may
       accumulate up to 1,000 Kg of waste on the property without being subject to
       the requirements of Federal hazardous waste regulations. However, State and
       local regulations may require special handling and disposal of contaminated
       samples. When samples have to be shipped, 49 CFR 100-177 should be
       consulted for current Department of Transportation regulations on packing and
       shipping.

       Over the last few years, there has been a growing awareness of the ecological
       and economic damage  caused by introduced species.  Because both east and
       west coast species are  often used in bioaccumulation tests, there is a real
       potential of introducing  bioaccumulation test species or associated fauna and
       flora (e.g., pathogens, algae used in transporting the worms).  It is the
       responsibility of the persons conducting the bioaccumulation or toxicity tests to
       assure that no non-indigenous species are released.  The general procedures
       to contain non-indigenous species are to collect and then poison all water,
       sediment, organisms and associated  packing materials (e.g., algae, sediment)
       before disposal.  Chlorine bleach can be used as the poison. A double
       containment system is used to keep any spillage from going down the drain.
       Guidance on  procedures used in toxicity tests can be found in Appendix B of
       DeWitt et al. (1992a).  Flow-through tests can generate large quantities of
      water, and researchers  should plan on having sufficient storage facilities.
2.7   CALIBRATION PROCEDURES AND FREQUENCY

      Procedures for minimizing bias and properly maintaining the precision of each
      piece of equipment to be used in the field or laboratory are detailed in this
      section of the QA project plan.  Procedures are also described for obtaining,
      using, and storing chemical standards of known purity used to quantify
                                         64

-------
      analytical results, and reference chemicals used as positive controls in toxicity
      tests.  Instruments that require  routine calibration include, for example,
      navigation devices, analytical balances, and water quality meters.

      Calibration of analytical instruments is a high priority and is always required for
      any project requiring quantitative data (even if only estimated quantities are
      necessary).  Calibration is essential because it is the means by which
      instrument responses are properly translated into chemical concentrations.
      Instrument calibration is performed before sample analysis begins and is
      continued during sample analysis at intervals specified in each analytical
      method to ensure that the data quality objectives established for a project are
      met.

      Because there are several analytical techniques that can be used for the same
      target analyte, each of which may provide different guidance for performing
      instrument calibration, it is important to establish a minimum calibration
      procedure for any chemical analysis that will be performed.  Uniform adherence
      to a minimum calibration procedure will also improve the comparability of data
      generated by multiple laboratories that may be used  for a specific project or
      among projects. All requirements for performing instrument calibrations should
      be clearly stated in the QA project plan and the laboratory statement of work
      prepared for any project.

      In addition to performing instrument calibrations, the  acceptability of the
      calibrations performed should be evaluated.  To provide control over the
      calibration process, specific guidelines should be specified. The basic elements
      of the calibration process include  the calibration frequency, number of
      calibration standards and their concentrations, and the calibration acceptance
      criteria. A summary of these elements is provided below.
      Examples of the differences in calibration procedures (specifically for the
      analysis of organic  compounds) for different analytical methods are provided in
      Table 6.
2.7.7 Calibration Frequency

      The general process of verifying that an instrument is functioning acceptably is
      to perform initial and continuing calibrations.  Initial calibration should be
      performed prior to sample analysis to determine whether the response of the
      instrument is linear across a range of target analyte concentrations (i.e., the
      working linear range).  In addition to establishing the initial calibration for an
      instrument, it is critical that the stability of the instrument response be verified
      during the course of ongoing sample analyses.  The verification of instrument
      stability is assessed by analyzing continuing calibration standards at regular
      intervals during the period that sample analyses are performed. Although each
      analytical method provides guidance for the frequency at which continuing

-------
                    TABLE 6.  EXAMPLE CALIBRATION PROCEDURES
    Calibration Criteria
      SW-846 Methods for
      Organic Compounds8
          EPA CLP Methods for
          Organic Compounds"
 Number of standards for
 initial calibration
 Concentration of lowest
 initial calibration standard

 Concentrations for initial
 calibration to establish the
 instrument's working linear
 range
 Concentration of continu-
 ing calibration standards

 Frequency of calibrations
Acceptance criteria for
initial calibration0
Acceptance criteria for
continuing calibration0
Minimum of five for all methods
All  target  analytes  near,
above, the TDL
but
1. Bracket the expected concen-
   tration range of analytes ex-
   pected in samples
Five for all GC/MS analyses
Three for pesticides
One for PCBs and  multicompo-
nent pesticides

Contractually set (e.g., 10 ng/L for
volatile organic compounds)

Contractually  set  (e.g.,  10,  50,
100,150, and 200 ug/L for volatile
organic compounds)
2. Bracket the  full  instrument/
  • detector linear range

Not specified, except for GC/MS  Contractually set (e.g., 50 ng/L for
methods                        all GC/MS analyses)

Repeat when acceptance criteria  Repeat when acceptance criteria
not met                         not met
Calculate analyte RRFs or RFs,
then RSD should be < 30 percent
for GC/MS  methods and < 20
percent for all other methods

Alternative:  generate  a  least
squares linear regression (peak
height/area  vs.   concentration)
and  use equation  to  calculate
sample results

Calculate analyte RRFs or RFs,
then difference to mean RRF or
RF of initial calibration should be
< 25 percent for GC/MS methods
and  < 15 percent for  all other
methods

Alternative: none
     Calculate analyte RRFs or RFs,
     then RSD should be < 30 percent
     for  GC/MS  methods  and  < 20
     percent for pesticides

     Alternative: none
     Calculate analyte RRFs or RFs,
     then  difference  to mean RRF or
     RF of initial calibration should be
     < 25  percent for GC/MS methods
     and < 15 percent for pesticides

     Alternative: none
te:  CLP    - Contract Laboratory Program
    GC/MS  - gas chromatography/mass spectrometry
    PCB    - polychlorinated biphenyl
    RF     - response factor (I.e., calibration factor)
    RRF    - relative response factor
    RSD    - relative standard deviation
    TDL    - target detection limit
                                               66

-------
TABLE 6. (cont.)


* U.S. EPA (1986a).

b U.S. EPA (1990b).
c The acceptance criteria for instrument calibration (i.e., initial and continuing calibration) may not be available for all organic
compounds listed in Table 3 (e.g., resin acids and guaiacols).  The determination of acceptable instrument calibration criteria
for organic compounds not specifically stipulated in SW-846 or EPA CLP methods should be assessed using best professional
judgment (e.g., < 50 percent RSD).
                                                            67

-------
      calibration standards should be performed, it is recommended that at a
      minimum these standards be analyzed at the beginning of each analytical for
      table 6 sequence, after every tenth sample, and at the end of the analytical
      sequence for all organic and inorganic compound analyses performed.  The
      concentration of the continuing calibration standard should be equivalent to the
      concentration of the midpoint established during initial calibration of the working
      linear range of the instrument.
2.7.2 Number of Calibration Standards

      Specific instrument calibration procedures are provided in most analytical
      methods; however, a wide variation exists in the number of calibration
      standards specified for different analyses. To ensure that consistent and
      reliable data are generated, a minimum number of calibration standards should
      be required for all laboratories performing chemical analyses.

      Typically, as the number of calibration standards increases, the reliability of the
      results increases for concentrations detected above the TDL. The specific
      standards that are selected for calibration can have a significant impact on the
      validity of the data generated.  Calibration standards should be established with
      respect to the  range of standards required, the TDLs selected, and the linear
      range of the target analytes desired.  Specific requirements for establishing the
      number of calibration standards, including recommendations on the
      concentrations to use, will be different for organic and inorganic analyses;
      however, some general recommended guidelines are provided below.

      The working linear range of an instrument should be established prior to
      performing sample analyses. A minimum of five calibration standards for the
      analysis of organic compounds and three calibration standards for the analysis
      of inorganic compounds should  be used when establishing the working linear
      range for all target analytes of concern. Generally, the working linear range of
      an instrument for a specific analysis should bracket the expected concentrations
      of the target analyte in  the samples to be analyzed.  In some instances,
      however, it may  not be known what analyte concentrations to expect. A 5-point
      initial calibration  sequence is recommended to establish  the working linear
      range for organic chemical analyses.

      In addition to the number of standards analyzed, the difference between the
      concentration of  the lowest standard and the TDL and the difference between
      each standard used to  establish the initial calibration are critical.  The selection
      of the lowest initial calibration standard concentration will provide more
      confidence in the documented bias of results reported as undetected at the TDL
      or any results reported  at very low concentrations. The selection of this
      standard will also ensure that target analytes can  be  reliably detected above
      instrument background  noise and potential matrix interferences. For the
                                          68

-------
      dredged material program, this standard should be no lower than the TDL
      provided in Table 3.

      The decision as to which specific concentrations (i.e., calibration range) should
      be used for a multipoint calibration requires careful consideration.  While
      methods established by EPA CLP protocols provide stringent requirements for
      calibration analyses, these requirements are not clearly specified for other
      analytical methods (e.g., SW-846 methods) (see Table 6).  A 5-point initial
      calibration sequence is recommended for all non-CLP methods. The
      concentrations of all  standards should range from the lowest concentration
      meeting the requirements suggested above to the highest standard
      concentration equivalent to the upper linear range of the instrument/detector
      configuration.  The concentrations of the remaining three standards should  be
      evenly distributed between these concentrations.  The calibration standards
      used to establish the working linear range should encompass a factor of 20
      (i.e., 1  to 20, with the lowest concentration equal to 1 and the highest
      concentration equal to 20 times the concentration of the lowest concentration
      used).
2.7.3 Calibration Acceptance Criteria

      Once the initial calibration has been performed, the acceptability of the
      calibration should be assessed to ensure that the bias of the data generated
      will be acceptable; this assessment should be performed by all laboratories
      prior to the analysis of any sample. In addition, the acceptability of all
      continuing calibrations should be assessed.

      Although each analytical method provides guidance for determining the
      acceptability of instrument calibrations, there are multiple options  available (e.g.,
      least squares linear regression, percent relative standard deviations, and
      percent differences).  A specific set of acceptance criteria should  be determined
      prior to sample analysis, and these criteria should be contractually binding to
      avoid unnecessary qualification or rejection of the data generated. A summary
      of the most widely used calibration acceptance criteria currently in use for
      organic analyses is provided in Table 6.  Calibration acceptance criteria should
      be used to assess the acceptability of tie initial calibration sequence in terms of
      the relationship between the intercept of the calibration curve (i.e., the x-y
      intercept) and the predetermined TDLs and the overall reliability of the working
      linear range established.

      The general criteria specified by SW-846 methods are typically more stringent
      for organic analyses than the EPA CLP requirements. Acceptance criteria, as
      summarized in Table 6, should be clearly defined before sample analyses are
      performed. AH specific acceptance criteria for calibrations should  be stated in
      the QA project plan and the laboratory statement of work.
                                          69

-------
2.8   ANALYTICAL PROCEDURES

      The methods cited in this section may be used to meet general data quality
      objectives for dredged material evaluations. However, other methods may
      provide similar results, and the final choice of analytical procedures should be
      based on the needs of each evaluation.  In all cases, proven, current methods
      should be used; EPA-approved methods, if available, are preferred.  Sample
      analysis procedures are identified in this section by reference to established,
      standard methods. Any modifications to these procedures and any specialized,
      nonstandard procedures are also described in detail. When preparing a QA
      project plan, only modifications to standard operating procedures or details of
      non-standard procedures need to be described in this section of the plan.

      Any dredged material from estuarine or marine areas contains salt, which can
      interfere with the results obtained from some analytical methods. Any methods
      proposed for the analysis of sediment and water from estuarine or marine
      environments should explicitly address steps taken to control salt interference.

      The following sections provide guidance on the selection of physical and
      chemical analyses to aid in evaluating dredged material proposed for disposal,
      and on the methods used to analyze these parameters.  Information on the
      chemicals on the EPA priority pollutant and hazardous substance lists is
      provided in Appendix E.
2.8.1 Physical Analysis of Sediment

      Physical characteristics of the dredged material must be determined to help
      assess the impact of disposal on the benthic environment and the water column
      and to help determine  the appropriate dredging methods. This is the first step
      in the overall process of sediment characterization, and also helps to identify
      appropriate control and reference sediments for biological tests. In addition,
      physical analyses can  be helpful in evaluating the results of analyses and tests
      conducted later in the  characterization process.

      The general analyses may include grain size distribution, total solids content,
      and specific gravity. Grain size analysis defines the frequency distribution of
      the size ranges of the  particles that make up the sediment (e.g., Plumb 1981;
      Folk 1980).  The general size classes of gravel, sand, silt, and clay are the
      most useful in describing the size distribution of particles in dredged material
      samples. Use of the Unified Soil Classification System (USCS) for physical
      characterization is recommended for the purpose of consistency with USAGE
      engineering evaluations (ASTM 1992).

      Measurement of total solids is a gravimetric determination of the organic and
      inorganic material remaining in a sample after it has been dried at a specified


                                         70

-------
      temperature.  The total solids values generally are used to convert
      concentrations of contaminants from a wet-weight to a dry-v/eight basis.

      The specific gravity of a sample is the ratio of the mass of 21 given volume of
      material to an equal volume of distilled water at the same temperature (Plumb
      1981). The specific gravity of a dredged material sample helps to predict the
      behavior (i.e., dispersal and settling characteristics) of dredged material after
      disposal.

      Other physical/engineering properties (e.g., Atterburg limits, settling properties)
      may be needed to evaluate the quality of any effluent discharged from confined
      disposal facilities. QA considerations for physical analysis of sediments are
      summarized in Section 2.10.3.
2.8.2 Chemical Analysis of Sediment

      Chemical analysis provides information about the chemicals present in the
      dredged material that, if biologically available, could cause toxicity and/or be
      bioaccumulated. This information is valuable for exposure assessment and for
      deciding which of the contaminants present in the dredged  material to measure
      in tissue samples.  This section discusses the selection of target analytes and
      techniques for sediment analyses. QA considerations are summarized in
      Section 2.10.4.
      2.8.2.1    Selection of Target Analytes

      If the review of data from previous studies suggests that sediment contaminants
      may be present (see Section 2.5.2), but fails to produce sufficient information to
      develop a definitive list of potential contaminants, a list of target analytes should
      be compiled.  Target analytes should be selected from, but riot necessarily
      limited to, those listed  in Table 3. The target analyte list should also include
      other contaminants that historical information or commercial and/or agricultural
      applications suggest could be present at a specific dredging site (e.g., tributyltin
      near shipyards, berthing areas, and marinas where these compounds have
      been applied). Analysis of polycyclic aromatic hydrocarbons (PAHs) in dredged
      material should focus on those PAH compounds listed in Table 3.

      All PCB analyses should be made using congener-specific methods.  The sum
      of the concentrations of specific congeners is an appropriate measure of total
      PCBs (NOAA, 1989).  Congener-specific analyses also provide data that can be
      used for specialized risk assessments that reflect the widely varying toxicity of
      different PCB congeners.
                                          71

-------
Sediments should be analyzed for TOG.  This is particularly important if there
are hydrophobia organic compounds on the target analyte list. The TOC
content of sediment is a measure of the total amount of oxidizable organic
material in a sample and also affects contaminant bioaccumulation by, and
effects to, organisms (e.g., DeWitt et al. 1992b; Di Toro et al. 1991).

Sediments in which metals are suspected to be contaminants of concern may
also be analyzed for acid volatile sulfide (AVS) (Di Toro et al. 1990; U.S. EPA
1991 a). Although acceptable guidance on the interpretation of AVS
measurements is not yet available, and AVS measurements are not generally
required at this time, such measurements can provide information on  the
bioavailability of metals in anoxic sediments.
2.8.2.2    Selection of Analytical Techniques

Once the list of project-specific target analytes for sediments has been
established, appropriate analytical methods should be determined (see Section
2.3).  The analytical methods selected must be able to meet the TDLs
established to meet the requirements of the intended uses of the data. Also,
the methods selected will, to some degree, dictate the amount of sediment
sample required for each analysis.  Examples of methods that can be used to
meet TDLs for dredged material  evaluations are provided in Table 3.  General
sample sizes are provided in Table 5, and include possible requirements for
more than one analysis for each group  of analytes. The amount of sample
used in an analysis affects the detection limits attainable by a particular
method.  The following overview summarizes various factors to be considered
when selecting analytical methods for physical, inorganic, and organic analyses.

TOC analyses  should be based on high-temperature combustion rather than on
chemical oxidation, because some classes of organic compounds are not fully
degraded by chemical/ultraviolet techniques. The volatile and nonvolatile
organic components make up  the TOC of a sample.  Because inorganic carbon
(e.g., carbonates and bicarbonates) can be a significant proportion of the total
carbon in some sediment, the  sample has to be treated with  acid to remove the
inorganic carbon prior to TOC  analysis. The method of Plumb (1981)
recommends the use of hydrochloric acid.  An alternative choice might be
sulfuric acid because it is nonvolatile, is used as the preservative, and does not
add to the chloride burden of the sample.  However, some functional groups
(e.g., carboxylic acids) can be  oxidized when inorganic carbonates are removed
using both a non-oxidizing and an oxidizing acid. Whatever acid is used, it has
to be demonstrated on sodium chloride blanks (for all marine samples) that
there is no interference  generated from the combined action of acid and salt in
the sample. Acceptable methods for TOC analysis are provided in PSEP
(1986) and U.S. EPA (1992b).
                                   72

-------
For many metals analyses in marine/estuarine areas, the concentration of salt
may be much greater than the concentration of the analyte of interest, and can
cause unacceptable interferences in certain analytical techniques. In such
cases,  the freshwater approach of acid digestion followed by inductively coupled
plasma-atomic emission spectrometry (1CP) or graphite furnace atomic
absorption spectrometry (GFAA) should be coupled with appropriate techniques
for controlling this interference.  For example, the mercury method in U.S. EPA
(1986a; Method 7471) may be used for the analysis of mercury in sediment.
Tributyltin may be analyzed by the methods of Rice et al. (1987)  and NCASI
(1986), and selenium and arsenic by the method of EPRI (1986).  Total
digestion of metals is not necessary for dredged material evaluations, although
this technique is used for complete chemical characterizations in  some national
programs (e.g., NOAA Status and Trends).  The standard aqua regia extraction
yields consistent and reproducible results.  The recommended method for
analysis of semivolatile and volatile priority pollutants in sediments is described
in Tetra Tech (1986a), and is a modified version of established EPA analytical
methods  designed to achieve lower and more reliable detection limits. Analysis
for organic compounds should always use capillary-column gas chromatography
(GC): gas chromatography/mass spectrometry (GC/MS) techniques for
semivolatile and volatile priority pollutants, and dual column -gas
chromatography/electron-capture detection (GC/ECD) for pesticides and PCBs
(NOAA 1989).  Alternatively, GC/MS  using selected ion monitoring can be used
for PCB and pesticide analysis. These analytically sound techniques yield
accurate  data on the concentrations of chemicals in the sediment matrix.  The
analytical techniques for semivolatile  organic compounds generally involve
solvent extraction  from the sediment matrix and subsequent analysis, after
cleanup,  using GC or GC/MS. Extensive cleanup  is necessitated by the
likelihood of 1) biological macromolecules, 2) sulfur from sediments with low or
no oxygen, and 3) oil and/or grease in the sediment. The analysis of volatile
organic compounds incorporates purge-and-trap techniques with analysis by
either GC or GC/MS. If dioxin (i.e., 2,3,7,8-tetrachlorodiben2:o-p-dioxin [TCDD])
analysis is being performed, the methods of Kuehl et al. (1987), Smith et al.
(1984), U.S.  EPA  (1989b;  Method 8290), or U.S. EPA (1990f; Method 1613)
should  be consulted. EPA Method 1613 is the recommended procedure for
measuring the tetra- through octa- polychlorinated  dibenzo-p-dioxins (PCDDs)
and polychlorinated dibenzofurans (PCDFs).  This  method has been developed
for analysis of water, soil, sediment, sludge, and tissue.  Table 7  shows the 17
compounds determined by Method 1613.

Techniques for analysis of chemical contaminants  have some inherent
limitations for sediment samples.  Interferences encountered as part of the
sediment matrix, particularly in samples from heavily contaminated areas,  may
limit the ability of a method to detect or quantify some analyles.  The most
selective  methods using GC/MS techniques are recommended for all
nonchlorinated organic compounds because such analysis can often avoid
                                   73

-------
         TABLE 7.  PCDD and PCDF Compounds Determined by Method 1613
Native Compound1
      2,3,7,8-TCDF
      2,3,7,8-TCDD
   1,2,3,7,8-PeCDF
   2,3,4,7,8-PeCDF
   1,2,3,7,8-PeCDD
  1,2,3,4,7,8-HxCDF
  1,2,3,6,7,8-HxCDF
  2,3,4,6,7,8-HxCDF
  1,2,3,4,7,8-HxCDD
  1,2,3,6,7,8-HxCDD
  1,2,3,7,8,9-HxCDD
  1,2,3,7,8,9-HxCDF
1,2,3,4,6,7,8-HpCDF
1,2,3,4,6,7,8-HpCDD
1,2,3,4,7,8,9-HpCDF
            OCDD
            OCDF
1 Polychlorinated Dioxins and Furans

TCDD   =   Tetrachlorodibenzp-p-dioxin
TCDF   =   Tetrachlorodibenzofuran
PeCDD  =   Pentachlorodibenzo-p-dioxin
PeCDF  =   Pentachlorodibenzofuran
HxCDD  =   Hexachlorodibenzo-p-dioxin
HxCDF  =   Hexachlorodibenzofuran
HpCDD  =   Heptachlorodibenzo-p-dioxin
HpCDF  =   Heptachlorodibenzofuran
OCDD  =   OctachlorcxJibenzo-p-dioxin
OCDF  =   Octachlorodibenzofuran
                                           74

-------
problems due to matrix interferences.  GC/ECD methods are recommended by
the EPA as the primary analytical tool for all PCB and pesticide analyses
because GC/ECD analysis (e.g., NOAA 1989) will result in lower detection
limits. The analysis and identification of PCBs by GC/ECD methods are based
upon relative retention times and peak shapes. Matrix interferences may result
in the reporting of false negatives, although the congener-specific PCB analysis
reduces this concern  relative to use of the  historical Aroclor^-matching
procedure.

For dredged material  evaluations, the concentration of total PCBs should be
determined by summing the concentrations of specific individual PCB
congeners identified in the sample (see Table 8).  The minimum number of
PCB congeners that should be analyzed are listed in the first column of Table 7
(i.e., "summation" column) (NOAA 1989). This summation is considered the
most accurate representation of the PCB concentration in saimples. Additional
PCB congeners are also listed in Table 8.  McFarland and Clarke (1989)
recommend these PCB congeners for analysis based on environmental
abundance,  persistence, and  biological importance. Sample preparation for
PCB congener analysis should follow the techniques described in Tetra Tech
(1986a) or U.S. EPA  (1986a), but with instrumental analysis and quantification
using standard capillary GC columns on  individual  PCB isomers according to
the methods reported by NOAA (1989) (see also Dunn et al. 1984; Schwartz et
al. 1984; Mullin et al.  1984; Stalling  et al. 1987).

Although the methods mentioned above  are adequate for detecting and
quantifying concentrations of those PCB congeners comprising the majority of
total PCBs in environmental samples,  they are not appropriate for separating
and quantifying PCB congeners which may coelute with other congeners and/or
may be present at relatively small concentrations in the total PCB mixture.
Included in this latter  group of compounds, for example, are PCBs 126 and
169, two of the more  toxic nonortho-substituted PCB congeners (Table 8). In
order to separate these (and other toxic  nonortho-substituted congeners), it is
necessary to initially utilize an enrichment step with an activated carbon column
(Smith 1981).  Various types of carbon columns have been used,  ranging from
simple gravity columns (e.g., in a Pasteur pipette) to more elaborate (and
efficient) columns using high-pressure liquid chromatography (HPLC) systems
(see Schwartz et al. 1993). The preferred  method of separation and
quantitation of the enriched PCB mixture has been via high resolution GC/MS
with isotope dilution (Kuehl et al. 1991; Ankley et al. 1993; Schwartz et al.
1993).  However, recent studies have  shown that if the carbon enrichment is
done via HPLC, the nonortho-substituted PCB congeners of concern also may
be quantifiable via more widely available GC/ECD systems (Schwartz et al.
1993).
                                   75

-------
TABLE 8. POLYCHLORINATED BIPHENYL CONGENERS
 RECOMMENDED FOR QUANTITATION AS POTENTIAL
         CONTAMINANTS OF CONCERN
Congener Number"
PCS Congener8
2,4'-Dichlorobiphenyl
2,2',5-Trichlorobiphenyl
2,4,4'-Trichlorobiphenyl
3,4,4'-Trichlorobiphenyl
2,2',3,5'-Tetrachlorobiphenyl
2,2',4,5'-Tetrachlorobiphenyl
2,2',5,5'-Tetrachlorobiphenyl
213I,4,4'-Tetrachlorobiphenyl
2,3',4',5-Tetrachlorobiphenyl
2,4,4',5-Tetrachlorobiphenyl
3,3',4,4'-Tetrachlorobiphenyl
3,4,4',5-Tetrachlorobiphenyl
212>,3,4,5'-Pentachlorobiphenyl
2,2',3,4',5-Pentachlorobiphenyl
2,2',4,5,5'-Pentachlorobiphenyl
2,3,3',4,4'-Pentachlorobiphenyl
2,3,4,4',5-Pentachlorobiphenyl
2,3',4,4',5-Pentachlorobiphenyl
2,3',4,4',6-Pentachlorobiphenyl
2',3,4,4',5-Pentachlorobiphenyl
3,3',4,4',5-Pentachlo robiphenyl
2',3,3',4,4'-Hexachlorobiphenyl
2,2',3,4,4',5'-Hexachlorobiphenyl
2,2',3,5,5',6-Hexachlorobiphenyl
2,2',4,4'15,5I-Hexachlorobiphenyl
2,3,3',4,4',5-Hexachlorobiphenyl
2,3,3',4,4',5-Hexachlorobiphenyl
2,3,3',4,4',6-Hexachlorobiphenyl
2,3',4,4',5I5I-HexachlorobiphenyI
2,3',4,4',5',6-Hexachlorobiphenyl
Summation0
8
18
28

44

52
66


77



101
105

118


126f
128
138

153





Highest
Priority"










77

87
49
101
105

118


126f
128
138

153
156




Second
Priority6

18

37
44
99
52

70
74

81




114

119
123



151


157
158
167
168
                      76

-------
TABLE 8. (cont.)

PCB Congener0
S.S'.M'.S.S'-Hexachlorobiphenyl
2,2',3,3',4,4',5-Heptachlorobiphenyl
2,2>,3,4,4',5)51-Heptachlorobiphenyl
2,2',3,4,4',5',6-Heptachlorobiphenyl
2,21,3,4,4II6,6I-Heptachlorobiphenyl
2,2',3,4',5,5',6-Heptachlorobiphenyl
2,3,3',4,41,5I5I Heptachiorobiphenyl
2,2',3,3',4,4',5,6-Octachlorobiphenyl
2I2II3,3II4I5,5',6I-Octachlorobiphenyl
2,2I,3,3>,4I4'I5I5',6-Nonachlorobiphenyl
2I2',3,3I,4,4',5)5')6I6'-Decachlorobiphenyl
Congener Number11
Highest
Summation0 Priority"
169f 169f
170 170
180 180
183
184
187

195

206
209

Second
Priority0





187
189

201


Note: PCB - polychlorinated biphenyl

a PCB congeners recommended for quantitation, from dichlorobiphenyl through decachlorobiphenyl.

b Congeners are identified by their International Union of Pure and Applied Chemistry (IUPAC) number,
as referenced in Ballschmiter and Zell (1980) and Mullin et al. (1984).

0 These congeners are summed to determine total PCB concentration using the approach In
NOAA(1989).

d PCB congeners having highest priority for potential environmental importance based on potential for
toxicity, frequency of occurrence  in environmental samples, and relative abundance in animal tissues
(McFarland and Clarke 1989).

9 PCB congeners having second  priority for potential environmental importance based on potential for
toxicity, frequency of occurrence  in environmental samples, and relative abundance in animal tissues
(McFarland and Clarke 1989).

f To separate PCBs 126 and 169, It is necessary to Initially utilize an enrichment step with an activated
carbon column (Smith 1981).
                                                77

-------
      The overall toxicity of nonortho-substituted PCBs at a site can be assessed
      based on a comparison with the toxicity of 2,3,7,8-TCDD.  A similar procedure
      can be used for assessing the toxicity of a mixture of dioxins and furans.  In
      this "toxicity equivalency factor" (TEF) approach, potency values of individual
      congeners (relative to TCDD) and their respective sediment concentrations are
      used to derive a summed 2,3,7,8-TCDD equivalent (U.S. EPA 1989d; Table 9).
      EPA and the  USAGE are developing guidance on the use of this approach.

      To  ensure that contaminants not included in the list of target analytes are not
      overlooked in the chemical characterization of the dredged material, the
      analytical results should also be scrutinized by trained personnel.  The
      presence of persistent unknown analytes should be noted.  Methods involving
      GC/MS techniques for organic compounds are recommended for the
      identification of any unknown analytes.
2.8.3 Chemical Analysis of Water

      Analysis to determine the potential release of dissolved contaminants from the
      dredged material (standard elutriate) may be necessary to make determinations
      of water column toxicity (see U.S. EPA and USAGE 1994). Elutriate tests
      involve  mixing dredged material with dredging site water and allowing the
      mixture to settle. The portion of the dredged material that is considered to have
      the potential to impact the  water column is the supernatant remaining after
      undisturbed settling and centrifugation. Chemical analysis of the elutriate allows
      a direct comparison, after allowance for mixing, to applicable water quality
      standards.  When collecting samples for elutriate testing, consideration should
      be given to the large volumes of water and sediment required to prepare
      replicate samples for analysis. In some instances,  when there is poor settling,
      the elutriate preparation has to be performed successively several times to
      accumulate enough water for testing.  The following sections discuss the
      selection of target analytes and techniques for water analyses. QA
      considerations are  summarized in Section 2.10.5.
      2.8.3.1     Selection of Target Analytes

      Historical water quality information from the dredging site should be evaluated
      along with data obtained from the chemical analysis of sediment samples to
      select target analytes. Chemical evaluation of the dredged material provides a
      known  list of contaminants that might affect the water column.  All target
      analytes identified in the sediment should initially be considered potential
      targets for water analysis.  Nonpriority pollutant chemical components which are
      found in measurable concentrations in the sediments should be included as
      target analytes if review of the literature indicates that these analytes have the
                                          78

-------
TABLE 9.  Methodology for Toxlciity Equivalency Factors
Because toxicity information on some dioxin and furan species is scarce, a structure-activity
relationship has been assumed. The toxicity of each cogener is expressed as a fraction of the
toxicity of 2,3,7,8 TCDD.
Compound
2,3,7,8 TCDD
other TCDD
2,3,7,8-PeCDDs
other PeCDDs
2,3,7,8-HxCDDs
other HxCDDs
2,3,7,8-HpCDDs
other HpCDDs
OCDD
2,3,7,8-TCDF
other TCDFs
1,2,3,7,8-PeCDF
2,3,4,7,8-PeCDF
other PeCDFs
2,3,7,8-HxCDFs
other HxCDFs
2,3,7,8-HpCDFs
other HpCDFs
OCDF
TEF
1
0
0.5
0
0.1
0
0.01
0
0.001
0.1
0
0.05
0.5
0
0.1
0
0.01
0
0.001
                           79

-------
potential to bioaccumulate in animals (i.e., have a high Kow or bioconcentration
factor [BCF]) and/or are of toxicological concern) (Table 10).
2.8.3.2    Selection of Analytical Techniques

In contrast to freshwater, there generally are no EPA-approved methods for
analysis of saline water although widely accepted methods have existed for
some time (e.g., Strickland and Parsons 1972; Grasshof et al. 1983; Parsons et
al. 1984). Application of the freshwater methods to saltwater will frequently
result in higher detection limits than are common for freshwater unless care is
taken to control the effects of  salt on the analytical signal. Modifications or
substitute methods (e.g., additional extract concentration steps,  larger sample
sizes, or concentration of extracts to smaller volumes) might be necessary to
properly determine analyte concentrations in saltwater or to meet the desired
TDLs.  It is extremely important to ascertain a laboratory's ability to execute
methods and attain acceptable TDLs in matrices containing up to 3 percent
sodium chloride.

Once the list of target analytes for water has been established, analytical
methods should be determined. The water volume required  for specific
analytical methods may vary.  A minimum of 1 L of elutriate  should be prepared
for metals analysis (as little as 100 ml may be analyzed). One liter of elutriate
should be analyzed for organic compounds. Sample size should also include
the additional volume required for the matrix spike and matrix spike duplicate
analyses, required for analysis of both metals and organic compounds.  Sample
size is one of the limiting factors in determining detection limits for water
analyses, but TDLs below the water quality standard should  be the goal in all
cases. Participating laboratories should routinely report detection limits
achieved for a given analyte.

Detailed methods for the analysis of organic and inorganic priority pollutants in
water are referenced in 40 CFR 136 and in U.S.  EPA (1983). Additional
approved methods include U.S. EPA (1986a,b; 1988a,b,c; 1990d,e), APHA
(1989), ASTM (1991 a), and Tetra Tech (1985).  Analysis of the semivolatile
organic priority pollutants involves a solvent extraction of water with an optional
sample cleanup procedure and analysis using GC or GC/MS. The volatile
priority pollutants are determined by using purge-and-trap techniques and are
analyzed by either GC or GC/MS. If dioxin (i.e., 2,3,7,8,-TCDD) analysis is
necessary, Kuehl et al. (1987), Smith et al. (1984), U.S. EPA (1989b; Method
8290), or U.S. EPA (1990f;  Method 1613) should be consulted.  EPA Method
1613 is the recommended procedure for measuring the tetra- through octa-
PCDDs and PCDFs.

A primary requirement for analysis of inorganic and organic priority pollutants is
to obtain detection limits that will result in usable, quantitative data that can
                                   80

-------
                     TABLE 10.  OCTANOL/WATER PARTITION COEFFICIENTS
                         FOR ORGANIC COMPOUND PRIORITY  POLLUTANTS
                                         AND 301(h) PESTICIDES
                                    Octanol/Water
                                  Partition Coefficient
                                                      Octanol/Water
                                                    Partition Coefficient
Pollutant
                  Pollutant
Di-n-octyl phthalate
lndeno[1,2,3-cd]pyrene
Benzo[ghi]perylene
PCB-1260
Mirex"
Benzo[k]fluoranthene
Benzo[b]fluoranthene
PCB-1248
2,3,7,8-TCDD (dioxin)
Benzo[a]pyrene
Chlordane
PCB-1242
4,4'-DDD
Dibenz[a,h]anthracene
PCB-1016
4,4'-DDT
4.4--DDE
Benz[a]anthracene
Chrysene
Endrin aldehyde
Fluoranthene
Hexachlorocyclopentadiene
Dieldrin
Heptachlor
Heptachlor epoxide
Hexachlorobenzene
Di-n-butyl phthalate
4-Bromophenyl phenyl ether
Pentachlorophenol
4-Chloropheny! phenyl ether
Pyrene
2-Chloronaphthalene
Endrin
PCB-1232
Phenanthrene
Fluorene
Anthracene
Methoxychlor*
Hexachlorobutadiene
1,2,4-Trichlorobenzene
Bis[2-ethylhexyl]phthalate
Acenaphthylene
Butyl benzyl phthalate
PCB-1221
Hexachloroethane
Acenaphthene
a-Hexachlorocyclohexane
5-Hexachlorocyclohexane
Q-Hexachlorocyclohexane
•y-Hexachlorocyctohexane
9.2
7.7
7.0
6.9
6.9
6.8
6.6
6.1
6.1
6.0
6.0
6.0
6.0
6.0
5.9
5.7
5.7
5.6
5.6
5.6
5.5
5.5
5.5
5.4
5.4
5.2
5.1
5.1
5.0
4.9
4.9
4.7
4.6
4.5
4.5
4.4
4.3
4.3
4.3
4.2
4.2
4.1
4.0
4.0
3.9
3.9
3.8
3.8
3.8
3.8
Parathion"
Chlorobenzene
2,4,6-Trichlorophenol
6-Endosulfan
Endosulfan sulfate
a-Endosulfan
Naphthalene
Fluorotrichloromethanet"
1,4-Dichlorobenzene
1,3-Dichlorobenzene
1,2-Dichlorobenzene
Toxaphene
Ethylbenzene
A/-Nitrosodiphenylamine
P-Chloro-m cresol
2,4-Dichlorophenol
3,3'-Dichlorobenzene
Aldrin
1,2-Diphenylhydrazine
4-Nitrophenol
Malathion*
Tetrachloroethene
4,6-Dinitro-ocresol
Tetrachloroethene
Bis[2-chloroisopropyl]ether
1,1,1 -Trichloroethane
Trichloroethene
2,4-Dimethylphenol
1,1,2,2-Tetrachloroethaine
Bromoform
1,2-Dichloropropane
Toluene
1,1,2-Trichloroethane
Guthion"
Dichlorodiflouromethane"
2-Chlorophenol
Benzene
Chlorodibromomethanei
2,4-Dinitrotoluene
2,6-Dinitrotoluene
:rans-1,2-Dichloropropone
c/s-1,3-DichIoropropeno
Demeton"
Chloroform
Dichlorobromomethane!  .
Nitrobenzene
Benzidine
1,1-Dichloroethane
2-Nitrophenol         i
Isophorone	
3.8
3.8
3.7
3.6
3.6
3.6
3.6
3.5
3.5
3.4
3.4
3.3
3.1
3.1
3.1
3.1
3.0
3.0
2.9
2.9
2.9
2.9
2.8
2.6
2.6
2.5
2.4
2.4
2.4
2.3
2.3
2.2
2.2
2.2
2.2
2.2
2.1
2.1
2.1
2.0
2.0
2.0
1.9
1.9
1.9
1.9
1.8
1.8
1.8
1.7
                                                           81

-------
TABLE 10. (cent.)
                                     Octanol/Water                                              Octanoi/Water
                                   Partition Coefficient                                         parti,ion Coefficlent
 Pollutant	(log KQ,)	      Pollutant                             (log
Dimethyl phthalate
Chloroethane
2,4-Dlnitrophenol
1,1-Dichloroethylene
Phenol
1,2-DIchloroethane
Diethyl phthalate
AJ-nUrosodlpropylamlne
Dichloromethane
1.6
1.5
1.5
1.5
1.5
1.4
1.4
1.3
1.3
2-Chloroethylvinylether
Bis[2-chloroethoxy]methane
Acrylonitrile
Bis[2-chloroethyl]ether
Bromomethane
Acrolein
Chloromethane
Vinyl chloride
W-nitrosodimethylamine
1.3
1.3
1.2
1.1
1.0
0.9
0.9
0.6
0.6
Source:  Tetra Teoh (1985)

Note:  Mixtures, such as PCB Aroclors ®, cannot have discrete Km values; however, the value given is a rough estimate for
tha mean. [It Is recommended that all PCB analyses use congener-specific methods. All PCB congeners have a log K^ > 4
(L Burkhardt, EPA Duluth, pers. comm.).]

* 301 (h) pesticides not on the priority pollutant list.
" No longer on priority pollutant or 301 (h) list.
                                                           82

-------
      subsequently be compared against applicable water quality standards or cri-
      teria, as appropriate.  Analysis of saline water for metals is subject to matrix
      interferences from salts, particularly sodium and chloride ions, when the
      samples are concentrated prior to instrumental analysis.  The gold
      amalgamation method using cold-vapor atomic absorption spectrometry (CVAA)
      analysis is recommended to eliminate saline water matrix interferences for
      mercury analysis.  Methods using solvent extraction and atomic absorption
      spectrometry analysis may be required to reduce saline water matrix
      interferences for other target metals. Other methods appropriate for  metals
      include: cadmium, copper, lead,  iron, zinc, silver (Danielson et al. 1978);
      arsenic (EPRI 1986); selenium and antimony (Sturgeon et al. 1985);  low levels
      of mercury (Bloom et al. 1983); and tributyltin (Rice et al. 1987).  GFAA
      techniques after extraction are recommended for the analysis of metals, with
      the exception of mercury. All PCB and pesticide analyses should be performed
      using GC/ECD methods because such analysis (e.g., NOAA 1989) will result in
      lower detection  limits.  PCBs should be quantified as specific congeners (Mullin
      et al. 1984; Stalling et al. 1987) and as total  PCBs based on the summation of
      particular congeners (NOAA 1989).
2.8.4 Chemical Analysis of Tissue

      This section discusses the selection of target analytes and techniques for tissue
      analyses.  QA considerations are summarized in Section 2.10.6.
      2.8.4.1    Selection of Target Analytes

      Bioaccumulation is evaluated by analyzing tissues of test organisms for
      contaminants determined to be of concern for a specific dredged material.
      Sediment contaminant data and available information on the iDioaccumulation
      potential of those analytes  have to be interpreted to establish target analytes.

      The n-octanol/water partition coefficient (Kow) is used to estimate the BCFs of
      chemicals in organism/water systems (Chiou et al. 1977; Kenaga and Goring
      1980; Veith et al. 1980; Mackay 1982). The potential for bioaccumulation
      generally increases as Kow  increases, particularly for compounds with log K^
      less than approximately 6.  Above this value, there is less of a tendency for
      bioaccumulation potential to increase with increasing Kow.  Consequently, the
      relative potential for bioaccumulation of organic compounds can be estimated
      from the K^ of the compounds.  U.S. EPA (1985) recommends that compounds
      for which the log Kow is greater than 3.5 be considered for further evaluation of ..
      bioaccumulation potential.  The organic compound classes of priority pollutants
      with the greatest potential to bioaccumulate are  PAHs, PCBs, pesticides, and
      some phthalate esters.  Generally, the volatile organic, phenol, and
      organonitrogen priority pollutants are not readily bioaccumulated, but exceptions
                                         83

-------
include the chlorinated benzenes and the chlorinated phenols. Table'10
provides data for organic priority pollutants based on K^,.  Specific target
analytes for PCBs and PAHs are discussed in Section 2.8.2. The water content
and percent lipids in tissue should be routinely determined as a part of tissue
analyses for organic contaminants.

Table 11 ranks the bioaccumulation potential of the inorganic priority pollutants
based on calculated BCFs.  Dredged material contaminants with BCFs greater
than 1,000 (log BCF > 3) should be further evaluated for bioaccumulation
potential.

Tables 10 and 11  should be used with caution because they are based on
calculated bioconcentration from water.  Sediment bioaccumulation tests, in
contrast, are concerned with accumulation from a complex medium via all
possible routes of uptake. The appropriate use of the tables is to help  in
selecting contaminants of concern for bioaccumulation'analysis by providing a
general indication of the relative potential for various chemicals to accumulate in
tissues.

The strategy for selecting contaminants for tissue analysis should include three
considerations:

    »    The target analyte is a contaminant of concern and is present  in
         the sediment as determined by sediment chemical analyses

    »    The target analyte has a high potential to accumulate and persist
         in tissues

    »    The target analyte is of toxicological concern.

Contaminants with a lower potential to bioaccumulate, but which  are present at
high concentrations in the sediments, should also be included in  the target list
because bioavailability can increase with concentration. Conversely,
contaminants with a high accumulation potential and of high toxicological
concern  should be considered as target analytes, even if they are only  present
at low concentrations in the sediments.  Nonpriority-pollutant contaminants that
are found in measurable concentrations in the sediments should  be included as
targets for tissue analysis if they have the potential to bioaccumulate and
persist in tissues, and are of toxicological concern.
2.8.4.2     Selection of Analytical Techniques

At present, formally approved standard methods for the analysis of priority
pollutants and other contaminants in tissues are not available.  However,
studies conducted for EPA and other agencies have developed analytical
                                    84

-------
 TABLE 11. BIOCONCENTRATION FACTORS (BCF)
     OF INORGANIC PRIORITY POLLUTANTS
Inorganic Pollutant
Metals
Methylmercury
Phenylmercury
Mercuric acetate
Copper
Zinc
. Arsenic
Cadmium
Lead
Chromium IV
Chromium III
Mercury
Nickel
Thallium
Antimony
Silver
Selenium
Beryllium
Nonmetals
Cyanide
Asbestos
Log BCF

4.6
4.6
3.5
3.1
2.8
2.5
2.5
2.2
2.1
2.1
2.0
1.7
1.2
ND
ND
ND
ND

ND
ND
Source: Tetra Tech (1986b)

Note: ND - no data
                       85

-------
methods capable of identifying and quantifying most organic and inorganic
priority pollutants in tissues.  The amount of tissue required for analysis is
dependent on the analytical procedure and the tissue moisture content.
General guidance, but not firm recommendations, for the amount of tissue
required is provided in Table 5. The required amounts may vary depending on
the analytes, matrices, detection limits, and particular analytical laboratory.
Tissue moisture content should be determined for each sample to enable data
to be converted from a wet-weight to a dry-weight basis for some data^jsers.

Detection limits depend on the sample size as well as the specific analytical
procedure. Recommended TDLs for dredged material evaluations are provided
in Section 2.3.2 (see Table 3). TDLs should be specified based on the
intended use of the data and specific needs of each evaluation.

The recommended methods for the analysis of semivolatile organic pollutants
are described in  NOAA (1989). The procedure involves serial extraction of
homogenized tissue samples with methylene chloride, followed by alumina and
gel-permeation column cleanup procedures that remove co-extracted lipids.  An
automated gel-permeation procedure described by Sloan et al. (1993) is
recommended for rapid,  efficient, reproducible sample cleanup.  The extract is
concentrated and analyzed for semivolatile organic pollutants using GC with
capillary fused-silica columns to achieve sufficient analyte resolution.  If dioxin
(i.e., 2,3,7,8-TCDD) analysis  is being performed,  the methods of Mehrle et al.
(1988), Smith et al. (1984), Kuehl et al. (1987), U.S. EPA (1989b; Method
8290), or U.S. EPA (1990f; Method 1613)  should be consulted.  EPA Method
1613 is the recommended procedure for measuring the tetra- through octa-
PCDDs and  PCDFs.

Chlorinated hydrocarbons (e.g., PCBs and chlorinated pesticides) should be
analyzed by GC/ECD. PCBs should be quantified as specific congeners (Mullin
et al. 1984; Stalling et al. 1987) and not by industrial formulations (e.g.,
Aroclors®) because the levels of PCBs in tissues result from complex
processes, including selective accumulation and metabolism (see the discussion
of PCBs in Section 2.8.2.2).  Lower detection limits and positive identification of
PCBs and pesticides can be obtained by using chemical  ionization mass
spectrometry.

The same tissue  extract is analyzed for other semivolatile pollutants (e.g.,
PAHs, phthalate esters, nitrosamines,  phenols) using GC/MS as described by
NOAA (1989), Battelle (1985), and Tetra Tech  (1986b). These GC/MS
methods are similar to EPA Method 8270 for solid wastes and soils (U.S. EPA
1986a).  Lowest detection limits are achieved by  operating the mass spectro-
meter in the selective ion monitoring mode. Decisions to perform analysis of
nonchlorinated hydrocarbons and resulting data interpretation should consider
that many of these analytes are readily metabolized by most fish and many
                                   86

-------
      invertebrates.  Analytical methods for analysis of tissue samples for volatile
      priority pollutants are found in Tetra Tech (1986b).

      Tissue lipid content is of importance  in the interpretation of bioaccumulation
      information.  A lipid determination should be performed on all biota submitted
      for organic analysis if 1) food chain models will be used, 2) test organisms
      could spawn during the test, or 3) special circumstances occur, such as those
      requiring risk assessment. Bligh  and Dyer (1959) provide an acceptable
      method, and the various available methods are evaluated by Randall et al.
      (1991).

      Analysis for priority pollutant metals involves a nitric acid or nitric acid/perchloric
      acid digestion of the tissue sample and subsequent analysis  of the acid extract
      using atomic absorption spectrometry or ICP techniques.  Procedures in Tetra
      Tech (1986b) are generally recommended.  NOAA (1989) methods may also be
      used and are recommended when low detection  levels are required.  Microwave
      technology may be used for tissue digestion to reduce contamination and  to
      improve recovery of metals (Nakashima et al. 1988). This methodology is
      consistent with tissue analyses performed by NOAA (1989), except for the
      microwave heating steps.  Mercury analysis requires the use of CVAA methods
      (U.S. EPA 1991c).  The matrix interferences encountered in analysis of metals
      in tissue may require case-specific techniques for overcoming interference
      problems. If tributyltin analysis is being performed, the methods of Rice et al.
      (1987), NCASI (1986), or Uhler et al. (1989) should be consulted.
2.9   DATA VALIDATION, REDUCTION, AND REPORTING

      This section describes procedures for data compilation and verification prior to
      being accepted for making technical conclusions.  In addition, special equations
      may be required  and used to make calculations, models may be used in data
      analysis,  criteria  may be used to validate the integrity of data that support final
      conclusions, and methods may be used to identify and treat data that may not
      be representative of environmental conditions.

      The following specific information should be  included in the QA project plan:

          •   The principal criteria that will  be used  to validate data integrity
              during collection and reporting of data (the criteria selected will
              depend  on the level of validation required to meet the data quality
              objectives)

          •   The data reduction scheme planned for collected data, including all
              equations  used to calculate the concentration or value of the
              measured parameter and reporting units
                                         87

-------
          •   The methods used to identify and treat outliers (i.e., data that fall
              outside the upper and lower limits such as ±3 standard deviations
              of the mean value) and nondetectable data

          •   The data flow or reporting scheme from collection of original data
              through storage of validated concentrations (a flowchart is usually
              necessary)

          •   Statistical formulas and sample calculations planned for collected
              data

          •   Key individuals who will handle the data in this reporting scheme.

      QC procedures designed to eliminate errors during the mathematical and/or
      statistical reduction of data should also be included in the QA project plan. QC
      in data processing may include both manual and automated review.  Input data
      should be checked and verified to confirm compatibility and to flag outliers for
      confirmation (i.e., verify that data are outliers and not data for highly contam-
      inated sediment, water, or tissue).  Computerized data plots can be  routinely
      used as a tool for rapid identification of outliers that can then be verified using
      standard statistical procedures.
2.9.1 Data Validation

      Once the laboratory has completed the requested sample analyses, the
      analytical results are compiled, printed out, and submitted as a data package,
      which has been signed by the laboratory's project manager. This package may
      include computer disks, magnetic tape, or other forms of electronically stored
      information.  Data packages may range in size from a few pages to several
      cartons of documents, depending on the nature and extent  of the analyses
      performed. The cost of this documentation can vary from no charge (in cases
      where only the final results of an analysis are  reported) to hundreds of dollars
      over the cost of reporting only the final results of an analysis.

      The data and information collected during the  dredged material evaluation
      should be carefully reviewed as to their relevancy, completeness, and quality.
      The data must be relevant to the overall objective of the project.  Data quality
      should be verified by comparing reported detection limits and QC results to
      TDLs and QC limits, respectively, specified for the current dredged material
      evaluation.

      As soon as new data packages are received from the laboratory, they should
      be checked for completeness and data usability and, ideally, dated and
      duplicated. Dating is important for establishing the laboratory's adherence to
      schedules identified in the statement of work.  Duplication assures that a clean
      reference copy is always kept on file.  Checking each element of the data
                                          88

-------
package for completeness of information, precision of analytical methods, and
bias of all measurements helps to determine whether acceptable data from
each type of analysis have been supplied by the laboratory.

Screening for data quality requires knowledge of the sample holding times and
conditions, the types of analyses requested, and the form in which data were to
be delivered by the laboratory. Review of the statement of work is essential to
determine any special conditions or requests that may have been stated at the
onset of the analyses.  Recommended lists of laboratory deliverables for  dif-
ferent types of chemical analyses are  provided in Tables 1 and 2.  This initial
screening of data can be performed by appropriate staff or the project manager.

Data validation, or the process of assessing data quality, can begin after
determining that the data package  is complete.  Analytical laboratories strive to
produce data that conform to the requested statement of work, and they
typically perform internal checks to assure that the data meot a standard  level
of quality.  However, data validation is an independent check on laboratory
performance and is intended to assure that quality of reported data meets the
needs identified  in the QA project plan.

Data validation involves all procedures used to accept or reject data after
collectiocLand prior to use.  These  include screening, editing, verifying, and re-
viewing through  external performance  evaluation audits.  Dsita validation
procedures ensure that objectives for data precision and bias were met, that
data were generated in accordance with the QA project plan and standard
operating procedures, and that data are  traceable and defensible.  All chemical
data should be reported with their associated analytical sensitivity, precision,
and bias. In addition, the quantification level achieved by the laboratory should
be compared to  specific TDLs.

The QA project plan should also specify an appropriate level of data validation
for the intended  data use. Examples of four alternative levels of validation
effort for chemical data are summarized  in Table 12.  Theses four data validation
levels range from complete, 100-percent review of the data package (Level 1)
to acceptance of the data package without any evaluation (Level 4).

The QA project plan should also specify who will perform the evaluations called
for in Levels 1, 2, or 3. The following  options should be considered for
chemical data:

    •   Perform a brief assessment and rely on specialists to resolve
        outstanding concerns. This assessment is equivalont to Level 3
        (Table 12).

    •   Perform a complete review for Level 1 or 2 using qualified staff
        and technical guidelines for QA specialists (see Footnote a in
        Table 12).
                                    89

-------
                      TABLE 12.  LEVELS OF DATA VALIDATION
     Level 1  100 percent of the data (including data for laboratory quality control
             samples) are independently validated using the data quality objectives
             established for the project.  Calculations and the possibility of transcription
             errors are checked.  Instrument performance and original data for the
             analytical standards used to calibrate the method are evaluated to ensure
             that the values reported for detection limits and data values are
             appropriate. The bias and precision of the data are calculated and a
             summary of corrections and data quality is prepared.8

     Level 2  20 percent of the sample data and 100 percent of the laboratory quality
             control samples are validated.  Except for the lower level of effort in
             checking data for samples, the same checks conducted in Level 1  are
             performed.  If transcription errors or other concerns (e.g., correct
             identification of chemicals in the samples) are found in the initial check on
             field samples, then data for an additional 10-20 percent  of the samples
             should be reviewed.  If numerous errors are found, then  the entire data
             package should be reviewed.

     Level 3  Only the summary results of the laboratory analyses are  evaluated. The
             data values  are assumed to be correctly reported by the  laboratory.  Data
             quality is assessed by comparing summary data  reported by the laboratory
             for blanks, bias, precision, and  detection limits with data  quality objectives
             In the QA project plan.  No checks on the calibration of the method are
             performed, other than comparing the laboratory's summary of calibrations
             with limits specified in the QA project plan.

     Level 4  No additional validation of the data is performed. The internal reviews
             performed by the laboratory are judged adequate for the project.

" Screening checks that can be easily performed by the project manager are provided in (U.S.
EPA 1991d).  Step-by-step procedures used by quality assurance specialists to validate data
for analyses of organic compounds and metals can be found in EPA's functional  guidelines for
data review (U.S. EPA 1988a,b). These guidelines were developed for analyses conducted
according to the statements of work for EPA's Contract Laboratory Program and  are updated
periodically.  Regional interpretation of these detailed procedures is also contained  in Data
Validation Guidance Manual for Selected Sediment Variables (PTI 1989b), a draft report
released by the Washington Department of Ecology's Sediment Management Unit in June
1989. A simplified version of this guidance is provided in Data  Quality  Evaluation for
Proposed Dredged Material Disposal Projects (PTI 1989a), another report released by the
Sediment Management Unit in June 1989.
                                             90

-------
          •   Send the data package to an outside technical specialist for
              review, specifying either Level 1, 2, or 3.

      Providing instructions for conducting a thorough technical validation of chemical
      data is beyond the scope of this document Examples of detailed technical
      guidance of  this nature can be found in a pair of publications, Laboratory Data
      Validation:  Functional Guidelines for Evaluating Inorganics Analyses (U.S. EPA
      1988a) and Laboratory Data Validation: Functional Guidelines for Evaluating
      Organics Analyses (U.S. EPA 1988b).  Examples of simple evaluations that can
      be conducted by a project manager are also provided in U.S. EPA (1991d).
      The evaluation criteria in Figure 1 (abstracted from U.S. EPA [1991d]) provide
      several signs that should alert a project manager to  potential problems with
      data acceptability.
2.9.2 Data Reduction and Reporting

      The QA project plan should summarize how validated data will be analyzed to
      reach conclusions, including major tools that will be used for statistical
      evaluations. In this section, a flow chart is useful to show the reduction of
      original laboratory data to final tabulated data in the project report. A summary
      should also be provided of the major kinds of data analyses that will  be
      conducted (e.g., health risk assessments, mapping of chemical distributions).
      In addition, the format, content, and distribution of any data reports for the
      project should be summarized.
2.10  INTERNAL QUALITY CONTROL CHECKS

      The various control samples that will be used internally by the laboratory or
      sample collection team to assess quality are described in this section of the QA
      project plan.  For most environmental investigations, 10-30 percent of all
      samples may be analyzed specifically for purposes of quality control.  In some
      special cases (e.g., when the number of samples is small and the need to
      establish the validity of analytical data is large), as  many as 50 percent of all
      samples are used for this purpose.  These QC samples may be used to check
      the bias and precision of the overall analytical system and to evaluate the
      performances of individual analytical instruments or the technicians that operate
      them.

      In addition to calibration procedures described in Section 2.7, this section of the
      guidance document (and Appendix C) summarizes  the most widely used QC
      samples as follows:

          •   Blanks
                                         91

-------
INFORMATION
    SOURCE
Information •"*****
 Complete
    ?
                                                                                     Accept
                                                                                   Data for Use
                                                                                 Accept Data with
                                                                                    Appropriate
                                                                                   Qualifications
    Analytical
Data and Supporting
  Documentation
                                                         Marginally
                                                        Outside Limits
                                                                                   Consult Expert
                                                          Severely
                                                        Outside Limits
                                                                                    Reject Data
                                                                                   (and consider
                                                                                    reanalysis)
          Figure 1.  Guidance for data assessment and screening for data quality.
                                                   92

-------
          •   Matrix spike samples

          •   Surrogate spike compounds

          •   Check standards, including:

              -   Spiked method blanks

              -   Laboratory control samples

              -   Reference materials

          •   Matrix replicates (split in the laboratory from one field sample)

          •   Field replicates (collected as separate field samples from one
              location).

      QC procedures for sediment, water, and tissue analyses are discussed in more
      detail in the following sections.  Field QC results are not used to qualify data,
      but only to help support conclusions arrived at by the review of the entire data
      set.

      The government authorities for the program may require that certain samples
      be submitted on a routine basis to government laboratories for analysis, and
      EPA or USAGE may participate in some studies.  These activities provide an
      independent QA check on activities being performed and on data being
      generated and are discussed in Section 2.11 (Performance and System Audits).


2.10.1   Priority and Frequency of Quality Control Checks

      Which QC samples will  be used in  analyses should be determined during
      project planning.  The frequency of QC procedures is dependent upon the type
      of analysis and the objectives of the project (as established in Section 2.3).
      The statements of work for EPA's CLP (U.S. EPA  1990d,e) specify the types of
      checks to be used during sample analysis.  Determining the actual numbers of
      samples and how often  they must be used is also  a part of this process. These
      specifications, called QC sample frequencies, represent the minimum levels of
      effort for a project.  Increasing the frequency of QC samples may be an
      appropriate measure when the expected concentrations of chemicals are close
      to the detection limit, when data on  low chemical concentrations are needed,
      when there is a suspected problem  with the laboratory, or v/hen existing data
      indicate elevated chemical concentrations such that remova.l or other actions
      may be required. In such cases, the need for increased precision may justify
      the cost of extra QC samples.
                                         93

-------
The relative importance, rationale, and relative frequency of calibration and
each kind of QC sample are discussed in Appendix C.  The following priority,
rationale, and frequency of use is recommended for each procedure:

    1.   Method blank samples are one of the highest priority checks on
        QC, because they provide an assessment of possible laboratory
        contamination (and the means to correct results for such contam-
        ination), and are used to determine the detection limit. As a result,
        method blank analyses are always required; at least one analysis
        is usually performed for each group of samples that are processed
        by a laboratory. In contrast, the need for other kinds of blank
        samples (bottle, transport, or field equipment) is usually project-
        specific and depends on the likelihood of contamination from
        solvents, reagents, and instruments used' in the project; the matrix
        being analyzed; or the contaminants of concern. A bottle blank
        consists of an unopened empty sampling bottle that is prepared
        and retained in the field laboratory.  A trip  travel blank consists of
        deionized water and preservative (as added to the samples) that is
        prepared in the laboratory and transported to the sampling site. A
        field or decontamination blank consists of deionized water from the
        sample collection device and preservative  (as added to the
        samples) that is prepared at the sampling site.

    2.   Matrix spike samples are high-priority checks on QC and should
        always be analyzed to indicate the bias of analytical
        measurements due to interfering substances or matrix effects.
        The suggested frequency is 1 matrix spike for every 20 samples
        analyzed.  If more than 1 matrix type is present (e.g., samples
        containing primarily sand and samples containing primarily of silt
        within the same group), then each matrix type should be spiked at
        the suggested frequency. Duplicate matrix spike samples
        analyzed at a frequency of 1 duplicate for every 20 samples can
        serve as an acceptable means of indicating both the bias and
        precision of measurement for a particular sample.  Duplicate matrix
        spike samples may provide the only information on precision for
        contaminants that are rarely detected in samples.

   3.   Surrogate spike compounds are high-priority checks on QC that are
        used to evaluate analytical recovery (e.g., sample extraction
        efficiency) of organic compounds of interest from individual
        samples. Surrogate spike compounds should be added to every
        sample, including blanks and matrix spike samples, prior to
        performing sample processing, to monitor extraction efficiency on a
        sample-by-sample basis. This kind of check is only used when the
        identity of the surrogate compound can be reasonably confirmed
        (e.g., by mass spectroscopy).  Because a surrogate compound is
                                   94

-------
    chemically similar to the associated compound of interest and is
    added to the sample in a known amount, its known recovery is
    indicative of that of the compound of interest.

    Variations in recovery that can be seen using  surrogate spike
    compounds with each sample will not necessarily be reflected in
    duplicate matrix spike analyses conducted on  only a few of the
    samples. The reasons for possible differences between surrogate
    spike analyses and matrix spike analyses relate to sample
    heterogeneity and how these QC samples are prepared. For
    example, matrix spike analyses provide an indication of chemical
    recovery for the general sample matrix tested. However, this
    matrix may differ among individual samples leading to a range of
    recoveries for surrogate spike compounds  among samples.  In
    addition, surrogate spike compounds are often added at a lower
    concentration than matrix spike compounds.  This difference in
    spiking concentration sometimes results in reasonable recovery of
    the higher-concentration matrix spike compounds but poorer
    recovery of the lower-concentration surrogate  spike compounds.
    Finally, matrix spike compounds are typically identical to
    compounds of interest in the samples, while surrogate spike
    compounds are usually selected because they are not present in
    environmental samples, but still mimic the  behavior of compounds
    of interest.  Therefore, there can be more uncertainty in quantifying
    the recovery of matrix spike compounds (after subtracting the
    estimated concentration of the compounds of  interest in the
    sample) than the recovery of surrogate  spike compounds.

4.   Check standards should be used whenever available as a high-
    priority check on laboratory performance. Check standards include
    laboratory control samples, reference materials prepared by an
    independent testing facility, and spiked  method blanks prepared by
    the laboratory. By comparing the results of check standards with
    those of sample-specific measurements (e.g.,  matrix spike
    samples and surrogate compound recovery), an overall
    assessment of bias and precision can be obtained.  The laboratory
    should  be contacted prior to analysis to determine what laboratory
    control samples can be used.  Catalogues from organizations such
    as National Institute for Standards and Technology and the
    National Research Council of Canada are available that list
    reference materials for different sediment, water, and tissue
    samples (see Section 2.11.2).

    Reference materials provide a standardized basis for comparison
    among laboratories or  between different rounds of analysis at one
    laboratory.  Therefore, reference materials should always be used
                               95

-------
              when comparison of results with other projects is an intended data
              use.  At least 1 analysis of a reference material for every 20
              samples is recommended for this purpose. Similarly, spiked
              method blanks should be used as acceptable checks on laboratory
              performance whenever a new procedure is used or when
              laboratories with no established track record for a standard or
              nonstandard procedure will be performing the analysis.

          5.  Analytical replicate samples should be included as a medium-
              priority check on laboratory precision. Analytical replicate samples
              better indicate the precision of measurements on actual samples
              than do matrix spike duplicates because the contaminants have
              been incorporated into the sample by environmental processes
              rather than having been spiked in  a laboratory setting. The
              suggested frequency is 1 replicate sample for every 20 samples
              for each  matrix type analyzed. For organic analyses, analysis of
              analytical spike duplicate samples are sometimes  a higher priority
              than matrix replicate samples  if budgets are limited.  The reason
              for this preference is because many organic compounds of interest
              may not be present in samples unless they are added as spiked
              compounds.

          6.  Field replicate samples should  be included if measuring sampling
              variability is a critical component of the study design.  Otherwise,
              collection of field replicate samples is discretionary and a lower
              priority than the other QC samples. Field replicate samples should
              be submitted to the  laboratory as blind samples.  When included,
              the suggested frequency is at least 1 field replicate for every 20
              samples  analyzed.  One of the field replicate samples should also
              be split by the laboratory into analytical duplicates so that both
              laboratory and laboratory-plus-sampling  variability  can be
              determined on the same sample.   By obtaining both measures on
              the same sample, the influence of sampling variability can  be
              better discerned. It  is possible that analytical variability can mask
              sampling variability at a location.
2.10.2   Specifying Quality Control Limits

      Prior to performing a chemical analysis, recognized limits on analytical per-
      formance should be established.  These limits are established largely through
      the analysis of QC samples. QC limits apply to all internal QC checks for field
      measurements, physical characterizations, bioaccumulation studies, and toxicity
      tests.  Many laboratories have established limits that are applicable to their own
      measurement systems.  These limits should be  evaluated to ensure that they
      are at least as stringent as general guidelines or that the reasons for a less
                                         96

-------
stringent limit are acceptable. Also, if a laboratory has consistently
demonstrated better performance than indicated by general guidelines, limits
tied to this better performance should be used to indicate when there may be a
problem at that laboratory. For example, if surrogate recoveries for benzene in
sediment samples have consistently been between 85 and 105 percent, a
recovery of 70 percent indicates an analytical problem that should be
investigated even if the general guideline for acceptable recovery is 50 percent.
It may be useful  to establish  different kinds of limits when working with labor-
atories.  For example, the following two kinds of limits are used by PSEP
(1990c) and are  similar to limits used in EPA's CLP.

Warning limits are values indicating that data from the analysis of QC samples
should be qualified (e.g., that they represent estimated or Questionable values)
before they can be relied upon in a project.  These limits serve to  warn the
project staff that  the analytical system, instrument, or method may not be
performing normally and that data should be qualified as "estimated" before
using the results for technical analysis.  The standard value; for warning limits
are ±2 times the standard deviation (U.S. EPA 1979).  Examples, of warning
limits used by the Puget Sound Estuary Program are provided  in Table 13.
Such limits provide a means  of ensuring that reported data are consistently
qualified, an important consideration when combining data in a regional
database.

If necessary to meet project goals, project managers may sipecify warning limits
as more stringent contractual requirements in laboratory statements of work.
For example, Puget Sound Estuary Program guidelines for organic compound
analyses state that the warning limits for the minimum recovery of surrogate
spike and matrix spike compounds are 50 percent of the amount added prior to
sample extraction.  Data that do not meet this minimum requirement would
normally be qualified as estimates.  However, the project manager could apply
more stringent criteria and decide to reject data that do not meet warning  limits,
which would require reanalysis of the samples associated with  those
QC samples that do not meet these limits.  These more stringent criteria are
termed control limits.

Control limits are limits placed on the acceptability of data from the analysis of
QC samples. Exceedance of control limits informs the analyst  and the project
manager that the analytical system or instrument is performing  abnormally and
needs to be corrected.  Control limits should be contractually binding on
laboratories, and statements  of work should provide the project manager or
designee with sole discretion in enforcing the limits. Data obtained under  these
circumstances should be corrected before they are resubmitted by the
laboratory.  Data that exceed control limits are often rejected and excluded from
a project database, although  there may be special circumstances that warrant
acceptance of the data as estimated values. The reasons for making such an
                                   97

-------
            TABLE 13.  EXAMPLE WARNING AND CONTROL LIMITS FOR
                 CALIBRATION AND QUALITY CONTROL SAMPLES*
     Analysis Type
  Recommended Warning Limit      Recommended Control Limit
 Ongoing calibration     Project manager decision"
 Surrogate spikes

 Method blanks

 Reference materials

 Matrix spikes
 Spiked method
 blanks
 (check standards)

 Matrix replicates
 Field replicates
< 50 percent recovery0


Exceeds the TDL

95 percent confidence interval, if
certified

50-150 percent recovery

50-150 percent recovery
35 percent coefficient of variation
Project manager decision
> ±25 percent of the average
response measured in the
initial calibration

Follow EPA Contract
Laboratory Program guidelines

Exceeds 5 times the TDL

To be determined


To be determinedd

To be determined
> ±50 percent coefficient of
variation (or a factor of 2 for
duplicates)
Project manager decision
Note.* TDL - target detection limit

* Warning and control limits used in the Puget Sound Estuary Program for the analysis of
organic compounds (PSEP 1990c).

b See U.S. EPA (1991d) for specific examples of project manager decisions for warning or
control limits.

0 Except when using the isotope dilution technique.

d Zero percent spike recovery requires rejection of data.
                                            98

-------
      exception should always be documented in a QA report for the data (see
      Appendix F).

      Unlike warning limits, control limits and appropriate corrective actions (such as
      Instrument recalibration, elimination of sources of laboratory contamination, or
      sample reanalysis) should be clearly identified  in the statement of work.  The
      standard value for control limits are ±3 times the standard deviation (U.S. EPA
      1979). Examples of regional control limits used by the Puget Sound Estuary
      Program are also provided in Table 13.  In those cases that require a project
      manager's decision to determine tie appropriate control limit, it is
      recommended that the associated warning limit be used as an control limit to
      produce data that will have broad applicability (including use in enforcement
      proceedings). Control limits should be enforced with discretion because some
      environmental samples are inherently difficult to analyze.  Recommended
      actions under different circumstances are provided below.
2.10,3   Quality Control Considerations for Physical Analysis of
         Sediments

      The procedures used for the physical analysis of sediments should include a
      QC component.  QC procedures for grain size analysis and total solids/specific
      gravity determinations are necessary to ensure that the data meet acceptable
      criteria for precision and bias. To measure precision, triplicate analyses should
      be performed for every 20 samples analyzed. TOC is a special case, where all
      samples should be analyzed in triplicate, as recommended by the analytical
      method.  In addition, 1 procedural blank per 20 samples should be run, and the
      results reported for TOG analysis.  Standards used for TOC determinations
      should be verified by independent check standards to confirm tie bias of the
      results. Quality control limits should be agreed upon for each analytical
      procedure, and should be consistent with the overall QA project plan.
2.70.4   Quality Control Considerations for Chemical Analysis of
         Sediments
                      i

      Methods for the chemical analysis of priority pollutants in sediments should
      include detailed QC procedures and requirements that should be followed
      rigorously throughout the evaluation. General procedures include the analysis
      of a procedural blank, a matrix duplicate, a matrix spike along with every 10-20
      samples processed, and surrogate spike compounds.  All analytical instruments
      should be calibrated at  least daily (see Section 2.7.1).  All calibration data
      should be submitted to  the laboratory project QA coordinator for review.  The
      QA/QC program should document the ability of the selected methods to
      address the high salt content of sediments from marine and estuarine areas.
                                         99

-------
      Analytical precision can be measured by analyzing 1 sample in duplicate or
      triplicate for every 10-20 samples analyzed. If duplicates are analyzed, the
      relative percent difference should be reported; however, if triplicates are
      analyzed, the percent relative standard deviation should be reported.
2.10.5   Quality Control Considerations for Chemical Analysis of
         Water

      Methods recommended for the chemical analysis of priority pollutants in water
      include detailed QC procedures and requirements that should be followed
      closely throughout the evaluations.  General procedures should include the
      analysis of a procedural blank, a matrix duplicate, a matrix spike for every
      10-20 samples processed, and surrogate spike compounds (for organic
      analyses only). Analytical precision can be measured by analyzing 1 sample in
      triplicate or duplicate for every 10-20 samples analyzed.  If duplicates are
      analyzed, the relative percent difference should be reported; however, if
      triplicates are analyzed, the percent relative standard deviation should be
      reported. Analytical bias can be measured by analyzing SRM, a matrix
      containing a known amount of a pure reagent. Recoveries of surrogate  spikes
      and matrix  spikes should be used to measure for precision and bias; results
      from these  analyses should be well documented.  Special quality control is
      required for ICP and GC/MS analyses. Initial calibrations  using three or five
      standards (varying in concentration) are required for analyses of inorganic and
      organic compounds, respectively, before analyzing samples (see Section 2.7.2).
      Subsequent calibration checks should be performed for every 10-20 samples
      analyzed.
2.10.6   Quality Control Considerations for Chemical Analysis of
         Tissue

      Methods recommended for the chemical analysis of priority pollutants in tissue
      include detailed QC procedures and requirements that should be followed
      closely throughout the evaluations. General procedures should include the
      analysis of a procedural blank, a matrix duplicate, a matrix spike for every
      10-20 samples processed, and surrogate spike compounds  (for organic
      analyses only). Analytical precision can be measured by analyzing 1 sample in
      triplicate or duplicate for every 10-20 samples analyzed.  If duplicates are
      analyzed, the relative percent difference should be reported; however, if
      triplicates are analyzed, the percent relative standard deviation should be
      reported. Analytical bias can be measured with the appropriate SRMs.
      Precision and bias determinations should be performed with  the same
      frequency as the blanks and matrix spikes.
                                         100

-------
2.11  PERFORMANCE AND SYSTEM AUDITS

      Procedures to determine the effectiveness of the QC program and its
      implementation are summarized in this section of the QA project plan.  Each
      QA project plan should describe the various audits required to monitor the
      capability and performance of all measurement systems.  Audits include a
      careful evaluation of both field and laboratory QC procedures. They are an
      essential part of the field and laboratory QA program and consist of two basic
      types:  performance audits and system audits.  For example, analyses of
      performance evaluation  samples may simply be used for comparison with the
      results of independent laboratories (a form of performance audit), or
      comprehensive audits may be conducted by the government of the entire field
      or laboratory operations (a system audit).

      Performance and system audits should be conducted by individuals not directly
      involved in the measurement process. A performance auditor independently
      collects data using performance evaluation samples, field blanks, trip blanks,
      duplicate samples, and spiked samples.  Performance audits may be conducted
      soon after the measurement systems begin generating data. They may be
      repeated periodically as required by task needs, duration, and cost. U.S. EPA
      (1991e) should be reviewed for auditing the performance of laboratories
      performing aquatic toxicity tests.

      A systems audit consists of a review of the total data production process.  It
      includes onsite reviews of field and laboratory operational systems. EPA and/or
      USAGE will develop and conduct external system audits based on the approved
      project plan. An example  of a systems audit checklist is provided in
      Appendices A and G.
2.11.1   Procedures for Pre-A ward Inspections of Laboratories

      The pre-award inspection is a kind of system audit for assessing the labor-
      atory's overall capabilities.  This assessment includes a determination that the
      laboratory personnel are appropriately qualified and that the required equipment
      is available and is adequately maintained.  It establishes the groundwork
      necessary to ensure that tests will be conducted properly, provides the initial
      contact between government and laboratory staff, and emphasizes the
      importance that government places on quality work and products.

      The purpose of the  pre-award inspection is to verify the following:

          •  The laboratory has an independent QA/QC program

          •  Written work plans are available for each test that describe the
             approach to be used in storing, handling, and analyzing samples
                                        101

-------
          •   Technically sound, written standard operating procedures are
              available for all study activities

          •   Qualifications and training of staff are appropriate and documented

          •   All equipment is properly calibrated and maintained

          •   Approved analytical procedures are being followed.


2.11.2   Interlaboratory Comparisons

      It is important that data collected and processed at various laboratories be
      comparable.  As part of the performance audit process, laboratories may be
      required to participate in analysis of performance evaluation samples related to
      specific projects.  In particular, laboratory  proficiency should be demonstrated
      before a laboratory negotiates a contract and yearly thereafter.  Each laboratory
      participating in a proficiency test is required to analyze samples prepared to a
      known concentration. Analytes used in preparation of the samples should
      originate from a recognized source of SRMs such as the  National Institute for
      Standards and Technology. Proficiency testing programs already established
      by the government may be  used (e.g., EPA Environmental Monitoring and
      Systems Laboratory scoring system), or a program may be designed
      specifically for dredged material  evaluations.

      In addition, the performance evaluation samples prepared by EPA
      Environmental Monitoring and Systems Laboratory (Las Vegas, Nevada) for the
      CLP may be used to assess interlaboratory comparability. Analytical results are
      compared with predetermined criteria of acceptability (e.g., values that fall
      within the 95 percent confidence interval are considered acceptable). The QA
      project plan should indicate, where applicable, scheduled participation in all
      interiaboratory calibration exercises.

      Reference materials are substances with well-characterized properties that are
      useful for assessing the bias of an analysis and auditing analytical
      performances among laboratories. SRMs are certified reference materials
      containing precise concentrations of chemicals, accurately determined by a
      variety of technically valid procedures,  and are issued by  the National Institute
      of Standards and Technology. Currently,  SRMs are not available for the
      physical measurements or all pollutants in sediments; however, where possible,
      available SRMs or other regional reference materials that have been repeatedly
      tested should be analyzed with every 20 samples processed.

      SRMs for most organic  compounds are not currently available for seawater,  but
      reference materials for many inorganic chemicals may be obtained from the
      organizations listed in Table 14.  Seawater matrix spikes of target analytes
      (e.g., seawater spiked with National Institute for Standards and Technology
                                         102

-------
	TABLE 14. SOURCES OF STANDARD REFERENCE MATERIALS

 PCBs
     National Research Council of Canada
 PAHs
     National Research Council of Canada

     National Institute for Standards and
     Technology

 Metals
     National Bureau of Standards
     National Research Council of Canada
Marine sediment


Marine sediment
Sediment
     International Atomic Energy Agency
Estuarine sediment
Marine sediment
Dogfish liver
Dogfish muscle
Lobster hapatopan-
creas
Marine sediment
Rsh flesh
Mussel tissue
HS-1 and HS-2


HS-3, HS-4, HS-5, HS-6

SRM #1647 and SRM #1597
SRM #1646

MESS-1, BCSS-1, PACS-1

DOLT-1

DORM-1

TORT-1


SD-N-1/2(TM)

MA-A-2(TM)

MAL-1(TM)
 Standard reference materials (SRMs) may be obtained from the following organizations:
 Organic Constituents

       U.S. Department of Commerce
       National Institute for Standards and Technology
       Office of Standard Reference Materials
       Room B3111 Chemistry Building
       Gaithersburg, Maryland 20899
       Telephone: (301)975-6776
        Marine Analytical Chemistry Standards Program
        National Research Council of Canada
        Atlantic Research Laboratory
        1411 Oxford Street
        Halifax, Nova Scotia, Canada B3H 3Z1
        Telephone: (902) 426-3280
 Inorganic Constituents

       U.S. Department of Commerce
       National Institute for Standards and Technology
       Office of Standard Reference Materials
       Room B3111 Chemistry Building
       Gaithersburg, Maryland 20899
       Telephone: (301)975-6776
        Marine Analytical Chemistry Standards Program
        National Research Council of Canada
        Division of Chemistry
        Montreal Road  •                       - .  •
        Ottawa, Ontario, Canada K1A OR9
        Telephone: (613)993-12359
                                                103

-------
      SRM 1647 for PAH) should be used to check analytical bias.  Some available
      SRMs for priority pollutant metals in seawater are National Research Council of
      Canada seawater CASS-1 and seawater NASS-2.

      SRMs for organic priority pollutants in tissues are currently not available. The
      National  Institute of Standards and Technology is presently developing SRMs
      for organic analytes.  Tissue matrix spikes of target analytes should be used to
      fulfill analytical accuracy requirements for organic analyses.

      Because new SRMs appear constantly, current listings of appropriate agencies
      should be consulted frequently. SRMs that are readily available and commonly
      used are included in Table 14.
2.11.3   Routine System Audits

      Routine system audits during the technical evaluation ensure that laboratories
      are complying with the QA project plan. It is suggested that checklists be
      developed for reviewing training records, equipment specifications, QC
      procedures for analytical tasks, management organization, etc. The government
      should also establish laboratory review files for quick assessment of the labor-
      atory's activity on a study, and to aid in monitoring the overall quality of the
      work. Procedures for external system audits by the government are similar to
      the internal systems audits conducted by the laboratories themselves.
2.12  FACILITIES

      The QA Project Plan should provide a complete, detailed description of the
      physical layout of the laboratory, define space for each test area, describe
      traffic-flow patterns, and document special laboratory needs.  The design and
      layout of laboratory facilities are important to maintain sample integrity and
      prevent cross-contamination. The specific areas to be used for the various
      evaluations should  be identified. Aspects of the dredging study that warrant
      separate facilities include the following:

          •   Receiving

          •   Sample storage

          •   Sample preparation

          •   Sample testing

          •   Reagent storage

          •   Data reduction and analysis.
                                         104

-------
2.13  PREVENTIVE MAINTENANCE

      Procedures for maintaining field and laboratory equipment in a ready state are
      described in this section, including identification of critical spare parts that must
      be available to ensure that data completeness will not be jeopardized by
      equipment failure. Regular servicing must be implemented and documented.

      The QA project plan should describe how field  and laboratory equipment
      essential to sample collection and analysis will  be maintained in proper working
      order.  Preventive maintenance may be in the form of: 1) scheduled
      maintenance activities to minimize costly downtime and ensure accuracy of
      measurement systems, and 2) available spare parts, backup systems, and
      equipment.  Equipment should be subject to regular inspection and preventive
      maintenance procedures to ensure proper working order.  Instruments should
      have periodic calibration and preventive maintenance performed by qualified
      technical personnel, and a permanent record should  be kept of calibrations,
      problems diagnosed, and  corrective actions applied.  An acceptance testing
      program for key materials used in the performance of environmental
      measurements (chemical and biological materials) should be applied prior to
      their use.
2. 14  CALCULA TION OF DA TA QUALITY INDICA TORS

      Specific equations or procedures used to assess the precision, bias, and
      completeness of the data are identified in this section.

      The calculations and equations used routinely in QA review (e.g., relative
      percent difference of duplicates) as well as the type of samples (e.g., blanks,
      replicates) analyzed to assess precision, bias, and completeness of the data
      must be presented in the QA project plan.  Routine procedures for  measuring
      precision and bias include the use of replicate analyses, SRMs, and matrix
      spikes.  The following routine procedures can be used to measure  precision
      and bias:

      1.   Replicate analysis

          Precision for duplicate chemical analyses will be calculated as  the relative
          percent difference:

                                               abs[D, - Do]
                   Relative percent difference = - — • - -  x 100
          where:

              D1  =   sample value


                                         105

-------
        D2  =   duplicate sample value
        abs =   absolute value.

    Precision for the replicate will be calculated as the relative standard
    deviation:
               Percent relative standard deviation = — x 100
                                                   x
    where:
        x = mean of three or more results
        a = standard deviation of three or more results.
                              o =
                                    n-1
2.  Matrix and surrogate spikes

    Bias of these measurements will be calculated as the ratio of the measured
    value to the known spiked quantity:

          Percent recovery = spiked result-unspiked result  x 10Q
                                     spike added

3.  Method blank

    Method blank results are assessed to determine the existence and
    magnitude of contamination.  Guidelines for evaluating blank results and
    specific actions to be taken are identified in U.S. EPA (1988a,b). Sample
    results will not be corrected by subtracting a blank value.

4.  Laboratory control sample

    Bias of these measurements will be calculated as the ratio of the measured
    value to the referenced value:

                Percent recovery =  measured value  x 1QO
                                   referenced value

5.  Completeness

    Completeness will be measured for each set of data received by dividing
    the number of valid (i.e., accepted) measurements actually obtained by the
    number of measurements that were planned:
                                   106

-------
                   Completeness =  valid data points obtained x 10Q
                                    total data points planned

          To be considered complete, the data set should also contain all QC check
          analyses that verify the accuracy (precision and bias) of the results.

2.15  CORRECTIVE ACTIONS

      Major problems that could arise during field or laboratory operations,
      predetermined corrective  actions for these problems, and the individual
      responsible for each corrective action are identified in this section.

      One purpose of any QA program is to identify nonconformance as quickly as
      possible. A  nonconformance event is defined as any event that does not follow
      defined methods,  procedures, or protocols, or any occurrence that may affect
      the quality of the data or study.  A QA program should have a corrective action
      plan and should provide feedback to appropriate management authority defining
      how all nonconformance events were addressed and corrected.

      Corrective actions fall into two categories:  1) handling of analytical or
      equipment malfunctions, and 2) handling of nonconformance or noncompliance
      with the QA  requirements that have been established.  During field and
      laboratory operations, the supervisor is  responsible for correcting equipment
      malfunctions. All corrective measures taken should be documented (e.g., a
      written standard operating procedure for the corrective action) and, if required,
      an alteration checklist should be completed.

      Corrective action procedures should be described for each project and include
      the following elements:

          •   Procedures for corrective actions when predetermined limits for
              data acceptability are exceeded (see DQO discussion in Section
              2.3)

          •   For each measurement system, the individual responsible for
              initiating the corrective action and the individual responsible for
              approving the corrective action.

      Corrective actions for field procedures should be described in a separate
      section from the corrective actions that  would apply to the data or laboratory
      analysis. Corrective actions may be initiated as a result of other QA activities
      including performance audits, system audits, interlaboratory/interfield
      comparison studies, and QA program audits. An example of a corrective
      actions checklist is provided in Appendix A.
                                          107

-------
2.16  QUALITY ASSURANCE REPORTS TO MANAGEMENT

      The process of assuring data quality does not end with the data review.  A
      report summarizing the sampling event (see Appendix H) and the QA review of
      the analytical data package should be  prepared, samples should be properly
      stored or disposed of, and laboratory data should  be archived in a storage file
      or database. Technical interpretation of the data begins after the  QA review
      has been completed.  Once data interpretation is complete, the results of the
      project should be carefully examined to determine how closely the original
      project goals and objectives were met. QA reviews  are particularly useful for
      providing data users with a written record of data concerns and a  documented
      rationale for why certain data were accepted as estimates or were rejected.

      QA project plans provide a mechanism for periodic reporting to management on
      the performance of measurement systems and data  quality. At a  minimum,
      these reports should include:

          •   Periodic assessment of measurement data  accuracy (precision and
              bias) and completeness

          •   Results of performance and system audits

          •   Significant QA problems and recommended solutions.

      The individuals responsible for preparing the periodic reports should be
      identified.  The final report for each project should include a separate QA
      section that summarizes data quality information contained in the periodic
      reports.  These reports may be prepared by the project manager if a brief
      evaluation was conducted, or by QA specialists  if a detailed review was
      requested by the project manager.
2.16.1   Preparing Basic Quality Assurance Reports

      Basic QA reports should summarize all conclusions concerning data
      acceptability and should note significant QA problems that were found.  The
      table of contents for a basic QA report should include the following:

          •  Data summary—The data summary section should discuss the
             number of samples collected, the laboratory(s) that analyzed the
             samples, and a summary of the data that were qualified during the
             QA review.

          •  Holding times—The holding time section should briefly discuss the
             holding time requirements and holding time exceedances.
                                        108

-------
          •   Analytical methods—The analytical methods section should briefly
              describe the methods of analysis, any departures from the
              methods, and any calibration or instrument-specific QC criteria
              exceedances.

          •   Accuracy—The accuracy section should include a discussion of
              QC criteria and exceedances for 1) analytical bias (surrogate
              compound, laboratory control sample, matrix spike, and reference
              material recoveries) and 2) precision of matrix replicates (and
              matrix spike  duplicates for organic compounds.

          •   Method blanks—The method blank section should  include a brief
              discussion of method blank QC criteria and exceedances.

      QA reviews are usually included as appendices to technical project reports. In
      any case, the QA review becomes part of the documented project file, which
      also includes the original data package and any computer files used in data
      compilation and analysis.
2.16.2   Preparing Detailed Quality Assurance Reports

      Depending on the project objectives, a more detailed QA report may be
      desired. An example of a detailed QA review for a metals data package is
      provided in Appendix F.  In addition to the sections outlined for the basic QA
      report, the detailed QA report should also include:

          •   Introduction—The introduction should give a brief overview of the
              purpose of data collection and brief summaries of how the samples
              were collected and processed in the field.

          •   Sample set description—The sample set section should describe
              the  number of samples sent to each laboratory, including the
              number of field blanks, field replicates, SRMs, and interlaboratory
              split samples.

          •   Sample delivery group description—The sample delivery group
              section should briefly describe how the samples were sorted by
              the  analytical laboratories (how many sample delivery groups were
              returned by the laboratory), and whether or not the QC criteria
              were performed at the correct frequency for each sample delivery
              group.

          •   Field QC summary—The field QC section should discuss the
              evaluation of the field blank and replicate results for the sample
              survey.
                                         709

-------
          •   Intel-laboratory comparison—The interlaboratory section, where
              applicable, should describe the evaluation of the split samples as
              compared to the corresponding samples analyzed by the contract
              laboratory.

          •   Field results description—The field results section, where
              applicable, should present tabular summaries of all data with
              appropriate qualifiers.

      For organic analyses, a discussion of the results of instrument tuning (if
      applicable), instrument calibration analyses, internal standard performance (if
      applicable), and summation of any factors that could effect overall data quality
      (e.g., system degradation) should also be included in the detailed QA report.
2.17 REFERENCES

      References cited in the QA project plan should be provided at the end of the
      plan.
                                         770

-------
3,   REFERENCES
      Ankley, G.T., G.J. Niemi, K.B. Lodge, H.J. Harris, D.L. Beaver, D.E. Tillitt, T.R.
      Schwartz, J.P. Giesy, P.O. Jones, and C. Hagley.  1993. Uptake of planar
      polychlorinated biphenyls and 2,3,7,8-substituted polychlorinated dibenzofurans
      and dibenzo-p-dioxins by birds nesting in the lower Fox  River and Green Bay,
      Wisconsin.  Submitted to Arch. Environ.  Contam. Toxicol.

      APHA.  1989. Standard methods for the analysis of water and wastewater.
      17th ed. American Public Health Association, American Water Works
      Association, Water Pollution Control Federation, Washington,  DC.

      ASTM.  1991 a. Annual book of standards. Volume II, Water.  American Society
      for Testing and Materials, Philadelphia, PA.

      ASTM.  1991b. Standard guide  for collection, storage, characterization, and
      manipulation of sediment for toxicological testing.  Method El391-90.  In:
      Annual  Book of ASTM Standards, Water, and Environmental Technology,
      Volume 11.04.  American Society for Testing and Materials, Philadelphia, PA.

      ASTM.  1992. Standard test method for classification of soils for engineering
      purposes. In: Annual Book of ASTM Standards, D  2487, Volume 04.08.
      American Society for Testing and Materials, Philadelphia, PA.

      Ballschmiter, K., and M. Zell.  1980.  Analysis of polychlorinated biphenyls
      (PCBs) by glass capillary gas chromatography, composition of technical Aroclor-
      and Clophen-PCB mixtures.  Fresenius Anal. Chem. 302:20-31.

      Battelle. 1985. Method for semivolatile  organic priority  pollutants  in fish.  Final
      report prepared for the U.S. Environmental Protection Agency under Contract
      No. 68-03-1760.

      Becker, D.S., and T.C. Ginn.  1990.  Effects of sediment holding time on
      sediment toxicity.  Prepared for U.S.  Environmental Protection Agency Region
      10,  Seattle,  WA.  PTI Environmental  Services, Bellevue, WA.

      Bligh, E.G.,  and W.J. Dyer.  1959.  A rapid method of total lipid extraction and
      purification.  Can J.  Biochem. Physiol. 37:911-917.

      Bloom,  N.S., E.A. Crecelius,  and S. Berman.  1983.  Determination of mercury
      in seawater at sub-nanogram per liter levels. Marine Chem. 14:49-59.
                                        111

-------
Burton, G.A. Jr.  1991.  Assessing the toxicity of freshwater sediments.
Environ. Toxicol. Chem. 10:1587-1627.

Chiou, C.T., V.H. Freed, D.W. Schmedding, and R.L. Kohnert. 1977.  Partition
coefficient and bioaccumulation of selected organic chemicals. Environ. Sci.
Technol. 11:475-478.

Cutter, G.A., and T.J. Gates.  1987.  Determination of dissolved sulfide and
sedimentary sulfur speciation using gas chromatography-photoionization
detection. Anal. Chem. 59:717-721.

Danielson, L, B. Magnussen,  and S. Westerland.  1978. An improved metal
extraction procedure  for determination of trace metal in seawater by atomic
absorption spectrometry with electrothermal atomization. Anal. Chem. Acta
98:47-55.

DeWitt, T.H., M.S. Redmond, J.E. Sewall, and R.C. Swartz.  1992a.
Development of a chronic sediment toxicity test for marine benthic amphipods.
Report prepared for U.S. Environmental Protection Agency, Newport, OR.
Contract No. CR-8162999010.

DeWitt, T.H., R.J. Ozretich, R.C. Swartz, J.O. Lamberson, D.W. Schults, G.R.
Ditsworth, J.K.P. Jones, L. Hoselton, and L.M. Smith. 1992b. The influence of
organic matter quality on the toxicity and partitioning of sediment-associated
fluoranthene. Environ. Toxicol. Chem. 11:197-208.

Di Toro, D.M., J.D.  Mahony, D.J.  Hansen, K.D. Scott, M.B. Hicks, S.M. Mayr,
and M.S. Redmond.  1990.  Toxicity of cadmium in sediments: role of acid
volatile sulfide.  Environ. Toxicol.  Chem. 9:1487-1502.

Di Toro, D.M., C.S. Zarba, D.J. Hansen, W.J. Berry, R.C. Swartz, C.E. Cowan,
S.P. Pavlou, H.E. Allen, N.A. Thomas, and P.R. Paquin.  1991.  Technical basis
for establishing sediment quality criteria for nonionic organic chemicals using
equilibrium partitioning.  Environ. Toxicol. Chem.  10:15411583.

Dunn, W.J., III, D.L Stallings, T.R. Schwartz, J.W. Hogan, J.D. Petty, E.
Johanson, and S. Wold. 1984. Pattern recognition for classification and
determination of polychlorinated biphenyls in  environmental samples. Anal.
Chem. 56:1308-1313.

Engler, R.M., L Saunders, and T. Wright.  1991.  The nature of dredged
material.  Environ. Profess. 13:313-316.

EPRI.  1986. Speciation of selenium and arsenic in natural waters and sed-
iments. Vol. 2.  Prepared by Battelle Pacific  Northwest Laboratories. EPRI
EA-4641.  Prepared for Electrical Power Research Institute.
                                   112

-------
Folch, J., M. Lees, and G.A. Sloane-Stanley.  1957.  A simple method for the
isolation and purification of total lipids from animal tissues.  J. Biol. Chem.
226:497-509.

Folk, R.L.  1980.  Petrology of sedimentary rocks. Hemphill Publishing Co.,
Austin, TX. 182 pp.

Grasshof,  K., M. Ehrhardt and K. Kremling. 1983. Methods of sea water
analysis. 2nd, revised and extended version. Verlag Chemie, Weinheim, 419
pp.

Higgins, T.R.  1988. Techniques for reducing the costs of sediment evaluation.
Tech. Note EEDP-06-2.  U.S. Army Corps of Engineers, Waterways
Experiment Station, Vicksburg, MS.

Higgins, T.R., and C.R. Lee.  1987. Sediment collection and analysis methods.
Tech. Note EEDP-06-1.  U.S. Army Corps of Engineers, Waterways
Experiment Station, Vicksburg, MS.

Ingham, A.E.  1975. Sea surveying. John Wiley and Sons, New York, NY.
306 pp.

Kenaga, E.E., and C.A.I. Goring.  1980. Relationship between water solubility,
soil sorption, octanol-water partitioning, and concentration of chemicals in biota.
pp. 78-115. In: Aquatic Toxicology. ASTM Spec. Tech. Publ. 707.  J.G.
Eaton, P.R. Parish, and A.C.  Hendricks (eds). American Society for Testing
and Materials,  Philadelphia, PA.

Kirchmer, C. 1988.  Estimation of  detection limits for environmental analytical
procedures: A tutorial.  In: Detection of Analytical Chemistry. American
Chemical Society. Washington, DC.

Kuehl, D.W., P.M. Cook, A.R. Batterman, D. Lothenbach, and B.C. Butterworth.
1987.  Bioavailability of polychlorinated dibenzo-p-dioxins and dibenzofurans
from contaminated Wisconsin River sediment to carp. Chernosphere
16:667-679.

Kuehl, D.W., B.C.  Butterworth, J. Libal, and P. Marquis. 1991. An isotope
dilution high resolution gas chromatographic-high resolution mass spectrometric
method for the determination  of coplanar polychlorinated biphenyls: application
to fish and marine mammals. Chernosphere 22:849-858.

Mackay, D. 1982. Correlation of bioconcentration factors.  Environ. Sci.
Technol. 5:274-278.
                                   773

-------
McDonald, D.D., S.L. Smith, M.P. Wong, and P. Mudroch.  1992. The
development of Canadian marine environmental quality guidelines. Marine
Environmental Quality Series No. 1. Environment Canada, Ecosystem
Sciences and Evaluation Directorate, Eco-Health Branch, Ottawa, Ontario.

McFarland, V.A., and J.U. Clarke. 1989.  Environmental occurrence,
abundance, and potential toxicity of polychlorinated biphenyl congeners:
considerations for a congener-specific analysis. Environ. Health Perspect.
81:225-239.

Mehrle, P.M., D.R. Buckler, E.E. Little, L.M. Smith, J.D. Petty, P.M. Peterman,
D.L. Stalling, G.M. DeGraeve, J.J. Coyle, and W.J. Adams.  1988. Toxicity and
bioconcentration of 2,3,7,8-tetrachlorodibenzodioxin and
2,3,7,8-tetrachlorodibenzofuran in rainbow trout. Environ. Toxicol. Chem.
7:47-62.

Mudroch, A., and S.D. MacKnight.  1991.  Handbook of techniques for aquatic
sediments sampling.  CRC Press, Boca Raton,  FL.  210 pp.

Mullin, M.D., C.M. Pochini, S. McCrindle, M. Romkes, S.H. Safe, and L.M. Safe.
1984.  High-resolution PCB analysis: synthesis and chromatographic properties
of all 209 PCB congeners.  Environ. Sci. Technol. 18:468-476.

Nakashima, S., R.E. Sturgeon, S.N.  Willie, and  S.S.  Berman. 1988. Acid
digestion of marine sample for trace element analysis using microwave  heating.
Analyst Vol.  113.

NCASI. 1986.  Procedures for the analysis of resin and fatty acids in pulp mill
effluents.  Technical Bulletin No. 501. National  Council for Air and Stream
Improvement, New York, NY.

NOAA.  1985.  Standard analytical procedures of the NOAA Analytical Facility,
1985-1986.  NOAA Tech. Memo. NMFS F/NWC-92. U.S. Department of
Commerce, National Oceanic and Atmospheric Administration, National Marine
Fisheries Service, National Analytical Facility, Environmental Conservation
Division, Northwest and Alaska Fisheries Center, Seattle, WA.

NOAA.  1989.  Standard analytical procedures of the NOAA National Analytical
Facility. 2nd ed. NOAA Tech. Mem. NMFS F/NWC-92, 1985-86. Contact:
National Status and Trends Program, National Oceanic and Atmospheric
Administration, NOAA N/OMA32, 11400 Rockville Pike, Rockville, MD 20852.

Parsons, T.R.,  Y. Maita  and c.M. Lalli. 1984. A manual of chemical and
biological methods for seawater analysis.  Pergamon Press, N.Y., N.Y.  173 pp.
                                  114

-------
Plumb, R.H., Jr. 1981. Procedure for handling and chemical analysis of
sediment and water samples.  Technical Report EPA/CE-81-1.  Prepared by
State University College at Buffalo, Great Lakes Laboratory, Buffalo, NY.  U.S.
Environmental Protection Agency and U.S. Army Corps of Engineers,
Waterways Experiment Station, Vicksburg, MS.

PSEP.  1986. Recommended protocols for measuring conventional sediment
variables in Puget Sound.  In: Recommended Protocols and Guidelines for
Measuring Selected Environmental Variables in Puget Sound. U.S.
Environmental Protection Agency Region  10, Puget Sound EEstuary Program,
Seattle, WA. (Looseleaf.)

PSEP.  1990a (revised). Recommended  protocols for measuring metals in
Puget Sound water, sediment, and tissue samples.  Prepared by PTI
Environmental Services, Bellevue, WA. In:  Recommended Protocols and
Guidelines for Measuring Selected Environmental Variables in Puget Sound.
U.S. Environmental Protection Agency Region 10, Puget Sound  Estuary
Program, Seattle, WA.  (Looseleaf.)

PSEP.  1990b (revised). Recommended  protocols for station positioning in
Puget Sound. Prepared by PTI Environmental Services, Bellevue, WA. In:
Recommended Protocols and Guidelines  for Measuring Selected Environmental
Variables in Puget Sound.  U.S. Environmental Protection Agency Region 10,
Puget Sound Estuary Program, Seattle, WA. (Looseleaf.)

PSEP.  1990c (revised). Recommended  guidelines for measuring organic
compounds in Puget Sound sediments and tissue samples. Prepared by PTI
Environmental Services, Bellevue, WA. In:  Recommended Protocols and
Guidelines for Measuring Selected Environmental Variables in Puget Sound.
U.S. Environmental Protection Agency Region 10, Puget Sound  Estuary
Program, Seattle, WA.  (Looseleaf.)

PSEP.  1991. Pollutants of concern in Puget Sound. EPA 910/9-91-003.
Prepared for U.S. Environmental Protection Agency Region 10, Office of
Coastal Waters, Seattle, WA.  Puget Sound Estuary Program. PTI
Environmental Services, Bellevue, WA.

PTI. 1989a. Data quality evaluation for proposed dredged material disposal
projects.  Prepared for Washington Department of Ecology, Sediment
Management Unit. PTI Environmental Services, Bellevue, WA.

PTI. 1989b. Data validation guidance manual for selected sediment variables.
Draft Report.  Prepared for Washington Department of Ecology, Sediment
Management Unit. PTI Environmental Services, Bellevue, WA.
                                  115

-------
Randall, R.C., H. Lee II, and RJ. Ozretich.  1991. Evaluation of selected lipid
methods for normalizing pollutant bioaccumulation.  Environ. Toxicol. Chem.
10:1431-1436.

Rice, C., F. Espourteille, and R. Huggett. 1997.  A method for analysis of
tributyltin in estuarial sediments and oyster tissue, Crassostrea virginica. Appl.
Organomet.  Chem. 1:541-544.

Schwartz, T.R., R.D. Campbell, D.L Stalling, R.L. Little, J.D. Petty, J.W. Hogan,
and  E.M. Kaiser.  1984. Laboratory data base for isomer-specific determination
of polychlorinated biphenyls. Anal. Chem. 56:1303-1308.

Schwartz, T.R., D.E. Tillit, K.P. Feltz, and P.M. Peterman. 1993.  Determination
of nono- and non-O,O'-chlorine substituted polychlorinated biphenyls in Aroclors
and  environmental samples.  Submitted to Chemosphere. 26:1443-1460.

Sloan,  C.A., N.G. Adams, R.W. Pearce, D.W. Brown, and S-L. Chan. 1993.
Northwest Fisheries Science Center organic analytical procedures. In:  U.S.
Department of Commerce, NOAA Tech. Memo. NOS ORCA 71.  Sampling and
Analytical  Methods of the National Status and Trends Program, National
Benthic Surveillance and Mussel Watch Projects  1984-1992. Volume IV.
Comprehensive Descriptions of Trace Organic Analytical Methods. Editors:
G.G. Lauenstein and A.Y. Cantillo. p. 53-97.

Smith,  L.M.  1981. Carbon dispersed on glass fibers as an adsorbent for
contaminant enrichment and fractionation.  Anal. Chem. 53:2152-2154.

Smith,  L.M.,  D.L. Stalling, and J.L. Johnson.  1984. Determination of part-per-
trillion levels of polychlorinated dibenzofurans and dioxins in environmental
samples.  Anal. Chem. 56:1830-1842.

Stalling, D.L, T.R. Schwartz, W.J. Dunn, III, and S. Wold. 1987.  Classification
of polychlorinated  biphenyl residues.  Anal. Chem. 59:1853-1859.

Strickland, J.D.H. and T.R. Parsons.  1972.  A practical  handbook of seawater
analysis. Bull. Fish. Res. Bd. Canada. 122:1-172.

Sturgeon,  R., S. Willie, and S. Berman. 1985. Preconcentration of selenium ad
antimony from seawater for determination of graphite furnace atomic absorption
spectrometry.  Anal. Chem. 57:6-9.

Sturgis, T.C.  1990.  Guidance for contracting biological  and chemical
evaluations of dredged material. Tech. Rep.  D-90-XX.  U.S. Army Corps of
Engineers, Waterways Experiment Station, Vicksburg, MS.
                                   116

-------
Tatem, H.E., D.L Brandon, C.R. Lee, A.S. Jarvis, and R.G. Rhett.  1991.
Effects of storage on sediment toxicity, bioaccumulation potential and chemistry.
Misc. Paper EL-91-2. U.S. Army Corps of Engineers, Waterways  Experiment
Station, Vicksburg,  MS.  62 pp.

Tetra Tech.  1985.  Bioaccumulation monitoring guidance:  1.  Estimating the
potential  for bioaccumulation  of priority pollutants and 301 (h) pesticides
discharged into marine and estuarine waters.  Final Report.  Prepared for U.S.
Environmental Protection Agency. Tetra Tech, Inc., Bellevue, WA.

Tetra Tech.  1986a. Analytical methods for U.S. EPA priority pollutants and
301 (h) pesticides in estuarine and marine sediments.  Final Report. Prepared
for U.S. Environmental Protection Agency. Tetra Tech, Inc., Bellevue, WA.

Tetra Tech.  1986b. Bioaccumulation monitoring guidance: 4. Analytical
methods  for U.S. EPA priority pollutants and 301 (h) pesticides in tissues from
estuarine and marine organisms.  Final Report.  Prepared for U.S.
Environmental Protection Agency. Tetra Tech, Inc., Bellevue, WA.

Uhler, A.D., and G.S. Durrel.   1989.  Measurement of butyltin species in
sediments by n-pentyl  derivation with gas chromatography/flame photometric
detection (GC/FPD). Battelle Ocean Sciences Project No. N-0519-6100,
Duxbury,  MA.

Uhler, A.D., T.H. Coogan, K.S. Davis, G.S. Durell, W.G. Steinhauer, S.Y.
Freitas, and P.O. Boehm. 1989.  Findings of tributyltin, dibutyltin, and
monobutyltin in bivalves from selected U.S. coastal waters. Environ. Toxicol.
Chem. 8:971-979.

USAGE.  1986.  Development of a modified elutriate test for estimating the
quality of effluent from confined dredged material disposal areas. Appendix A:
recommended procedure for conducting modified elutriate tost.  U.S. Army
Corps of  Engineers, Waterways Experiment Station, Vicksburg, MS.

USAGE.  1990. Engineering  and design: hydrographic surveying.  Draft
Report. EM 1110-2-1003.  U.S. Army Corps of Engineers, Washington, DC.

U.S. EPA.  1979. Handbook for analytical QA in water and wastewater
laboratories. Chapter  6.  EPA 600/4-79-019.  U.S. Environmental Protection
Agency, Washington, DC.

U.S. EPA.  1982. Test methods.  Methods for organic chemical analysis of
municipal and industrial wastewater.  EPA 600/4-82-057. U.S. Environmental
Protection Agency,  Environmental Monitoring and Support Laboratory,
Cincinnati, OH.
                                   117

-------
U.S. EPA.  1983.  Methods for the chemical analysis of water and wastes.
EPA 600/4-79-020. U.S. Environmental Protection Agency, Environmental
Monitoring and Support Laboratory, Cincinnati, OH. 460 pp.

U.S. EPA.  1984a.  Guidance for the preparation of combined/work quality
assurance project plan for environmental monitoring. OWRS QA-1. U.S.
Environmental Protection Agency, Office of Water" Regulations and Standards,
Washington, DC.

U.S. EPA.  1984b.  Standard operating safety guides.  U.S. Environmental
Protection Agency, Office of Emergency and Remedial Response, Washington,
DC.

U.S. EPA.  1985. Technical support document for water quality-based toxics
control.  EPA 440/4-85-032. U.S. Environmental Protection Agency, Office of
Water Enforcement and Permits, Washington, DC.

U.S. EPA.  1986a.  Test methods for evaluating solid waste (SW-846): phys-
ical/chemical methods.  U.S. Environmental Protection Agency, Office of Solid
Waste and Emergency Response, Washington, DC.

U.S. EPA.  1986b (revised July 1987).  U.S. EPA Contract Laboratory
Program—statement of work for organics analysis, multi-media, multi-
concentration. IFB WA87K236-IFB WA87K238.  U.S. Environmental Protection
Agency, Washington,  DC.

U.S. EPA.  1987a.  Quality assurance/quality control (QA/QC) for 301 (h) mon-
itoring programs: guidance on field and laboratory methods.  EPA   '
430/9-86-004.  NTIS Number PB87-221164. Prepared by Tetra Tech, Inc.,
Bellevue, WA. U.S. Environmental Protection Agency,  Office of Marine and
Estuarine Protection, Washington, DC.

U.S. EPA.  1987b.  Evaluation of survey positioning methods for nearshore
marine and estuarine  waters. EPA 430/9-86-003. U.S. Environmental
Protection Agency, Office of  Marine and Estuarine Protection, Washington, DC.
54 pp. + appendices.

U.S. EPA.  1988a.  Laboratory data validation: functional guidelines for
evaluating inorganics  analyses.  U.S. Environmental Protection Agency, Office
of Emergency and Remedial Response, Washington, DC.

U.S. EPA.  1988b.  Laboratory data validation: functional guidelines for
evaluating organics analyses. U.S.  Environmental Protection Agency, Office of
Emergency and  Remedial Response, Washington, DC.
                                  118

-------
U.S. EPA. 1988c.  U.S. EPA Contract Laboratory Program statement of work
for organics analyses, multi-media, multi-concentration.  U.S. Environmental
Protection Agency, Washington, DC.

U.S. EPA. 1989a.  Preparing perfect project plans: a pocket guide for the
preparation of quality assurance project plans.  U.S. Environmental Protection
Agency, Risk Reduction Engineering Laboratory, Cincinnati,  OH.

U.S. EPA. 1989b.  Test methods for evaluating solid waste.  Determination of
polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans by high
resolution gas chromatography/high resolution mass spectrometry.  EPA
Method 8290.  SW-846, Revision 0. U.S. Environmental Protection Agency,
Washington,  DC.

U.S. EPA. 1989c.  Method 1624, Revision C:  Volatile orgsinic compounds by
isotope dilution GCMS.  Method 1625, Revision C:  Semivolatile organic
compounds by isotope dilution GCMS. U.S. Environmental Protection Agency,
Office of Water, Office of Water Regulations and Standards, Industrial
Technology Division, Washington, DC.

U.S. EPA. 1989d.  Interim procedures for estimating risks with exposures to
mixtures of chlorinated dibenzo-p-dioxins and dibenzofurans (CDDs and CDFs)
and 1989 update.  EPA/625/3-89/016. U.S. Environmental Protection Agency,
Risk Assessment Forum, Washington, DC.

U.S. EPA. 1990a.  Macroinvertebrate field and laboratory methods for
evaluation of the biological integrity of surface waters.  EPA  600/4-90-030.  U.S.
Environmental  Protection Agency, Office of Research and Development,
Washington,  DC.

U.S. EPA. 1990b.  Methods for measuring the acute toxicily of receiving waters
to freshwater and marine organisms.  EPA 600/4-90-027. U.S. Environmental
Protection Agency, Office of Research and Development, Washington, DC.  239
pp.

U.S. EPA. 1990c.  Specifications and guidance for obtaining contaminant-free
sampling containers.  Directive #9240.0-05, April 1990.  U.S. Environmental
Protection Agency, Office of Solid Waste and Emergency Response,
Washington,  DC.

U.S. EPA. 1990d.  U.S. EPA Contract Laboratory Program statement of work
for organic analyses, multi-media, multi-concentration.  Document #OLM01.8.
U.S. Environmental Protection Agency, Washington, DC.
                                  119

-------
U.S. EPA.  1990e.  U.S. EPA Contract Laboratory Program statement of work
for inorganics analyses, multi-media, multi-concentration.  Document #ILM02.0.
U.S. Environmental Protection Agency, Washington, DC.

U.S. EPA.  1990f. Tetra- through octa- chlorinated dioxins and furans by
isotope dilution HRGC/HRMS. Method 1613, Revision A. U.S. Environmental
Protection Agency, Office of Water, Washington, DC.

U.S. EPA.  1991 a.  Draft analytical method for determination of acid volatile
sulfide in sediment.  U.S. Environmental Protection Agency, Office of  Science
and Technology, Washington, DC.

U.S. EPA.  1991b.  Technical support document for water quality-based toxics
control. EPA 505/2-90-001. U.S.  Environmental Protection Agency,  Office of
Water, Washington, DC.

U.S. EPA.  1991c.  Methods for the determination of metals in environmental
samples.  EPA/600-4-91-010.  U.S. Environmental Protection Agency,
Environmental Monitoring Systems Laboratory, Office of Research and
Development, Cincinnati, OH.

U.S. EPA.  1991d.  A project manager's guide to requesting and evaluating
chemical analyses. EPA 910/9-90-24.  Prepared by PTI Environmental  Serv-
ices, Bellevue, WA.  U.S. Environmental Protection Agency Region 10, Puget
Sound Estuary Program, Seattle, WA.

U.S. EPA.  1991e.  Manual for the evaluation of laboratories performing aquatic
toxicity tests.  EPA/600-4-90-031.  U.S. Environmental Protection Agency,
Office of Research and  Development, Washington, DC.

U.S. EPA.  1992a.  Fish field and laboratory methods for evaluation of the
biological integrity of surface waters. EPA 600/R-92-111.  U.S. Environmental
Protection Agency, Office of Research and  Development, Washington, DC.

U.S. EPA.  1992b.  Methods for the determination of chemical substances in
marine and estuarine environmental samples. Method 440.0.  EPA 600/R-92-
121. U.S. Environmental Protection Agency, Environmental Monitoring Systems
Laboratory, Cincinnati, OH.

U.S. EPA.  1993a.  Recommended analytical techniques and quality
assurance/quality control guidelines for the  measurement of organic and
inorganic analytes in marine sediment and tissue samples.  Prepared  by  U.S.
Environmental Protection Agency, Environmental Research Laboratory,
Narragansett, Rl.
                                  120

-------
U.S. EPA. 1993b.  Locations! data policy Implementation guidance. EPA 2180.
U.S. Environmental Protection Agency, Administration and Resources
Management, Information Manaegment Services Division, Washington, DC.

U.S. EPA and USAGE.  1991.  Evaluation of dredged material proposed for
ocean disposal—testing manual.  EPA 503/8-91-001.  U.S. Environmental
Protection Agency and U.S. Army Corps of Engineers, Washington, DC.

U.S. EPA and USAGE.  1994.  Evaluation of dredged material proposed for
discharge in waters of the U.S.—testing manual (Draft).  U.S. Environmental
Protection Agency and U.S. Army Corps of Engineers, Washington, DC.

Veith, G.D., K.J. Macek, S.R. Petrocelli, and J. Carroll.  1980.  An evaluation
using partition coefficients and water solubility to estimate bioconcentration
factors for organic chemicals in fish. pp. 116-129. In:  Aquatic Toxicology.
ASTM Spec. Tech. Publ. 707. J.G. Eaton,  P.R. Parish, and A.C. Hendricks
(eds). American Society for Testing and Materials, Philadelphia, PA.
                                  121

-------

-------
4.    GLOSSARY
      Accuracy
      Acid Volatile Sulfide
      Analyte
      Bias
      Bioaccumulation
      Bioassay
      Bioconcentration Factor
The ability to obtain precisely a nonbiased (true)
value.  Accuracy as used in this document is the
combined measure of precision and bias (see
footnote at beginning of Section 2).

The sulfides removed from sediment by cold acid
extraction, consisting mainly of H2S and FeS.  AVS
is a possible predictive tool for divalent metal
sediment toxicity.

The specific component measured in a chemical
analysis.

Deviation of the measurement from the true value.
Usually expressed as the percent recovery of a
known amount of a chemical added to a sample at
the start of a chemical analysis.  Bias (along with
precision) is a component of the overall accuracy
of a system.

The accumulation of contaminants in the tissue of
organisms through any route, including respiration,
ingestion, or direct contact with contaminated
water, sediment, pore water, or dredged material.

A bioassay is a test using a biological system.  It
involves exposing an organism to a test material
and determining a response.  There are two major
types of bioassays differentiated by response:
toxicity tests which measure an effect (e.g., acute
toxicity, sublethal/chronic toxicity) and
bioaccumulation tests which measure a
phenomenon (e.g., the uptake of contaminants into
tissues).

The degree to which an organism uptakes a
substance from water.
                                        123

-------
Blanks
Calibration
Chromatography
Cleanup
Comparability
Completeness
Confined Disposal
Facility

Contaminant
QC samples that are processed with the samples
but contain only reagents.  They are used to obtain
the response of an analysis in the absence of a
sample, including assessment of contamination
from sources external to the sample.

The systematic determination of the relationship of
the response of the measurement system to the
concentration of the analyte of interest. Instrument
calibration performed before any samples are
analyzed is called the initial calibration.
Subsequent checks on the instrument calibration
performed throughout the analyses of samples are
called continuing calibration.

The process of selectively separating a mixture into
its component compounds.  The compounds are
measured and presented graphically in the form of
a chromatogram and digitally as a quantification
report.

The process of removing certain components from
sample extracts, performed to improve instrument
sensitivity

Reflects the confidence with which one data set
can be compared with others and the expression of
results consistent with other organizations reporting
similar data.  Comparability of analytical
procedures also implies using analytical
methodologies that produce results comparable in
terms of precision, bias, and effective range of
calibration.

A measure of the amount of valid data obtained vs.
the amount of data originally intended to be
collected.

A diked area, either in-water or upland, used to
contain dredged material.

A chemical or biological substance in a form that
can be incorporated into, onto, or  be ingested by
and that harms aquatic organisms, consumers of
aquatic organisms,  or users of the aquatic
environment, and includes but is not limited to the
                                   124

-------
Control Limit
Control Sediment
Data Package
Data Quality Indicators
Data Quality Objectives
(DQOs)
substances on the 307(a)(1) list of toxic pollutants
promulgated on January 31, 1978 (43 Federal
Register 4109).

A value for data from the analysis of QC checks
indicating that a system or a method is not
performing normally and that an appropriate
corrective action should be takon. When control
limits are exceeded, analyses should be halted;
samples analyzed since the las;t QC sample may
need reanalysis.

A sediment used to confirm the.1 biological
acceptability of the test conditions and to help
verify the health of the organisms during the test.
Control sediment is essentially free of
contaminants and compatible with the biological
needs  of the test organisms such that it has no
discernable influence on the response being
measured in the test. Test procedures are
conducted with  the control sediment in the same
way as the reference sediment and dredged
material. Control sediment may be the sediment
from which the test organisms are collected or a
laboratory sediment.  Excessive mortality in the
control sediment indicates a problem with  the test
conditions or organisms, and can invalidate the
results of the corresponding dredged material test.

The results of chemical analyses completed by a
laboratory, compiled, printed out, and presented to
the agency or individual reque:sting  the analyses.
The data package should include chromatograms,
calculations, and tuning and calibration summaries,
where  appropriate. Also included in the data
package may be computer disks, magnetic tape, or
other forms of electronically stored data.

Surrogate spike recoveries, matrix spike
recoveries, analytical values obtained for blanks,
standard reference material, and performance
evaluation samples for  each parameter in  each
matrix.

Qualitative and  quantitative statements of the
overall uncertainty that  a decision maker is willing
to accept in results or decisions derived from
                                   125

-------
Detector
Digestion
Disposal Site
Dredged Material
Dredged Material
Discharge
Elutriate



Evaluation


Extraction
Interference
environmental data.  DQOs provide the framework
for planning environmental data operations
consistent with the data user's needs.

A device used in conjunction with an analytical
instrument to determine the components of a
sample.

A process used prior to analysis that breaks down
samples using acids (or bases).  The end product
is called a digestate. Other chemicals, called
matrix modifiers, may be added to improve the
final digestate.

That portion of inland or ocean where specific
disposal activities are permitted.  It consists of a
bottom surface area and any overlying volume of
water.

Material excavated or dredged from waters of the
United States. A general discussion of the nature
of dredged material is provided by Engler et al.
(1991).

Any addition of dredged material into waters of the
United States, including:  open water discharges;
discharges from unconfined disposal operations
(such as beach nourishment or other beneficial
uses); discharges from  confined disposal facilities
which enter waters of the United States (such as
effluent, surface runoff,  or leachate); and overflow
from dredge hoppers, scows, or other transport
vessels.

Material prepared from  the sediment dilution water
and used for chemical analyses and toxicity
testing.

The process of judging  data in order to reach a
decision.

A chemical or mechanical procedure to remove
semivolatile organic compounds from a sample
matrix. The end product of extraction is called an
extract.

Unwanted elements or compounds in a sample
                                   126

-------
Ion
Matrix
Matrix Effects
Matrix Spike Samples
Metals
that have properties similar to those of the
chemical of interest and that collectively cause
unacceptable levels of bias in the results of a
measurement or in sensitive measurements.
Unless removed by an appropriate cleanup
procedure, the interferant is carried along with the
chemical of interest through the analytical
procedure.

An atom or group of atoms that carries a positive
or negative electric charge as a result of having
lost or gained one or more electrons.

The sample material (e.g., water, sediment, tissue)
in which the chemicals of interest are found.
Matrix refers to the physical structure of a sample
and how chemicals are bound within this structure.
At a gross level, tissue is one kind of sample
matrix and soil is another. At a finer level, a
sediment sample of silty sand containing large
amounts of calcium carbonate from the shells of
aquatic  organisms represents a different sample
matrix than a sediment sample of clayey silt
containing a large amount of organic carbon from
decaying vegetation.

Matrix effects are physical or chemical interactions
between the sample material and the chemical of
interest  that can bias chemical measurements in
either a negative or positive direction. Because
matrix effects can vary from sample to sample and
are often not well understood, they  are a major
source of variability in chemical analyses.

QC check samples created by adding known
amounts of chemicals of interest to actual samples,
usually prior to extraction or digestion. Analysis of
matrix spikes and matrix spike duplicates will
provide  an indication of bias due to matrix effects
and an estimation of the precision of the results.

A group of naturally occurring elements.  Certain
metals (such as mercury, lead, nickel, zinc, and
cadmium) can be of environmental concern when
they are released to the environment in  unnaturally
high amounts. This group usually includes the
metalloid arsenic.
                                   727

-------
Organic Compounds
Performance Audit
Precision
Quality Assurance
Quality Assurance
Management Plan
Quality Assurance
Project Plan
Quality Control
Quality Control
Checks
Carbon-based substances commonly produced by
animals or plants.  Organic chemicals are
chemical compounds based on carbon chains or
rings and also containing hydrogen with or without
oxygen, nitrogen, or other elements. Organic
chemicals may be produced naturally by plants and
animals or processed artificially using various
chemical reactions.

Audit of a laboratory's performance by testing a
standard reference material.  The test results are
evaluated by the auditor.

The ability to replicate a value; the degree to which
observations or measurements of the same
property, usually obtained under similar conditions,
conform to themselves.  Usually expressed as
standard deviation, variance, or range.  Precision,
along with bias, is a component of the overall
accuracy of a system.

The total integrated program for assuring the
reliability of data.  A system for integrating the
quality planning,  quality control, quality
assessment, and quality improvement efforts to
meet user requirements  and defined standards of
quality with a stated level of confidence.

A detailed document specifying guidelines and
procedures to assure data quality at the program
level (i.e., multiple projects).

A detailed, project-specific document specifying
guidelines and procedures to assure data  quality
during  data collection, analysis, and  reporting.

The overall system of technical activities for
obtaining prescribed standards of performance in
the monitoring and measurement process  to meet
user requirements.

Blanks, replicates, and other samples used to
assess the overall analytical system  and to
evaluate the performances of individual analytical
instruments or the technicians that operate them.
                                   128

-------
Reference Materials
Reference Sediment
Replicates
Representativeness
Sediment
Semi volatile
Organic
Compound
Spectrornetry
Materials or substances with well-characterized
properties that are useful for assessing the
accuracy of an analysis and comparing analytical
performances among laboratories.

A sediment that serves as a point of comparison to
identify potential effects of contaminants in the
dredged material (see Inland and Ocean Testing
manuals for further discussion).

One of several identical samples. When two
separate samples are taken from the same station,
or when one sample is split into two separate
samples and analyzed, these samples are called
duplicates. When three identical samples are
analyzed, these samples are called triplicates.

The degree to which sample data depict an
existing environmental condition; a measure of the
total variability associated with sampling and
measuring that includes the two major error
components:  systematic error (bias) and random
error. Sampling representativeness is
accomplished through proper selection of sampling
locations and sampling techniques, and collection
of sufficient number of samples.

Material, such as sand, silt, or clay, suspended in
or settled on the bottom of a water body.  The term
dredged material refers to material which has been
dredged from a water body (se«  definition of
dredged material), while the term sediment refers
to material in a water body  prior to the dredging
process.

An organic compound with moderate vapor
pressure that can be extracted from samples using
organic solvents and analyzed by gas
chromatography.

The use of speetrographie techniques for deriving
the  physical constants of materials. Four basic
forms of spectrometry commonly used are atomic
absorption spectrometry (AA), inductively coupled
plasma-atomic emission spectrometry (ICP) for
metals, and ultraviolet spectrometry (UV) and
                                  129

-------
Spiked Method
Blanks
Standard Operating
Procedure
Standard Reference
Material
Statement of Work
Surrogate Spike
Compounds
Target Detection Limit
(TDL)
fluorescence emission or excitation spectrometry
for organic compounds.

Method blanks to which known amounts of
surrogate compounds and analytes have been
spiked. Such samples are useful to verify
acceptable method performance prior to and during
routine analysis of samples containing organic
compounds. Also known as check standards in
some methods; independently prepared standards
used to check for bias and to estimate the
precision of measurements.

A written document which details an operation,
analysis, or action whose mechanisms are
thoroughly prescribed and which is commonly
accepted as the method for performing certain
routine or repetitive tasks.

Standard reference materials are certified
reference materials containing precise
concentrations of chemicals, accurately determined
by a variety of technically valid procedures.

A contract addendum used as a legally binding
agreement between the individual or organization
requesting an analysis and the individual,
laboratory, or organization performing the actual
tasks.

Compounds with characteristics similar to those of
compounds of interest that are added to a sample
prior to extraction. They are used to  estimate the
recovery of organic compounds in a sample.

A performance goal set by consensus between the
lowest, technically feasible, detection limit for
routine analytical methods and available regulatory
criteria or guidelines for evaluating dredged
material. The TDL is, therefore,  equal to or greater
than the lowest  amount of a chemical that can be
reliably detected based on the variability of the
blank response  of routine analytical mefriods.
However, the reliability of a chemical  measurement
generally increases as the concentration increases.
Analytical costs  may also be lower at higher
detection limits.  For these reasons, a TDL is
                                   130

-------
Tests/Testing
Toxicity Test
Volatile Organic
Compound

Warning Limit
Water Quality Standard
typically set at not less than 10 times lower than
available dredged material guidelines for potential
biological effects associated with sediment
chemical contamination.

Specific procedures which generate biological,
chemical, and/or physical data to be used in
evaluations. The data are usually quantitative but
may be qualitative (e.g., taste,  cidor, organism
behavior).

A bioassay which measures an effect (e.g., acute
toxicity, sublethal/chronic toxicity).  Not a
bioaccumulation test (see definition of bioassay).

An organic compound with a high vapor pressure
that tends to evaporate readily from a sample.

A value indicating that date from the analysis of
QC checks are subject to qualification before they
can be used in a project. When two or more
sequential QC results fall outside of the warning
limits, a systematic problem is  indicated.

A law or  regulation that consists: of the beneficial
designated use or uses of a water body, the
numeric and narrative water quality criteria that are
necessary to protect the use or uses of that
particular water body, and an anti-degradation
statement.
                                    131

-------
      APPENDIX A

Example QA/QC Checklists,
   Forms, and Records

-------
CONTENTS
                                                      Page

    QA PROGRAM ORGANIZATION FLOW DIAGRAM                 A-1

    EXAMPLE DATA QUALITY OBJECTIVES FOR ACCURACY
    AND COMPLETENESS                                   A-2

    ALTERATION CHECKLIST                                A-3

    CHAIN-OF-CUSTODY RECORD                             A-4

    FIELD TRACKING REPORT FORM                           A-5

    LABORATORY TRACKING REPORT FORM                     A-5

    GENERAL SAMPLE LABEL                                A-6

    STATION LOCATION LOG                                A-7

    SYSTEMS AUDIT CHECKLIST                              A-8

    CORRECTIVE ACTIONS CHECKLIST                         A-9
                              A-iii

-------
           QA PROGRAM ORGANIZATION FLOW DIAGRAM
             PROGRAM MANAGER
REGULATORY
  OFFICER
REGULATORY
  OFFICER
                  PROJECT
                 MANAGER
                 ASSISTANT
              PROJECT MANAGER
                  PROJECT
              QA COORDINATOR
     QA
 CHEMISTRY


REGULATORY
OFFICER
  QA DATA
  ANALYSIS
                           A-1

-------
EXAMPLE DATA QUALITY OBJECTIVES FOR
    ACCURACY AND COMPLETENESS
Variable
Volatiles
Grain Size
Matrix
Sediment
Sediment
Units
Mg/kg
Percent
Target
Detection Bias
Limit (%)
10 ±50%
0.01 —
Precision
±30%
±5%
Completeness
99%
99%
Method
Purge & Trap/GC-MS
Sieve & Pipet
Reference
EPA abc/x-cc-yy(1975)

Maximum
Holding
Time
14 days
Undetermined

-------
                                  ALTERATION CHECKLIST



Sample Program Identification: 	



Material to be Sampled: 	



Measurement Parameter: 	
Standard Procedure for Analysis:
Reference:
Variation from Standard Procedure:
Reason for Variation:
 Resultant Change in Field Sampling Procedure:
 Special Equipment, Material, or Personnel Required:
 Author's Name: _ _ Date:



 Approval:  _ _ Title:



 Date: _
                                              A-3

-------
                                                       CHAIN OF CUSTODY RECORD
   MOJLNO.
     PROJECT NAME
 SAMPLERS:
 STA.NO
DATE
TIME
                                     STATION LOCATION
                                                      NO.

                                                       OF

                                                      CON-
                                                    TAHWM
                                                                                                                   REMARKS
RiKnquiihtd by:
                      Dttt/TUiw
                           Rented by :
R«llnqutth«i by:
Ottt/TbiM
RiMnquiihid by:
                     DIM/HUM
                          Rmhad by:
 !*Hnquiih«l by:
Datt/TliM
Rtlinquittitd by:
                     Itatt/Tinw
                                  or Uboraiory by:
    Dtte /Tim     RMiurfci
                 Oiiifibulim. Orifbul »icocnp»il« SMpnMM: Copy «o CaoiiflnMor PWd Pltat

-------
   FIELD TRACKING REPORT FORM
W/0 No.

FIELD TRACKING REPORT:

FIELD SAMPLE CODE
(FSC)




BRIEF
DESCRIPTION



(LOC-SN)
DATE

i

TIME



Pace

SAMPLER



LABORATORY TRACKING REPORT FORM
W/0 No.


LABORATORY TRACKING REPORT:

FRACTION CODE



X




PREP/ANAL
REQUIRED



(LOC-SN)
RESPONSIBLE
INDIVIDUAL



DATE
DELIVERED



Pape

DATE
COMPLETED



              A-S

-------
       GENERAL SAMPLE LABEL
(NAME OF SAMPLING ORGANIZATION)



PROJECT:	




DATE:  	



TIME:  —;	
SAMPLE ID NO.:



MEDIA:  	
STATION NUMBER:



DEPTH:  	
PRESERVATION:
ANALYSES TO BE PERFORMED:



SAMPLED BY: 	



LAB NO.:  	



REMARKS:  	
                A-6

-------
                             STATION LOCATION LOG




                                                              DATE:




PROJECT:	
STATION LOCATION:
DESCRIPTION OF SAMPLES COLLECTED:
SPC ZONE: 	(N/S)      EAST: 	 NORTH:
LOCATION:



      Bottom Depth: 	(ft) 	(m) Tide: ±	(m)  MLLW:	(ft)	(m)




      LORANC: LOP1  	  LOP2	



      Variable Radar Range:  	  .	•
       Visual Fixes: (Note: Please tape any drawings to back of this sheet)
       Photos - Roll: 	 Pictures:



       PID Reading ("»"£»)•



       Comments: 	
 RECORDER:	 SIGNATURE: 	 ORG. CORE	DATE:
                                       A-7

-------
                        SYSTEMS AUDIT CHECKLIST
SAMPLE PROGRAM IDENTIFICATION:




SAMPLING DATES: 	
MATERIAL TO BE SAMPLED:
MEASUREMENT PARAMETER:
SAMPLING AND MONITORING EQUIPMENT IN USE:
AUDIT PROCEDURES AND FREQUENCY:
FIELD CALIBRATION PROCEDURES AND FREQUENCY:
SIGNATURE OF QA COORDINATOR:



DATE: 	

-------
                      CORRECTIVE ACTIONS CHECKLIST
SAMPLE PROGRAM IDENTIFICATION:




SAMPLING DATES:	
MATERIAL TO BE SAMPLED:
MEASUREMENT PARAMETER:



ACCEPTABLE DATA RANGE:
 CORRECTIVE ACTIONS INITIATED BY:	




•TITLE: 	,	




 DATE: 	




 PROBLEM AREAS REQUIRING CORRECTIVE ACTION:
 MEASURES TO CORRECT PROBLEMS:
 MEANS OF DETECTING PROBLEMS (FIELD OBSERVATIONS, SYSTI-MS AUDIT, ETC):
 APPROVAL FOR CORRECTIVE ACTIONS:




 TITLE: 	
 SIGNATURE:



 DATE: 	
                                  A-9

-------
      APPENDIX B

Example Statement of Work
    for the Laboratory

-------
PREFACE
     This appendix contains a generic statement of work for the analysis of most chemicals
     in the most commonly analyzed sample matrices.
                                     B-iii

-------
CONTENTS
                                                       Page

    PREFACE                                             B-iii

    STATEMENT OF WORK                                   B-1

       SUMMARY OF ANALYSES AND SERVICES                  B-1

       SAMPLE DELIVERY AND STORAGE                       B-1

       METHODS                                         B-1

       QUALITY ASSURANCE AND QUALITY CONTROL REQUIREMENTS  B-5

       DELIVERABLES                                      B-6

          Laboratory Data Reports                             B-6

       TURNAROUND TIME                                  B-9

       PROGRESS REPORTS, PROBLEM NOTIFICATION, AND
       PROJECT AUDITS                                    B-9
                              B-v

-------
STATEMENT OF WORK
      The following tasks shall be performed by	as extensions to work
      identified as part of Contract No.	between Contractor and	.
SUMMARY OF ANAL YSES AND SERVICES

      The Laboratory shall perform quantitative analyses for the analytes listed in Table 1 on
      sediment, water, and tissue samples collected from in and around	.  The
      analyses shall be  conducted according to	sampling and analysis plan
      (SAP), the project work plan, and	.
SAMPLE DELIVERY AND STORAGE

      Sampling will begin  approximately 	,  and continue for  a period of
      approximately	.  Contractor will provide sample-is to the Laboratory no
      earlier than	.  Table 2 summarizes the maximum number of samples
      the Laboratory could receive each month and the associated analyses.  The actual number
      of samples that will be delivered to the Laboratory may vary from these estimates.

      Samples will be sent from the site to the Laboratory's facilities vizi United Parcel Service
      or equivalent  carrier.  Contractor may choose to use the Laboratory's courier service if
      the Laboratory provides such  a service.  Contractor will coordinate with the Laboratory
      for final disposition of the samples after analysis. All samples shsill be maintained under
      strict  chain of custody  at all times,  including documentation o:f any transfers among
      facilities.
METHODS

      The Laboratory shall perform the analyses according to the specified	,
      or other  Contractor-specified protocols.  Table  1 provides a list of specific method
      references, holding times, and data quality objectives.

      The Laboratory shall promptly notify the  Contractor Quality Assurance and Quality
      Control (QA/QC)  Coordinator prior to any  deviation from these methods. Further, the
      Laboratory  shall immediately notify the  Contractor QA/QC Coordinator as soon as it
      becomes  apparent  that the data quality objectives cannot be met for a set of samples.
                                           e-y

-------
                       TABLE B-1. SUMMARY OF ANALYSES AND DATA QUALITY OBJECTIVES


Analyte
Target
Detection Bias
Matrix Units Limit (%)

Precision
(%)


Completeness
(%)
Method Reference
Holding
Time
(days)
Organic Analyses

  TCL' semivolatlle
  organic compounds


  TCL volatile organic
  compounds
  TCL pesticides and
  PCBs*
  Lipids

Metals Analyses

  Copper


  Mercury



  TALC metals
Solids
Water
Tissue

Solids
Water
Tissue

Solids
Water
Tissue

Tissue
                          Solids
                          Water

                          Solids
                          Water
                          Tissue

                          Solids
                          Water
                          Tissue
                                    tig/kg
Conventional and Nutrient-Related Analyses

  Acid-volatile sulfide        Solids      ^moles/g
  Total organic carbon


  Dissolved organic carbon   Water
                          Solids
                          Water
           % carbon
           mg/L

           mg/L

-------
TABLE B-1. (cont.)
Analyte
Physical Analyses
Grain size
Percent moisture
Total suspended solids
Matrix

Solids
Solids
Water
Units

g dry wt.
% moisture
mg/L
Target
Detection Bias Precision Completeness
Limit (%) (%) (%) Method Reference




Holding
Time
(days)




*   Target compound list.
6   Polychlorinated biphenyl.
c   Target analyte list.

-------
                           TABLE B-2.  ESTIMATED MAXIMUM NUMBER OF SAMPLES BY MONTH AND ANALYTE TYPE

                                              (date)                   (date)                   (date)                   (date)                  Total
                                            Maximum                Maximum                Maximum                Maximum                Maximum
       Analyte	SoHds  Water   Tissue      Solids  Water Tissue     Solids Water  Tissue      Solids  Water Tissue      Solids  Water Tissue
     Organic Analyses
       TCL' serrfvolatile organic compounds
       TCL volatile organic compounds
       TCL pesticides and RGBs6
       Uplds
     Metals Analyses
       Copper
       Mercury
       TALC metals
DO
     Conventional and Nutrient-Related Analyses
       Acid-volatile sulfide
       Total Inorganic carbon
       Dissolved organic carbon
     Physical Analyses
       Grain size
       Percent moisture
       Total suspended solids
     "    Target compound list.
     *    Polychlorinated biphenyl.
     c    Target analyte list.

-------
QUALITY ASSURANCE AND QUALITY CONTROL REQUIREMENTS

      The Laboratory shall implement the following procedures to assess quality during sample
      analysis:

          •   Calibration Verification—Initial calibration of instruments shall be per-
               formed at the start of the project and when any ongoing calibration does
               not meet control criteria.  The number of points used in the initial calibra-
               tion  is  defined in each analytical method  (e.g., Contract Laboratory
               Program [CLP]).  Ongoing calibration verification shall be performed as
               specified in the analytical methods to monitor instrument performance. In
               the event that an ongoing calibration is out of control, analysis of project
               samples shall be suspended until the source of the control failure is either
               eliminated or reduced to within control specifications. Any project samples
               analyzed while the instrument was out of control shall be reanalyzed at
               Laboratory's expense.

          •   Surrogate Spike  Compounds—The Laboratory shall spike all project
               samples to be analyzed for organic compounds with appropriate surrogate
               compounds as defined in the analytical methods (e.g., CLP). Recoveries
               determined using  these  surrogate  compounds shall be  reported by the
               Laboratory; however, the Laboratory shall not correct sample results using
               these recoveries.

          •   Method  Blanks—The  Laboratory shall not  apply bhink corrections to
               original data.  For organic analyses, a minimum of 1 method blank shall
               be analyzed for every extraction batch, or 1  for every 20 samples, whichev-
               er is more frequent. For metals and conventional analyses, 1 method blank
               shall be analyzed  for every digestion batch, or 1 for every 20  samples,
               whichever is more frequent.

          •   Matrix Spike Samples—For organic analyses and metiils, the Laboratory
               shall analyze  a minimum of 1 matrix  spike  for each group of samples
               extracted or  digested, or 1  for every 20 samples,  whichever is  more
               frequent.  For organic analyses, 1 matrix  spike duplicate shall either be
               analyzed for each group of samples extracted or for every 20  samples,
               whichever is more frequent.

          •   Laboratory Control Samples—When available, the Laboratory  shall use
               laboratory control  samples (LCS). For metals and applicable conventional
               parameters, 1 LCS shall either be analyzed  for every digestion batch or for
               every 20 samples, whichever is more frequent.  The source of the LCS
               must be included in the data package.

          •   Laboratory Duplicates—The Laboratory shall perform duplicate analyses
               as indicators of laboratory precision. For metals analyses (except mercury)
               and conventional  analyses, the Laboratory  shall analyze 1  laboratory
               duplicate either for every digestion batch or for every 20 samples, whichev-
               er is  more frequent.

                                            B-5

-------
               Sample Container Preparation—Sample containers shall be prepared by
               the Laboratory and delivered to the project site, as required.   Sampling
               personnel shall discard any containers that have visible signs of dirt or
               contamination.  Documentation of the preparation of sample containers
               shall be prepared, signed, and dated by Laboratory personnel and included
               with the sample container shipment.
DELIVERABLES

       The Laboratory shall report results that are supported by sufficient backup  data and
       quality assurance results to enable reviewers to conclusively determine the quality of the
       data.  The data and supporting documents shall be provided to the Contractor QA/QC
       Coordinator.  The Laboratory shall not divulge outside of Contractor any data or other
       information obtained or generated by the Laboratory with respect to the work  specified
       herein. Data reporting requirements are summarized below.
Laboratory Data Reports

       All data reports shall include the following:

       A.  General

           1.  A cover letter documenting all sample preparation and analytical protocols used
               and explaining any variance from protocols contained in the appropriate EPA
               statement of work (SOW) or this SOW.

           2.  Copies of completed chain-of-custody records and sample analysis request
               forms.

           3.  A cross-referenced table of Contractor and Laboratory identification numbers,
               and full  explanation  of  all data qualifier  symbols in accordance with  the
               appropriate EPA SOW.

           4.  Tabulated results in units specified in the appropriate EPA SOW or this SOW.

           5.  A table of sample preparation data,  including initial weights or volumes of
               samples,  final dilution volumes,  and  digestion or preparation reagents.  Data
               must  be  grouped by preparation date and include  the identity of all quality
               control checks  associated with each preparation batch.  If subsets of a large
               number of samples are prepared or digested at separate times, then each sample
               subset is  defined as a batch. Data provided in this  table must be sufficient to
               unequivocally match each field sample with the corresponding quality control
               check samples.
                                             B-6

-------
B.  Quality Control Results
    1.   For the analyses of inorganic compounds, the foEowing summary results should
         be tabulated in the format of the appropriate indicated EPA form:
         a.   Initial and ongoing calibration verifications
         b.   Initial and ongoing calibration blanks and preparation blanks
         c.   Inductively coupled plasma-atomic emission specti'ometry (ICP) interfer-
             ence checks
         d.   Matrix spike sample recoveries
         e.   Duplicate samples
         f.   Laboratory control sample recoveries
         g.   Method of standard additions, if performed
         h.   ICP serial dilution
         i.   Mercury  holding times, if performed
         j.   Instrument detection limits
         k.   ICP interelemental correction factors
         1.   ICP linear ranges.
    2,   For all other analyses, the following tabulated summaries of all quality control
         checks for each analyte should be included:
         a.   Initial and ongoing calibration verifications
         b.   Initial and ongoing calibration blanks and preparation blanks
         c.   Matrix spike sample recoveries
         d.   Duplicate samples
         e.   Independent standards,
C.  Original Data
    1.   Legible photocopies of all original data, including Laboratory notebook pages,
         computer printouts, and stripcharts, with sufficient information to unequivocally
         identify the following:
                                \
         a.   Calibration and ongoing calibration results
         b.   Surrogate spike compound recoveries
                                      B-7

-------
         c.   Samples and all dilutions
         d.   Results of all method blanks
         e.   Results of all matrix spikes and matrix spike duplicates
         f.   Results and origin of LCS analyses
         g.   Results of Laboratory duplicates and triplicates
         h.   Origin of all reference materials
         i.   Any instrument adjustments or apparent anomalies on the measurement
             record.
    2.   The following information should be shown  on the first page of each set of
         original data sheets pertaining to a particular protocol (e.g., ICP  computer
         printout):
         a.   A statement documenting the analyte(s) and the exact protocol used
         b.   The date of analysis
         c.   Typed name and signature of the analyst.
    3.   Copies of all sample container preparation documentation.
D.  Electronic Deliverables
All data reported on the EPA forms must also be submitted as a diskette deliverable. The
data should be in Format A (on an MS-DOS diskette),  as defined by the SOW.
E.  Other Information
Although not required as a deliverable for every data package, the following documenta-
tion must be available at the request of the Contractor QA/QC Coordinator as part of the
Laboratory's standard QA/QC procedures:
    •   All original data
    "   Sample receipt and storage logbooks
    •   Record of sample holding time
    •   Storage temperature logbooks
    •   Conductivity of distilled/deionized water
    "   Analytical  balance  annual and  routine (Class S  weights)  calibration
         logbooks
                                      B-8

-------
               Standard preparation and tracking logbooks, including purity of chemicals
               used to prepare standards

               Instrument calibration protocols and service record logbooks, including
               preventive maintenance

               Evidence of spot-checking of data handling

               In-house quality control charts.
TURNAROUND TIME

      Schedules for delivery of results may vary, but shall not exceed a turnaround time of
      	calendar days. Generally, a turnaround time of	days will be desired. For data that
      are delivered late, the Laboratory will be subject to, at the discretion of the Contractor,
      a penalty of	percent per calendar day for each day the data are late up to a maximum
      of	percent of the total cost of the analyses.
PROGRESS REPORTS, PROBLEM NOTIFICATION,
AND PROJECT AUDITS

      A verbal progress report to the Contractor QA/QC Coordinator is inquired each week for
      the duration of the project.  Immediate notification of the Contractor QA/QC Coordinator
      is required when the Laboratory identifies a problem that could prevent all QA/QC
      requirements or data quality objectives,  including required detection limits, to be met for
      the final data.  Contractor may conduct  onsite audits of the Laboratory's facilities during
      the period of analysis to assess implementation of QA/QC requirements. The Laboratory
      shall maintain records to support an audit of the technical quality of all analyses and shall
      provide all such records to Contractor upon request.
                                            B-9

-------
        APPENDIX C

   Description of Calibration,
  Quality Control Checks, and
Widely Used Analytical Methods

-------
CONTHVTS
                                                                Page

     DESCRIPTION OF CALIBRATION, QUALITY CONTROL SAMPLES, AND
         WIDELY USED ANALYTICAL METHODS                        C-1

         INTRODUCTION                                           C-1

         CALIBRATION                                             C-1

         QUALITY CONTROL SAMPLES                               C-3

            Blanks                                                C-3
            Matrix Spikes                                          C-4
            Surrogate Spikes                                       C-4
            Check Standards                                       C-5
            Laboratory Control Samples                              C-5
            Spiked Method Blanks                                   C-5
            Reference Materials                                     C-5
            Replicates                                             C-6

         COMMON ANALYTICAL METHODS                           C-7

            Gas Chromatography                                    C-7
            Gas Chromatography/Mass Spectrometry                   C-7
            Gas Chromatography/Electron Capture Detection              C-9
            Gas Chromatography/Flame lonization Detection              C-9
            High Pressure Liquid Chromatography                     C-10
            Atomic Absorption Spectrometry                         C-10
            Inductively Coupled Plasma-Atomic Emission Spectrometry    C-11
                                   C-iii

-------
DESCRIPTION OF CALIBRATION, QUALITY
CONTROL  SAMPLES, AND WIDELY USED
ANALYTICAL  METHODS
INTRODUCTION

      The relative importance, rationale, and recommended frequency of calibration and each
      of the quality control samples are discussed in the following sections. A summary of the
      major considerations in applying these procedures is provided in the main text  (see
      Section 2.7).

      The concepts of calibration and quality control samples apply to dozens of analytical
      methods that are currently used by  laboratory technicians.   Selection of appropriate
      methods for particular types of analyses is based on the list of chemicals for analysis and
      the required detection limits. Some of the widely used analytical methods are described
      below, along with technical issues that should be considered when choosing  individual
      methods.
CALIBRATION

      Calibration of analytical instruments is a critical element of quality control because the
      procedures used for calibration will determine  both the accuracy and precision  of
      analytical results.  Gas chromatography/mass spectrometry,  or any other  analytical
      technique, measures the magnitude of an unknown concentration of an analyte relative to
      a known concentration of the analyte or a similar analyte in a  standard.   Such relative
      measurements are meaningless unless the responsiveness of the  analytical instrument can
      be determined over a range of analyte concentrations. Through calibration, this level of
      responsiveness can be determined. The relationship between response and concentration
      is generally expressed as an analytical curve. For the analysis of organic compounds in
      samples, response factors (RFs) for analytes relative to  standards at various concen-
      trations may be established from this analytical curve. The degree with which incremen-
      tal concentrations of an analyte produce constant increments of response is  called
      linearity.

      Guidelines for instrument calibration must be included in  the statement of work for the
      laboratory performing the analysis. Examples of these guidelines are given in Methods
      for Chemical Analysis of Water and Wastes (U.S. EPA 1983).  Project managers should
      ensure that the statement of work addresses the following points:
                                         C-1

-------
Instruments should be calibrated at the beginning of the project before any
samples are analyzed, after each major disruption in analytical procedures,
and whenever action limits are exceeded for certain samples. This type of
calibration is  called the initial  calibration of the instrument.  Through
initial calibration, an analytical curve based on the absorbance, emission
intensity,  or other measured characteristics of known standards  can be
established. Data from subsequent analyses are considered valid as long
as the values fall within the linear range of this curve.

In some analytical  programs, the  accuracy of the initial calibration is
verified and documented for every analyte by analyzing U.S. Environ-
mental Protection Agency (EPA) quality control  solutions immediately
following the initial calibration.  If immediate verification is not required,
then the  verification may be conducted after several samples have been
analyzed.  When a  certified solution  of an analyte is not available from
EPA or any other source, analyses should be conducted on an independent
standard  at a concentration other than that used for calibration, but within
the calibration range.  When measurements for the certified components
exceed the action limits, the analysis should be terminated, the problem
corrected, the  instrument recalibrated, and the recalibration verified.

The validity of the original calibration curve should be confirmed through-
out the analyses of samples. This process is called continuing calibration.
However, unless required by a specific method, the continuing calibration
results should not  be used to quantify sample results (use the average
response from  the initial calibration instead). For gas chromatography/mass
spectrometry (GC/MS) analyses of samples containing organic compounds,
calibration should be checked at the beginning of each work shift, at least
once every 12 hours (or every  10-12 analyses, whichever is more fre-
quent), and after the last sample  analysis  of each work  shift.  For gas
chromatography/electron capture detection analyses, calibration should be
checked at the beginning of each shift, every 6 hours (or every 6 samples,
whichever is less frequent), and after the last sample analysis of each shift.

For analyses with inductively coupled argon plasma emission spectrometry
and atomic absorption spectrometry, all work should be performed using
continuing calibration.  A procedure for  conducting these calibrations is
outlined in EPA's  Contract Laboratory Program statement of work for
inorganic chemicals (U.S. EPA 1990e).  Frequency of continuing calibra-
tion of these instruments is 10 percent of the samples or every 2 hours
during an analysis run, whichever is more frequent.
                              C-2

-------
QUALITY CONTROL SAMPLES

Blanks

       Blanks are quality control samples .that are processed with the samples but contain only
       reagents. They are used to obtain the response of an analysis in the absence of a sample,
       including assessment of contamination from sources external to the sample. Contamina-
       tion can arise from sources such as the reagents themselves, sample or reagent contain-
       ers, and equipment used for  sampling, sample storage, and  analysis.   The  types of
       analytical blanks used to identify each of these potential sources of contamination are
       described below:

           •   Method blanks (also called preparation blanks or reagent blanks) are used
               to identify any contamination that may have been contributed by laborato-
               ries during sample preparation. A method blank should be required  for
               each batch of samples prepared for analysis, except in the case of volatile
               organic analyses (VOAs),  in  which case, method  blanks  should  be
               analyzed at least once every 12>hours.  Because method blanks are usually
               included in the cost of sample analysis, they should not place an additional
               cost burden on a project.

           •   Bottle blanks are used to determine whether sample containers are sources
               of contamination.  One bottle blank should be prepared for each lot of
               sample containers. Large increases in the contaminant level for the bottle
               blank compared with  the method blank  indicate a potential  container
               problem.  Laboratories usually provide clean containers for performing
               bottle blank analyses at no additional cost.  For most sampling efforts,
               precleaned containers from a chemical supply company can be obtained at
               reasonable cost.  The use of precleaned bottles may eliminate the need to
               have bottle blanks analyzed.

           •   Transport blanks (also called trip blanks) are used to delect contamination
               arising during sample shipping, handling, and storage. These blanks are
               taken from clean containers filled with deionized water, transported to the
               field, and stored and  shipped with the samples.  One transport blank
               should be included with each shipping container.  A contaminant level for
               the  transport  blank  that greatly exceeds  the  contaminant level of the
               method blank indicates a  potential field handling, container,  or storage
               problem.   Transport blanks  are important only for projects  involving
               analysis  of  volatile organic compounds, which  may migrate from  one
               container to another.

           •   Field equipment blanks (also called decontamination checks) are used to
               detect contamination arising from field sampling equipment. At least one
               field equipment blank should be required for each medium that is sampled
               during a sampling effort.

                                             C-3

-------
Matrix Spikes
       Matrix spike samples are used to provide an indication of the bias due to matrix effects
       and an estimation of the precision of results.  They can also provide indications of how
       tightly an analyte is bound to its matrix, such as soil or tissue.  Matrix spike samples are
       created by adding known amounts of  chemicals of interest to actual samples, prior to
       extraction and usually prior to digestion. The addition of these chemicals is commonly
       called spiking.  The matrix spike is analyzed using the same  analytical procedure used
       for samples.   The results are then compared with the results from the  analysis of a
       replicate, unspiked sample.  In this way the effect of the particular sample matrix on the
       recovery of chemicals of concern can be evaluated. By spiking and analyzing the sample
       after digestion, an analyst can determine whether spike analysis results have been affected
       by matrix binding or by sample preparation procedures.  This postdigestion spiking is
       only used for metals analyses.

       Matrix spike samples should include a wide range of chemical types.  For example, a
       matrix spike sample for analysis of semivolatile organic compounds may include spiking
       with three neutral compounds, two organic acid compounds, and two  organic base
       compounds.   Ideally, samples should be spiked either at approximately 5 times  the
       expected chemical concentration in a  sample  or  at 5 times the target detection  limit,
       whichever is higher. Spiking at this concentration reduces the possibility for any increase
       in  random  error during the matrix  spike analysis  and  eliminates  any masking  of
       interferences at representative chemical concentrations.

       One matrix spike sample and  one matrix spike duplicate sample should be analyzed for
       every set of twenty or fewer samples or with each  sample preparation lot.  If 20 or more
       samples are submitted, 1 matrix spike duplicate pair should be run for each set of 20
       samples.  Analysis of matrix  spikes and matrix spike duplicates is often  performed to
       assess the precision and bias of one set of results.
Surrogate Spikes

       Surrogate spike compounds can be used to estimate the recovery of organic compounds
       in a sample.  Surrogates are compounds with characteristics similar to those of com-
       pounds of interest that are  added to a sample before it undergoes  the  process  of
       extraction.  Surrogates  should be compounds that are not expected to be present in the
       samples, but  they should  have characteristics similar  to  the compounds of concern.
       Compounds labeled with stable isotopes (that is, where normal carbon or hydrogen atoms
       in the molecule have been replaced with isotopes of carbon or hydrogen) are commonly
       used as surrogates.  However, all surrogates need not be isotopically labeled.  They need
       only  be compounds that are physically and chemically similar  to  the  chemicals  of
       interest.  For example, dibromooctafluorobiphenyl is used by some laboratories  as a
       surrogate for polychlorinated biphenyls (PCBs), although this compound is not identical
       in structure to a PCB.
                                             C-4

-------
       Because surrogate compounds are the only means of checking method performance on
       a sample by sample basis, they should be used whenever possible.  A minimum of five
       surrogate spikes (three neutral and two acid compounds) should be added to each sample
       when  analyzing for semivolatile organic compounds.  These surrogate spikes should
       cover  a wide range of compound classes. At least three surrogate compounds should be
       used for the analysis of volatile organic compounds, and at least one surrogate compound
       should be used in each extracted  sample as a check on recovery of pesticides. A separate
       surrogate compound should be used in each extracted sample to check the recovery of
       PCB mixtures.
Check Standards

       Check standards contain known amounts of analyte and are analyzed along with the
       samples.  Check standard  results are used to indicate bias due to sample preparation
       and/or calibration and to control precision.
Laboratory Control Samples

       Laboratory control samples are check standards used to assess precision in the analytical
       procedures for metals.  Like reference materials, these  samples can  be  acquired from
       EPA.  Often they are routinely analyzed  by the laboratory at no extra cost.
Spiked Method Blanks

       In  certain organic methods, surrogate spikes are added to the check standards; these
       quality control samples are called spiked method blanks.  The different compounds and
       their required amounts are specified in EPA's  guidelines for the; Contract Laboratory
       Program  (U.S. EPA 1990d,e) and other regional guidelines.  Such analyses are useful
       to verify acceptable method performance prior to and during routine analysis of samples
       containing organic compounds.  Spiked method blanks do not take into account sample
       matrix effects, but can be used to identify basic problems in procedural steps.  Spiked
       method blanks can also be used to provide minimum recovery data when no suitable
       reference material is available or when sample size is insufficient for matrix spikes.  A
       spiked method blank should be analyzed whenever a method is used for the first time in
       a project  and each time that a method is modified.   In these instances, analysis of the
       spiked method blank should take place before analysis of any samples.
Reference Materials

       Reference materials are substances with well-characterized properties that are useful for
       assessing the bias of an analysis and auditing analytical performances among laboratories.
       SRMs are certified reference materials containing precise concentrations of chemicals,


                                             C-5

-------
       accurately determined by a variety of technically valid procedures, and are issued by the
       National Institute of Standards and Technology. Currently, SRMs are not available for
       the physical measurements or all pollutants  in sediments; however, where  possible,
       available SRMs or other regional reference materials that have been repeatedly tested
       should be analyzed with every 20 samples processed.  Further information on SRMs is
       provided in the main text (see Section 2.11.2).
Replicates
       Replicates are two or more identical samples that are analyzed to provide an estimate of
       the overall precision of sampling or analytical procedures.  When two separate samples
       are taken  from the same field station, or when  one  sample is split into two separate
       samples, these replicate samples are specifically called duplicates. Duplicates are usually
       sufficient  when using an  analytical procedure that is well proven in the laboratory.
       Analyzing three replicate samples (called triplicates)  yields more meaningful statistical
       measures  of variability  than analyzing duplicate samples.    However, statistically
       combining the variance of duplicate sample results across several sets of duplicates is also
       an effective way of evaluating variability.

       Replicate samples are commonly used for the following purposes:

           «   Analytical (or laboratory) replicates measure the  precision of sample
               analyses. To prepare analytical replicates, the sample is homogenized by
               the laboratory and divided into two subsamples.  The subsamples are then
               independently analyzed.   If  five or fewer samples are submitted for
               analysis, a minimum of one analytical replicate is recommended, the exact
               number to be determined by the project manager. If more than 5 but less
               than 20  samples are submitted, at least 1 analytical replicate should be
               analyzed. A general role is 1 analytical replicate for every batch of up to
               20 samples analyzed together (e.g., U.S. EPA 1990d).

           "   Field replicates measure sampling variability. These samples are collect-
               ed at the same time and location as other samples and are submitted for
               analysis  along with the other samples. Field replicates should be coordi-
               nated with analysis of laboratory replicates  so that both  sampling varia-
               bility and analytical variability can be measured for the same station. The
               project manager or coordinator usually determines  the  frequency with
               which field replicates are collected and sent to the laboratory. If funds are
               limited,  a single laboratory replicate to measure analytical variability is
               preferred over a field replicate.

           «   Blind  replicates are samples submitted  to  the  laboratory  without the
               laboratory's prior knowledge.  Data from these  blind replicates can  be
               used to detect potential laboratory bias when  compared with data from the
               analysis of analytical replicates. In this manner, blind replicates can serve
                                             C-6

-------
               as laboratory quality control samples.  However, the results for these
               samples are  subject to errors introduced by the process of splitting the
               sample and by preservation, transportation, and storage procedures as well
               as analytical errors.   Analysis of 1 set of blind replicates  should  be
               performed whenever 20 or more samples are submitted.  At least one
               triplicate set is recommended for analysis of more than 20 samples.
COMMON ANAL YTICAL METHODS


Gas Chromatography

      Gas chromatography is a technique  used  to  separate a complex mixture of organic
      materials  into its components (for example,  an extract of oil  or smoke, which may
      contain hundreds, even thousands, of compounds).  To do this, the sample  extract is
      injected into a heated chamber, in which the mixture of compounds is  concentrated at the
      head of a separating column. The mixture is then carried through the  column by an inert
      gas (called the mobile phase).  As the column is heated, the analytes pass through
      absorbent materials (called the stationary phase).  Different analytes move at different
      rates and appear one after another, along with any interfering substances for a  particular
      analyte, at the effluent end of the column.  Here they are measured by a detector.  The
      detector sends information as  an electronic signal  to  an integrator,  chart recorder, or
      computer.  The signals are then interpreted and presented graphically in the  form of a
      chromatogram and digitally as a quantification report.

      Using  the chromatogram and the  digital information contained in  the quantification
      report, many analytes contained in the sample can be accurately identified and quantified.
      Several different gas chroraatograph/detector combinations are commonly used for the
      analysis of volatile  and semivolatile organic compounds, which include pesticides and
      PCBs.  Three of these combinations are described  in the following sections.
Gas Chromatography/Mass Spectrometry

      GC/MS  enables positive  identification of a  compound that has  eluted from  a gas
      chromatographic column. In the GC/MS chamber, separated compounds are bombarded
      by electrons  and broken into characteristic fragments called ions.   The mass of the
      charged ions  (i.e., their molecular weight) can be sensed by a detector that accumulates
      data on ionization current over a wide range of masses.  The more ions of a particular
      mass, the greater the ionization current that is recorded for that mass.  At any one time,
      the relative intensity of this current over all the different masses recorded for a particular
      compound gives rise to its mass spectrum (Figure C-l).  The pattern of fragmentation
      ions in a mass spectrum is used to distinguish one compound from another. In addition,
      the intensity of the current recorded for one characteristic ion over time gives rise to its
      mass chromatogram, which  is used to quantify the concentration, of the analyte as it
                                            C-7

-------
9
00
                                LIBftflRY SBVtCH                               DATfli F22744L 11922      BASE 11/2: 228

                                                                           CM Ii C0112 17        PI' i   43727-

                                SflHPLEl 27« L SRtl SED1IDII P ?5-l  III  'JC  1.0 It
                                SAMPLE
                 CI8.HI2
                           I  jl   II   ill  4  l.ljllhjlj. .. .    I..   I
                                     ...
                               50
100
ISO
                                                                                   200
&
,el
LIBRARY
1 , i II , Li ll
1
250
              Figure C-1. Example mass spectrum for benz(a)anthracene identified in a sample sediment extract (upper) and

                          authentic spectrum stored in computerized GC/MS library (lower).

-------
       elutes from the gas chromatograph.  This characteristic ion is called the quantification
       ion.  The mass chromatograms for all ions detected  can  be superimposed  into a
       reconstructed ion chromatogram  (RIC), also called a total ion  chiromatogram.  The RIC
       is a graphic display of the total ionization current resulting from all mass fragments for
       all compounds detected from the start to the finish of the analysis.   The RIC  can be
       compared with the chromatograms produced by other detectors  and provides an indication
       of the relative composition of components in the sample mixture, analyzed by GC/MS.
       The mass spectrometer is a selective detector that allows for  the positive identification
       of many compounds.  Other kinds of detectors may be more sensitive in detecting PCBs
       and other chlorinated compounds.


Gas Chromatography/Electron Capture Detection

       Gas chromatography/electron capture detection (GC/ECD) is useful for detecting analytes
       such as  pesticides, PCBs, and other similarly structured chemical compounds that contain
       chlorine.  The BCD measures the total concentration of a chemical in a sample, but it
       cannot  distinguish  one  individual chemical from  others.  Verification of  individual
       chemicals is accomplished by comparing the order in which the chemicals appear (called
       the elution order) and the time that  passed before they appeared (called the retention
       time) with the elution orders and retention times of certain analytical standards.  The
       identity of a chemical is verified when the elution orders and retention times match on
       two columns  of different stationary  phases.  This verification technique, called dual
       dissimilar column confirmation, is useful because two chemicals that may have the same
       elution  orders and retention times on one column will  have different characteristics on
       the second column.


Gas Chromatography/Flame Ionization Detection

       Gas chromatography/flame ionization detection (GC/FID) can be used to detect organic
       compounds that can be converted  to ions during exposure to flame.  This kind of detector
       is especially sensitive to molecules that  contain  carbon and hydrogen, just  as  the
       GC/ECD is especially sensitive to molecules containing chlorine.  Because the GC/FID,
       like the GC/ECD,  cannot distinguish between individual chemicals,  dual  dissimilar
       column confirmation must also  be  performed for each sample analyzed.  Related
       detectors that use flame for analyzing organic samples include  the nitrogen flame ioniza-
       tion detector (NFID), which is especially sensitive to nitrogen- and phosphorus-containing
       molecules, and the  flame photometric detector (FPD), which is especially sensitive to
       organophosphorus pesticides and other compounds containing sulfur.
                                            C-9

-------
                              PACKED VS. CAPILLARY COLUMNS
          Different kinds of separating columns will
          yield different results.  Packed columns
          have been  used  routinely in the past for
          the analysis of  PCBs,  pesticides/ and
          volatile organic compounds. Packed, col-
          umns  produce chromatograms of fairly
          low resolution, although the results may
          be reproducible (i.e., preciseK  However,!:
          a large quantity of the sample extract can
          be  analyzed  without  overloading the
          instrument.   More exacting analysis Is
          afforded  by either megabore capillary or .
          fused silica capillary columns.  Pesticides
          and PCBs can now be routinely analyzed
  usingi megaBore  columns, ^Analysis of
  volatile organic compounds cart be con-
  ducted on capillary columns.--. Howeveiv
 " because ;.the:.«fitir0:99mpfe purge Is u$«J
 . for volatile 'analyses, a packed  column
  with high loading capacity may sKll be
  preferred if high resolution is'aot esserv-
:;  tialv If project results are dependent of*
  detailed recognition of contaminant mix-
  tures (a$ is the case with PCBs a.rjd .toxa*
  pheneh laboratories equipped withcapil-
 ..lary columns  should be  selected,: to
  perform analytical tasks. <•   r.     ;^--
High Pressure Liquid Chromatography

       Like gas Chromatography, high pressure liquid chromatography (HPLC) is a technique
       used to separate a complex mixture into its component compounds. The compounds are
       carried as a liquid through solid absorbent phases and  are sensed at the effluent end of
       the column by a specialized  detector sensitive to,  for example, ultraviolet, fluorescent,
       or infrared signals.  This technique (described in EPA's laboratory manual Test Methods
      for Evaluating Solid Waste [U.S. EPA 1986a) is useful for analyzing polycyclic aromatic
       hydrocarbon (PAH) compounds in samples because many  interferents on other instru-
       ments do not emit ultraviolet or fluorescent spectra, thereby increasing the sensitivity of
       the ultraviolet/fluorescent detector to many  PAH compounds.  However, some  com-
       pounds of interest also do not emit these characteristic  spectra. It is for this reason that
       EPA's Contract Laboratory Program statement of work  for organic analysis recommends
       GC/MS over HPLC using ultraviolet/fluorescent  detectors.  However, HPLC can be
       useful as a way to screen samples for PAH  contamination.  Because it removes  some
       interferents and separates the sample into components that can be individually collected
       and analyzed, HPLC can also be used as a powerful cleanup technique.
Atomic Absorption Spectrometry

       Two basic methods of spectrometry are commonly used  to  identify and  measure
       concentrations of metals in a  sample.   Using  the  first  method, atomic  absorption
       spectrometry, the digested sample is first vaporized and then exposed to a light source
       emitting a  spectrum characteristic of the target analyte.   A  portion  of the light  is
       absorbed by the analyte in the sample.  The remaining light is measured by a photoelec-
       tric detector and assigned a numerical value.  Because the intensity of light absorbed by
       the sample is proportional to the quantity of the target analyte present in the light's path,
                                             C-10

-------
       this value represents the concentration of a metal in the sample.  Several different forms
       of atomic absorption are frequently used:

           •   Graphite furnace atomic absorption spectrometry (GFAA) determinations
               are completed as single element analyses.  With this technique, sample
               digestates are vaporized in an electrically heated graphite furnace.  The
               furnace is designed to gradually  heat the digestates in several stages,
               allowing an experienced analyst to remove unwanted matrix components
               and select the optimum final temperature for the metid being analyzed.
               The major advantage of this technique is that  it affords extremely low
               detection limits, which are particularly essential  in the jualysis of arsenic,
               cadmium, selenium, or lead.  Samples must be relatively clean for GFAA
               to produce usable data.

           •   Hydride generation atomic absorption (HGAA) spectrometry uses a chemi-
               cal reaction to  separate arsenic or selenium selectively from a sample
               digestate.   This technique removes these two element; from the sample
               matrix, minimizing interferences and improving instrument sensitivity.

           •   Cold  vapor atomic absorption  (CVAA) spectrometry uses a chemical
               reaction to release mercury from the digestate as a vapor, which is then
               analyzed by atomic absorption.  This method should be used whenever
               analysis of mercury in samples is required.

           •   Flame atomic absorption (FLAA)  spectrometry determinations are nor-
               mally  completed as single element analyses, following exposure  of the
               vaporized samples  to  either a  nitrous oxide/acetylene or air/acetylene
               flame.  Data produced using this technique are relatively free of interfer-
               ents, however instrument sensitivity is not as great as with other forms of
               atomic absorption.
Inductively Coupled Plasma-Atomic Emission Spectrometry

       The second widely used and cost-effective form of spectrometry is inductively coupled
       plasma-atomic emission spectrometry (ICP).  Using ICP, the digested sample is first
       turned  into  an aerosol,  then  subjected  to extremely  high  temperatures  within  the
       instrument.  The  high  temperature  ionizes  the atoms,  which produce ionic emission
       spectra uniquely characteristic of specific metals.  The wavelengths of these spectra can
       then be used to identify one or many different metals in the sample, while the intensity
       of light can be used to determine metals concentrations.

       The primary advantage of ICP is that it allows simultaneous or rapid sequential determi-
       nation of many different metals, reducing the time and cost of individual metals analyses.
       The primary  disadvantage of ICP, however,  is its lower degree  of sensitivity.   The
       detection limit associated  with ICP analysis is  often higher than the detection limit that
       can be obtained through the use of a graphite furnace or several other forms of atomic
                                            C-ll

-------
absorption spectrometry.  Although all ICP instruments use high-resolution optics and
background  corrections to minimize interferences, analysis for traces of metals in the
presence of a large excess of a single metal can be difficult.  Spectroraetric data are
reliable only if the analyte concentrations in the digestate are 5-10 times greater than the
instrument detection limit.  When concentrations are lower than this value for ICP
analysis (as is often the case, for example, with samples containing arsenic or lead), then
GFAA should be used.  A relatively new method of detection is the use of combined
inductively coupled plasma-mass spectrometry (ICP/MS),  which not only allows for
simultaneous determination  of many  different metals,  but can also  achieve lower
detection limits comparable to those using graphite furnace techniques.
                                      C-12

-------
   APPENDIX D

 Example Standard
Operating Procedures

-------
CONTENTS
                                                         Page

     GENERAL STANDARD OPERATING PROCEDURES

        SAMPLE PACKAGING AND SHIPPING                      D-1

        EQUIPMENT DECONTAMINATION                         D-3

     SPECIFIC ANALYTICAL STANDARD OPERATING PROCEDURES

        SEMIVOLATILE ORGANIC ANALYTES IN SEDIMENT AND
        TISSUE EXTRACTS                                    D-8

        ANALYSIS OF PAHs BY GC/MS                          D-11

        ANALYSIS OF PCBs AND CHLORINATED PESTICIDES           D-14

        INSTRUMENTAL ANALYSIS OF METALS IN SEDIMENT AND
        TISSUE EXTRACTS                                    D-18

        SEDIMENT EXTRACTION OF SEMIVOLATILE ORGANIC
        ANALYSES                                         D-22

        DIGESTION OF MARINE ORGANISM SAMPLES FOR METALS
        ANALYSIS                                          D-25

        TOTAL DIGESTION OF SEDIMENT SAMPLES                 D-30

        TISSUE EXTRACTION OF SEMIVOLATILE ORGANIC ANALYTES   D-34
                               D-ili

-------
General Standard Operating Procedures

-------
           STANDARD OPERATING  PROCEDURE
            SAMPLE PACKAGING AND SHIPPING
      For samples collected during field operations that will be classified as "environmental."
      Specific sample packaging and shipping requirements are described below.
ENVIRONMENTAL SAMPLES

      All samples identified as Environmental Samples should be packaged and/or shipped
      utilizing the following procedures.
Packaging

      1.   Place samples into a strong container, such as a lined cooler or a U.S. Department
          of Transportation (DOT)-approved fiberboard box.  The inside of the container
          should be lined with a polyethylene bag.  Wrap glass jars with bubble-pack and
          surround the  samples with noncombustible, absorbent, cushioning material for
          stability during transport.

      2.   Seal the large polyethylene bag with two chain-of-custody seals.

      3.   Place the laboratory/sampling (including chain-of-custody) paperwork in a large
          envelope and tape it to the inside lid of the shipping container (see Shipping Papers).

      4.   Close and seal the outside container with several chain-of-custody seals. Tape it shut
          using fiberglass tape.

                                        D-1

-------
Marking/Labeling

       1,   Use abbreviations only where specified.

       2.   Place the following information, either hand-printed or in label form, on the outside
           container:

           •   Laboratory name and address

           •   Return name and address.

       3.   Print "Environmental Samples" and "This End Up" clearly on top of the shipping
           container. Put upward pointing arrows  on all four sides of the container. No other
           marking or labeling is required.


Shipping Papers

       No DOT  shipping papers are required.  The  following sample custody  and analytical
       laboratory request forms should accompany the sample shipment.  These  documents
       should be taped to the inside lid of the outside sample container:

           «   Chain-of-custody form

           •   Sample analytical request form

           «   Sample packing list.

       See the quality assurance project plan for procedures in filling out these forms.
                                            D-2

-------
      STANDARD OPERATING PROCEDURE
         EQUIPMENT DECONTAMINATION
The purpose of this standard operating procedure (SOP) is to define decontamination
procedures for field equipment used for collecting soil, sediment, and water samples.
Techniques for ridding equipment of both metals and organic contaminants are discussed.
Sampling equipment is  decontaminated between each  sampling event to  avoid cross
contamination of samples and to help maintain a healthy working environment. Protective
clothing is worn by  all field technicians  during sampling and  decontamination as
described in the health and safety plan.

It is the responsibility of the field sampling coordinator to assure that proper decontami-
nation procedures are followed and that all waste materials produced by decontamination
are properly managed.  It is the responsibility of the  project safety officer  to. draft and
enforce safety measures  that provide the best protection for all persons involved directly
with  sampling or decontamination.   All subcontractors  (e.g., drilling contractors) are
required to follow the decontamination procedures specified in the contract, the health and
safety plan, and this SOP. Individuals involved in sampling and/or decontamination are
responsible for maintaining a clean working environment and ensuring that contaminants
are not introduced to the environment.

All equipment will be decontaminated using a series  of washes and rinses  designed to
remove materials of interest without leaving residues that will in any way interfere with
analysis of the samples taken with that equipment. In addition, the decontamination site
will be set up at a location separate from the sampling area in order to isolate these two
activities.
                                    D-3

-------
Field equipment blanks will be taken at a frequency of 5 percent of samples and sent to
the laboratory(s) for analysis along with the regular samples.  These blanks will serve as
a quality assurance indicator of possible cross contamination of samples.  When feasible,
samples to be taken with the same equipment will be taken in order from  lowest to
highest suspected  contaminant levels to minimize the chances of cross contamination.

The following is a list of materials that are required on site to support decontamination.
The quantity and actual use of each item will be dependent on the overall size and nature
of the sampling effort.

     •   Cleaning liquids and  dispensers: soap and/or phosphate free detergent
         solutions, tap water, methanol,  10  percent nitric acid, distilled/deionized
         water

     •   Personal safety gear as defined in the project health and safety plan

     •   Chemical-free paper towels and/or tissues

     •   Powder-free disposable latex gloves

     •   Waste storage containers: drums, boxes, plastic bags

     •   Plastic ground cloth  on which to lay clean equipment

     •   Cleaning containers: plastic and/or galvanized steel tubs and buckets

     •   Cleaning brushes with non-contaminating  stiff bristles

     •   Steam cleaning apparatus (supplied by drilling contractor).

The  materials used in  decontamination activities are located a minimum of 15-30 feet
downwind  of the sampling site as  designated by  the task leader.  Decontamination will
be carried out before moving to the next sampling site to avoid transporting contaminants.
                                        D-4

-------
PROCEDURES
       Regardless of the type of contamination that requires removal, the basic steps involved
       are  the  same.  Procedures unique  to organic, metal,  and organic/metal  combined
       contamination are discussed in their respective sections that follow.
Step 1:  Gross Removal of Material

       Steam Cleaning

       Depending on the availability of apparatus (e.g., drilling operations), steam cleaning
       combined with brushing is the preferred method of initial material removal. Using steam
       alone introduces little further contamination, and is a very efficient way of removing
       materials.  Equipment such as spatulas, split spoons, and drill flights are placed in and/or
       suspended over tubs that catch contaminated wash waters for proper disposal.


       Detergent Wash

       In cases where steam apparatus is not available, a phosphate free detergent wash and tap
       water rinse may be used. A detergent bath is  formulated in a tub large enough to hold
       the equipment to be  washed leaving  enough volume to  hold the tap water rinses.   All
       material is brushed from the equipment into the tub.  The equipment is rinsed with tap
       water while suspended over the wash tub. Because detergents can contain low levels of
       interfering contaminants for both organic and metals analysis, the thoroughness of the
       final rinse in this step is of utmost importance. When the analyte levels in the samples
       to  be  taken  by the  decontaminated equipment  are  suspected to be very low (e.g.,
       background level), it is recommended that the  detergent  wash be replaced by a distilled
       water wash or steam cleaning when available, followed by a decontamination equipment
       blank as described below.
                                            D-5

-------
Step 2:  Specific Contaminant Removal

       Organic Contaminants

       For removal of general organic contaminants, the solvent of choice is methanol because
       a) it dissolves all contaminants of concern and b) it is miscible with water which means
       it can be removed with a water rinse. The equipment is suspended over a tub and rinsed
       from the top down with high purity methanol delivered by peristaltic pump  for large
       pieces, or a squirt bottle for smaller pieces.  Rinse wastes are disposed of according to
       the project health and safety plan.


       Metal Contaminants

       Metals require acid solvents for efficient removal. Nitric acid is the acid of choice
       because  of  its ability to  dissolve all of the metals of concern.   The equipment is
       suspended over a tub and rinsed from the top down with 10 percent nitric acid delivered
       by peristaltic pump for large pieces, or a squirt bottle for smaller pieces.  Rinse wastes
       are disposed of according  to the project health and safety plan.


       Combined Organic/Metals Contaminants

       When equipment will be used to take samples that will be analyzed for both metal and
       organic constituents, the acid rinse is performed followed by the methanol rinse, each as
       described above.  Due to the difficulty in obtaining organics free acids, and the ease of
       obtaining metals free methanol, the order of the two rinses must not be reversed.
                                            D-6

-------
Step 3:  Final Distilled/Deionized Water Rinse

       A final rinse with distilled/deionized water is carried out last to remove the contaminant
       specific solvents (i.e., nitric acid and/or methanol).  Because these solvents may
       themselves interfere with sample analyses, this step is very important and must be carried
       out thoroughly. The equipment is suspended over a waste  tub, and rinsed from the top
       down with distilled/deionized water delivered  by pump or squirt bottle, depending on
       equipment size.   In  the case of  metals decontamination, a simple  pH monitoring
       technique (e.g., pH paper)  may be used to monitor rinse water in determining rinse
       completion.
Step 4:  Air Dry

       Before an equipment blank is taken, the equipment is laid out on a clean plastic ground
       cloth and allowed to dry.  The equipment should be protected from gross contamination
       during the drying process.
Equipment Blanks

       Equipment blanks are taken between selected samplings as described in the Sampling and
       Analysis Plan. Equipment is rinsed with distilled water that is subsequently collected in
       a sample container.  The rinsate sample is then labeled and shipped as a blind sample to
       the laboratory(s) with regular  samples.  One blank is created in this way for each
       analysis to be performed on samples taken  with this equipment unless otherwise stated
       in the quality assurance plan. The equipment should be protected from contamination
       between the time the blank is taken and the time the next sample is collected.
                                            D-7

-------
      Specific Analytical
Standard Operating Procedures

-------
                          ERLN CHEMISTRY GROUP
   STANDARD OPERATING PROCEDURE FOR COLUMN CHROMATOGRAPHY
                   OF SEMTVOLATILE ORGANIC ANALYTES
                    IN SEDIMENT AND TISSUE EXTRACTS
                         (REVISED FEBRUARY 1993)
1.0 OBJECTIVES

      The objective of this document is to define die standard operating procedure for the
      preparation of columns for the cleanup and chemical class separation of semi-volatile
      organic compounds from marine samples. The extract tractions will be analyzed by gas
      chromatography (GQ or gas chromatography/mass spectrometry (GC/MS).

2.0 MATERIALS AND EQUIPMENT

      9.5-mm ID X 45-cm glass chromatography column with 200 ml reservoir

      Apparatus for determining weight
            Top-loading balance capable of weighing to 0.01 g

      Turbo-Vap (Zymark) apparatus, with heated water bath maintained at 25-35° C
            Glass Turbo-Vap flasks, 200 ml
            Nitrogen gas, compressed, 99.9% pure

      Tumbler, ball-mill

      Glass graduated cylinders, 100- and 500-ml

      Glass beakers, 50-ml

      Borosilicate glass vials with Teflon-lined screw caps, 2-ml

      Micropipets, solvent rinsed or muffled at 400°C

      Reagents
            Pentane, pesticide grade or equivalent
            Methylene Chloride (CH^Clj), pesticide grade or
                  equivalent
            Hexane,  pesticide grade or equivalent
            Heptane, pesticide grade or equivalent
            Deionized water, pentane-extracted

                                        D-8

-------
            BioSil A silicic acid, 100-200 mesh
            Glass wool,
1.0  METHODS

      3.1 Silica gel preparation

            3.1.1 Approximately 150 grams of fully activated silica gel is accurately weighed
            and transferred to a glass jar.

            3.1.2  The silica gel is deactivated by adding 7.5% (weight basis) of pentane-
            extracted deionized water.  The water is weighed accurately and an appropriate
            amount is added dropwise, —  1 ml at a time, to the silica ijel. After each water
            addition, the jar is hand-shaken vigorously.

            3. 1.3  The glass jar is then placed on a ball-mill nimbler zind allowed to tumble
            overnight.

            3.1.4  After tumbling, the jar is removed from the tumbler.  The silica gel is
            stoned tightly sealed in the jar at room temperature until use.

      3.2 Column preparation

            3.2.1  The glass columns are set up in ring stands in a fume hood.

            3.2.2  Glass wool, sufficient to create a 1 cm thick plug in the column is placed
            into the reservoir of the column. A glass rod is used to push the  glass wool to
            the bottom of the column.

            3.2.3  11.5 g of die  7.5% deactivated silica gel is weighed out in  a beaker.
            Approximately 30 ml of CHjClj is  added to the beaker to  form a slurry.  The
            slurry is then carefully poured into the column.  The beaker is rinsed with
            additional CHjCl], as  are the  inner walls of the reservoir to ensure all silica is
            introduced to the column. The total volume of CHjClj should be approximately
            50 mi

            3.2.4  The column is allowed to drip, and the ehiate is collected and discarded.
            When the level  of the CH^Clj just retches the top of the  silica gel, 50 ml of
            pentane is slowly added to the column.  This  ehiate is also collected and
            discarded.

      3.3 Chemical class separations

            3.3.1  The sample extract is introduced to the column just as the pentane rinse

                                            0-9

-------
             level reaches the silica gel.  The vial is then rinsed with an additional 1 ml of
             pentane which is also introduced to the column just before the silica gel is
             exposed. The eluate is collected in a clean round bottom flask.

             3.3.2 As the sample rinse level reaches the silica gel, 55 ml of pentane is added
             to the column. The eluate is collected as the F-l fraction in a clean Turbo-Vap
             flask.

             3.3.3  As  the pentane  level reaches the  top of the silica, 36 ml  of 70:30
             pentane: methylene chloride is introduced to the column.  The F-2 fraction is
             collected in a separate Turbo-Vap flask from the F-l traction. After collection,
             the flasks are kept tightly capped with.aluminum foil.   At no time should the
             column flow rate exceed 6 ml/min.

             3.3.4 After the F-2 fraction has been collected from the column, the flasks are
             placed in the Turbo-Vap.   The apparatus is turned on and Nitrogen gas is
             introduced  to the flasks.  The solvent is reduced to approximately 1  ml.  The
             samples are then solvent-exchanged to heptane and concentrated to about 1 ml.

             3.3.S  The fractions are then transferred to borosilicate glass vials fitted with
             Teflon-lined screw caps for storage until analysis.

4.0 QUALITY ASSURANCE/QUALITY CONTROL

       4.1 Silica Gel Testing
             4.1.1 Silica Gel is verified to separate compound classes using the  silica gel
             testing SOP.

       4.2 Method Blanks
             4.2.1 Method (procedural) blanks are included in each sample set to provide an
             estimate of contamination from the reagents.

       4.3 Internal Standard Recovery
             4.3.1 PCB103 is added  to final column fractions to calculate recovery of the
             internal standard.
                                            D-1O

-------
                          ERLN CHEMISTRY GROUP
             STANDARD OPERATING PROCEDURE FOR ANALYSIS
                              OF PAHs BY GC/MS
                          (REVISED FEBRUARY 1993)
1.0 OBJECTIVES
      The objective of this document is to define the standard procedure for analyzing marine
      environmental samples for PAHs using GC/MS in electron impact/positive ion mode,

2.0 EQUIPMENT

      HP Model 5890 Series H Gas Chromatograph
      HP Model 5971A Mass Selective Detector
      HP Model 7673 Autosampler
      HP MS Chemstation (DOS Series) Software
      IBM Compatible Personal Computer

3.0 OPERATION

      A. Instrument Parameters

      Column: 60 m x 0.25 mm ID x 0.25 urn DB-5 (J&W Scientific)
      Carrier: Helium at 25 psi; 0.8-1.0 ml/min
      Injector: 270 degrees C; splitless mode, purge on at 0.8 min
      Interface: 300 degrees  C; direct, source 200 degrees C
      Temperature Program: 1 min, 40 deg; 20 deg/min to 120 deg; 10 deg/min to 310 deg
            and hold 16 min. This is suitable for Polycyclic Aromatic Hydrocarbons.
      MS Parameters: Set by Autotune using PFTBA as the calibration compound; Manual
            Tune is then used to force the 131 and 219 abundances to 20 to 40 percent of the
            69 base peak; die electron multiplier is then set to meet the requirements of the
            particular method.  This procedure is done in a series of loops, as new parameter
            settings for a specific lens will affect the behavior of this others.

      B. Daily Performance Checks

      1) Adequate DFTPP spectrum (see attached criteria), based on a SO ng injection.
      2) Calibration Check - results for a mid-level standard must be within 25 percent of the
            true value for a single target compound; the average error for all compounds in
            the method must be less than IS percent.
                                          D-11

-------
       C. Calibration

       The calibration method is a 5 point, internal standard, least squares fit, forced through
       the origin. The levels are chosen to cover a range from 4 to 10 times the instrument
       detection limit for the lowest point, up to the point at which saturation and/or non-linear
       behavior is observed. For PAHs in marine sediment or tissue, the current levels are 1.0,
       5.0, 10.0, 15.0, and 20.0 ng/ul. Acceptance criteria for each level are the same as listed
       for the daily check.

       D. Sample Analysis

       A 2SO uL aliquot of the sample extract is blown down to 20-25 uL with nitrogen or
       helium. If required, an internal injection standard is added (4-chloro-p-terphenyl).  Once
       the  daily performance  checks  are satisfied, the extracts  are  queued up on the
       autosampler. Periodic solvent blanks, standards, etc. are inserted at the judgement of
      the analyst.

       E. Identification

       Compounds are identified by monitoring a characteristic ion within a 12 second retention
       time window. Additional ions may be monitored at the discretion of the analyst.
     Confirmation is obtained by inspection of the full mass spectrum.

4.0 QUALITY ASSURANCE

       A. Standard Reference Materials, Blanks, Calibration Checks

       Standard reference materials are prepared along with each batch of samples. Calibration
       standards are verified with independently prepared control standards.

       B. Method Detection Limits

       Method detection  limits are determined independently  for a  given sample matrix.
       Instrument detection limits are generally in the 6-10 pg per injection range, which usually
       corresponds to a 3-5 ng/g (ppb) method detection limit range in samples.

5.0 TROUBLESHOOTING AND MAINTENANCE

       On a daily basis, the injection port and liner are cleaned; the septum and glass wool in
       the liner are changed. It is periodically necessary to break off the first few inches of the
       column (this is done daily for  heavy workloads of  dirty samples;  compounds most
       affected are the high molecular weight compounds).
                                            D-12

-------
          DFTPP ACCEPTANCE CRITERIA (by CLP 3/90)





Mass              Abundance



 51               30-60% of mass 198



 68               Less than 2% of mass 69



 70               Less than 2% of mass 69



127               40-60% of mass 198



197               Less than 1 % of mass 198



198               Base peak, 100% relative abundance



199               5-9% of mass 198



275               10-30% of mass 198



365               Greater than 1% of mass 193



441               Less than mass 443



442               40-60% of mass 198



443               17-23% of mass 442
                                     D-13

-------
                           ERLN CHEMISTRY GROUP
     STANDARD OPERATING PROCEDURE FOR GAS CHROMATOGRAPfflC
             ANALYSIS OF PCBs AND CHLORINATED PESTICIDES
                           (REVISED FEBRUARY 1993)
1.0   OBJECTIVES

      The objective of this document is to define the standard procedure for analyzing marine
      environmental samples for poly chlorinated biphenyls (PCBs) and chlorinated hydrocarbon
      pesticides using gas chromatography and electron capture detectors.

2.0   EQUIPMENT USED

      Hewlett Packard 5890 Gas Chromatographs equipped with electron capture detectors (Ni
      63), automatic samplers, 30 m DB-5 fused  silica capillary columns (0.25 ft, film
      thickness, 0.25 mm i.d.). Peridn-Elmer/Nelson software (ACCBSS*CHROM) provides
      for collection and storage of raw chromatographic data, and for selection and quantitation
      of analyte peaks. Ultra high purity helium and 95/5% Argon/Methane gases are used
      as the carrier and auxiliary gas respectively.

3.0   OPERATION

      3.1 Instrument checks made prior to data collection

            3.1.1 Gas supply

                   3.1.1.1 Check gas cylinder pressures.  Replace tank if pressure is less
                   than 100 psig.

                   3.1.1.2 Check head pressure gauge on front panel of instrument. Gauge
                   should read 18 psig; adjust to correct setting if reading is high; check for
                   leaks if pressure is low.  This setting provides for a carrier gas flow of
                   approximately 1.5 ml/min.

                   3.1.1.3 Replace injection port septum.  Check septum nut and column
                   fittings for leaks with leak detector and tighten as necessary.

                   3.1.1.4 Check the auxiliary gas flow. A flow of 35 ml/min is required.

                   3.1.1.5 Check septum purge and split flows. Adjust to 1 and 35 ml/min,
                   respectively, as necessary.
                                          D-14

-------
      3.1.2 Instrument output signal

             3.1.2.1  Display the analog output signal from the detector on the
             panel of the GC.  Record the value in the instrument log book, and check
             for  consistency with previous readings.   On  instruments  with dual
             detectors, ensure the signal is correctly assigned to the detector selected
             for the analysis.

      3.1.3 Instrument operating parameters

             3.1.3.1  Temperature programs and  run times are stored as workfiles in
             each GC's integrator.  The following conditions are required  for  the
             analysis  of PCBs and pesticides:

                   Injection port temperature         275 °C
                   Detector temperature             325 °C
                   Initial column temperature        100°C
                   Initial hold time                  1 min
                   Rate 1                          5°C/min
                   Ramp 1 final temperature         140°C
                   Ramp 1 hold time               1 min
                   Rate 2                          1.5°C/rain
                   Ramp 2 final temperature         230°C
                   Ramp 2 hold time               20 min
                   Rate 3                          ICPC/min
                   Final column temperature         300°C
                   Final hold time                  5 min
                   Stop time                        100 mill
                   Injection port purge open time     1 min

             3,1.3.2  Load an appropriate workfile into the integrator.

             3.1.3.3  Enter the autosampler parameters into the integrator via Option
             11.  Indicate which injection port is being used, the number and positions
             of the samples in the autosampler tray, the number of injections per bottle,
             and the amount injected (1 ul).

             3.1.3.4   Check the signal assignments and level} again.   If they  are
             correct,  store the workfile in the integrator.

3.2   Data system setup

      3.2.1  Scheduling of standards and samples
                                     D-15

-------
                    3.2.1.1  Setting up the instrument queue is accomplished by following
                    instructions laid out in the Perkin-Elmer Nelson manual.

                    3.2.1.2  Order the samples, standards, and rinses  according  to  the
                    following guidelines:
                           -place hexane rinses before and after standards
                           -bracket groups of no more than five (5) samples with standards.
                           -arrange multiple level standards so that a high and a low standard
                                 precede as well as follow samples
                           -procedural and field  blanks should be run prior  to  samples to
                           minimize risk of carryover contamination.

                    3.2.1.3  Type in sample weight and internal standard  amounts for each
                    sample to be used in final concentration calculations.  Double check all
                    manually entered values for accuracy.

       3.3 Instrument startup and data collection

             3.3.1 After the instrument has been scheduled, arrange the samples and standards
             to be run in the autosampler trays. Check the order for accuracy against a copy
             of the queue.  Load the trays into the autosampler.

             3.3.2  Visually  recheck tank regulator gauges and  instrument  settings to ensure
             proper settings.

             3.3.3 Start GC operation and data collection by pressing 'start' on the integrator.

       3.4 Peak identification and quantitation

             3.4.1 Peak identification is accomplished by automated routines.  Identifications
             are based on comparison of retention times of actual standards to unknown peaks.
             Multilevel standards are  calibrated  to  generate a linear regression curve of
             response according to the manufacturer's instructions. After a calibration curve
             has been generated, the samples are analyzed. Analytes are quantitated based on
             the peak areas for the analytes and internal standard, the amount of the internal
             standard, and the response  factors generated from the calibration  curve.
             Chromatograms and data reports are generated for each sample and standard.

4.0    QUALITY ASSURANCE

       4.1   Chromatograms of standards  are  compared  to  posted  references.   Peak
       identifications, resolution and shapes are inspected.  Calculated standard amounts are
       checked for accuracy and documented. Other abnormalities, such as  spurious or extra
       peaks, rising or falling baselines, and negative spiking are examined.   Response factors

                                             D-16

-------
      and overall instrument response are compared to previous runs and documented.  Blanks
      are checked for the presence of interferences or analytes of interest.  Unknown samples
      are compared to standards to verify peak identifications.

5.0   TROUBLESHOOTING

      5.1 Refer to the ERLN GC Troubleshooting notebook, the manufacturer's manuals, or
      to experienced  personnel for guidance in troubleshooting the GCn.
                                           0-77

-------
       ERLN CHEMISTRY GROUP STANDARD OPERATING PROCEDURE
                 FOR INSTRUMENTAL ANALYSIS OF METALS
                     IN SEDIMENT AND TISSUE EXTRACTS
1.0 OBJECTIVES

      The objective of this document is  to  outline the proper  sample preparation  and
      instrumental parameters for the analysis of trace metals in marine sediment or tissue acid
      digests,

2.0 MATERIALS AND EQUIPMENT

      Atomic Absorption  Spectrometer or Inductively Coupled Plasma  Atomic Emission
             Spectrometer
      Reagent grade Instra-Analyzed concentrated HNO3 for trace metal analysis (diluted to 2M
             concentration)

3.0 METHODS

      3.1 Standard Calibration

             3.1.1 Estimate  or determine the range of concentrations that exist within the
             sample anaiytes. This may require scanning several samples prior  to standard
             calibration in order to approximate the range of absorbances (AA) or emission
             intensities (ICP) produced from the samples.

             3.1.2 Prepare multiple calibration standards that bracket the  expected range of
             sample analyte concentrations. The composition of the standard matrices (i.e. acid
             strength and salt content) should match that in die samples as closely as possible.

             3.1.3 Analyze the standards and calculate calibration equations  by regression
             (linear 'or  polynomial) of standard concentrations against measured standard
             absorbances or intensities.

      3.2 Sample Dilutions

             3.2.1 In section 3.1 the expected range of sample concentrations is  determined.
             If sample concentrations exceed the upper limit of the chosen analytical technique,
             then the sample anaiytes will  need  to  be diluted to fall within the range of
             standard concentrations. Sample diluent should be of the same acid composition
             and strength present  in the sample anaiytes (Keep close record of the sample
             dilutions so that raw  analytical concentrations can be dilution-corrected).
                                          D-18

-------
4.0 ANALYSIS

       4.1 Sample Analysis (Unknown Concentrations)

             4,1.1 Analyze the samples and record the absorbances (A A) or emission intensities
             (ICP).

             4.1.2 Triplicate readings should be made for every element.

             4.1.3  After  approximately  10  (AA) or 20 (ICP) samples, several calibration
             standards should be re-analyzed to determine instrumental drift.

       4.2 Concentration Calculation

             4.2.1  Calculate  sample concentrations by applying the   calibration equation
             obtained from the standard curve to the measured sample signals (absorbances or
             intensities).  Calculate  the  mean  and  standard deviation  of the individually
             calculated sample concentrations.

       4.3 Dilution Correction

             4.3.1 Calculated analyte concentrations must be dilution- corrected to obtain the
             true metal concentration present in the sample. The analyte concentration, in
             ug/ml, is converted to ug/g dry sample by inputing the sample prep, information
             into the following equation:

                            Analyte conc.(ug/ml)  X  Acid volume (ml.)
Sed. Cone, (ug/g dry sed.)  = 	
                                        dry sed. wt. (g)

5.0 QUALITY CONTROL

       5.1 Determination of Analytical Accuracy (Calibration check)

             5.1.1 Analyze several standards as unknown samples to check the accuracy of die
             standard curve regression. Recoveries should be within  10%  of the standard
             concentration.

             5.1.2 Analyze  a solution  of  known and/or certified  concentration, prepared
             independently from the calibration  standards, to determine the daily analytical
             fluctuation. Recoveries should  be within 10% of the certified concentration.

       5.2 Standard Additions (Spike Additions)

             5.2.1 Standard additions are required to investigate  instrumental interferences
             arising from differing sample solution matrices.

                                            D-19

-------
             5.2.2 Select a sample whose concentrations can be matched fairly closely with a
             dilution of a calibration standard.

             5.2.3 Prepare an acid spike (a dilution of a calibration standard) in the same acid
             matrix as the samples. Try to match spike concentrations as closely as possible
             with the sample chosen.

             5.2.4 Prepare a sample spike by removing a second sample aliquot and adding the
             same amount of calibration standard as was used in  the  acid spike. The total
             volume of sample spike should also be equal to the total volume of acid used in
             the acid spike.

             5.2.5 Analyze the sample, acid spike and sample spike as unknown samples.

             5.2.6 Calculate the spike recovery using the  following equation:
                                    CSAMPLESPKE "
                                          ACID SPKE

             5.2.7 Acceptable spike recoveries fall between 80-120%

             5.2.8 One out of every 20 samples should be chosen for a standard addition.

6.0 DETECTION LIMITS

       6,1 Instrument Detection Limits

             6.1.1 Instrument detection limits are determined as the concentration equivalent
             to a signal three times the standard deviation of a blank. The limits should either
             be determined previously for given instrumental conditions or as part of the
             instrumental data analysis, and should be comparable to those listed below:
                                             £7-20

-------
              ICP                        GFAA
             (ug/ml)                    (ug/L)

                                         1.0
                                         0.1
                                         1.0
                                         3.0
                                         2.0
                                         0.5
                                         2.0
                                         0.5

                                         2.0
                                         2.0
                                         2.0
                                         0.5

6.1.2 Sample Detection Limits, assuming a dry weight of 2 grams and a total
volume of 50 mis. (ie. sediment ultrasonic extraction method), are 25 times higher
than the instrument D.L.'s. Method detection limits should be calculated following
the rigorous statistical procedure detailed in 40 CFR Part 136.
Cu
Zn
Cr
Pb
Ni
Mn
Fe
Cd
Al
Sn
Sb
As
Ag
.020
.005
.020
.050
.050
.010
.020
.005
.075
.050
.100
.100
.020
                               D-21

-------
                           ERLN CHEMISTRY GROUP
      STANDARD OPERATING PROCEDURE FOR SEDIMENT EXTRACTION
                  OF SEMIVOLATILE ORGANIC ANALYTES
                           (REVISED FEBRUARY
1.0 OBJECTIVES

      The objective of this document is to define the standard operating procedure for the
      extraction of semi-volatile organic compounds from marine sediment samples.  The
      extracts will be further cleaned up by silica gel chromatography procedures prior to
      analysis by gas chromatography (GQ or gas chromatography/mass  spectrometry
      (GC/MS).

2.0 MATERIALS AND EQUIPMENT

      Apparatus for homogenizing sediment
          Wrist-action shaker
             100 ml glass centrifuge tubes
      Apparatus for determining weight and dry weight
             Top-loading balance capable of weighing to 0.01 g
             Aluminum weighing pans
             Stainless steel spatula
      Drying oven maintained at 105-120°C
      Turbo-Vap (Zymark) apparatus, with heated water maintained at 25-35 °C
             Nitrogen gas, compressed, 99.9% pure
             Glass Turbo-Vap flasks, 200 ml
      Glass graduated cylinders, 100- and SOO-ml
      Erienmeyer flasks, 250 ml
      Microliter syringes or micropipets, solvent rinsed
      Borosilicate glass vials with Teflon-lined screw caps, 2-ml
      Reagents
             Methylenc chloride, pesticide grade or equivalent
             Deionized water, pentane-extracted
             Acetone, pesticide grade or equivalent
             Sodium sulfate-anhydrous, reagent grade.  Heated to 400°C tor at least 4 hours,
                  then cooled and stored in a tightly sealed glass container at room
                  temperature.
             Internal Standards, to be added to each sample prior to extraction.

3.0 METHODS

      3.1  Find the correct caps for each centrifuge tube to be used by filling them with

                                          D-22

-------
      approximately 25 mis of methylene chloride, putting the caps on and rolling the tube on
      the lab bench on a paper towel and look for leaks.  Once the correct tubes and caps have
      been matched, weigh approximately 10.0 g of homogenized sample into a solvent rinsed
      centrifuge tube. Homogenization is accomplished by physical muring of the sediment with
      stainless steel or Teflon coated utensils, or by a polyethylene propeller attached to an
      electric drill.  The amount of sample may be adjusted based on expected contaminant
      concentrations or detection limits required.  Weigh approximately  2.0 grams into a
      preweighed aluminum pan for dry/wet determination.

      3.2  Add Internal Standards as required: CB198 for PCB analysis,  2,5-dichloro-m-
      terphenyl for pesticides, and d!2 Benzo(a)anthracene/ dlO Phenanthrene mix for PAHs.
      The amount of IS added is dependent on  the expected contaminant concentrations and
      should be equivalent to those concentrations.

      3.3 Add 30 g Sodium sulfate and mix with a teflon coated spatula very well.  Then add
      50 ml 20:80 acetone: methylene chloride.

      3.4 Seal the centrifuge tubes with teflon tape and caps, and shalce -15 hrs. (overnight).
      Shake tubes at approximately a 60° angle, at an intensity setting of "5". Centrifuge for
      20 minutes at 1750 rpm and pour off the supernatant into an erienmeyer flask.

      3.5  Add 50  ml of 20:80 acetone:methylene chloride, seal and shake as above for -6
      hrs. Centrifuge for 20 minutes at 1750 rpm and add the supernatant to the erienmeyer
      flask.  Add some additional sodium sulfate to the combined extracts to ensure all water
      is excluded.

      3.6 Gravity filter the extract through a pre-rinsed (methylene cldloride) glass fiber filter.
      Rinse the erienmeyer 2 x with methylene chloride, and the filter itself once. Collect the
      filtrate in a clean rinsed 200 ml Turbo-Vap  tube. Place the flask into the Turbo-Vap
      apparatus,  and turn on the unit. Open the valve on the nitrogen tank and  adjust the
      regulator to ensure a pressure of 15 psi. Reduce the sample volume to approximately 1
      ml, with solvent exchange to pentane.

      3.9  Adjust the volume to 1 ml  with hexane.

      3.10 Fractionate the  sample following the Column Chromatography SOP.

4.0 OPTIONAL CLEANUP PROCEDURES

      Activated copper powder (activated by the addition of 8 M hydrochloric acid  and rinsed
      with the  following  solvents  in succession:   deionized water,  methanol,  methylene
      chloride, and hexane) may be added to the extract to remove Jiny free elemental sulfur.
      The copper is added until the formation of black copper sulfkle no longer occurs.
                                            D-23

-------
5.0 QUAIJTY ASSURANCE/QUALITY CONTROL

      5.1 Standard Reference Materials

             5.1.1 A  certified SRM  is prepared with each batch of samples to validate
             analytical recovery.   Results are compared to certified  concentrations and
             corrective action is  required  if  the  accuracy is outside  of the required
             specifications.

             5.1.2 SRMs should be prepared in the exact same manner as the unknowns.

      5.2 Analytical Reproducibility

             5.2.1 Replicate samples should be prepared to assess the reproducibility of the
             extraction procedure.

             5.2.2 For every batch of samples, one  sample should be chosen to extract and
             analyze in triplicate. Deviation between replicate samples should be <30%.

      5.3 Procedural Blanks

             5.3.1 Procedural blanks should be carried throughout the entire extraction
             procedure to verify the absence of contamination of the method.

             5.3.2 Trace amounts of analytes in the blanks (less than three times the  method
             detection limit) may be ignored and have no effect on the subsequent sample
             analyses, but samples should be rejected if significant concentrations (greater than
             five times the MDL) are present in procedural blanks.

             5.3.3 One  blank should be prepared  for each batch  of  samples  (minimum
             frequency of 5%).
                                            D-24

-------
       ERLN CHEMISTRY GROUP STANDARD OPERATING PROCEDURE
              FOR DIGESTION OF MARINE ORGANISM SAMPLES
                           FOR METALS ANALYSIS
1.0 OBJECTIVES

      The objective of this document is to establish the standard operating procedure for the
      total digestion of marine tissue samples. Sample extracts are routinely analyzed by Flame
      Atomic  Absorption  Spectrometry  (FAA),  Graphite  Furnace Atomic Absorption
      Spectrometry (GFAAS) or Inductively Coupled Plasma Atomic Emission Spectrometry
      (ICP-AES).

2.0 MATERIALS AND EQUIPMENT

      Top-loading balance (0.01 gram precision)
      Vacuum Freeze Dryer
      CEM Microwave Digestion System (Including 100 ml. Teflon vessel liners and pressure
      control capability)
      50 ml. class A volumetric flasks
      60 ml. polyethylene screw-cap bottles
      Instra-Analyzed grade concentrated HNO3 for trace metal analysis (70-71 %)
      Hydrogen Peroxide - H£>2 (30%)
      Vacuum filtering apparatus with Whatman 42 filter paper

3.0 METHODS

      3.1 Sample Preparation

            3.1.1 Organism  samples should be thawed, and handled only with plastic or
            stainless   steel  utensils.  Where  neccessary,  organism tissues  should  be
            homogenized. If chromium or nickel  is  to be analyzed in  the samples,  the
            homogenizer tip should be constructed of titanium to avoid  contamination of
            sample tissues.

            3.1.2 Obtain the tare weight of labeled, acid-washed 100 ml. Teflon microwave
            digestion vessel liners.

            3.1.3 Weigh approximately 3-5 grams wet tissue  into each vessel (-0.5 grams
            dry). Obtain the wet gross weight of each tube.

             3.1.4 Freeze dry  samples and obtain  the dry gross  weight  for each sample.
            Subtract the tare weight and record the weight of dry tissue in each tube.
                                          0-25

-------
3.2 Closed Vessel Microwave Digestion (1st Stage)

       3.2.1 Add 10 ml. of concentrated HNO3 (70-71 %) to each digestion vessel.

       3.2.2 Make sure the tissue sample is fully saturated and allow  to sit  for a
       minimum of 1 hour, or until all foaming subsides.

       3.2.3 Place each liner into a microwave vessel.

       3.2.4 Insert a pressure relief membrane into each cap assembly and place on top
       of  the vessels, (use the modified cap assembly for the  vessel to  be used for
       pressure monitoring)

       3.2.5 Place a top on each vessel and hand tighten.

       3.2.6 Place the vessels into the carousel.

       3.2.7 Insert a vent tube into each vessel, place the free end in  the center trap,
       then place the carousel into the oven.

       3.2.8 Connect  the pressure sensing line to the modified cap assembly, (make sure
       the valve on the side of the oven is in the "neutral" position)

       3.2.9 Program the oven following the parameters below:

       STAGE      12345

       %POWER    85           85           85           85           85
       PSI          20           40.          85           150          190
       TIME        15:00        15:00        15:00        15:00     15:00
       TAP         5:00         5:00          5:00         5:00      5:00
       FAN SPEED  100          100          100          100          100

                          ** Note - Power settings are for 12 vessels. If a different
    #   of   vessels   is   desired,    subtract   or   add  5%   power
                                   per vessel.

       3.2.10  After completion of the program, allow the pressure in the control vessel
       to drop below 20 PSl, then manually vent the control vessel, remove the pressure
       sensing line and place the carousel into the fume hood.

3.3    Closed Vessel Microwave Digestion (2nd Stage)

       3.3.1 Manually vent each vessel, remove the caps and add 2 ml. of 30% Hf>2.

       3.3.2 Allow the reaction to subside, then reassemble the vessels as described in

                                     D-26

-------
      sections 3.2.4-3.2.6.

      3.3.3 Place the carousel into the oven and reconnect the pressure sensing line to
      the control vessel. Check to ensure the exhaust fan is operating.

      3.3.5 Program the oven following the parameters below:

             STAGE       1            2
%POWER 85
PSI 100
TIME 15:00
TAP 5:00
FAN SPEED 100
100
100
15:00
5:00
100
                          ** Note - Power settings are for 12 vessels. If a different
    #   of   vessels   is   desired,   subtract  or   add   5%   power
                                   per vessel.

      3.3.6 Although the oven is  automated,  individual  tissue samples will react
      differently, so all steps should be monitored in case venting  should occur. If
      venting does occur, remove the vented vessels and lower the power accordingly.

      3.3.7 After completion of the program, allow the vessels to cool in the oven until
      the pressure in the control vessel is below 20 PSI.

      3.3.8 Manually vent the control vessel, then remove the carousel and place in a
      fume hood until the liquid reaches room temperature.

      3.3.9 Remove the vent tubes and manually vent the remaining vessels.

3.4 Sample Filtration

      3.4.1 Remove the tops and rinse the lids with deionized water, catching the rinse
      in the vessel liner.

      3.4.2 Add -15 ml. of deionized water to each vessel.

      3.4.3 Using plastic tweezers, place a sheet of Whatman  42  filter paper in a
      vacuum filtration funnel and wet the paper with 2M HNO3.

      3.4.4 Place a 60 ml. acid-cleaned polyethylene bottle and vacuum  gasket under
      the filter funnel and apply vacuum.

      3.4.5 Filter the digested sample through the paper and collect  the filtrate in the
                                     D-27

-------
             bottle.

             3.4.6 Rinse the digestion vessel with deionized water, filter and collect the filtrate
             in the bottle.

             3.4.7 Pour the combined filtrates into a 50 ml. acid-cleaned volumetric flask, and
             dilute to the mark with deionized water.

             3.4.8  Shake the solution thoroughly and transfer back to the acid-cleaned 60
             ml. polyethylene bottle. Label the bottle appropriately.

4.0 QUALITY ASSURANCE

       4.1 Standard Reference Materials (SRM)

             4.1.1 A certified SRM should be prepared with every batch of samples to validate
             analytical recovery.

             4.1.2  SRMs should be prepared in the exact manner as the unknown samples,
             including drying, even if the material is already dry.

             4.1.3 The frequency of SRM preparation should be approximately 1 for every 20
             unknown samples prepared.

             4.1.4 The outlined extraction technique should yield close to 100% recoveries for
             organism SRMs, as outlined in the ERLN QA/QC guidelines.

       4.2 Analytical Reproducibility

             4.2.1  Replicate,samples should be prepared to  assess the reproducibility of the
             digestion procedure.

             4.2.2 For every 20 samples  prepared, one sample should be chosen to digest and
             analyze in  triplicate. The relative standard deviation between replicate  analyses
             should be <20%.

       4.3 Procedural Blanks

             4.3.1  Procedural blanks should be  carried throughout  the  entire extraction
             procedure to verify that contaminants are not present in the reagents and that no
             contamination has occurred throughout the procedure.

             4.3.2  Trace amounts of metals in the blanks can be subtracted from subsequent
             sample analyses (blank subtraction), but a sample batch  should be rejected if
             concentrations in the blank are >10% of "average" sample concentrations.
                                            £3-28

-------
4.3.3 One procedural blank should be prepared for every 20 samples extracted.
                                D-29

-------
                           ERLN CHEMISTRY GROUP
        STANDARD OPERATING PROCEDURE FOR TOTAL DIGESTION
                            OF SEDIMENT SAMPLES

1.0 OBJECTIVES

      The objective of this document is to establish the standard operating procedure for the
      total digestion of bulk sediments. Sample digests are routinely analyzed by Flame Atomic
      Absorption Spectrometry (FAA), Graphite Furnace Atomic Absorption Spectrometry
      (GFAAS) or  Inductively Coupled Plasma Atomic Emission Spectrometry (ICP).

2.0 MATERIALS AND EQUIPMENT
      Top-loading balance (0.01 gram precision)
      Vacuum Freeze Dryer
      CEM Microwave Digestion System (Including 100 ml. Teflon digestion vessel liners with
      pressure control capability)
      Protective Clothing (Polyethylene apron, Neoprene gloves, Safety goggles, Face shield)
      100 ml. class A volumetric flasks
      125 ml. polyethylene screw-cap bottles
      Instra-Analyzed grade concentrated HNO3 for trace metal analysis (70-71 %)
      Reagent grade concentrated HF (49%)
      Reagent grade concentrated HCL (36.5-38%)
      Boric Acid (5%) prepared from H3BO3 crystals
      Deionized water

3.0 METHODS

      3.1 Sample Preparation

             3.1.1  Sediment  samples should be thawed and  homogenized  with plastic or
             stainless  steel utensils.

             3.1.2 Obtain the tare weight of labeled, acid-washed 100 ml. Teflon microwave
             digestion vessels liners.

             3.1.3 Weigh approximately 1.5 grams wet sediment into each vessel (-0.5 grams
             dry). Obtain the wet gross weight of each liner.

             3.1.4  Freeze  dry samples and obtain the dry gross weight for each sample.
             Subtract the tare weight and record the weight of dry sediment in each  liner.

      3.2 Microwave digestion
                   ** NOTE- Be  sure to wear proper safety clothing when working with the
                   concentrated HF.
                                          D-30

-------
             3.2,1 Add 5 ml. of concentrated HNO3 (70-71 %), 4 ml. of concentrated HF (49%)
             and 1 ml. concentrated HC1 (36.5-38%) to the vessel liners.

             3.2.2 Make sure the sediment is fully saturated and allow to sit for a minimum of
             1 hour.

             3.2.3 Place the liners into their corresponding vessels.

             3.2.4 Insert a rupture membrane into each lid and secure into place with a cap. fi>
lot overtighten.

             3.2.5 Place the vessels into the carousel.

             3.2.6 Insert a vent tube  into each vessel and place the free end into the center
             trap.

             3.2.7 Attach the pressure sensing line to thhe control vessel, making sure the lever
             on the side of the oven is in the "neutral" position.

             3.2.8 Program the oven following the parameters below:

                   STAGE       1            2
                   %POWER     100           100
                   PSI           120           150
                   TIME         30:00         15:00
                   TAP          20:00         10:00
                   FAN SPEED  100           100

                                        **Note  - Power settings are for  12 vessels. If a
                                                 different # of vessels is desired, subtract
 or add 5% power per vessel.

             3.2.9 Although the oven is automated, individual sediments; will react differently,
             so all steps should be monitored in case venting should occur. If venting does
             occur, remove the  vented vessels and lower the power accordingly.

             3.2.10 When the program  is finished, allow the pressure in the control vessel to
             drop below 20 PSI.

             3.2.11 Manually vent the control vessel, detach the pressure sensing line and place
             the carousel in a fume hood.

             3.2.12 Remove the vent tubes  and vent the remaining vessels manually.

             3.2.13 In a fume hood, remove the caps and rinse the lids with deionized water,
             catching the rinse  in the vessel liner.

                                              D-31

-------
             3.2.14 Add 30 ml. of 5% Boric acid to each sample.

       3.3 Sample Filtration (This step may not be necessary)

             3.3.1 Add ~15 ml. of deionized water to each vessel.

             3.4.2 Using plastic tweezers,  place a sheet of Whatman 42 filter paper in a
             vacuum filtration funnel and wet the paper with 2M HNO3.

             3.3.3 Place a 120 ml. acid-cleaned polyethylene bottle and vacuum gasket under
             the filter funnel and apply vacuum.

             3.3.4 Filter the digested sample through the paper and collect the filtrate in  the
             bottle.

             3.3.5 Rinse the digestion vessel with deionized water, filter and collect the filtrate
             in the bottle.

             3.3.6 Pour the combined filtrates into a 100 ml. acid-cleaned volumetric flask, and
             dilute to the mark with deionized water.

             3.3.7 Shake the solution thoroughly  and transfer back to the acid-cleaned 120
             ml. polyethylene bottle. Label  the bottle appropriately.

       3.4 Sample Dilution (Required only if filtration step was omitted)

             3.4.1 Transfer the contents of the vessel liner to a clean 100 ml. volumetric flask
             and rinse the vessel with deionized water, also adding the rinse to the flask.

             3.4.2 Dilute to the volume mark with deionized water.

             3.4.3 Shake the extracts thoroughly and  transfer into  acid-cleaned 125  ml.
             polyethylene screw-cap bottles.

             3.4.4 Label the bottles appropriately and store at room temperature until analysis.

4.0 QUALITY ASSURANCE

       4.1 Standard Reference Materials (SRMs)

             4.1.1 A certified SRM should be prepared with every batch of samples to validate
             analytical recovery.

             4.1.2 SRMs should be prepared in the exact manner  as  the unknown samples,
             including drying, even if the material is already dry.
                                             D-32

-------
       4. L3 The frequency of SRM preparation should be approximately 1 for every 20
       unknown samples prepared.

       4.1.4 The outlined extraction technique should yield close to 100% recoveries for
       sediment SRMs,

4.2 Analytical Reproducibility

       4.2.1 Replicate samples  should be  prepared to assess the reproducibility of the
       digestion procedure.

       4.2.2 For every 20 samples prepared, one sample should be chosen to digest and
       analyze in triplicate. The relative standard deviation between replicate analyses
       should be <20%,

4.3 Procedural Blanks

       4.3.1  Procedural blanks  should be carried throughout the  entire digestion
       procedure to verify that contaminants are not present in the reagents and that
       contamination has not occurred throughout the procedure!.

       4.3.2 Trace amounts of metals in the blanks can be suboracted  from subsequent
       sample analyses (blank  subtraction), but a sample batch should be rejected if
       concentrations in the blank are >10% of "average" sample concentrations.

       4.3.3 One procedural blank should  be prepared for every 20 samples digested.
                                       D-33

-------
                          ERLN CHEMISTRY GROUP
       STANDARD OPERATING PROCEDURE FOR TISSUE EXTRACTION
                 OF SEMTVOLATILE ORGANIC ANALYTES
                          (REVISED FEBRUARY 1993)
1.0 OBJECTIVES

      The objective of this document is to define the standard operating procedure for the
      extraction of semi-volatile organic compounds from marine tissue samples. The extracts
      will be further cleaned up by silica gel cinematography procedures prior to analysis by
      gas chromatography (GC) or gas chromatography/mass spectrometry (GC/MS).

2.0 MATERIALS AND EQUIPMENT

      Apparatus for homogenizing tissue
            Brinkman Polytron
            100- or 150-ml glass centrifuge tubes

      Apparatus for determining weight and dry weight
            Top-loading balance capable of weighing to 0.01 g
            Aluminum weighing pans
            Stainless steel spatula

      Drying oven maintained at 105-120SC

      Turbo-Vap (Zymark) apparatus, with heated water bath maintained at 25-35° C
            Nitrogen gas, compressed, 99.9% pure
            Glass Turbo-vap flasks, 200 ml

      Glass graduated cylinders, 100- and 500-ml
      Glass separatory funnels, 1  L.
      Glass erlenmeyer flasks, 250 and 500 mL
      Borosilicate glass vials with Teflon-lined screw cast, 2-ml

      Microliter syringes or mkropipets, solvent rinsed

      Reagents
            Pentane, pesticide grade or equivalent
            Acetonitrile, pesticide grade or equivalent
            Deionized water, pentane-extracted
            Sodium sulfate-anhydrous, reagent grade. Heated to 400°C for at least 4 hours,
                   then  cooled  and stored  in  a  tightly-sealed glass container at  room
                                         D-34

-------
                   temperature.
             Internal Standards, to be added to each sample prior to extraction.

3.0 METHODS

      3.1 Weigh approximately 10.0 g of sample into a solvent rinsed centrifuge tube. Weigh
      approximately 1.0 gram into a preweighed aluminum pan for dry/wet determination.

      3.2  Add Internal Standards as required: CB198 for PCB analysis, 2,5-dichloro-m-
      terphenyl for pesticides, and d!2 Benzo(a)Anthracene and dlO Phenanthrene mix for
      PAHs. The amount of IS added is dependent on the expected contuninant concentrations
      and should be equivalent to those concentrations.

      3.3 Add SO ml acetonitrile.

      3.4 Polytron the samples for 20 seconds, at a speed setting of - 5.  Centrifuge for 10
      minutes at 1750 rpm and pour off the supernatant into a separator) funnel containing 500
      ml pentane extracted deionized water (DI).  Repeat this step two more times.

      3.5 Back extract the DI/ACETONITRILE phase in the scparatory funnel with 3 X 50
      ml pentane.  After each addition of pentane has been shaken, draw off the bottom layer
      into a 500 ml erlenmeyer flask. Decant the Pentane layer into a 250 ml ertenmeyer flask
      by pouring it out the top of the separatory funnel. This way the transfer of water into
      the pentane extract will be avoided.

      3.6 Transfer the water layer from the 500 ml erlenmeyer flask buck into the separatory
      funnel for every addition of pentane.  Rinse the 500 ml flask 3 x with Pentane and add
      the rinses to the separatory  funnel.

      3.7 Combine the pentane extracts and dry over Sodium Sulfate.

      3.8 Transfer the sample to a 200 ml Turbo-Vap flask. Rinse the flask 3 x with pentane
      and add the rinses to the flask. Place the flask into the Turbo-Vap apparatus, and turn
      on the unit.  Open the valve  on the nitrogen tank and set the regulator to ensure a
      pressure of 15 psig is reaching the Turbo-Vap unit.  Reduce this volume of sample to
      approximately 1 ml.

      3.9  Adjust the volume to  1.0 ml with pentane.  Remove 0.1 ml of sample into a
      prcweighed aluminum pan  for lipid weight determination.  Allow it to dry at room
      temperature for at least 24 hours.  Record the weight of the pan plus the sample.

      3.10  Fractionate the sample following the Column Chromatography  SOP.

4.0 QUALITY ASSURANCE/QUALITY CONTROL


                                          D-35

-------
4.1 Standard Reference Materials

       4.1.1 A certified SRM  is prepared with each batch of samples to validate
       analytical recovery.  Analytical results should then be compared to the certified
       concentrations.  Corrective action is required if the required accuracy goals are
       not met.

       4.1.2 SRMs should be prepared in the exact same manner as the unknowns.
4.2 Analytical Reproducibility

       4.2.1 Replicate samples should be prepared to assess the reproducibility of the
       extraction procedure.

       4.2.2 For every batch of samples, one sample should be chosen to extract and
       analyze in triplicate. Deviation between replicate samples should be <30%. ,

4.3 Procedural Blanks

       4.3.1  Procedural blanks should  be carried throughout the entire extraction
       procedure to verify the absence of contamination of the method.

       4.3.2 Trace amounts of analytes in the blanks (less than three times the method
       detection limit) may be ignored and have no effect on the subsequent sample
       analyses, but samples should be rejected if significant concentrations (greater than
       five times the MDL) are present in procedural blanks.

       4.3.3  One blank should be  prepared for each  batch  of samples  (minimum
       frequency 536).
                                      0-36

-------
      APPENDIX E

EPA Priority Pollutants and
   Additional Hazardous
Substance List Compounds

-------
           CHEMICAL STRUCTURES AND MOLECULAR  WEIGHTS Of U.S.  EPA
    PRIORITY POLLUTANT AND ADDITIONAL HAZARDOUS SUBSTANCE  LIST COMPOUNDS

EPA 1
Compound Structure
mw
PHENOLS

a

b

c
d

65

HSL

HSL
34
SUBSTITUTED

a


b

c

d

e

f

9

h
LOW

a

b
c

d
e

f

24


31

22

21

HSL

64

57

59
OH
phenol (ch
^*^ OH
2-methyl phenol b [or"3
OH ^
4-methyl phenol c (CD)
2, 4-d1methyl phenol °^ d^T"3
PHENOLS
OH
2-chlorophenol (oj"
, ON

2,4-d1chloropnenol (oj"
OH TT
4-chloro-3-methyl phenol c (pi
ci ^ d eiJLn
2,4,6-trlchlorophenol T(3T
OH ^-r
2,4,5-trlchlorophenol Jot"
o*>r f . on
pentachlorophenol n "^^C"
OH tl* ^f^Cl
2-n1trophenol ($T"°*
ON
2,4-d1n1trophenol h (ch"1*1

94

10B

108
122


126


163

143

198

198

266

139

184
MOLECULAR WEIGHT AROMATIC S

55

77
1

80
81

78
9 ^
naphthalene C§©
b FR
acenaphthylene (pK)
acenaphthene (A®
*+*r*^ ^
fluorene ^C!I3§)
phenanthrene @~(o)
f
anthracene ©^^©

128

152
154

116
178

178
EPA # - EPA priority pollutant number defined for toxic pollutants in 40 CFR 401.15 that are a
subset of the hazardous substances listed in Appendix VUL of 40 CFR 261.
mw - molecular weight of an organic compound.

HSL -  hazardous substance  list.
                                            E-1

-------
     EPA *  Compound
HIGH MOLECULAR WEIGHT PAH
a     39    fluoranthene
b     84    pyrene
c     72    benzo(a)anthracene
d     76    chrysene
e     74    benzo(b)fluoranthene
f     75    benzo(k)fluoranthene
g     73    benzo(a)pyrene
h     83    indeno(l,2,3-c,d)pyrene
1     82    dibenzo( a, h)anthracene
j     79    benzo(g,h,1)pery1ene

CHLORIKATED AROMATIC HYDROCARBONS
a    26    1,3-dlchlorobenzene
b    27    1,4-dichlorobenzene
 c    25    It2-d1ch1orobenzene
 d      8    1,2,4-trichlorobenzene
 e     20    2-chloronaphthalene
 f      9    hexachlorobenzene
          Structure
     a
     (§3
     n

                    ci
                    ci
                    n
                f   n
                     C1
                              mw

                              202
                              202
                              228
                              228
                              252
                              252
                              252
                              276
                              278
                              276
147
147
147
181
163
285
E-2

-------
     EPA I  Compound
CHLORINATED ALIPHATIC HYDROCARBONS
i     12    hexachloroethane
            trlchlorobutadiene isomers
            tetrachlorobutadlene  isomers
            pentachlorobutadiene  isomers
            hexachlorobutadiene         e
            hexachlorocyclopentadiene

 HALQ6EiATED ETHERS
 a     18   bis(2-chloroethyl)ether
 b     42   b1s(2-chloro1sopropyl)ether
 c     43   b1s(2-chloroethoxy)methane
 d     40   4-chlorophenyl phenyl ether
 e     41    4-bromophenyl phenyl ether
        Stucture
b
c
d
e
f
XX
XX
XX
52
53
    ci ci
  ci-c-c-ci
    CI Cl
 b,c,d:
              .
                     H OR Cl
 ci
     «
                                             Cl
          Cl
          •o
                Cl
                Cl
                     Cl
                      Cl
                    CH
168
158
192
226
261
273
                             143
                                                                      173
                                                                      204
                                                                      249
PHTHALATES
a     71    dimethyl  phthalate
b     70    diethy1  phthalate
c     68    d1-n-butyl phthalate
d     67    butylbenzylphthalate
e     66    b1s(2-ethylhexyl) phthalate
f     69    d1-n-octylphthalate
e
  tor'
                     0
                     *
                              194
                              278
                              312
                              391
 £-3-

-------
     EPA I  Compound                               Structure           mm
MISCELLANEOUS OXYGENATED CONFOUNDS
a
b
c
d
e

54
HSL
HSL
129
HSL

isophorone «,-ixJL.. i
^CHj a!J fa jk
benzyl alcohol to)
benzole acid c rf^j 4
2f3,7»8-tetrachlorod1benzo-p-d1ox1n . ei^^O^C
dlbenzofuran (S£^
-------

EPA *
PESTICIDES
a
b
c
d
e
f
5
h
i
j
k
1
m
n
0
P
q
r
93
94
92
89
90
91
95
96
97
98
99
100
101
102
103
104
105
113
PCBs
a    106
b    110
c    107
d    111
Compound

pfp'-DDE
p,p'-OOD
pfp'-OOT
aldrin
dleldrin
chlordane
alpht-endosulfin
beta-endosulfan
endosulfan sulfate
endrin
 endrin aldehyde
 heptachlor
 heptach1orepox ide
 alpha-HCH
 beta-HCH
 delta-HCH
 gamma-HCH
 toxaphene
                                                Structure
                                                                    mw
                                           (man copoHBtis. «mniMTE
  Aroclor 1242
  Aroclor 1248
  Aroclor 1254
  Aroclor 1260
                    FMWU
a     a
                                        £-5

-------
EPA f Compound
VOLATILE HALOGENATEO AUAHES

a 45 chloromethane

b 46 brorroethane

c 16 chloroethane

d 44 methylene chloride
e 13 l.l'-dichloroethane

f 23 chloroform


g 10 1,2-dichloroethane


h 11 1,1,1-trichloroethane

1 6 carbon tetrachloride

j 48 bromodi chloromethane

k 32 1,2-dichloropropane

1 51 chlorodlbrcmomethane

m 14 1,1,2-trichloroethane
n 47 bromoform


o 15 1,1,2,2-tetrachloroethane

VOLATILE HALOGEMATEO AUEMES

a 88 vinyl chloride
b 29 I,l'-d1chl oroethene
c 30 trans-1 ,2-dichloroethent

Structure
a (
-fa b
i i
_ — «— C— Ir
C | i
I I
— C— C— Cl j
it a
ci
-La
,7 '
— C— c—ci f a
1 1 ?
g -U.CI
a
i i

11 ci
i i
i ~^-a
a ci
ci-c-a
a a a
k -H
a a
-M-c-ei 1
-«_^-6-a ,
m -c-a ^
ci a ir
-{-H1 n
o -f«p
b-
Cl Cl
n— i- e-a
i i

a
% ,^a h
^c-cx b
C|^B^
X Vp
>-<* d
i i
d 33 c1s- and trans-1 ,3-dichloropropene "^c-e^f" „>•& eli* Own-
^°^x0 f
°XC1
mw

50.6

109

64.5

85
99

119


99


133

154

164

133

208

133
253


168



62.5
97
97

1 111

131
166
E-6

-------
     EPA f  Compound
VOLATILE AROMATIC HYDROCARBONS
a      4    benzene
b     86    toluene
c     38    ethylbenzene
d    HSL    styrene
e    HSL    total xylenes
                                                 Structure
                                          e
                                          o.f.  fb
                                                 •«3
                                                          b   o^
                                                          (1   a
                                                              CM
AW 011O ISONTO
mw

 78
 92
106
104
106
VOLATILE CHLORINATED AROMATIC HYDROCARBONS
                                          i
a      7    chlorobenzene
                                                ci
                                                                      112
 VOUTILE UGATURATED CARBOKTL COMPOUNDS
 a      2    acrolein
 b      3    acrylonltrlle
     :«c%
                                                                       56
                                                                       53
 VQUTILE ETHBIS
 i     19    2-ch1oroethylv1ny1ethtr
                                               •   ,
               106
  VOLATILE KETOKES
  a    HSL    acetone
  b    HSL    2-butanone
  c    HSL    2-hexanone
  d    HSL    4-methyl-2-pentanone

  MISCELLANEOUS VOLATILE  COMPOUNDS
  a    HSL    carbon dlsulflde
|  b    HSL    vinyl acetate
                                           i  111    i
                                          —c—e—c—e—«—c—
                                          a
                                               *-c*s
                 76
                 86
                                          £-7

-------
      APPENDIX F
Example Quality Assurance
        Reports

-------
PREFACE
      The following examples of detailed quality assurance (QA) reviews for a metals data
      package and a polychlorinated biphenyl (PCB) data package demonstrate the kind of
      information provided by QA specialists.  The sections of these example reports address
      each of the components of a QA review discussed in Section 2.16 in. the main text of this
      document.

      These reviews were conducted in accordance with EPA Contract Laboratory Program pro-
      cedures.  QA reviews for other programs may use alternative criteria for evaluation  and
      different detection limits. For example, the target detection limits discussed for dredging
      programs differ from the detection limits described in this QA review.
                                           F-iii

-------
CONTENTS
                                                             Page

     PREFACE                                                  F-iii

     QUALITY ASSURANCE REVIEW OF METALS IN WATER SAMPLES     F-1

        INTRODUCTION                                          F-1

        QUALITY ASSURANCE REVIEW                             F-1

           Overall Case Assessment                                F-1
           Completeness                                         F-3
           Holding Times                                         F-5
           Analytical Methods                                     F-5
           Accuracy                                             F-8
           Precision                                            F-11
           Blanks                                             F-11

        REFERENCES                                           F-13

     QUALITY ASSURANCE REVIEW OF POLYCHLORINATED
     BIPHENYLS IN SEDIMENT                                    F-14

        INTRODUCTION                                         F-14

        OVERALL CASE ASSESSMENT                             F-15

           Summary of Completeness                              F-15
           Summary of Data Qualifications                           F-15

        HOLDING TIMES                                         F-16

        ANALYTICAL METHODS                                   F-16

        CALIBRATION                                           F-17

           Initial Calibration                                      F-17
           Continuing Calibration                                  F-18
                                  F-v

-------
                                                    Page

METHOD BLANK ANALYSIS                               F-18

ACCURACY                                            F-18

   Surrogate Compound Recoveries                         F-19
   Matrix Spike Recoveries                                F-19

PRECISION                                            F-19

IDENTIFICATION OF COMPOUNDS                          F-19

COMPOUND QUANTIFICATION AND REPORTED
DETECTION LIMITS                                      F-20

REFERENCES                                          F-20
                         F-vi

-------
QUALITY ASSURANCE REVIEW OF METALS  IN
WATER  SAMPLES
INTRODUCTION

      This report documents the results of a quality assurance review of analytical data for
      metals in water samples from Project X.  This quality assurance report is provided in
      support of the quality assurance project plan for this project.

      All laboratory analyses were performed by Analysis Laboratory in  City,  State.  Ah*
      samples were analyzed in accordance with the U.S. Environmental Protection Agency
      (EPA) Contract Laboratory Program Statement of Work for Inorganic Analyses (U.S. EPA
      1987). Data validation was performed according to EPA's Laboratory Data  Validation:
      Functional Guidelines for Evaluating Inorganics Analyses (U.S. EPA  1988).

      The quality  assurance review included examination and validation  of the following
      laboratory data:

          •   Sample digestion and extraction logs

          •   All instrument printouts, except for mercury (the instrument printout was
              not available from the laboratory)

          •   Instrument calibration and calibration verification procedures and results

          •   Sample holding times and custody records

          •   Manual data transcriptions and computer algorithms.

      Data qualifiers were assigned as necessary during this review. Following the validation
      procedures, data quality was assessed with respect to accuracy, precision, and complete-
      ness. All qualifier codes used in this report are defined in Table F-l.
QUALITY ASSURANCE REVIEW
Overall Case Assessment

      All data for metals in the five water samples are acceptable as qualified in this review for
      the uses specified in the quality assurance project plan except for the matrix spike result
      for silver, which was rejected. Data for all samples analyzed for cadmium, calcium, lead,
      mercury, silver, and zinc are acceptable as estimates.  Data qualified as / (estimated) are

                                         F-1

-------
                          TABLE F-1.  DATA QUALIFIER CODES
Qualifiers Applied During Quality Assurance Review

U  The analyte was not present above the level of the associated value. The associated numerical value
    indicates the approximate concentration necessary to detect the analyte in this sample.

J   The analyte was positively identified, but the associated numerical value may not be consistent with
    the amount actually present in the field sample. The data should be seriously considered for decision-
    making and are usable for many purposes.

UJ The analyte was not present above the level of the associated numerical value.  The associated
    numerical value may not accurately or precisely represent the concentration necessary to detect the
    analyte in this sample.

/?  The data are unusable for all purposes.  The presence or absence of the  analyte has not been
    verified. Resampling and reanalysis are necessary to confirm or deny the presence of the analyte.


Qualifiers Applied During Laboratory Validation*

E  The reported value is estimated because of the presence of interference. This qualifier is commonly
    used  when the serial dilution result for analyses by inductively coupled plasma-atomic emission
    spectrometry (ICP) does not meet control  limits.

M  Duplicate injection precision was not met.                                            •

N  Predigestion  matrix recovery was not within control limits.                        .......

S  The reported value was determined by the method of standard additions (MSA).  The associated
    value is as reliable as unqualified results.

W The postdigestion spike recovery for GFAA" analysis was not within control limits (85-115. percent),
    and the sample absorbance was less than 50 percent of the spike absorbance.

*   Duplicate analysis was  not within control limits.

+  The reported value was determined by MSA.  The correlation coefficient for MSA is < 0.995.


1 Adapted from U.S. EPA (1987).

b Graphite furnace atomic absorption spectrometry.
                                                F-2

-------
       acceptable, but a greater degree of uncertainty is associated with these values than with
       unqualified data.

       The matrix spike result for silver was rejected because the postdigestion spike recovery
       (58 percent) was well below the EPA Contract Laboratory Program (CLP) control limit
       (85- to 115-percent recovery).   Analysis of the  sample by the  method  of standard
       additions  (MSA) is required in this case, but was not performed.

       Calcium values received / qualifiers because the CLP control limit (U.S. EPA 1987) was
       exceeded  slightly for the serial dilution sample analyzed by inductively coupled plasma-
       atomic emission spectrometry (ICP). Reported results may be underestimated by approxi-
       mately 10 percent.

       Cadmium and lead results received J qualifiers  because CLP control limits for matrix
       spike recoveries and for duplicate analyses were exceeded. In addition, the result for lead
       in Sample 2  was restated as undetected (U) at the reported  concentration  because the
       associated digestion blank was  contaminated.   Cadmium  and  lead  data should be
       considered order-of-magnitude estimates.

       Mercury results were qualified J because the matrix spike recovery was below the CLP
       control limit.  These results may be 100-200 percent higher  than reported.

       A J qualifier was applied to silver results because recovery of silver was poor for the
       laboratory control sample (LCS).  Silver results may be approximately 100 percent higher
       than reported.   Additional individual results were qualified J because the correlation
       coefficient for the results determined by MSA did not meet the CLP control limit of 0.995

       The overall data quality achieved by  the laboratory  for analyses completed by  ICP
       (Table F-2) is typical for metals analyses in water samples.  The overall data quality for
       analyses by graphite furnace atomic absorption (GFAA) is typical for arsenic, chromium,
       and silver. Data quality for cadmium, lead, and mercury is less th
-------
   TABLE F-2.  ANALYTICAL METHODS AND INSTRUMENT
                    DETECTION LIMITS
Analyte
Aluminum
Arsenic
Cadmium
Calcium
Chromium
Copper
Iron
Lead
Magnesium
Manganese
Mercury
Nickel
Silver
Zinc
Method of Analysis
ICP"
GFAA"
GFAA
ICP
GFAA
ICP
ICP
GFAA
ICP
ICP
CVAAC
ICP
GFAA
ICP
Instrument
Detection Limit
(H9/L)
55
5
5
28
10
11
9.6
5
140
1.8
0.2
18
5
4
1 Inductively coupled plasma-atomic emission spectrometry.
" Graphite furnace atomic absorption spectrometry.
° Cold vapor atomic absorption spectrometry.
d Manual spectrophotometry.
                               F-4

-------
Holding Times

       Holding times required by EPA CLP protocols were met for all metals analyses.


Analytical Methods

       All sample digestion and analysis procedures, instrument calibration  procedures, and
       quality control checks conformed to EPA CLP requirements except as noted below.


       Sample Preparation and Analysis

       Water samples were digested according to requirements specified for  CLP (U.S. EPA
       1987).   Sample  digestates  were analyzed  by ICP, GFAA, and cold  vapor atomic
       absorption spectrometry (CVAA), as indicated in Table F-2.  Multiple digestions were
       prepared for Samples 1 and 2 and  the  duplicate and the spike of Sample 2, because
       unacceptably high levels of lead were present in the second preparation blank and because
       volumes of digestate were initially insufficient for all analyses. A. preparation blank and
       a laboratory control sample were digested and analyzed with each batch. Only lead and
       arsenic results were obtained from the second and third digestion batches. Results for all
       applicable  quality control  samples, except  the method blank for lead for the third
       digestion group, were provided on the appropriate CLP forms by the laboratory or were
       added during the quality assurance review.


       Instrument Calibration

       Instrument calibration was completed according to EPA CLP protocols (U.S. EPA  1987).
       Four  calibration standards and one  blank  were used for all analyses by GFAA. The
       correlation coefficient of a least  squares linear regression met the CLP control limit of
       >0.995 in  all  cases except one.   The correlation coefficient was  0.993 for the  initial
       calibration for analysis of cadmium in Samples 3 and 5. Consequently, the cadmium
       results for  these samples were qualified J.

       ICP  instruments  were calibrated according to manufacturer instructions,  using  one
       standard and one blank.  A low-level  standard was used  to  verify accuracy of the
       calibration curve at low analyte concentrations for all metals except mercury and alumi-
       num.

       Initial (ICV) and continuing (CCV) calibration check standards and initial  (ICB)  and
       continuing (CCB)  calibration blanks  were  analyzed immediately  after instrument
       calibration, after every 10 samples or more frequently, and at  the conclusion of each
       analytical run, with the following exception:   no CCV/CCB pair was analyzed  at the
       conclusion of the ICP run.  However, only interference check samples were analyzed after
       the final CCV/CCB pair, and data quality  was not affected.   Results for all  CCVs fell
       within 90-110 percent of the expected value (80-120 percent for mercury), as  required

                                            F-5

-------
by EPA CLP.   Instrument calibration remained within control  limits  for all  samples
thorughout each sample run and for all other analytes.
Instrument-Specific Quality Control Procedures

     ICP—A serial dilution sample is required by EPA CLP protocols to check for matrix
interference in samples analyzed by ICP. All samples analyzed by ICP were diluted to
one fifth of their initial concentration to bring manganese concentrations within the linear
range of the ICP. The laboratory chose to report the results of diluted Sample 3 on CLP
Form 9, ICP Serial Dilutions. A further serial dilution was required by CLP protocols
to obtain a diluted result for manganese, but was not performed.  Results of the serial
dilution for iron, magnesium, nickel, and zinc were within the CLP control  limit of
10-percent difference from the undiluted result.  The results for aluminum and copper
were not applicable  because the undiluted concentration of these  metals was  not
sufficiently high. The result for calcium (11-percent difference) exceeded control limits,
with the diluted result (corrected for dilution) exceeding the undiluted result. All calcium
data were  qualified  E by  the laboratory  and J during the quality assurance review.
Reported calcium results may have a small negative bias of approximately 10 percent due
to matrix interference.

Interference check samples (ICSs) were analyzed at the beginning and end of the ICP
sample run to check for interference by other metals. Results met CLP control limits in
all cases.   To extend the  linear  range of the  ICP to  accommodate the high analyte
concentrations present in the ICSs, a second calibration curve was  obtained for  some of
the ICS analytes using higher standards than were used for the sample analyses.  The
analytical wavelength and all instrument parameters remained the same. Calibration was
verified at the higher calibration curve as well.  Data relating to the higher calibration
curve were labeled "secondary lines" in the original data.
     GFAA—Quality control procedures for GFAA analyses included duplicate injection
of all samples and analysis of a postdigestion analytical spike with each sample. Results
of duplicate injections were spot-checked at a frequency of approximately 10 percent.  All
examined duplicate injection results agreed within 20-percent coefficient of variation, as
required by CLP protocols.

Recoveries of the analytical spike for numerous samples and analytes did not meet CLP
control limits of 85-115 percent.  In most cases, these data were qualified W (analytical
spike recovery did not meet control limits and sample absorbance is less than 50 percent
of spike absorbance) by the laboratory, or MSA was used to analyze the  samples as
required by CLP  protocols.  Sample  results obtained by MSA were qualified S by the
laboratory  if the  correlation coefficient obtained with the MSA results was >0.995.
Results  qualified S are reliable and are not considered to be estimates.  Sample results
obtained by MSA with correlation coefficients <0.995 were qualified + by the laboratory
and / during the quality assurance review. These results are estimates.


                                       F-6

-------
A systematic calculation error was made by the laboratory for all sample results obtained
by MSA.  The error consisted of the misassignment of axes to the sample concentration
values and to the instrument response values, resulting in an incorrect value for the slope
of the instrument response per added concentration and  consequently for the analyte
concentration in the sample.  Results obtained with a poor correlation coefficient showed
the greatest magnitude in the error.  All results were corrected during quality assurance
review.

Several errors were  made by the laboratory in  following the CLP sample analysis
sequence for analyses by GFAA.  The analytical spike recoveries cif silver and lead in the
first method blank (122- and 119-percent recovery, respectively) exceeded CLP control
limits (85-115 percent).  According to U.S. EPA (1987), the problems should have been
corrected and acceptable results should have been generated for Ihe method blank prior
to sample analysis. A qualifier (£) was applied to the silver result for Sample 5 (the only
result not obtained by MSA)  by the laboratory because of the high analytical spike
recovery from the blank, but was removed during the quality assurance review because
data qualification is not automatically warranted in this case.  All samples results for lead
from  the .first  digestion  group were  obtained by MSA and were not qualified  by  the
laboratory or during the quality assurance review.

The matrix spike samples for lead and silver should have been analyzed by MSA because
the analytical spike recoveries were low (74- and 58-percent recovery, respectively) for
these  analytes. The initial sample and duplicate (Sample 2) for silver were analyzed by
MSA. The spike results for silver and lead are estimates.

The analytical  spike recovery for lead in Sample 3 was 34 percent.  This sample  should
have  been diluted and reanalyzed (U.S.  EPA 1987); however, MSA was performed
instead.  Samples 2 (duplicate),  5, and 6 were analyzed by MSA for arsenic and had
correlation coefficients below the control  limit.  These samples should have been
reanalyzed, but were not.  The correlation coefficient for arsenic by MSA in Sample 2
(duplicate) was 0.909, well below the control limit of 0.995, and  ithe curve generated by
the standard additions was exponential  in  appearance.   This nisult  (45.5 |ig/L)  was
rejected during the quality assurance review because of the poor correlation coefficient,
and the initial result (26.2  fag/L) was  accepted as an estimate.
Detection Limits

All reported instrument detection limits (DDLs) were below or equal to the CLP contract-
required detection limits (CRDLs) (Table F-2).  The DDL for lead by GFAA was omitted
from CLP Form 11, but was subsequently provided by the laboratory. The DDLs reported
for GFAA analytes were estimated by laboratory personnel based on their experience with
the instrument and were not determined statistically as required by CLP protocols  (U.S.
EPA 1987). Data were  not qualified for this omission.  Based on the quality assurance
                                       F-7

-------
       review of original laboratory data, in the reviewer's judgment the laboratory estimates of
       detection limits tended to be high.  Use of statistically determined detection limits may
       result in lower values than the reported IDL in many cases.
Accuracy
       The laboratory performed one LCS analysis (using a commercially available standard
       prepared  specifically for CLP analyses)  and one predigestion  matrix  spike analysis
       (Sample 1 for mercury, and Sample 2 for all other analytes). Recovery of all analytes
       except  silver from the LCS ranged  from  84 to  112 percent.   Silver recovery was
       52 percent (Table F-3). CLP control limits for metals in the LCS are 80- to 120-percent
       recovery (except for silver, which has no contractual control limit [U.S. EPA 1987]).  All
       results for silver were qualified / during the quality assurance review because of the poor
       LCS recovery (U.S. EPA 1988).

       Predigestion matrix spike recovery was-within control limits (75-125 percent; U.S. EPA
       1987) for all metals except cadmium, lead, mercury, and silver (Table F-4).  Results for
       cadmium and lead  (194- and 261-percent recovery, respectively) were greater than the
       control limit, and all sample results greater than the IDL were qualified / during the
       quality assurance review (U.S.  EPA 1988).   Only Sample 2 was  not qualified for
       cadmium because none was  detected.  The spike results for both lead and cadmium are
       questionable because the matrix  duplicate results for Sample 2 exceeded control limits,
       so a reliable sample concentration  is not available.  The spike sample result for lead is
       also questionable because the sample should have been analyzed by MSA, but was not.
       In addition, at least one method blank for lead was contaminated (as discussed in the
       Blanks section); nonsystematic lead contamination may also have contributed to the poor
       replicability  of the duplicates and the  high spike recovery  for lead.   All data were
       qualified  as  estimated despite the uncertainty in the matrix  spike results because the
       magnitude of the control limit exceedance was large for both analytes.

       All mercury data were qualified / during the quality assurance review because prediges-
       tion spike recoveries (40 and 39 percent, respectively) were much lower than control
       limits.  Recovery for  a postdigestion mercury spike analyzed for Sample 1 was 38 per-
       cent, similar to the predigestion spike result.  This result indicates that a matrix interfer-
       ence at the spectrophotometer was probably responsible  for poor recovery.  Reported
       results for mercury  may be lower than the actual sample concentrations.

       The matrix spike result reported for silver was  lower  than the result reported for the
       unspiked  sample.  The  analytical spike result of the matrix spike sample was low
       (58-percent recovery), and therefore the matrix spike sample should have been analyzed
       by MSA, but was not.  The original and duplicate Sample 2 were both analyzed by MSA.
       The matrix spike result for silver was rejected during the quality assurance review. The
       matrix spike result  for chromium was not applicable because the sample concentration
       exceeded  4 times the  spike  concentration.  The magnitude of the precision error (the
       control limit is ^20  relative percent difference [RPD]) may be significant with respect to
                                             F-8

-------
TABLE F-3. PERCENT RECOVERY FOR METALS
    IN LABORATORY CONTROL SAMPLE
Analyte
Aluminum
Arsenic
Cadmium
Calcium
Chromium
Copper
Iron
Lead
Magnesium
Manganese
Mercury
Nickel
Silver
Zinc
8 Percent recovei
Percent
Recovery"
98
105
112
99
109
101
99
98
99, 84, 93
100
111
97
52
98
measured value v inn
                  true value
                    F-9

-------
TABLE F-4.  MATRIX SPIKE RECOVERY FOR METALS
                     IN SAMPLE 2
Sample Result Spike Added
Analyte (u.g/L) (u,g/L)
Aluminum
Arsenic
Cadmium
Calcium
Chromium
• Copper
Iron
Lead
Magnesium
Manganese
Mercury9
Nickel
Silver
Zinc
* Percent recov
310
25
5 I/
—
69
27
7,090
29
~
6,560
0.2 U
180
28 fl'
180
spiked
2,000
40
' 5
~
10
250
1,000
20
~
500
1.0
500
10
500
result- unspiked
Percent
Recovery*
97
89
194
NRC
NAd
103
77
261
NR
76
40
106
-'
95
result x 100.
                          spike added
 b U - the analyte was not detected at the indicated concentra-
 tion.

 c A matrix spike was not required for this analyte (U.S.  EPA
 1987).

 d The result is not applicable because the sample concentration
 is greater than 4 times the spike concentration.

 * Sample 1  was spiked for mercury only.

 1 R - the spike sample result was rejected; the result is not
 meaningful.
                            F-10

-------
       the spike concentration in this situation, and spike recovery results cannot be clearly
       interpreted. Assessment of analytical accuracy was based on the LCS for both silver and
       chromium.
Precision
       Duplicate subsamples of Sample 2 for all metals and Sample 1 for mercury only were
       analyzed by the laboratory.  Results are summarized in Table F-5.  All results except
       cadmium and lead were within the control limit of 25 RPD (for siimple results >5 times
       the CRDL) or ± the CRDL (for results <5 times the CRDL) specified by the EPA.  A
       qualifier (*) was applied to all cadmium and lead values by the laboratory or during the
       quality assurance review to indicate EPA CLP duplicate control limit exceedance, and all
       cadmium and lead values were qualified / during the quality assuxance review.

       The result  for arsenic for Sample 2 (duplicate) as obtained by MSA and reported by the
       laboratory  was rejected during the quality assurance  review, but the result obtained
       initially  by direct  comparison to the instrument  calibration  curve  was  accepted  as
       estimated (details in the Calibration section).  The latter value was well within control
       limits, and the  former value exceeded  the control limit by less than  1 (ag/L.  The data
       qualifier (*) applied by the laboratory to the arsenic value for Sample 2 was removed
       during the  quality assurance review.  No arsenic data were qualified /.
Blanks
       A method blank and several calibration blanks were analyzed with the samples for each
       metal.  No contaminant was found in any method blank with one exception:  lead was
       present (6.1 (ag/L) in the method blank prepared with the second digestion batch. Results
       for Sample 2 and the duplicate and spike samples for Sample 2 v/ere reported from this
       digestion batch.  Sample 2 was qualified U (undetected at the reported concentration)
       during the quality assurance review because the sample result (29.4 (ag/L) was <5 times
       the concentration in the method blank (U.S. EPA  1988).  According  to the laboratory
       worksheets  for lead, the  method blank  prepared  with the  third digestion batch also
       contained lead (105 [Jg/L); however,  data corresponding to this result were absent from
       the instrument printout, and the result was not entered onto  the appropriate CLP form.
       The entry on the worksheet was apparently a transcription error, arid no result is available
       for this method blank.  The result reported for Sample 1 was obtained from this digestion
       batch and was qualified / during the quality assurance review.

       Several results for CCBs exceeded the detection limits for calcium, manganese, and zinc.
       However, all  associated  sample results exceeded 5 times the concentration  of the
       respective analyte found in any CCB, and were therefore not significant with respect to
       the expected analytical variability of sample results.  No sample results were qualified as
       a result of detected analyte concentrations in associated CCBs.
                                             F-11

-------
     TABLE F-5. DUPLICATE ANALYSIS RESULTS FOR METALS
                            IN SAMPLE 2
Analyte
Aluminum
Arsenic
Cadmium
Calcium
Chromium
Copper
Iron
Lead
Magnesium
Manganese
Mercury*
Nickel
Silver
Zinc
Sample Result Duplicate Result
(ug/L) (ug/L)
310
25
5U°
184,000
69
27
7,100
29
200,000
6,600
0.2 U
180
28
180
308
26
17
180,000
78
29
6,700
47
190,000
6,400
0.2 U
190
31
190
Control Relative Percent
Limit* Difference"
200
10
5*d
~
-
15
—
-
-
-
0.2
40
10
-
-
-
-
2
-
-
8
46*
3
2
-
--
—
3
* For results less than 5 times the CRDL, the difference between replicate sample
results must be < the CRDL.

b RpD _  | sample -  duplicate |
         (sample * dupiicate)/2'
c U - the analyte was not detected at the indicated concentration.

d Results followed by "*' exceed CLP control limits.

• Sample 1 was analyzed in duplicate for mercury only.
                                   F-12

-------
REFERENCES
      U.S. EPA.  1987.   U.S. EPA Contract  Laboratory Program  statement of work for
      inorganics analysis, multi-media, multi-concentration.  SOW No. 788.  U.S. Environ-
      mental Protection Agency, Washington, DC.

      U.S. EPA.   1988.  Laboratory data validation:  functional guidelines  for evaluating
      inorganics analyses.  U.S. Environmental Protection Agency, Office of Emergency and
      Remedial Response, Washington, DC.
                                          F-13

-------
QUALITY ASSURANCE REVIEW OF
POL YCHLORINA TED BIPHENYLS
IN  SEDIMENT
INTRODUCTION

      This report documents the results of a quality assurance review of data for polychlorinated
      biphenyls (PCBs) in sediment samples as part of the sediment characterization of the
      Project Y site. The sampling and analysis plan (SAP) and the quality assurance project
      plan (QAPP) are described in the study proposal.

      All laboratory analyses were performed by the laboratory in accordance with procedures
      specified in the SAP.  Sample analyses were performed using modified versions of U.S.
      Environmental Protection Agency (EPA) SW-846 Method 8080 (U.S. EPA 1986); the
      modifications are detailed in the laboratory statement of work (SOW).  Data validation
      was performed in accordance  with the U.S. EPA (1988)  functional  guidelines for
      evaluating organic compound  analyses, guidelines established  in U.S. EPA (1986)
      SW-846 Method 8080, the data quality objectives specified in the SAP, and the require-
      ments specified in the laboratory SOW.

      The quality assurance review included examination and validation of the following data:

          •   Sample holding tunes  and chain-of-custody records

          •   Initial and continuing  calibration analyses, including calculations by least
             squares linear regression

          •   Reported detection limits

          •   Method blank analyses

          •   Matrix spike and matrix spike duplicate recoveries

          •   Surrogate compound recoveries

          •   All reported  sample  results,  including  verification  of quantification,
             examination of chromatograms, and PCB identification.
                                       F-14

-------
OVERALL CASE ASSESSMENT

       The results of the quality assurance review for the analysis of PCBs in the 64 sediment
       samples are presented below in two sections. These sections address completeness of the
       data package and the qualifiers  assigned to individual measurements.
Summary of Completeness

      A complete data package was submitted by the laboratory for 64 sediment samples,
      4 method blanks, 4 matrix spikes, and 4 matrix spike duplicates.  Data completeness is
      100 percent of the total requested analyses; no results were rejected.
Summary of Data Qualifications

       The results of analyses for PCBs in the 64 sediment samples associated with this project
       are acceptable for the intended purposes specified in the SAP.  Some data were assigned
       a J qualifier to indicate that the values reported are estimates. The data are acceptable,
       but have a greater degree of uncertainty than nonqualified data.

       A summary of the technical factors resulting in the qualification of the PCB  data is as
       follows:

           •   The  laboratory  did not fully establish linearity for the initial calibration
               near the lower end of the standard curve.  Demonstration of linearity near
               the lower end of the curve is important for validating to demonstrate  the
               limits of detection and  practical quantification  limits specified in  the
               laboratory SOW.

           •   The laboratory quantified all sample results using a single-point standard
               (i.e., the continuing calibration standard). However, quantification  using
               a single-point standard is only acceptable if linearity is established through-
               out the calibration range in the initial calibration.

           •   The criterion for continuing  calibration was  not met for three of the  eight
               total standard analyses.

           •   Surrogate recoveries for 13 samples did not meet quality control limits;  the
               associated data were  qualified as estimates.

       In addition, all PCB  values were recalculated because coeluting  chromatographic peaks
       were used by the laboratory  to identify  PCBs; therefore,  the  peak heights used  for
       quantification resulted in biased values.  The recalculated values  were typically one-half
       of the original concentrations  reported  by the laboratory.   In addition,  the  laboratory
       occasionally incorrectly identified  and reported results  for speciiic PCBs. During  the
       quality assurance review, these data were corrected.
                                            F-15

-------
       A complete discussion  of the results of the data validation  and specific  problems
       identified during the quality assurance review is provided below.
HOLDING TIMES

      All storage conditions and sample holding times were properly met by the laboratory.
      The holding time requirements for PCB analyses specified in the SAP are as follows:

           •   All samples must be shipped on ice to  the laboratory and stored at -18°C
               until sample extractions are performed

           •   Sample extracts must be analyzed within 40 days

           •   Sediment samples must be kept frozen and extracted within 6 months from
               the date and time of sample collection.

      The 64 sediment samples were collected between	and	; the
      samples were received at	on	.  Samples were extracted
      between	and	, and the sample extracts were analyzed
      between	and	.
ANALYTICAL METHODS

      Samples were analyzed for PCBs using a modified version of U.S. EPA (1986) SW-846
      Method 8080.  The modifications are specified in the SAP and the laboratory SOW and
      include the following:

           •   Larger sample  size for  extraction  (i.e., approximately 100 grams, wet
               weight)

           •   In addition to the Contract Laboratory Program (CLP) surrogate compound
               dibutylchlorendate (DBC), the use  of an additional surrogate compound
               (4,4'-dibromooctafluorobiphenyl  [DBOFB]) to monitor recovery on a
               sample-by-sample basis

           •   Sample extract cleanup procedures as required using alumina column
               chromatography by EPA Method 3610, florisil column chromatography by
               EPA Method 3620, and elemental sulfur cleanup by EPA Method 3660

           •   Megabore capillary gas chromatography/electron capture detection (GC/EC-
               D) analysis to enhance resolution and reduce potential interferences

           •   Use of a multipoint calibration for all Aroclor® mixtures and analysis of a
               check  standard of 0.1 ng (on-column) for verification  of instrument
               sensitivity to assess the validity of the required detection limits.
                                           F-16

-------
       The laboratory generally performed the recommended modifications.  Florisil column
       chromatography was used for a limited number of samples. EPA Method 3660 (mercury
       cleanup) and a  sulfuric acid cleanup step were used to remove elemental  sulfur; the
       sulfuric acid cleanup step was used on all  samples associated with this project.  The use
       of sulfuric acid was approved by the project manager during sample processing.
CALIBRATION
       The results  of all initial and continuing instrument calibrations performed  by the
       laboratory are  generally acceptable.  Specific  problems identified during  this  quality
       assurance review are discussed in the section below.

       Instrument calibration is  performed to  establish and ensure that the chromatographic
       system is  capable of producing acceptable  and reliable analytical data.   An initial
       calibration is performed  prior to  sample analysis to establish the linearity  of the
       chromatographic system,  including  demonstrating  that all  target compounds  can be
       detected.  Continuing calibrations are performed to verify that instrument performance is
       stable and reproducible on a day-to-day basis. The initial and continuing calibrations are
       to be performed according to procedures established by CLP protocols and  modified  in
       the SAP and the laboratory SOW.

       A detailed description of  the results for initial and  continuing calibrations is presented
       below.
Initial Calibration

      The laboratory performed an initial three-point calibration using concentrations of 0.4, 1.0,
      and 5.0 ng (on-column) for the five Aroclor® mixtures (Aroclor® 1016, 1221, 1232, 1248,
      and 1260).  A five-point initial calibration (0.4, 1.0, 2.0, 3.0, and 5.0 ng) was performed
      for PCB 1242 and PCS 1254.

      Linearity of the initial calibration to zero concentration is assumed when the percent
      relative standard deviation (RSD) of the calibration factors is <20 percent over the entire
      calibration  range  (U.S.  EPA 1986).   Additionally, the correlation coefficients (r2)
      generated by least squares linear regression should be greater than 0.9950 to demonstrate
      linearity.

      The laboratory calculated the r2 values for the initial calibrations using the sum of all
      chromatographic  peaks that were integrated (i.e., from the fiist peak  integrated, the
      injection peak, to  the last  peak integrated) to perform the  calculations.  Only the
      chromatographic  peaks representative  of a specific  PCB mixture should be used for
      performing these calculations.   Therefore, all standard chromatograms  were reviewed
      during the quality assurance review and the r2 values were recalculated.
                                             F-17

-------
       The recalculated  results  generated using least  squares linear  regression indicate that
       linearity through the origin was not established.  While linearity through the origin is not
       uncommon for this type of analysis, most PCB concentrations that were recalculated are
       in this low  concentration range.  Therefore, the  results  for PCBs were assigned  a J
       qualifier to indicate estimated values.


Continuing Calibration

       The number of continuing calibrations is acceptable; however, the frequency of calibra-
       tions is not acceptable.  The data were not qualified for unacceptable frequency of
       antimony calibration because of the numerous other problems identified and discussed in
       other sections of this report.

       The criteria  for acceptable continuing calibration require that the calibration factors for
       all target compounds have a difference of <15 percent from the average calibration factor
       calculated for the  associated  initial calibration (U.S. EPA  1986).   The 15-percent
       difference value is required  for results calculated using the chromatographic column that
       is used for quantitative purposes.  In addition, the percent difference of the calibration
       factors calculated for  the  chromatographic column  used  for confirmation  must be
       ^20 percent  (U.S. EPA 1986). If the criteria for the percent differences are not met, then
       a new initial calibration sequence must be prepared.

      .The laboratory performed 8  continuing calibration analyses during the analysis of the 64
       sediment samples. The criteria for continuing calibration were not met for three of eight
       calibrations  performed (ranging from  32- to 92-percent difference).   In addition,  the
       laboratory typically performed continuing calibrations  at  the end of  a  given daily
       analytical sequence or the calibrations were clustered together.
METHOD BLANK ANALYSIS

       Method blank analysis is performed to determine the extent of laboratory contamination
       of samples.  The four method blank analyses for this project are acceptable; PCBs were
       not  detected.
ACCURACY

       Accuracy of the analytical results  is expressed in terms  of the bias and precision of
       measurements. Bias is assessed by evaluating the recoveries of the surrogate compounds
       and the matrix spike recoveries calculated for sample analyses.  Precision is assessed by
       evaluating the differences between  duplicate matrix spike analyses.   These results are
       presented below.
                                             F-18

-------
Surrogate Compound Recoveries

      The surrogate compound recoveries reported for the 64 sediment sample analyzed are
      acceptable, except 13 surrogate recoveries did not meet the quality control limits and the
      associated data are accepted  as estimates.  The data quality objective for acceptable
      recovery for surrogate recovery is 100±50 percent.

      The recoveries for  DEC ranged from 0 to 160 percent, with an average recovery of
      70 percent.  The recoveries for DBOFB ranged from 0 to  128 percent, with an average
      recovery of 71 percent. Thirteen surrogate recoveries exceeded the quality control limits;
      four recoveries were reported  at  zero  percent, and nine recoveries were less  than
      50 percent but greater than zero percent.  No data  were rejected because only one
      unacceptable surrogate recovery was reported for a given sample and the other surrogate
      recovery value was acceptable.  The values for PCBs reported in  these samples were
      assigned a J qualifier to indicate the values are estimates.


Matrix Spike Recoveries

      The results for the matrix spike recoveries are acceptable for the four sets of duplicate
      matrix spike analyses that were performed, except for three results that are acceptable as
      estimates.  All matrix spike  analyses were  performed using  Aroclor®  1254 and the
      samples chosen by the laboratory for the matrix spikes had detectable amounts of PCBs.

      The criteria for acceptable matrix spike recovery is  100±50 percent.  All recoveries were
      recalculated during the quality assurance review. The recalculated matrix spike.recoveries
      ranged from 0 to  90 percent.  Only three results did not meet the quality control limits.
      No data were rejected in accordance with procedures detailed by EPA CLP protocols
      (U.S. EPA 1988).


PRECISION

      Two of the four total relative percent difference (RPD) values di:.d not meet the quality
      control criteria for precision.  Precision is expressed as the RPD between the recoveries
      of the matrix spike and the matrix spike duplicate analyses performed on a sample.  The
      quality control criterion for precision  is  ±50 percent.  The RPDs  calculated from the
      duplicate matrix spike recoveries ranged from 13 to 90 percent.


IDENTIFICATION OF COMPOUNDS

      All chromatograms were examined during the quality assurance review to verify that PCB
      identifications and confirmations (where applicable) are correct.  The confirmation of the
      PCB identification  during the  quality assurance  review focuses  on false positives.
      However, PCBs reported as not detected are also evaluated to investigate the possibility
      of false negatives.  Confirmation of possible  false  negatives is addressed by reviewing


                                            F-19

-------
       other factors relating to analytical sensitivity (e.g., detection limits, instrument linearity,
       and analytical recovery).

       Ether Aroclor® 1254 or Aroclor® 1260, or a mixture of the two, was identified in 55 of
       64 samples associated with this study.   Absolute  identification  for the presence  of
       Aroclors®  1254 or 1260 could not be confirmed during the quality assurance review
       because all chromatograms generated  with the confirmational chromatographic column
       drifted off scale (i.e., 100 percent, full-scale deflection).  Additional sample dilutions were
       not performed for these samples.  Therefore, results generated using data obtained from
       only one chromatographic column were used to perform quantification and identify the
       PCBs. As a result, all results were assigned a J qualifier to indicate the values reported
       are estimates.
COMPOUND QUANTIFICATION AND REPORTED DETECTION LIMITS

       All  quantifications performed by the  laboratory were corrected during the quality
       assurance review.  The laboratory had not accounted for coeluting  peaks when Aro-
       clors* 1254 and 1260 were present in a given sample; the inclusion of coeluting peaks
       resulted in biased values.  Quantification of the reported data and the  reported detection
       limits were recalculated to ensure all results  are  accurate and  consistent with  the
       requirements established in U.S.  EPA (1986) SW-846 Method 8080, the SAP, and the
       laboratory SOW.

       During the quality assurance review, chromatographic peaks characteristic to  each PCB
       mixture were chosen to check quantifications and their identity.  The heights of selected
       integrated peaks for a  specific  PCB mixture used for calibration  were summed to
       recalculate the r2  values,  and concentrations  of PCBs detected in the  samples  were
       recalculated using least squares linear regression. The results for PCBs quantitated in the
       samples were typically one-half of the values  originally reported by the laboratory; all
       results were assigned a J qualifier to indicate estimated values.

       The laboratory reported  limits of detection of 5 ug/kg (wet-weight  basis) for  Aroclors®
       1016, 1254, and 1260 and  10 |ig/kg (wet-weight basis) for Aroclors®  1221, 1232, 1242,
       and  1248 in most samples.  Overall, the laboratory reported limits of detection  that range
       from 5 to 100 |ig/kg (all  values are adjusted for dilutions that may have been performed).
REFERENCES

      U.S. EPA.  1986. Test methods for evaluating solid waste (SW-846): physical/chemical
      methods. U.S. Environmental Protection Agency, Office of Solid Waste and Emergency
      Response, Washington, DC.

      U.S. EPA.  1988.  Laboratory data validation:   functional guidelines for evaluating
      organics analyses.  U.S. Environmental Protection Agency, Office of Emergency and
      Remedial Response, Washington, DC.
                                                        ^

                                            F-20

-------
       APPENDIX G

   Analytical/Environmen tal
      Laboratory Audit
Standard Operating Procedure

-------
CONTENTS
     ANALYTICAL/ENVIRONMENTAL LABORATORY AUDIT
     STANDARD OPERATING PROCEDURE                          G-1

        1.  PURPOSE AND INTRODUCTION                         G-1

        2.  AUDITOR QUALIFICATIONS                            G-1

        3.  REQUEST FOR AUDIT                                 G-1

        4.  CLARIFICATION OF AUDIT OBJECTIVES                  G-1

        5.  ESTIMATE OF AUDIT COSTS                           G-2

        6.  PREPARATION FOR THE AUDIT                         G-2

           6.1 Identification of Laboratory Contact Person               G-2
           6.2 Initial Discussion with Laboratory Management            G-3
           6.3 Pre-Site Visit Activities                              G-3
           6.4 Schedule of the Site Visit                            G-4

        7.  PERFORMANCE OF THE SITE VISIT                      G-5

        8.  USE OF THE AUDIT CHECKLIST FORM                   G-6

        9.  USE OF THE AUDIT SCORING GUIDELINES                G-7

        10. AUDIT REPORT                                     G-7
                                 G-iii

-------
ANAL YTICAL/ENVIRONMENTAL

LABORATORY AUDIT

STANDARD OPERATING  PROCEDURE	



1.    PURPOSE AND INTRODUCTION

      The purpose of this standard operating procedure (SOP) is to provide guidance to EZ
      Consultants (EZ) staff in auditing analytical or environmental testing laboratories. The
      audit  requires evaluation of information collected  during  the review  of laboratory
      documents, performance  of site  interviews,  and observation  of normal laboratory
      operations.  Basic procedures for arranging and performing a site; visit are provided, as
      well as a checklist for items to be considered during the audit process, and an evaluation
      guide. Portions of the audit checklist form (Attachment 1)  are based upon laboratory
      evaluation checksheets developed by the U.S. EPA Industrial Technology Division.

      There are two typical reasons why an audit is requested to be performed:  to determine
      the capability of a laboratory to perform (future) testing for EZ; or to evaluate the quality
      of data submitted, usually on behalf of  a third party.  The SOP outlined below is
      applicable in both cases.
                                                          !


2.    AUDITOR QUALIFICATIONS

      The auditor should have the technical experience necessary to perform the audit, i.e.,
      familiarity with the analytical methods of interest, instrumentation used, standard QA
      practices, and general good laboratory practices. The auditor should also be familiar with
      this SOP.


3.    REQUEST FOR AUDIT

      A staff member desiring a laboratory audit be performed can contact the EZ chemistry
      group and request an auditor be assigned for this task.


4.    CLARIFICATION OF AUDIT OBJECTIVES

      The auditor should consult the staff member requesting the audit to determine the purpose
      of the audit and the rigor with which the  audit must be performed. The extent of the
      audit  and the intensity of scrutiny will vary, according  to the type of laboratory, analyses,


                                       G-1

-------
       and type of project which are involved.  The auditor should get clear direction from the
       individual requesting the audit to determine the intensity  of review which is desired.
       Information necessary to make this decision include:

           •   Reason for audit

           •   Rigorousness of the data requirements

           •   Type of project for which data are (to be) collected

           •   Analytical methods required.


5.     ESTIMATE OF AUDIT COSTS

       The labor costs involved for the audit will depend on the intensity of the audit, which in
       turn depends upon factors such as the following:

           •   Type and size of project involved

           •   Type of laboratory involved

           •   Rigorousness of information requirements

           •   Required analytical methods

           •   Size and organization of the  laboratory

           •   Accessibility of documents for review

           •   Type of audit report necessary.

       For a rough estimate, the audit of a small, subcontract laboratory with 10 staff members,
       producing  standard CLP  data packages for inorganics, with all  necessary documents
       available in the EZ contract files would take approximately 18 hours of the auditor's time:
       eight hours for audit preparation,  four hours for the site visit (excluding travel), and six
       hours for evaluation and report generation. Additional labor costs would include clerical,
       word processing, and editing staff time.  Other direct costs such as travel expenses and
       computer time would also need to be included.


6.     PREPARATION FOR THE AUDIT


6.1    Identification  of Laboratory Contact Person

       If a laboratory (which will be) performing analyses for EZ is  to be audited, then the
       auditor should contact the laboratory directly.  Usually the best person with whom to
       establish contact is the technical director or lab manager, if such a position exists.
                                             G-2

-------
       If the laboratory to be audited is (or will be) performing analyses for a third party, that
       party should first be contacted, and their assistance should be enlisted to establish contact
       with the laboratory.
6.2    Initial Discussion with Laboratory Management

       Initiate preliminary discussions with the laboratory contact person to:

           •   Obtain a profile of laboratory, e.g., what types of samples; and analyses are
               handled, what clients  are served, what level and  types  of services are
               available, how  lab is  managed, identification of the managerial chain,
               management's overall  philosophy of quality, type of quality program in
               place.

           •   Identify  the primary  concerns, e.g., potential or  perceived  problems,
               perceived strengths.

           •   Identify the expectations, e.g., reason for desiring an audit; expected use of
               the outcome.

           •   Identify any problems  the laboratory may have with EZ.

           If at all possible,  do not take an adversarial attitude, but instead  try to foster a
           cooperative relationship with the laboratory. This is especially important when there
           have been previous problems or concerns regarding the quality of data produced by
           the  laboratory.   It is much easier to obtain necessary information  and to resolve
           problems if an open, cooperative relationship can be established for the audit process.


6.3    Pre-Site Visit Activities

           •   Review the audit checklist form (Attachment 1):  determine what infor-
               mation will be  necessary to complete the form and prepare for the site
               visit.   The topics generally covered during the  site visit  include organi-
               zation and personnel training, client requests, sample receipt and storage
               areas, sample preparation areas, general laboratory facilities, documents,
               standards, procedures,  instrumentation, quality control,  data review, data
               management, and report generation.

           •   Collect relevant information:   gather applicable laboratory or project
               documents  which will be helpful in filling out portions  of the audit
               checklist in advance, or aid in completing the audit report.  Such docu-
               ments could include the  laboratory  statement  of qualifications (SOQ),
               statement of work (SOW), contract or bid package, relevant analytical or
               sampling methods, EPA or state performance evaluations performed within
               the past  year, and the laboratory QA/QC manual.  If the  laboratory is
               currently under contract with EZ, or a third party for whom EZ is perform-
               ing the audit,  obtain the applicable documents from our contract files or

                                              G-3

-------
               from the third party. If the laboratory is being considered for performance
               of future work, obtain copies of the documents from the laboratory, if
               possible.

           •   Review the assembled information and begin filling out the audit checklist
               form following the instructions  in Section 8.  Make notes of additional
               questions regarding the laboratory which will need to be answered.  Note
               that the audit checklist form (Attachment 1) contains general guidelines for
               laboratories testing hazardous materials, therefore, not all of the questions
               may be applicable. The audit procedure  will proceed more quickly if those
               sections which are not applicable are marked with "N/A" in advance.
6.4    Schedule of the Site  Visit

       Remind the laboratory  contact person of the purpose of the audit when you make the
       arrangements for a site  visit. Since the most useful information can be gained while the
       laboratory is operating under typical conditions, only two to three days' advance warning
       should be allowed prior to the site visit. This should allow enough time for the laboratory
       to arrange that key individuals are available for site interviews.

       It is helpful to the laboratory staff if the auditor provides the laboratory with information
       on the audit and explains how the site visit will be conducted. See Section 7 for a typical
       agenda for a site visit.  Information which should be discussed in making arrangements
       for the site  visit should include:

           •   Purpose of the audit (e.g., potential contract, resolution of problems)

           •   Estimate of time the site visit will take  (typically, three to four hours  for
               a small laboratory performing one type of analysis)

           •   Areas of the laboratory  to be audited

           •   Topics to be covered during the site visit (e.g., organization and personnel
               training, client requests, sample receipt and storage areas, sample tracking,
               sample preparation areas, general laboratory facilities,  documents, stan-
               dards, procedures, instrumentation,  quality  control, data  review, data
               management, and report generation)

           •   Staff requested to be available to the auditor during the site visit (e.g., lab
               manager  or director, QA/QC  officer,  sample management supervisor,
               sample custodian, sample processing supervisor, inorganic and/or organic
               section supervisors, bench chemists  and technicians, data management);
               there should be a specific laboratory staff member identified to provide
               information on each of the topics listed above
                                             G-4

-------
           •   Documents requested to be available to the auditor during the site visit
               (e.g., QA program documents, policies and procedures, manuals, control
               charts, corrective action reports)

           •   Proposed site visit schedule (see Section 7 for a typical schedule)

           •   Specific problems, if any.


7.     PERFORMANCE OF THE SITE VISIT

       It is important to perform the site  visit in a  professional, efficient manner,  and to
       minimize disruption  of  the  normal laboratory  activities.   Try to have a cooperative
       attitude, and emphasize that this site  visit is an  information gathering activity that may
       provide helpful information to their organization as well.  Do not make critical remarks
       or point out flaws, but include such remarks in written notes. One way to conduct a site
       visit is  as follows:

           •   Initial briefing:  meet the key personnel (managers and supervisors)  in the
               laboratory as a group and briefly explain the purpose of the audit.  Have
               one of the laboratory  staff present a general overview of the  laboratory
               organization and capabilities, and introduce personnel. Ask that a history
               be presented on a sample, beginning with the initial request for analysis,
               receipt  of the  sample from the client, through internal procedures and
               analysis, generation of data and submittal of the final data report to the
               client.  Set the format for this initial briefing with the laboratory contact
               person prior to  the site visit. Try to arrange to keep this initial briefing to
               approximately half an hour.

           •   Document review:   have arrangements  made ahead  of time  for an op-
               portunity to review the laboratory documents you  requested be available.
               This can be done at this point, during the interview, or near the end of the
               interview, just prior to the final briefing.

           •   Observation  of the various  areas of the laboratory:   make arrangements
               ahead of time  with the laboratory contact person to visit each area  of
               interest in the laboratory to make observations. Cover each of the applica-
               ble topics on the audit checklist. Follow the sample history, as presented
               earlier by the laboratory. The audit checklist is organized to facilitate this
               task.

           •   Information gathering:  collect information on the audit checklist as the site
               visit progresses.  Make checks in the  appropriate  places, or write in the
               information necessary for each question as responses  are given.  It  is
               difficult  to remember all the information provided, and is important to be
               as accurate as possible in recording responses at the time they are provided.
                                             G-5

-------
               If possible, arrange to speak with bench level technicians and  analysts
               during the observation process.  Specific instructions for filling out the
               audit checklist are provided in Section 8.

               Final briefing:  meet with 'the key personnel, or at a minimum with the
               laboratory director or QA manager, at the end of the interviews to ask any
               questions which may not have been answered. If additional information is
               necessary,  ask that it be forwarded.   Since it is not possible to tell the
               laboratory  at this  time whether the audit was passed or  not, because  a
               detailed  review  of the information  provided on  the  checklist  will be
               required, make no comment on whether the laboratory has passed the audit.
               However, give an  indication of when the laboratory may expect  an audit
               report, and to whom this report will be made available.  Always thank the
               laboratory staff for their time and for allowing you to disrupt their sched-
               ules.
8.     USE OF THE AUDIT CHECKLIST FORM

       The  audit checklist  form (Attachment  1)  provides general guideline  questions for
       laboratories performing hazardous materials analysis.  The EZ chemistry group leader
       should be consulted  by  the auditor, if it is  felt that a project-specific form  must be
       generated.

       The checklist is divided into several sections:

           •   Organization and Personnel

           •   Sample Receipt and Storage Area

           •   Sample Preparation Area/Facilities

           •   Instrumentation

           •   Quality Control

           •   Data Handling and Review

           •   QC Manual Checklist

           •   Summary.

       It is assumed that appropriate staff (who have been previously identified) will be made
       available to the  auditor to answer the questions  in each of these sections.  Make checks
       in the appropriate boxes, or write in the information necessary for each question as the
       answers are provided.  Do not make critical remarks or point out flaws, but include such
       information in written notes. Either write all notes on the checklist form or attach notes
       to the form.  Ask to inspect documents, when appropriate, to verify answers.
                                             G-6

-------
9.     USE OF THE AUDIT SCORING GUIDELINES

       Once the site visit has been completed and any additional information has been provided
       to the auditor, the evaluation of the laboratory can be completed.

       Point distributions for each response which can be answered "yes" or ."no" are given in
       the scoring guideline in Attachment 2. In some cases, it may be necessary to check both,
       as not all requirements  may be fulfilled.  All points  are then totaled and the percentage
       of the maximum possible points is then calculated.  Questions which are not applicable
       to a particular facility are not scored, and are not counted toward the maximum possible
       points, thereby neither rewarding or penalizing the laboratory.  Responses to questions
       which have no point value will be used to determine marginal cases of pass or fail.  The
       following criteria are given for acceptability or nonacceptability:

            86-100% of maximum possible points     =  acceptable audit

            76-85% of maximum possible points     =  provisionally acceptable audit
                                                      (based  on responses to nonpoint
                                                      questions)

            below 76% of maximum possible points   =  unacceptable audit
10.   AUDIT REPORT

      An internal memo summarizing the results should be provided i:o the EZ staff who
      requested the audit be performed. In many cases, the third party may wish to receive
      copies of the completed audit report for their records. An example memo is provided as
      Attachment 3 of this procedure.  If it has been requested, a copy should also be provided
      to the audited laboratory.
                                            G-7

-------
Attachment 1

-------
         ANALYTICAL CHEMISTRY LABORATORY AUDIT GUIDELINES
Laboratory: 	   Date:
Address:  	   Telephone:
Auditor(s):
Laboratory Personnel Interviewed:



                Name                                       Title
Laboratory Accreditation/Certification:





                                               Expiration
Comments:
Score:

-------
LABORATORY AUDIT GUIDELINES
Page 2
                                                             Yes   No    Points         Comments
A.   Organization and Personnel

     1.    Is there an organizational chart available?

     2.    Is everyone in the organization familiar with it?

     3.    Is an up-to-date file maintained in the laboratory de-
          scribing  the educational background  and/or  related
          work experience of all laboratory personnel?

     4.    Is there a formal training program for personnel?

     5.    Are  employees  required to demonstrate proficiency
          with  analytical  instrument  operation,  methods,  or
          techniques prior to working on client samples?

     6.    Is this proficiency testing documented?

     7.    Is the organization  adequately staffed to  meet com-
          mitments in a timely manner?

     8.    Is there a designated QA/QC Officer?

     9.    To whom does the lab QA/QC Officer report?
     10.  Was the lab QA/QC Officer available  during the au-
         dit?

     11.  Was a program manager or laboratory  manager avail-
         able during the evaluation?

Comments:	
B.   Sample Receipt and Storage Area

     1.   Is a sample custodian designated?

     2.   Are the responsibilities clearly defined?
         In writing?

     3.   Is there a standard sample login procedure followed?

     4.   Does the procedure include  adequate  inspection  of
         samples and accompanying documents to verify that
         they are intact, complete, and consistent?

     5.   Is there an inspection checklist?

     6.   Does it document adequately the nature and condi-
         tion of samples and documentation?

     7.   Is the integrity of samples and shipping containers
         being documented?

-------
LABORATORY AUDIT GUIDELINES
Page 3
                                                             Yes   No    Points         Comments
     8.   Are samples logged into a bound notebook?
         a.  Computerized lab management system?
         b.  Other? (describe:	
     9.   Does the login record document:
         a.  Field and laboratory ID
         b.  Analyses requested
         c.  Storage location
         d.  Signature of custodian
         e.  Collection date
         f.   Receipt date
         g.  Analysis due date
         h.  Sample holding time
         i.   Special instructions

     10.  Is there a daily summary  of information such as sam-
         ples received, analyses requested, date sampled, or
        " date received?

     11.  To whom is this summary distributed?


     12.  Are login records filed and readily retrievable?

     13.  How far back in time can records be retrieved?
     14.  Are written SOPs developed for receipt and storage
         of samples?

     15.  Are they available to and  understood  by  laboratory
         personnel?

     16.  Is a clean area available for receiving and opening
         sample shipments?

     17.  Is this area separated from  other lab operations (con-
         sider not only spatial separations,  but  air flow, per-
         sonnel, traffic, etc.)?

     18.  Does  the custodian understand the importance  of
         preventing lab contamination?

     19.  If appropriate, are the pHs of samples measured and
         recorded to verify that they are preserved?

     20.  What percentage of samples is checked?
    21.  Are records of these checks retained?

    22.  Are facilities adequate for the storage of samples?

    23.  Are samples  stored  so  as  to  maintain their preser-
         vation?

-------
LABORATORY AUDIT GUIDELINES
Page 4
                                                              Yes    No    Points         Comments
     24.  Are volatile samples stored separately from semivola-
          tile samples?

     25.  Is the temperature of the cold storage area recorded
          daily?

          a.  Are excursions noted, along with descriptions of
             corrective action taken?

          b.  Is this being reviewed periodically by a supervisor
             or the QC unit?

     26.  Is the sample storage area secure?

     27.  How  is sample identification maintained?
     28.  Is positive sample chain-of-custody maintained within
          the lab?

     29.  How are samples tracked through the lab?
     30.  How long are samples retained?

          Sample extracts?   	
     31.   How are special instructions regarding  preparation,
          analysis, or turnaround times transmitted within the
          laboratory?
Comments:.
C.   Sample Preparation Area/Facilities

     1.    Is the laboratory maintained in a clean and organized
          manner?

     2.    Does the lab appear to have adequate  work space
          (120 ft2 per analyst)?

     3.    Are the toxic chemical handling areas either stainless
          steel benches or an impervious material covered with
          absorbent paper?

     4.    Are contamination-free  work areas provided  for the
          handling of toxic materials?

-------
LABORA TORY AUDIT GUIDELINES
Page 5
                                                               Yes   No    Points         Comments
     5.    Are adequate exhaust  hoods  available to  prevent
          contamination of personnel and the laboratory facility?

     6.    Are the flow rates and/or face  velocities of these
          hoods periodically checked and recorded?

     7.    How frequently are they checked?
     8.    Are the procedures and records adequate to  dem-
          onstrate the proper face velocity profile for each hood
          over the period of record?

     9.    Is the near-face interior of each hood clear of objects
          that might interfere with the  proper face velocity pro-
          file and thereby reduce hood efficiency?

     10.   Are chemical waste disposal policies/procedures well-
          defined and followed by the laboratory?

     11.   Are records of waste  containerization and disposal
          (lab logs, manifest, etc.) filed and retrievable?

     12.   Are voltage control devices installed on  major instru-
          mentation?

     13.   What is the laboratory's source of distilled/deionized
          water?
     14.   Is the conductivity of this water  checked daily and
          data  recorded  (acceptable  conductivity is 2.0-5.0
          umhos/cm at 25°C)?

     15.   Is the analytical balance located away from draft and
          areas subject to rapid temperature fluctuations?

     16.   Is it protected from vibration associated with activities
          in the facility (i.e., it should be on a heavy table, on a
          floor that does not bounce when walked on, etc.)?

     17.   Is the balance maintained by a certified technician?

     18.   Is  the  balance  routinely  calibrated  with  Class  S
          weights and are the calibration data recorded?

     19.   Are the Class S  weights handled  properly to prevent
          contamination/damage?

     20.   How often are the Class S weights certified?
    21.  Are  pH and ion selective  meters properly calibrated
         and  maintained; and are these activities recorded?

    22.  Are ^laboratory thermometers  (including  mercury-in-
         glass) calibrated  at  least  yearly  against  an NIST
         traceable thermometer and documented?

-------
LABORATORY AUDIT GUIDELINES
Dage 6
                                                             Yes   No   Points         Comments
    23.  Are reagents  dated  upon receipt  by labeling each
         container with the date received?

    24.  Is there a complete log of reagent and solvent supply
         giving the quantity, batch number,  receipt date, per-
         cent activity, or purity?

    25.  Are reagents and standards checked prior to  use?

    26.  Are solvent lots checked  and documented  prior  to
         use?

    27.  Are reference materials properly labeled?

    28.  Is each spiking/calibration standard completely trace-
         able to  documented  neat material  or  a documented
         purchased standard?

    29.  Is each logbook entry signed and dated by the indi-
         vidual who prepared the solution?

    30.  Are logbooks periodically reviewed and signed by a
         manager/supervisor?

    31.  Are logbooks  maintained in  a manner which allows
         complete traceability?

    32.  Are standards stored separately from samples and
         sample extracts?

    33.  Are volatile and semivolatile  standard  compounds
         properly segregated?

    34.  Are SOPs readily available to laboratory personnel?

    35.  Are glassware  cleaning procedures documented?

    36.  Are the  cleaning  procedures  consistent with EPA
         recommended  procedures?

    37.  Is the temperature of the drying ovens recorded dai-
         ly?

    38.  Is cleaned glassware properly handled and stored  to
         prevent contamination?

    39.  How do lab personnel  recognize glassware  that has
         been prepared for specific function  (e.g., organic vs.
         inorganic)?    	
    40.  Is the laboratory secured?

Comments:	
).   Instrumentation

    1.   Are instrument operating manuals available?

-------
LABORATORY AUDIT GUIDELINES
Page 7
                                                               Yes   No   Points         Comments
     2.    Do the operators demonstrate a good familiarity with
          the manuals?
     3.    Are  there service  contracts on  the instrumentation
          (and is a record maintained of the service)?
     4.    Are in-house replacement parts available?
     5.    Have the  instruments been modified in any way?
          Describe  the modifications and discuss ramifications:
     6.    Are instruments  properly vented or are  appropriate
          traps in place?
     7.    Is a logbook maintained for each instrument?
     8.    Is a complete list of laboratory instrumentation avail-
          able?
     9.    Are all calibration data hard-copied and retained?
     10.   When calibrating  an AA:
          a.  How  many  standards are run to  generate  the
             calibration curve?	
          b.  Is a new curve generated for each run?
          c.  Is a standard blank always run?
          d.  Is calibration checked immediately after complet-
             ing as well as periodically throughout the run?
     11.   When calibrating  an ICP:
          a.  How  many  standards are run to  generate  the
             calibration curve?	
          b.  Is a new curve generated for each run?
          c.  Is a standard blank always run?
          d.  Is calibration  checked immediately after complet-
             ing as well as periodically throughout the run?
     12.   When calibrating  a GC:
          a.  How  many  standards are run to  generate  the
             calibration curve?	
          b.  Is a calibration check standard run daily?
          c.  What are  the performance criteria  for this stan-
             dard?	
          d.  Is the instrument typically  calibrated for  every
             compound of interest?

-------
LABORATORY AUDIT GUIDELINES
Page 8
                                                             Yes   No   Points         Comments
         e.  How are retention times monitored for each com-
             pound of interest, and when is corrective action
             taken?      	
     13.  When calibrating a GC/MS:

         a.  How many standards are run to generate the
             calibration curve?	

         b.  Is a calibration check standard run daily?

         c.  What are  the performance criteria for this stan-
             dard?	

         d.  Is  the instrument typically  calibrated  for every
             compound of interest?

         e.  Is the instrument tuned at least daily?

         f.   Do the tuning procedures conform to the methods
             for which the instrument is being used?

         g.  What compound  and performance  criteria are
             used?	

         h.  Are surrogates and internal standards used?

         I.   Are  surrogate and  internal  standard recoveries
             monitored?

         j.   What are the action limits? 	
Comments:.
E.   Quality Control

     1.   Are method blanks prepared and analyzed with each
         batch of samples, for each  analytical procedure, or
         some percentage?

         What percentage:

         a.  For GC/MS analyses?	
         b.  For GC analyses?	
         c.  For AA/ICP analyses?.
         d.  For wet chemistry?	
    2.   At what frequency are lab  duplicates  prepared  and
         analyzed:

         a.  For GC/MS analyses?	
         b.  For GC analyses?	;	
         c.  For AA/ICP analyses?.

-------
LABORATORY AUDIT GUIDELINES
Page 9
                                                            Yes   No   Points         Comments
         d.  For wet chemistry?
    3.   How are duplicate sample results tracked and used:
         a.  For GC/MS analyses?	
         b.  For GC analyses?	
         c.   For AA/ICP analyses?.
         d.   For wet chemistry?	
    4.   At what frequency are lab spikes (e.g., spiked deion-
         ized water or clean soil) prepared and analyzed:
         a.  For GC/MS analyses?	
         b.  For GC analyses?	
         c.   For AA/ICP analyses?.
         d.   For wet chemistry?	
     5.   At what stage of processing are samples spiked:
         a.  For GC/MS analyses?	
         b.  For GC analyses?	
         c.   For AA/ICP analyses?.
         d.   For wet chemistry?	
     6.   Are matrix spiked samples employed:
         a.  For GC/MS analyses?
         b.  For GC analyses?
         c.  For AA/ICP analyses?
         d.  For wet chemistry?
     7.   What action is  taken  when results exceed  control
         limits:
         a.  For GC/MS analyses?	
         b.  For GC analyses?	
         c.   For AA/ICP analyses?.
         d.   For wet chemistry?	
     8.   Are surrogate compounds utilized for GC/MS analy-
         ses?
     9.   When are the surrogates added to the samples?
     10.  How many surrogate compounds are introduced?

     11.  Is the percent recovery for each surrogate calculated?
     12.  Are those data reported?
     13.  Are performance criteria established for surrogates?
     14.  Are percent recoveries plotted on control charts?

-------
LABORATORY AUDIT GUIDELINES
Page 10
                                                             Yes   No    Points         Comments
     15,  What action is taken when results exceed limits?


     16,  Are surrogate compounds utilized for GC analyses?

     17.  When are the surrogates added to the samples?


     18.  How many surrogate compounds are introduced?


     19.  Is the percent recovery for each surrogate calculated?

     20.  Are those data reported?

     21.  Are performance criteria established for surrogates?

     22.  Are percent recoveries plotted on control  charts?

     23,  What action is taken when results exceed limits?


F.   Data Handling and Review

     1.    Are computer programs validated prior to use?

     2.    Are records of the validation maintained?

     3,    Are user instructions  complete and  available  to all
          users?

     4,    Do analysts/technicians  record data in  a neat and
          accurate manner?

     5.    Are all handwritten data recorded in nonerasable ink?

     6.    Have entries been obliterated (e.g., through cross-
          outs or "whiteout")?

     7.    Are data calculations spot-checked by a second per-
          son?

          What percentage?	
    8.   Are these checks documented on the hard-copy data
         record, and dated and initialed by the reviewer?

    9.   Are raw data being identified with client name, project
         number,  date, and other pertinent tracking informa-
         tion?

    10.  Are raw data (notebooks, data sheets, computer files,
         strip chart recordings) being retained for 5 years?

    11.  Is there a system for report, record, or data retrieval?

    12.  Do  supervisory personnel  review  the data or QC
         results?

         What percentage?	

-------
LABORATORY AUDIT GUIDELINES
Page 11
                                                             Yes   No    Points         Comments
     13.  Are these reviews documented?
     14.  Are in-house QC charts maintained and available for
         onsite inspection for:
         a.  Matrix spikes?
         b.  Laboratory duplicates?
         c.  Surrogate recoveries?
         d.  Calibration check standards?
     15.  Have method detection  limit studies been performed
         for each method in use?
         a.  How recently?
          b.  Any procedural or configurational changes since
            ' then?
     16.   Do records indicate that appropriate corrective action
          has been taken when analytical results fail  to meet
          the QC criteria?
Comments:.
G.   QC Manual Checklist
     1.    Does the laboratory have a QC manual?
     2.    Does the manual address the following:
          a.  Personnel?
          b.  Facilities or equipment?
          c.  Operation of instruments?
          d.  Method validation
          e.  Calibration frequency
          f.   Standards preparation
          g.  Documentation of procedures
          h.  Preventive maintenance
          i.   Reliability of data
          j.   Data validation
          k.  Feedback and corrective action
          I.   Record-keeping
          m.  Internal audits
Comments:	

-------
LABORATORY AUDIT GUIDELINES
Page 12
                                                          Yes   No    Points        Comments
H.  Summary

    1.   Do responses to the evaluation indicate that labora-
         tory personnel are aware of QA/QC and its  potential
         impact on the data?

    2.   Is a positive emphasis placed on QA/QC  by labora-
         tory management?

    3.   Have the responses been open and direct?

    4.   Has the attitude been cooperative?

    5.   Is the proper emphasis placed on quality assurance?

Comments:	

-------
Attachment 2

-------
               ANALYTICAL AUDIT SCORING GUIDELINES
Point distributions for each response that can be answered "yes" or "no" are given in the
following guideline. In cases of incomplete fulfillment of requirements, both responses
may be checked. All points are then totaled and the percentage of the maximum possible
points is then calculated. Questions that are not applicable to a particular facility are not
scored,  and are not  counted toward the maximum possible points,  thereby neither
rewarding nor penalizing the laboratory. Responses to questions which have no point
value will be used to determine marginal cases of pass or fail.  The following criteria are
given for acceptability or nonacceptability:

     86-100% of maximum possible points   =   acceptable audit

     76-85% of maximum possible points   =   provisionally acceptable audit
                                             (based on responses to nonpoint
                                             questions)

     below 76% of maximum possible points =   unacceptable audit

-------
AUDIT SCORING GUIDELINES
Page 2

A. Organization and Personnel
1 . Is there an organizational chart available?
2. Is everyone in the organization familiar with it?
3. Is an up-to-date file maintained in the laboratory de-
scribing the educational background and/or related
work experience of all laboratory personnel?
4. Is there a formal training program for personnel?
5. Are employees required to demonstrate proficiency
with analytical instrument operation, methods, or
techniques prior to working on client samples?
6. Is this proficiency testing documented?
7. Is the organization adequately staffed to meet com-
mitments in a timely manner?
8. Is there a designated QA/QC Officer?
9. To whom does the lab QA/QC Officer report?
10. Was the lab QA/QC Officer available during the au-
dit?
11. Was a program manager or laboratory manager avail-
able during the evaluation?
Onmmflnts-


B. Sample Receipt and Storage Area
1. Is a sample custodian designated?
2. Are the responsibilities clearly defined?
In writing?
3. Is there a standard sample login procedure followed?
4. Does the procedure include adequate inspection of
samples and accompanying documents to verify that
they are intact, complete, and consistent?
5. Is there an inspection checklist?
6. Does it document adequately the nature and condi-
tion of samples and documentation?
7. Is the integrity of samples and shipping containers
being documented?
8. Are samples logged into a bound notebook?
a. Computerized lab management system?
h. Other? (ri«snrihp- )
Yes

1
1
1
1
2
2
5
2
1
1
2
1
1
1
2
1
1
1
5
5
5
No Comments

-1
-1
-1
-2
-2
-1
-1
-1
-1
-1
-1
-1
-1
_•<
-1
-1
-1
-1
-2
-2
-2

-------
AUDIT SCORING GUIDELINES
PageS

9.









10.
11.

Does the login record document:
a. Field and laboratory ID
b. Analyses requested
c. Storage location
d. Signature of custodian
e. Collection date
f. Receipt date
g. Analysis due date
h. Sample holding time
i. Special instructions
Is there a daily summary of information such as sam-
ples received, analyses requested, date sampled, or
date received?
To whom is this summary distributed?
Yes

2
2
2
2
2
2
2
2
2
2

Mo Comments

-2
-2
-2
..2
-2
-2
-2
-2
-2
-2

     12.   Are login records filed and readily retrievable?             2      -2

     13.   How far back in time can records be retrieved?
     14.   Are written  SOPs developed for receipt and storage      2     -1
          of samples?

     15.   Are they available to  and understood by laboratory      1      -1
          personnel?

     16.   Is a clean area available for receiving and opening      1      -1
          sample shipments?

     17.   Is this area  separated from other lab operations (con-      1      -1
          sider not only  spatial  separations, but air flow, per-
          sonnel, traffic, etc.)?

     18.   Does  the custodian  understand  the importance  of      1      -1
          preventing lab contamination?

     19.   If appropriate, are the pHs of samples measured and      1      -1
          recorded to  verify that they are preserved?

     20.   What percentage of samples is checked?
     21.   Are records of these checks retained?                     1-1

     22.   Are facilities adequate for the storage of samples?          1      -1

     23.   Are samples stored so as to maintain their preser-"      2      -1
          vation?

     24.   Are volatile samples stored separately from semivola-       5      -2
          tile samples?

     25.   Is the temperature of the cold storage area recorded       2      -1
          daily?

-------
AUDIT SCORING GUIDELINES
Page 4
                                                              Yes     No            Comments
          a.  Are  excursions noted,  along with descriptions of       2     -1
             corrective action taken?
          b.  Is this being reviewed periodically by a supervisor       2     -1
             or the QC unit?
     26.  Is the sample storage area secure?                        1      -1
     27.  How  is sample identification maintained?                   1      -1
     28.   Is positive sample chain-of-custody maintained within       1      -1
          the lab?
     29.   How are samples tracked through the lab?
     30.   How long are samples retained?
          Sample extracts?     	
     31.   How are special  instructions regarding  preparation,
          analysis,  or turnaround times transmitted within the
          laboratory?
Comments:
C.   Sample Preparation Area/Facilities
     1.    Is the laboratory maintained in a clean and organized      2     -2
          manner?
     2.    Does the lab appear to have adequate  work space      1     -1
          (120 ft2 per analyst)?
     3.    Are the toxic chemical handling areas either stainless      1     -1
          steel  benches or an impervious material covered with
          absorbent paper?
     4.    Are contamination-free  work areas provided  for the      1     -1
          handling of toxic materials?
     5.    Are adequate  exhaust hoods available  to  prevent      2     -1
          contamination of personnel and the laboratory  facility?
     6.    Are the flow rates and/or face  velocities of these      1     -1
          hoods periodically checked and recorded?

-------
AUDIT SCORING GUIDELINES
Page 5
                                                                Yes     Mo             Comments
     7.   How frequently are they checked?
     8.    Are the procedures  and records adequate to dem-      1      -1
          onstrate the proper face velocity profile for each hood
          over the period of record?

     9.    Is the near-face interior of each hood clear of objects      1      -1
          that might interfere with the proper  face velocity pro-
          file and thereby reduce hood efficiency?

     10.  Are chemical waste disposal policies/procedures well-      1      -1
          defined and followed by the laboratory?

     11.  Are records  of waste containerization and disposal      1      -1
          (lab logs, manifest, etc.) filed and retrievable?

     12.  Are voltage control devices installed on major instru-      1      -1
          mentation?

     13.  What  is the laboratory's  source of  distilled/deionized
          water?
     14.  Is  the  conductivity  of this water checked daily and      2      -2
          data recorded  (acceptable conductivity  is  2.0-5.0
          u,mhos/cm at 25°C)?

     15.  Is the analytical balance located away from draft and      1      -1
          areas subject to rapid temperature fluctuations?

     16.  Is it protected from vibration associated with activities      1      -1
          in the facility (i.e., it should be  on a heavy table, on a
          floor that does not bounce when walked on, etc.)?

     17.  Is the balance maintained by a certified technician?         2      -2

     18.  Is  the   balance   routinely  calibrated  with Class   S      2      -2
          weights and are the calibration data recorded?

     19.  Are the Class S weights handled properly to prevent      2      -2
          contamination/damage?

     20.  How often are the Class S weights  certified?
     21.  Are pH and ion selective meters properly calibrated      1     -1
          and maintained; and are these activities recorded?

     22.  Are laboratory  thermometers (including  mercury-in-      1     -1
          glass)  calibrated  at least yearly  against an  NISI
          traceable thermometer and documented?

     23.  Are reagents dated  upon receipt by  labeling each      1     -1
          container with the date received?

-------
AUDIT SCORING GUIDELINES
Page 6
                                                              Yes     No             Comments
     24.  Is there a complete log of reagent and solvent supply      1     -1
          giving the  quantity, batch number, receipt date, per-
          cent activity, or purity?

     25.  Are reagents and standards checked prior to use?          1     -1

     26.  Are solvent lots checked and documented prior to      1     -1
          use?

     27.  Are reference materials properly labeled?                  1     -1

     28.  Is each spiking/calibration standard completely trace-      2     -1
          able to documented neat material or a documented
          purchased standard?

     29.  Is each logbook entry signed and dated by the indi-      1     -1
          vldual who prepared the solution?

     30.  Are logbooks periodically reviewed and signed by a      1     -1
          manager/supervisor?

     31.  Are logbooks  maintained in a manner which allows      2     -2
          complete traceability?

     32.  Are standards  stored separately  from  samples and      1     -1
          sample extracts?

     33.  Are volatile  and  semivolatile standard  compounds      1     -1
          properly segregated?

     34.  Are SOPs readily available to laboratory personnel?        1     -1

     35.  Are glassware cleaning procedures documented?           2     -2

     36.  Are the  cleaning procedures consistent  with  EPA      5     -2
          recommended procedures?

     37.  Is the temperature of the drying ovens recorded dai-      1     -1
          ly?

     38.  Is cleaned glassware properly  handled  and stored to      2     -2
          prevent contamination?

     39.  How do lab personnel recognize glassware that has
          been prepared  for specific function (e.g., organic vs.
          inorganic)?     	
     40.   Is the laboratory secured?                               1     -1

Comments:  	
D.   Instrumentation

     1.   Are instrument operating manuals available?               1      -1

     2.   Do the operators demonstrate a good familiarity with       1      -1
         the manuals?

-------
AUDIT SCORING GUIDELINES
Page?
                                                               Yes     No             Comments
     3.    Are there service  contracts on  the instrumentation      1     -J
          (and is a record maintained of the service)?
     4.    Are in-house replacement parts available?                 1     -1
     5.    Have the  instruments been modified in any way?          -1      1
          Describe  the modifications and discuss ramifications:
     6.    Are instruments  properly vented or are appropriate      1     -1
          traps in place?
     7.    Is a logbook maintained for each instrument?               1     -1
     8.    Is a complete list of laboratory instrumentation avail-      1     -1
          able?
     9.    Are all calibration data hard-copied and retained?           5     -2
     10.   When calibrating an AA:
          a.  How  many  standards are run to  generate  the
             calibration curve? 	
          b.  Is a new curve generated for each run?                5     -2
          c.  Is a standard blank always run?                       5     -2
          d.  Is calibration  checked immediately after complet-      5     -2
             ing as well as periodically throughout the run?
     11.   When calibrating an ICP:
          a.  How  many  standards are run to  generate  the
             calibration curve? 	
          b.  Is a new curve generated for each run?                5     -2
          c.  Is a standard blank always run?                       5     -2
          d.  Is calibration  checked immediately after complet-      5     -2
             ing as well as periodically throughout the run?
     12.   When calibrating a GC:
          a.  How  many  standards are run to  generate  the
             calibration curve? 	
          b.  Is a calibration check standard run daily?               5     -2
          c.  What are  the performance criteria  for  this stan-
             dard? 	
          d.  Is the instrument typically  calibrated  for  every      5     -2
             compound of interest?

-------
AUDIT SCORING GUIDELINES
Page 8
                                                             Yes     No             Comments
         e.  How are retention times monitored for each com-
             pound of interest, and when is corrective action
             taken?        	
     13.  When calibrating a GC/MS:

         a.  How many standards are run to generate the
             calibration curve?  	

         b.  Is a calibration check standard run daily?              5     -2

         c.  What are  the  performance criteria for  this stan-
             dard? 	:	

         d.  Is  the  instrument typically  calibrated  for every      5     -2
             compound of interest?

         e.  Is the instrument tuned at least daily?                 5     -2

         f.   Do the tuning procedures conform to the methods      5     -2
             for which the instrument is being used?

         g.  What  compound  and performance  criteria are
             used? 	

         h.  Are surrogates and internal standards used?           5     -2

         i.   Are surrogate and  internal  standard recoveries      5     -2
             monitored?

         j.   What are the action limits?   	
Comments:
E.   Quality Control

     1.   Are method blanks prepared and analyzed with each      5     -2
         batch of samples, for each  analytical procedure, or
         some percentage?

         What percentage:

         a.  For GC/MS analyses?  	
         b.  For GC analyses? 	
         c.  For AA/ICP analyses?
         d.  For wet chemistry?   _
    2.   At what frequency are lab duplicates  prepared  and
         analyzed:

         a.  For GC/MS analyses?  	
         b.  For GC analyses? 	
         c.  For AA/ICP analyses?

-------
AUDIT SCORING GUIDELINES
Page 9
                                                            Yes     No             Comments
         d.  For wet chemistry?
     3.   How are duplicate sample results tracked and used:
         a.  For GC/MS analyses? 	
         b.  For GC analyses?  	
         c.  For AA/ICP analyses?
         d.  For wet chemistry?   _
     4.   At what frequency are lab spikes (e.g., spiked deion-
         ized water or clean soil) prepared and analyzed:
         a.  For GC/MS analyses? 	
         b.  For GC analyses?  	
         c.  For AA/ICP analyses?
         d.  For wet chemistry?   _
     5.   At what stage of processing are samples spiked:
         a.  For GC/MS analyses? 	
         b.  For GC analyses?  	
         c.  For AA/ICP analyses?
         d.  For wet chemistry?   _
     6.   Are matrix spiked samples employed:
         a.  For GC/MS analyses?                               1     -1
         b.  For GC analyses?                                   1     -1
         c.  For AA/ICP analyses?                               1     -1
         d.  For wet chemistry?                                  1     -1
     7.   What action  is taken when results exceed  control
         limits:
         a.  For GC/MS analyses? 	
         b.  For GC analyses?  	
         c.  For AA/ICP analyses?
         d.  For wet chemistry?   _
     8.   Are surrogate  compounds  utilized for GC/MS analy-
         ses?
     9.   When are the surrogates added to the samples?
     10.  How many surrogate compounds are introduced?

     11.  Is the percent recovery for each surrogate calculated?      5     -2
     12.  Are those data reported?                               2     -2
     13.  Are performance criteria established for surrogates?        2     -2
     14.  Are percent recoveries plotted on control charts?           2     -2

-------
AUDIT SCORING GUIDELINES
Page 10
                                                              Yes     No             Comments
     15.  What action is taken when results exceed limits?

     16.  Are surrogate compounds utilized for GC analyses?        1     -1
     17.  When are the surrogates added to the samples?

     18.  How many surrogate compounds are introduced?

     19.  Is the percent recovery for each surrogate calculated?      1     -1
     20.  Are those data reported?                                1-1
     21.  Are performance criteria established for surrogates?        1     -1
     22.  Are percent recoveries plotted on control charts?           1     -1
     23.  What action is taken when results exceed limits?

F.   Data Handling and Review
     1.    Are computer programs validated prior to use?             2     -1
     2.    Are records of the validation maintained?                  2     -1
     3.    Are user instructions complete  and  available to ail      2     -1
          users?
     4.    Do analysts/technicians  record data in a neat  and      2     -1
          accurate manner?
     5.    Are all handwritten data recorded in nonerasable ink?       2     -2
     6.    Have entries been obliterated  (e.g., through cross-     -2      2
          outs or "whiteout")?
     7.    Are data calculations spot-checked by a second per-      2 .    -2
          son?
          What percentage?   	
     8.   Are these checks documented on the hard-copy data      2     -2
         record, and dated and initialed by the reviewer?
     9.   Are raw data being identified with client name, project      2     -2
         number,  date, and  other pertinent tracking informa-
         tion?
     10.  Are raw data (notebooks, data sheets, computer files,      2     -2
         strip chart recordings) being retained for 5 years?
     11.  Is there a system for report, record, or data retrieval?       2     -1
     12.  Do  supervisory  personnel  review the  data  or  QC      2     -1
         results?
         What percentage?  	
    13.  Are these reviews documented?                          2     -1

-------
AUDIT SCORING GUIDELINES
Page 11

14.





15.

Are in-house QC charts maintained and available for
onsite inspection for:
a. Matrix spikes?
b. Laboratory duplicates?
c. Surrogate recoveries?
d. Calibration check standards?
Have method detection limit studies been performed
Yes


2
2
2
2
5
No Comments


-2
-2
-2
-2
-2
         for each method in use?
         a.  How recently?  	
          b.  Any procedural or configurational changes since
             then?
     16.   Do records indicate that appropriate corrective action
          has been taken when analytical results fail to meet
          the QC criteria?
Comments:
G.   QC Manual Checklist
     1.    Does the laboratory have a QC manual?
     2.    Does the manual address the following:
          a.  Personnel?
          b.  Facilities or equipment?
          c.  Operation of instruments?
          d.  Method validation
          e.  Calibration frequency
          f.  Standards preparation
          g.  Documentation of procedures
          h.  Preventive maintenance
          i.  Reliability of data
          j.  Data validation
          k.  Feedback and corrective action
          I.  Record-keeping
          m. Internal audits
—2
10
       -2
-10
1
1
1
1
1
1
1
1
1
1
1
1
1
-1
-1
-1
-1
-2
-1
-1
-1
-2
-2
-2
-2
-1
Comments:

-------
AUDIT SCORING GUIDELINES
Page 12

H.





Summary
1. Do responses to the evaluation indicate that labora-
tory personnel are aware of QA/QC and its potential
impact on the data?
2. Is a positive emphasis placed on QA/QC by labora-
tory management?
3. Have the responses been open and direct?
4. Has the attitude been cooperative?
5. Is the proper emphasis placed on quality assurance?
nnmmRnts-
Yes

2
2
2
2
2
No Comments

-2
-2
-2
-2
-5

-------
                 Attachment 3
This is an example memorandum for a specific laboratory for
which there were very few negative remarks.  Naturally, not
all laboratories will be of this quality.

-------
                           (from an actual laboratory audit)
TO:        [Audit Requestor]

FROM:     [Auditor]

DATE:     [Day/Month/Year]

SUBJECT: Laboratory Audit Visit to [Laboratory Name]
            [Street Address]
            [City, State, Zip Code]
            [Phone Number]

An analytical chemistry laboratory observation visit was conducted on [date] at the [laboratory
name and location]. The observation visit was performed by [auditor name] as part of the general
QA/QC observations being conducted on behalf of [client name]. Samples were collected in the
field by [source testing or field sampling company], and analyzed at the [laboratory name]. The
following areas were included as a part of the observation process at [laboratory name]:

    •   Personnel and organization

    •   Sample receipt and storage

    •   Sample preparation facilities

    •   Instrumentation and equipment

    •   Quality control

    •   Data handling and review.

The attached Analytical Chemistry Laboratory Audit Guidelines were followed during the visit.
Participating [laboratory name] staff included:

    «   [Names and titles].

The purpose of the observation visit was to determine whether [laboratory name] has the facilities,
equipment, trained personnel, and QA/QC program in place to be capable of routinely producing
data of known quality for site characterization programs. The completed checklist is appended.
AUDIT FINDINGS

Generally, the [laboratory name] was found to be capable of producing known quality, traceable
data.   There appeared to be an adequate understanding of QA/QC procedures within the
laboratory.  The employees interviewed displayed a positive attitude and an appreciation for the
importance of quality assurance,  and understood the potential impact of QA/QC upon data.

No major deficiencies were noted during the audit. The following recommendations are intended
to improve a basically sound program:

    •    There should be  more formal in-house QA/QC and training programs instituted for
         analysts and technicians; currently, training is dependent upon the more experi-
         enced analysts

-------
    »   An inspection checklist should be generated for incoming samples, which includes
         the nature and condition of samples and documentation

    «   Internal chain-of-eustody procedures should be initiated

    •   As part of the SOPs, a specific policy should be instituted for the rejection of
         incoming compromised samples

    «   Control charts should be maintained for all types of QC samples that are run.

The [laboratory name] staff were very helpful and cooperative.  There appears to be a positive
emphasis placed on QA/QC by laboratory management, and the responses appeared to be open
and direct.

-------
    APPENDIX H

Format for the Sediment
    Testing Report

-------
I
                      SEDIMENT TESTING REPORT FORMAT

    The sediment testing report, including physical, chemical, bioa&say, and bioaccumulation
data, should be prepared using the format guidelines below.

A.  INTRODUCTION

    The project description should include the following information:

    1.  Location of the proposed dredging project and the disposal site.

    2.  A plan view map showing project design depth,  side-slopes,  allowable overdepth.

    3.  Proposed dredging and disposal quantities.               !

B.  MATERIALS AND METHODS

    1.  Field sediment sampling and sediment sample handling procedures should be
        described or referenced.

    2.  References for laboratory protocols for physical, chemistry,  bioassay, and
        bioaccumulation analyses should be included, such as:

        a.   EPA method numbers and other EPA-approved methods that do not have a
             specific EPA number.

        b.   Target detection limits and references used for physical, chemical and tissue
             analyses.

        c.   Test species used in each test, the supplier or collection site for each test
             species, and QA/QC procedures for maintaining the test species.

        d.   Locations of references and control sediment samples.

        e.   Source of water used in all biological tests and documentation that the water is
             free of contaminants.                              :

        f.   Bioassay  and bioaccumulation testing procedures and QA/QC information.

        g.   Statistical analysis procedures.

C.  LOCATION OF SAMPLING AREAS

    1.  The exact position of the dredging site sampling areas and each core taken within
        each sampling area should be mapped.
                                              H-1

-------
     2.   A table should be prepared with the coordinates for each station in latitude and
         longitude (North America Datum 1983).

     3.   A table should be included showing the required sampling depth at each sampling
         location compared to the actual core depth achieved during field sampling. Any
         problems in collecting  sediment from the required depth should be discussed.

     4.   The type of positioning equipment to be used for the sampling program should be
         specified.

     5.   Charts should be provided to show the location of the reference site, the control
         site(s) and the disposal site, including the coordinates of each site.

D.   DESCRIPTION OF TESTING APPROACH

     The rationale for performing specific types of tests (e.g. chemical analysis of elutiate for
comparison to water quality standards, tissue analysis, etc.) should be presented in writing.

E.   FINAL RESULTS

     1.   Summary data tables should be furnished. All data tables should be typed or
         produced as a computer printout.

     2.   Copies of the final raw data sheets should be included. These tables should be
         certified to be accurate by the analytical laboratory manager.

F.   DISCUSSION AND ANALYSIS OF DATA

     1.   An evaluation of historical  data from the proposed dredging site should be concisely
         discussed. References  to previous sediment testing should also be included.

     2.   Statistical comparisons  between the dredging site sediments and the reference
         sediment should be made.

G.   REFERENCES

     This list should include all references used in the field sampling program, laboratory and
statistical data analyses, and historical data used to compare the dredging to the reference
site.

H.   DETAILED QA/QC PLANS AND INFORMATION

     The following topics should be addressed in the QA Plan:

     •  Introductory material, including title and signature pages, table of contents, and
        project description.
                                         H-2

-------
    •   QA organization and responsibilities (the QA organization should be designed to
        operate with a degree of independence from the technical project organization to
        ensure appropriate oversight)

    •   QA objectives

    •   Standard Operating Procedures

    •   Sampling strategy and procedures

    •   Sample custody

    •   Calibration procedures and frequency

    •   Analytical procedures

    •   Data validation,  reduction, and reporting                         '

    »   Internal QC checks

    •   Performance and system audits

    *   Facilities

    •   Preventive maintenance

    •   Calculation  of data quality indicators

    •   Corrective actions

    •   QA reports  to management

    *   References.

I.   PERTINENT CORRESPONDENCE WITH SCOPING COMMENTS AND
    COORDINATION

    The report should contain copies of the correspondence related to coordination on the
testing  activities for the proposed project.
                                         H-3
                         &U.S. GOVERNMENT PRINTING OFFICE: 199S- 821 - 154 / 82081

-------