SR-120
United State*
Environmental Protection
Agency
Research and Development
Office of Research
and Development
Washington, DC 20460
August 1992
EMAP- Estuaries
1992 Louisianian Province
Quality Assurance
Project Plan
Environmental Monitoring
and Assessment Program
-------
DRAFT 8/92
ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM
EMAP-ESTUARIES LOUISIANIAN PROVINCE
1992 QUALITY ASSURANCE PROJECT PLAN
by
Tom Heitmuller
Technical Resources, Inc.
1 Sabine Island
Gulf Breeze, Florida 32561
and
Raymond M. Valente
Science Applications International Corporation
27 Tarzweil Drive"
Narragansett, Rhode Island 02882
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF RESEARCH AND DEVELOPMENT
ENVIRONMENTAL RESEARCH LABORATORY
GULF BREEZE, FLORIDA 32561
-------
DISCLAIMER
The research described in this document has not been subject to U.S. Environmental Protection Agency
review and is intended for internal Agency distribution. It should be considered a draft document and
should not be cited, quoted or distributed without official approval. Mention of trade names does not
constitute endorsement or recommendation for use.
-------
PREFACE
This document outlines the integrated quality assurance plan for the Environmental Monitoring and
Assessment Program's Estuarine Monitoring in the Louisianian Province. The quality assurance plan is
prepared following the guidelines and specifications provided by the Quality Assurance Management Staff
of the U.S. Environmental Protection Agency Office of Research and Development.
Objectives for five data quality indicators (completeness, representativeness, comparability,
precision, and accuracy) are established for the Estuarine Monitoring in the Louisianian Province. The
primary purpose of the integrated quality assurance plan is to maximize the probability that data collected
over the duration of the project will meet or exceed these objectives, and thus provide scientifically sound
interpretations of the data in support of the project goals. Various procedures are specified in the quality
assurance plan to: (1) ensure that collection and measurement procedures are standardized among all
participants; (2) monitor performance of the measurement systems being used in the program to maintain
statistical control and to provide rapid feedback so that corrective measures can be taken before data quality
is compromised; (3) allow for the periodic assessment of the performance of these measurement systems
and their components; and, (4) to verify and validate that reported data are sufficiently representative,
unbiased, and precise so as to be suitable for their intended use. These activities will provide users with
information regarding the degree of uncertainty associated with the various components of the EMAP-
Estuaries data base.
-------
TABLE OF CONTENTS
Section Pa9e
Disclaimer _'
Preface j|
Table of Contents "j
Figure v'
Table v»
Acknowledgments v"'
1 INTRODUCTION 1 of 4
1.1 OVERVIEW 1 of 4
1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS 2 of 4
2 PROJECT ORGANIZATION 1 of 3
2.1 MANAGEMENT STRUCTURE 1 of 3
3 PROJECT DESCRIPTION 1 of 1
3.1 PURPOSE 1 of 1
4 QUALITY ASSURANCE OBJECTIVES 1 of 7
4.1 DATA QUALITY OBJECTIVES 1 of 7
4.2 REPRESENTATIVENESS 3of7
4.3 COMPLETENESS 4of7
4.4 COMPARABILITY 4of7
4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR 5 of 7
5 QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS,
CRITERIA, AND CORRECTIVE ACTION 1 of 35
5.1 CHEMICAL ANALYSIS OF SEDIMENT AND TISSUE SAMPLES 1 of 35
5.1.1 General QA/QC Requirements 4of35
5.1.2 Initial Calibration 9of35
5.1.3 Initial Documentation of Detetection Limits 9of35
5.1.4 Initial Blind Analysis of Representative Sample 10 of 35
5.1.5 Laboratory Participation in Intercomparison Exercises 10of35
5.1.6 Routine Analysis of Certified Reference Materials
or Laboratory Control Materials 12 of 35
5.1.7 Continuing Calibration Check 14 of 35
5.1.8 Laboratory Reagent Blank 14 of 35
5.1.9 Internal Standards 15 of 35
5.1.10 Injection Internal Standards 15 of 35
5.1.11 Matrix Spike and Matrix Spike Duplicate 16 of 35
5.1.12 Field Duplicates and Field Splits 17 of 35
5.1.13 Analytical Chemistry Data Reporting Requirements 18 of 35
Hi
-------
Contents (Continued)
Section Page
5.2 OTHER SEDIMENT MEASUREMENTS 19 of 35
5.2.1 Total organic carbon 19 of 35
5.2.2 Acid volatile sulfide 20 of 35
5.2.3 Butyltins 21 of 35
5.2.4 Sediment grain size 22 of 35
5.2.5 Apparent RPD depth 23 of 35
5.3 SEDIMENT TOXICITY TESTING 23 of 35
5.3.1 Facilities and Equipment 24 of 35
5.3.2 Initial Demonstration of Capability 24 of 35
5.3.3 Sample Handling and Storage 25 of 35
5.3.4 Quality of Test Organisms 25 of 35
5.3.5 Test Conditions 26 of 35
5.3.6 Test Acceptability 27 of 35
5.3.7 Record Keeping and Reporting 27 of 35
5.4 MACROBENTHIC COMMUNITY ASSESSMENT 27 of 35
5.4.1 Sorting 28 of 35
5.4.2 Species Identification and Enumeration 29 of 35
5.4.3 Blomass Measurements 30 of 35
5.5 FISH SAMPLING 31 of 35
5.5.1 Species ID. Enumeration and Length Measurements 31 of 35
5.5.2 Fish Gross Pathology and Histopathology 32 of 35
5.6 WATER COLUMN MEASUREMENTS 33of35
5.6.1 Hydrolab Surveyor II 33 of 35
5.6.2 Hydrolab Datasonde 3 34of35
5.7 NAVIGATION 35of35
6 FIELD OPERATIONS AND PREVENTIVE MAINTENANCE 1 of 3
6.1 TRAINING AND SAFETY 1 of 3
6.2 FIELD QUALITY CONTROL AND AUDITS 2of3
6.3 DATA RECORDING , 3of3
6.4 PREVENTIVE MAINTENANCE 3of3
7 LABORATORY OPERATIONS 1 of 3
7.1 DATA RECORDING 1 of 3
7.2 LABORATORY PERSONNEL, TRAINING, AND SAFETY 1 of 3
7.3 QUALITY CONTROL DOCUMENTATION 2of3
7.4 ANALYTICAL PROCEDURES 2of3
7.5 LABORATORY PERFORMANCE AUDITS 2 of 3
iv
-------
Contents (Continued)
Section Page
8 QUALITY ASSURANCE AND QUALITY CONTROL FOR MANAGEMENT
OF DATA AND INFORMATION 1 of 7
8.1 SYSTEM DESCRIPTION 1 of 7
8.2 QUALITY ASSURANCE/QUALITY CONTROL 1 of 7
8.2.1 Standardization 1 of 7
8.2.2 Prelabeling of Equipment and Sample Containers 2 of 7
8.2.3 Data Entry and Transfer 2 of 7
8.2.4 Automated Data Verification 3 of 7
8.2.5 Sample Tracking 4 of 7
8.2.6 Reporting 4of7
8.2.7 Redundancy (Backups) 4 of 7
8.2.8 Human Review 5 of 7
8.3 DOCUMENTATION AND RELEASE OF DATA 6 of 7
9 QUALITY ASSURANCE REPORTS TO MANAGEMENT 1 of 1
10 REFERENCES 1 of 2
-------
FIGURES
Figure Page
2.1 Management Strucrure for the 1992 EMAP-Estuaries Louisianian
Province 2of3
vi
-------
TABLES
Table Page
1.1 Sections in this report that address the 15 subjects required
in a quality assurance project plan 4 of 4
2.1 List of key personnel, affliations, and responsibilities for
EMAP-Estuaries 1992 Louisianian Province monitoring 3 of 3
4.1 Measurement quality objectives for EMAP-Estuaries indicators and
associated data 5 of 7
4.2 Quality assurance sample types, frequency of use, and types of data
generated for EMAP-Estuaries Louisianian Province monitoring (see
Table 5.3 for chemical analysis QA/QC sample types) 6 of 7
5.1 Chemicals to be measured in sediments by EMAP-Estuaries Louisianian
Province i ; 2 of ,35
5.2 Chemicals to be measured in edible fish and shellfish tissue by
EMAP-Estuaries Louisianian Province 3 of 35
5.3 Key elements for quality control of EMAP-Estuaries chemical analyses
(see text for detailed explanations) 6 of 35
5^4 Target method detection limits for EMAP-Estuaries analyses 11 of 35
5.5 Codes for denoting QA/QC samples in submitted data packages 18 of 35
5.6 Maximum acceptable differences for instrument field
calibration checks 34 of 35
vii
-------
ACKNOWLEDGMENTS
The following individuals contributed to the development of this document: J. Pollard, K. Peres and
I. Chiang, Lockheed Engineering and Sciences Company, Las Vegas, Nevada; J. Schoenherr, C. Eller, and
D. Cobb, Science Applications International Corporation, Narragansett, Rhode Island; D. Bender and L
Johnson, Technology Applications Inc., Cincinnati, Ohio; R. Graves, U.S. Environmental Protection Agency,
Environmental Monitoring Systems Laboratory, Cincinnati, Ohio; C.A. Manen, National Oceanic and
Atmospheric Administration, Rockville, Maryland; K. Summers, U.S. Environmental Protection Agency, Envi-
ronmental Research Laboratory, Gulf Breeze, Rorida; R. Pruell, U.S. Environmental Protection Agency,
Environmental Research Laboratory, Narragansett, Rhode Island; F. Holland and S. Weisberg, Versar, Inc.,
Columbia, Maryland. The assistance provided by R. Graves in the development of measurement quality
t . -
objectives for analytical chemistry is especially appreciated.
viii
-------
Section 1
Revision 0
Date 8/92
DRAFT 1
Page 1 of 4
SECTION 1
INTRODUCTION
1.1 OVERVIEW
The U.S. Environmental Protection Agency (EPA), in cooperation with other Federal agencies and
state organizations, has designed the Environmental Monitoring and Assessment Program (EMAP) to monitor
indicators of the condition and health of the Nation's ecological resources. Specifically, EMAP is intended
to respond to the growing demand for information characterizing the condition of our environment and the
type and location of changes in our environment. Simultaneous monitoring of pollutants and environmental
indicators will allow for the identification of the likely causes of adverse changes. When EMAP has been fully
implemented, the program will answer the following critical questions:
o What is the status, extent and geographic distribution of the nation's important ecological
resources?
o What proportion of these resources is declining or improving? Where, and at what rate?
o What are the factors that are likely to be contributing to declining condition?
o Are control and mitigation programs achieving overall improvement in ecological
conditions?
o Which resources are at greatest risk to pollution Impacts?
To answer these types of questions, the Near Coastal Component of EMAP-Near Coastal (EMAP-NC) has
set four major objectives:
o Provide a quantitative assessment of the regional extent of coastal environmental problems
by measuring pollution exposure and ecological condition.
-------
Section 1
Revision 0
Date 8/92
DRAFT 1
Page 2 of 4
o Measure changes in the regional extent of environmental problems for the nation's estuarine
and coastal ecosystems.
o Identify and evaluate associations between the ecological condition of the nation's estuarine
and coastal ecosystems and pollutant exposure, as well as other factors known to affect
ecological condition (e.g., climatic conditions, land use patterns).
o Assess the effectiveness of pollution control actions and environmental policies on a
regional scale (i.e., large estuaries like Chesapeake Bay, major coastal regions like the mid-
Atlantic and Gulf Coasts) and nationally.
The Near Coastal component of EMAP will monitor the status and trends in environmental quality
of the coastal waters of the United States. This program will complement and eventually may merge with
the National Oceanic and Atmospheric Administration's (NOAA) existing National Status and Trends Program
for Marine Environmental Quality to produce a single, cooperative, coastal and estuarine monitoring
program. To more efficiently manage Near Coastal activities, the Program has been further divided to study
the Great Lakes, the offshore (shelf) environment, and the Nation's estuaries, bays, tidal rivers, and sounds.
This last component (EMAP-Estuaries or EMAP-E) began in 1990 with the Virginian Province Demonstration
Project.
The strategy for implementation of the EMAP-E project Is a regional, phased approach which started
with the 1990 Demonstration Project in the Virginian Province. This biogeographical province covers an area
from Cape Cod, Massachusetts to Cape Henry, Virginia (Holland 1990). In 1991, monitoring continued in
the Virginian Province and began in the Louisianian Province (Gulf of Mexico from near Tampa Bay, Florida
to the Texas-Mexico border at the Rio Grande). Additional provinces will be added in future years,
eventually resulting in full national implementation of EMAP-Estuaries.
1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS
The quality assurance policy of the EPA requires every monitoring and measurement project to have
a written and approved quality assurance plan (Stanley and Vemer 1983). This requirement applies to all
environmental monitoring and measurement efforts authorized or supported by the EPA through regulations,
-------
Section 1
Revision 0
Date 8/92
DRAFT 1
Page 3 of 4
grants, contracts, or other means. The quality assurance plan for the project specifies the policies,
organization, objectives, and functional activities for the project The plan also describes the quality
assurance and quality control activities and measures that will be implemented to ensure that the data will
meet all criteria for data quality established for the project All project personnel must be familiar with the
policies and objectives outlined in this quality assurance plan to assure proper interactions among the
various data acquisition and management components of the project EPA guidance (Stanley and Verner,
1983) states that the 15 items shown in Table 1.1 should be addressed in the QA Project Plan. Some of
these items are extensively addressed In other documents for this project and therefore are only summarized
or referenced in this document.
-------
Section 1
Revision 0
Date 8/92
DRAFT 1
Page 4 of 4
TABLE 1.1. Sections in this report that address the 15 subjects required in a Quality Assurance Project
Plan.
Quality Assurance Subject This Report
Title page Title page
Table of contents Table of contents
Project description Section 3
Project organization and responsibility Section 2
QA objectives Section 4
Sampling procedures Section 6
Sample custody Section 8
Calibration procedures Section 5,6,7
Analytical procedures Section 7
Data reduction, validation, and reporting Section 8,9
Internal QC checks Section 5
Performance and system audits Section 5,6,7
Preventive maintenance Section 6
Corrective action Sections
QA reports to management Section 9
-------
SECTION 2
PROJECT ORGANIZATION
2.1 MANAGEMENT STRUCTURE
For the EMAP-Estuaries monitoring in the Louislanian Province, expertise in specific research and
monitoring areas will be provided by several EPA laboratories and their contracting organizations. The
Environmental Research Laboratory in Gulf Breeze, Rorida (ERL-GB) has been designated as the principal
laboratory for EMAP-E monitoring in the Louisianian Province, and therefore will provide direction and
support for all activities. Technical support is provided to ERL-GB by Technical Resources, Inc. (TRI) and
Computer Sciences Corporation (CSC). ERL-GB has been designated as the principal laboratory for the
statistical design of the Estuarine monitoring effort. Figure 2.1 illustrates the management structure for the
EMAP-E 1992 Louisianian Province monitoring. All key personnel involved in the 1992 Lousianian Province
monitoring are listed in Table 2.1.
-------
Section 2
Revision 0
Date 8/92
DRAFT 1
Page 2 of 3
EMAP-E QA
Coordinator
Ray Valente
Province QA
Coordinator
Tom Heitmuller
Associate Director
Near Coastal
John Paul
Acting Technical Director
Estuaries
Richard Latimer
Province Manager
Kevin Summers
EMAP-E
Information
Manager
Jeff Rosen
Field Activities
Coordinator
John Macauley
Processing
Laboratories
Team Leader 1
1
Province
Inform.
Manager
Matt Adams
Field Operations
Center Support
Staff
Team Leader 2
Field Crew I
Team Leader 3
Field Crew 2
Field Crew 3
Rgure 2.1. Management structure for the 1992 EMAP-E Louisianian Province monitoring.
-------
Section 2
Revision 0
Date 8/92
DRAFT 1
Page 3 of 3
TABLE 2.1. List of key personnel, affiliations, and responsibilities for the EMAP-Estuaries 1992
Louisianian Province monitoring.
NAME
ORGANIZATION
RESPONSIBILITY
E. Martinko
F. Kutz
J. Paul
R. Latimer
K. Summers
J. Macauley
L Kirkland
R. Valente
T. Heitmuller
W. Benson
J. Brooks
B. Albrecht
J. Fournie
W. Walker
R. Heard
D. Heggam
J. Rosen
M. Adams
U.S. EPA-DC
U.S. EPA-DC
U.S. EPA-Narragansett
U.S. EPA-Narragansett
U.S. EPA-Gulf Breeze
U.S. EPA-Gulf Breeze
U.S. EPA-Cincinnati
SAIC-Narragansett
TRI-Gulf Breeze
U. Mississippi
Texas A&M Univ.
TRI-Guif Breeze
U.S. EPA-Gulf Breeze
Gulf Coast Research Lab.
Gulf Coast Research Lab.
U.S. EPA-Las Vegas
CSC-Narragansett
CSC-Gulf Breeze
EMAP Director
Deputy Director
NC Associate Director
EMAP-E Acting Technical Director
EMAP-E Design Lead and Louisianian
Province Manager
Louisianian Province Field Coordinator
Acting EMAP QA Coordinator
EMAP-E QA Coordinator
Louisianian Province QA Coordinator
Contaminant Analyses-Tissue
Contaminant Analyses-Sediments and
Field Sampling
Toxicity Testing
Fish Histopathology
Sediment Physical Analyses and Field
Sampling
Benthic Analyses
Logistics Support
Information Management
Information Management
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 1 of 1
SECTION 3
PROJECT DESCRIPTION
3.1 PURPOSE
Complete descriptions of the EMAP-E monitoring approach and rationale, sampling design, indicator
strategy, logistics, and data assessment plan are provided in the Near Coastal Program Plan for 1990:
Estuaries (Holland 1990). Briefly, the objectives of the 1992 Near Coastal Louisianian Province monitoring
are to:
o Obtain estimates of the variability associated with EMAP-E indicators which will allow
establishment of program level data quality objectives (DQOs).
o Evaluate the utility, sensitivity, and applicability of the EMAP-Estuaries indicators on a
regional scale.
o Determine the effectiveness of the EMAP network design for quantifying the extent and
magnitude of pollution problems in the Louisianian Province.
o Demonstrate the usefulness of results for the purposes of planning, prioritization, and
determining the effectiveness of existing pollutant control actions.
o Develop methods for indicators that can be transferred to EMAP-E user groups.
o Identify and resolve logistical issues associated with implementing the network design in the
Louisianian Province.
The strategy for accomplishing the above objectives will be to continue to field test the sensitivity
of the proposed indicators and network design through a second year of sampling in the Louisianian
Province estuaries.
-------
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 1 of 7
SECTION 4
QUALITY ASSURANCE OBJECTIVES
4.1 DATA QUALITY OBJECTIVES
EMAP-Estuaries personnel are making a variety of measurements to monitor a defined set of
parameters (i.e., indicators of estuarine and coastal environmental quality). Complete descriptions of the
program's objectives and indicator strategy are presented in the Near Coastal Program Plan (Holland 1990)
and will not be repeated here. To successfully meet the objectives, the program's assessments of
ecosystem health must be based on scientifically sound interpretations of the data. To achieve this end,
and as required by EPA for all monitoring and measurement programs, objectives must be established for
data quality based on the proposed uses of the data (Stanley and Vemer 1985). The primary purpose of
the quality assurance program is to maximize the probability that the resulting data will meet or exceed the
data quality objectives (DQOs) specified for the project Data quality objectives established for the EMAP-
Estuaries project, however, are based on control of the measurement system because error bounds cannot,
at present, be established for end use of indicator response data. As a consequence, management
decisions balancing the cost of higher quality data against program objectives are not presently possible.
As data are accumulated on indicators and the error rates associated with them are established, end use
DQOs can be established and quality assurance systems implemented to assure acceptable data quality
to meet pre-established program objectives.
Data quality objectives for the various measurements being made in EMAP-Estuaries can be
expressed in terms of accuracy, precision, and completeness goals (Table 4.1). These data quality
objectives more accurately can be termed 'measurement quality objectives* (MQOs), because they are
based solely on the likely magnitude of error generated through the measurement process. The MQOs for
the Project were established by obtaining estimates of the most likely data quality that is achievable based
on either the instrument manufacturer's specifications or historical data Scientists familiar with each
particular data type provided estimates of likely measurement error for a given measurement process.
-------
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 2 of 7
TABLE 4.1. Measurement quality objectives for EMAP-Estuaries indicators and associated data.
Maximum Maximum
Allowable Allowable
Accuracy (Bias) Precision Completeness
Indicator/Data Type Goal Goal Goal
Sediment contaminant analyses:
Organics 30% 30% 90%
Inorganics 15% 15% 90%
Fish tissue contaminant analyses:
Organics 30% 30% 90%
Inorganics 15% 15% 90%
Sediment toxicity NA NA 90%
Benthic species composition
and biomass:
Sorting 10% NA 90%
Counting 10% NA 90%
Taxonomy 10% NA 90%
Biomass NA 10% 90%
Sediment characteristics:
Grain size analyses NA 10% 90%
Total organic carbon 10% 10% 90%
Acid volatile sulfide 10% 10% 90%
Water Column Characteristics:
Dissolved oxygen
Salinity
Depth
PH
Temperature
± 1.0 mg/L
± 1 .0 ppt
± 0.5 m
± 0.5 units
± 0.5 °C
10%
10%
10%
NA
NA
90%
90%
90%
90%
90%
(continued)
-------
TABLE 4.1. (Continued)
Section 4
Revision 0
Data 8/92
DRAFT 1
Page 3 of 7
Indicator/Data Type
Gross pathology of fish
Fish community composition:
Counting
Taxonomic identification
Length determinations
Fish histopathology
Apparent RPD depth
Maximum
Allowable
Accuracy (Bias)
Goal
NA
10%
10%
± 5 mm
NA
± 5 mm
Maximum
Allowable
Precision
Goal
10%
NA
NA
NA
NA
NA
Completeness
Goal
90%
90%
90%
90%
NA
90%
The MQOs presented in Table 4.1 are used as quality control criteria both in field and laboratory
measurement processes to set the bounds of acceptable measurement error. General speaking, DQOs or
MQOs are usually established for five aspects of data quality: representativeness, completeness,
comparability, accuracy, and precision (Stanley and Vemer 1985). These terms are defined below with
general guidelines for establishing MQOs for each QA parameter.
4.2 REPRESENTATIVENESS
Representativeness is defined as the degree to which the data accurately and precisely represent
a characteristic of a population parameter, variation of a property, a process characteristic, or an operational
condition" (Stanley and Vemer, 1985). Representativeness applies to the location of sampling or monitoring
sites, to the collection of samples or field measurements, to the analysis of those samples, and to the types
of samples being used to evaluate various aspects of data quality. The location of sampling sites and the
-------
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 4 of 7
design of the sampling program for EMAP-Estuaries monitoring in the Louisianian Province provide the
primary focus for defining representative population estimates from this region.
The proposed sampling design combines the strengths of systematic and random sampling with
an understanding of estuarine systems, to collect data that will provide unbiased estimates of the status of
the Nation's estuarine resources. Field protocols are documented in the 1992 Louisianian Province Field
Operations Manual (Macauley era/. 1992) for future reference and protocol standardization, as are laboratory
measurement protocols in the Laboratory Methods Manual (U. S. EPA, in preparation). The types of QA
documentation samples (i.e., performance evaluation material) used to assess the quality of chemical data
will be as representative as possible of the natural samples collected during the project with respect to both
composition and concentration.
4.3 COMPLETENESS
Completeness is defined as "a measure of the amount of data collected from a measurement
process compared to the amount that was expected to be obtained under the conditions of measurement*
(Stanley and Verner 1985). A criteria ranging from 75 to 90 percent valid data from a given measurement
process is suggested as being reasonable for the Project. As data are compiled for the various indicators,
more realistic criteria for completeness can be developed. The suggested criteria for each data type to be
collected is presented in Table 4.1.
4.4 COMPARABILITY
Comparability is defined as "the confidence with which one data set can be compared to another*
(Stanley and Vemer 1985). Comparability of reporting units and calculations, data base management
processes, and interpretative procedures must be assured if the overall goals of EMAP are to be realized.
One goal of the EMAP-Estuaries program is to generate a high level of documentation for the above topics
to ensure that future EMAP efforts can be made comparable. For example, both field and laboratory
methods are described in full detail in manuals which will be made available to all field personnel and
analytical laboratories. Field crews will undergo Intensive training In a single four week session prior to the
start of field work. Finally, the sampling design for the Louisianian Province monitoring has been made
-------
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 5 of 7
flexible enough to allow for analytical adjustments, when necessary, to ensure data comparability.
4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR
The term "accuracy", which is used synonymously with the term bias in this plan, is defined as the
difference between a measured value and the true or expected value, and represents an estimate of
systematic error or net bias (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Precision is defined as
the degree of mutual agreement among individual measurements, and represents an estimate of random
error (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Collectively, accuracy and precision can provide
an estimate of the total error or uncertainty associated with an individual measured value. Measurement
quality objectives for the various indicators are expressed separately as maximum allowable accuracy (i.e.,
bias) and precision goals (Table 4.1). Accuracy and precision goals may not be definable for all parameters
due to the nature of the measurement type. For example, accuracy measurements are not possible for
toxicity testing and fish pathology identifications because "true" or expected values do not exist for these
measurement parameters (see Table 4.1).
In order to evaluate the MQOs for accuracy and precision, various QA/QC samples will be collected
and analyzed for most data collection activities. Table 4.2 presents the types of samples to be used for
quality assurance/quality control for each of the various data acquisition activities except sediment and fish
tissue contaminant analyses. The frequency of QA/QC measurements and the types of QA data resulting
from these samples or processes are also presented in Table 4.2. Because several different types of QA/QC
samples are required for the complex analyses of chemical contaminants in sediment and tissue samples,
they are presented and discussed separately In Section 5.1 along with presentation of warning and control
limits for the various chemistry QC sample types.
-------
TABLE 4.2.
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 6 of 7
Quality assurance sample types, frequency of use, and types of data generated for EMAP-
Estuaries Louisianian Province monitoring (see Table 5.3 for chemical analysis QA/QC
sample types).
Variable
QA Sample Type
or Measurement
Procedure
Frequency
of Use
Data Generated
for Measurement
Quality Definition
Sediment toxicity
tests
Reference toxicant Each experiment
Variance of replicated
tests over time
Benthic Species
Composition and Biomass:
Sorting
Counting and
Identification
Biomass
Resort of complete
sample including
debris
Recount and ID of
sorted animals
Duplicate weights
10% of each
tech's work
10% of each
tech's work
10% of samples
No. animals resorted
No. of count and ID
errors
Duplicate results
Sediment grain
size
Organic carbon
and acid vola-
tile sulfide
Splits of a sample
Duplicates and
analysis of
standards
10% of each
tech's work
Each batch
Duplicate results
Duplicate results
and standard
recoveries
Dissolved
Oxygen cone.
(Surveyor II)
Water-saturated air
calibration followed
by air-saturated water
measurement
Weekly
Difference between
measurement and
saturation table
values
Dissolved
Oxygen cone.
(DataSonde 3)
Side-by-side
comparison with
Surveyor II
At deployment
and retrieval
of unit
Difference between
DataSonde 3 and
Surveyor II
(continued)
-------
Section 4
Revision 0
Date 8/92
DRAFT 1
Page 7 of 7
Table 4.2 (continued).
Variable
QA Sample Type
or Measurement
Procedure
Frequency
of Use
Data Generated
for Measurement
Quality Definition
Salinity
Secondary Sea- Daily
water Standard
Difference between
probe measurement
and standard value
Temperature
Depth
Thermometer
reading
Check bottom
Daily
Each station
pH
Fish identification
Fish counts
Fish gross
pathology
Fish
histopathology
Apparent RPD
depth
depth against depth
finder on boat
QC check with
standard buffer
solutions
Fish preserved
for verification
by taxonomist
Duplicate counts
Specimens
preserved for
confirmation
Confirmation by
second technician
Duplicate
measurements
Once each day
Twice/crew for
each species
10% of trawls
Regular intervals
5% of slides
10% of samples
Difference between
probe and thermometer
Difference
from depth finder
Difference from
standard
Number of mis-
identifications
Replicated difference
between determinations
Number of mis-
identifications
Number of confirmations
Duplicate results
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 1 of 35
SECTION 5
QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS, CRITERIA,
AND CORRECTIVE ACTION
Complete and detailed protocols for field and laboratory measurements can be found in the 1992
Louisianian Province Field Operations Manual (Macauley 1992) and in the EMAP-Estuaries Laboratory
Methods Manual (U.S. EPA, in preparation), respectively. Specific QA/QC procedures to be followed during
the 1992 Louisianian Province monitoring are presented in the following sections.
5.1 CHEMICAL ANALYSIS OF SEDIMENT AND FISH TISSUE SAMPLES
The EMAP-E program will measure a variety of organic and inorganic contaminants in estuarine
sediment and fish tissue samples (Tables 5.1 and 5.2); these compounds are identical to those measured
in NOAA's National Status and Trends (NS&T) Program. No single analytical method has been approved
officially for low-level (i.e., low parts per billion) analysis of organic and inorganic contaminants in estuarine
sediments and fish tissue. Recommended methods for the EMAP-E program are those used in the NS&T
Program (Lauenstein in prep.), as well as those documented in the EMAP-E Laboratory Methods Manual
(U.S. EPA in prep.). EMAP-E does not require that a single, standardized analytical method be followed,
but rather that participating laboratories demonstrate proficiency and comparability through routine analysis
of Certified Reference Materials1 (CRMs) or similar types of accuracy-based materials. Furthermore, through
an interagency agreement with the NOAA's NS&T Program, all EMAP-E analytical laboratories are required
to participate in an on-going series of laboratory intercomparison exercises
Certified Reference Materials are samples in which chemical concentrations have been determined
accurately using a variety of technically valid procedures; these samples are accompanied by a certificate
or other documentation issued by a certifying body (e.g., agencies such as the National Research Council
of Canada (NRCC), U.S. EPA, U.S. Geological Survey, etc.). Standard Reference Materials (SRMs) are
CRMs issued by the National Institute of Standards and Technology (NIST), formerly the National Bureau
of Standards (NBS). A useful catalogue of marine science reference materials has been compiled by Cantillo
(1990).
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 2 of 35
TABLE 5.1. Chemicals to be measured in sediments by EMAP-Estuaries Louisianian Province.
Polvnuclear Aromatic Hydrocarbons (PAHsi
Acenaphthene
Anthracene
Benz(a)anthracene
Benzo(a)pyrene
Benzo(e)pyrene
Biphenyl
Chrysene
Chrysene(Cl-C4)
Dibenz(a,h)anthracene
Dibenzothiophene
Dibenzothiophene(C1 -C3)
2,6-dimethylnaphthalene
Fluoranthene
Fluorene
Fluorene(Cl -C3)
2-methylnaphthalene
1-methyl napthalene
1 -methylphenanthrene
2,6-dimethylnaphtalene
Naphthalene
Naphtalene(Cl-C4)
Perylene
Phenanthrene
Phenanthrene(Cl -C4)
Pyrene
Benzo(b)fluoranthene
Acenaphthlylene
Benzo(k)fluoranthene
Benzo(g,h,i)perylene
ldeno(1,2,3-c,d)pyrene
2,3,5-trimethyl naphthalene
21 PCS Congeners:
PCB No. Compound name
8 2,4'-dichlorobiphenyl
18 2,2',5-trichlorobiphenyl
28 2,4,4'-trichlorobiphenyl
44 2,2',315'-tetrachlorobiphenyt
52 2,2'15,5'-tetrachlorobiphenyl
66 2,3',4,4'-tetrachlorobiphenyl
101 2,2'.4,5,5'-pentachlorobiphenyl
105 2,313',4,4>-pentachlorobiphenyl
110/77 2,3,3',4',6-pentachlorobiphenyl
3,3',4,4'-tetrachlorobiphenyl
118 2,3',4,4',5-pentachlorobiphenyl
126 3,3',4,4',5-pentachlorobiphenyl
128 2.2<,3,3>.4,4'-hexacnlorobiphenyl
138 2,2',3.4,4'15'-hexachlorobiphenyl
153 2,2'.4.4',5,5'-hexacnlorobiphenyl
170 2,2',3.3',4,4',5-heptacnlorobiphenyl
180 2,2I.3,4,4',5,5'-heptachlorobiphenyl
187 2,2',3,4'15,5I,6-heptachlorobiphenyi
195 2,2'.3,3>.4,4>,5,6-octachlorobiphenyl
206 2,21,3,3',4,4'(5,5I16-nonachJorobiphenyl
209 2,2'.3>3',4I4'1515t,6,6'-decachlorobiphenyl
Other measurements
Acid volatile sulfide
Total organic carbon
Tributyltln, Dibutyltln. Monobutyltln
DDT and its metabolites
2,4'-DDD
4,4'-DDD
2,4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT
Chlorinated pesticides
other than DDT
Aldrin
Alpha-Chlordane
Dieldrin
Endosulfan
Endrin
Heptachlor
Heptachlor epoxide
Hexachlorobenzene
Lindane (gamma-BHC)
Mirex
Toxaphene
Trans-Nonachlor
Alkanes
C10-C34
Pristane
Phytane
Total alkanes
Trace Elements
Aluminum
Antimony
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Silver
Tin
Zinc
-------
Sections
Revision 0
Data 8/92
DRAFT1
Page 3 of 35
TABLE 5.2. Chemicals to be measured in fish and shellfish tissue by EMAP-Estuaries Louisianian Province.
DDT and its metabolites
2,4'-DDD
4,4'-DDD
2,4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT
Chlorinated pesticides
other than DDT
Aldrin
Alpha-Chlordane
Dieldrin
Endosulfan
Endrin
Heptachlor
Heptachlor epoxkje
Hexachlorobenzene
LJndane (gamma-BHC)
Mirex
Toxaphene
Trans-Nonachlor
21 PCB Congeners:
Trace Elements
Aluminum
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Mercury
Nickel
Selenium
Silver
Tin
Zinc
Butylins
Monobutyltin
Dibutyltin
Tributyltin
PCB No. Compound name
8 2,4'-dichlorobiphenyl
18 2,2',5-trichlorobiphenyl
28 2,4,4'-trlchlorobiphenyl
44 2,2',3,5>-tetracNorobiphenyl
52 2,2',5,5'-tetrachlorobiphenyl
66 2,3',4,4I-tetrachlorobiphenyl
101 2,2',4,5,5'-pentachlorobiphenyl
105 2,3,31,4,4'-pentachlorobiphenyl
110/77 2,3,3',4',6-pentachlorobiphenyi
3,3',4,4'-tetracWorobiphenyl
118 2,3',4,4',5-pentachlorobiphenyl
126 3,3',4,4',5-pentachlorobiphenvl
128 2,2',3,3I,4,4'-hexachlorobiphenyi
138 2,2'.3,4,4>,5'-hexachlorobiphenyl
153 2,21,4,4',5,5'-hexacWorobiphenyl
170 2,2',3,3',4,4',5-heptachlorobiphenyl
180 2,2',3,4,4',5,5'-heptachlorobiphenyl
187 2,2',3,4',5,5',6-heptachlorobiphenyi
195 2,2'>3,31,4,41.5.6-octc:chlorobiphenyl
206 2,2113>3',4,4I,5,5'16-nonachlorobiphenyl
209 2,2',3.3',4,4',515',6,6p-decachlorobiphenvl
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 4 of 35
(round-robins), which are conducted jointly by the National Institute of Standards and Technology (MIST)
and the National Research Council of Canada (NRCC). Laboratories must participate In these QA
intercomparison exercises both to demonstrate initial capability (i.e., prior to the analysis of actual samples)
and on a continual basis throughout the project. The EMAP-E laboratories will be required to initiate
corrective actions if their performance in these intercomparison exercises falls below certain pre-determined
minimal standards, described in later sections.
The data quality objectives for EMAP-E were developed with the understanding that the data will not
be used for litigation purposes. Therefore, legal and contracting requirements as stringent as those used
in the U.S. EPA Contract Laboratory Program, for example, need not be applied to EMAP-E. Rather, it is
the philosophy of EMAP-E that as long as required QA/QC procedures are followed and comparable
analytical performance is demonstrated through the routine analysis of Certified Reference Materials and
through the on-going QA intercomparison exercises, multiple procedures for the analysis of different
compound classes used by different laboratories should yield comparable results. This represents a
"performance-based* approach for quality assurance of low-level contaminant analyses, involving continuous
laboratory evaluation through the use of accuracy-based materials (CRMs), laboratory fortified sample
matrices, laboratory reagent blanks, calibration standards, and laboratory and field replicates. The
conceptual basis for the use of each of these types of quality control samples Is presented in the following
sections.
5.1.1 General QA/QC Requirements
The guidance provided in the following sections is based largely on the protocols developed for the
Puget Sound Estuary Program (U.S. EPA 1989); it is applicable to low parts-per-billlon analyses of both
sediment and tissue samples unless otherwise noted. The QA/QC requirements are intended to provide
a commorv foundation for each laboratory's protocols; the resultant QA/QC data wyi enable an assessment
of the comparability of results generated by different laboratories and different analytical procedures. It
should be noted that the QA/QC requirements specified in this plan represent the minimum requirements
for any given analytical method. Additional requirements which are method-specific should always be
followed, as long as the minimum requirements presented in this document have been met
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 5 of 35
Data for all QA/QC variables must be submitted by the laboratory as part of the data package; the
completeness of each submitted data package will be checked by the Louisianian Province manager, quality
assurance coordinator, or their designee(s). Data validation will be conducted by qualified personnel to
ascertain that control limits for QA/QC samples have been met, or, if exceeded, that acceptable narrative
explanations have been provided by the laboratory along with the submitted data (a more detailed
description of data reporting requirements is provided in Section 5:1.13). The QA/QC data will be used
initially to assess the accuracy and precision of individual laboratory measurements, and ultimately to assess
comparability of data generated by different laboratories.
The results for the various QA/QC samples should be reviewed by laboratory personnel immediately
following the analysis of each sample batch. These results then should be used to determine when warning
and control limit criteria have not been met and corrective actions must be taken, before processing a
subsequent sample batch. When warning limit criteria have not been met, the laboratory is not obligated
to halt analyses, but the analyst(s) is advised to investigate the cause of the exceedance. When control limit
criteria are not met, specific corrective actions are required before the analyses may proceed. Warning and
control limit criteria and recommended frequency of analysis for each QA/QC element or sample type
required in the EMAP-E program are summarized in Table 5.3. Descriptions of the use, frequency of
analysis, type of information obtained, and corrective actions for each of these QA/QC sample types or
elements are provided in the following sections.
-------
TABLE 5.3.
Sections
Revision 0
Date 8/92
DflAFTI
Page 6 of 35
Key elements for quality control of EMAP-Estuaries chemical analyses (see text for detailed
explanations).
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
1.) Initial Demonstration
of Capability (Prior to
Analysis of Samples):
- Instrument Calibration NA
- Calculation of Method
Detection Limits
NA
Must be equal to or less than
target values (see Table 5.4)
- Blind Analysis of
Accuracy-Based
Material NA
NA
Initial and then
prior to analyzing
each batch of samples
At least
once each
year
initial
2.) On-going Demonstration
of Capability:
- Blind Analysis of
Laboratory Inter-
comparison Exercise
Samples
NA
NA
Regular intervals
throughout the
year
3.) Continuing Calibration
Checks using Calibration
Standard Solutions
NA
should be within
±15% of initial
calibration ori
average for all
analytes, not to
exceed ±25% for
any one analyte
At a minimum,
middle and end
of each sample
batch
(continued)
-------
Sections
Revision 0
Data 8/92
DRAFT 1
Page 7 of 35
TABLE 5.3 (continued).
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
3.) Analysis of Certified Reference
Material (CRM) or Laboratory
Control Material (LCM):
Precision (see NOTE 1): NA
Relative Accuracy
(see NOTE 2):
Value obtained for
each analyte should
be within 3s control
chart limits
One with each
batch of samples
Value plotted on
control chart after
each analysis of the
CRM
PAHs
PCBs/pesticides
inorganic elements
Lab's value should
be within ±25% of
true value on
average for all
analytes; not to
exceed ±30% of
true value for
more than 30% of
individual analytes
same as above
Lab should be within
±15% of true value
for each analyte
Lab's value should
be within ±30% of
true value on
average for all
analytes; not to
exceed ±35% of
true value for
more than 30% of
individual analytes
same as above
Lab should be within
±20% of true value
for each analyte
NOTE 1: The use of control charts to monitor precision for each analyte of interest should follow
generally accepted practices (e.g., Taylor 1987). Upper and lower control limits, based on three
standard deviations (3s) of the mean, should be updated at regular intervals.
NOTE 2: True" values in CRMs may be either 'certified'' or "non-certified' (it Is recognized that absolute
accuracy can only be assessed using certified values, hence the term relative accuracy). Relative
accuracy is computed by comparing the laboratory's value for each analyte against either end of the
range of values (I.e., 95% confidence limits) reported by the certifying agency. The laboratory's value
must be within ±35% of either the upper or lower 95% confidence interval value. Accuracy control limit
criteria only apply for analytes having CRM concentrations a10 times the laboratory's MDL
(continued)
-------
TABLE 5.3 (continued).
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 8 of 35
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
4.) Laboratory Reagent
Blank
5.) Laboratory Fortified
Sample Matrix
(Matrix Spike)
Analysts should use
best professional
judgement if analytes
are detected at <3
times the MDL
NA
No anaiyte should
be detected at > 3
times the MDL
Recovery should be
within the range
50% to 120% for at
least 80% of the
analytes
One with each
batch of samples
At least
5% of total
number of
samples
NOTE: Samples to be spiked should be chosen at random; matrix spike solutions should contain all the
analytes of interest. The final spiked concentration of each anaiyte in the sample should be at least 10
times the calculated MOL
6.) Laboratory Fortified
Sample Matrix Duplicate
(Matrix Spike Duplicate)
NA
RPD1 must be
£ 30 for each
anaiyte
Same as
matrix spike
7.) Field Duplicates
(Field Splits)
NA
NA
number of
5% of total
samples
8.) Internal Standards
(Surrogates)
9.) Injection Internal
Standards
NA
Lab develops
Its own
Recovery must be
within the range
30% to 150%
Each sample
Each sample
1 RPD = Relative percent difference between matrix spike and matrix spike duplicate results (see
section 5.1.11 for equation)
-------
Sections
Revision 0
Data 8/92
DRAFT 1
Page 9 of 35
5.1.2 Initial Calibration
Equipment should be calibrated prior to the analysis of each sample batch, after each major
equipment disruption, and whenever on-going calibration checks do not meet recommended control limit
criteria (Table 5.3). All calibration standards should be traceable to a recognized organization for the
preparation and certification of QA/QC materials (e.g., National Institute of Standards and Technology, U.S.
Environmental Protection Agency, etc.). Calibration curves must be established for each element and batch
analysis from a calibration blank and a minimum of three analytical standards of increasing concentration,
covering the range of expected sample concentrations. The calibration curve should be well-characterized
and must be established prior to the analysis of samples. Only data which results from quantification within
the demonstrated working calibration range may be reported by the laboratory (i.e., quantification based
on extrapolation is not acceptable). Samples outside the calibration range should be diluted or
concentrated, as appropriate, and reanalyzed.
5.1.3 Initial Documentation of Method Detection Limits
Analytical chemists have coined a variety of terms to define limits* of detectability; definitions for
some of the more commonly-used terms are provided in Keith er a/. (1983) and in Keith (1991). In the
EMAP-E program, the Method Detection Limit (MDL) will be used to define the analytical limit of detectability.
The MDL represents a quantitative estimate of low-level response detected at the maximum sensitivity of a
method. The Code of Federal Regulations (40 CFR Part 136) gives the following rigorous definition: "the
MDL is the minimum concentration of a substance that can be measured and reported with 99% confidence
that the analyte concentration is greater than zero and is determined from analysis of a sample in a given
matrix containing the analyte.' Confidence in the apparent analyte concentration increases as the analyte
signal increases above the MDL
Each EMAP-E analytical laboratory must calculate and report an MDL for each analyte of interest
in each matrix of interest (sediment or tissue) prior & thg analysis oj field samples for a given year. Each
laboratory is required to follow the procedure specified in 40 CFR Part 136 (Federal Register, Oct 28,1984)
to calculate MDLs for each analytical method employed. The matrix and the amount of sample (I.e., dry
weight of sediment or tissue) used in calculating the MDL should match as closely as possible the matrix
of the actual field samples and the amount of sample typically used. In order to ensure comparability of
-------
Section 5
Revision 0
Data 8/92
DRAFT 1
Page 10 of 35
results among different laboratories, MDL target values have been established for the EMAP-E program
(Table 5.4). The initial MDLs reported by each laboratory should be equal to or less than these specified
target values before the analysis of field samples may proceed. Each laboratory must periodically (i.e., at
least once each year) re-evaluate its MDLs for the analytical methods used and the sample matrices typically
encountered.
5.1.4 Initial Blind Analysis of a Representative Sample
A representative sample matrix which is uncompromised, homogeneous and contains the analytes
of interest at concentrations of interest will be provided to each analytical laboratory new to the EMAP-E
program; this sample will be used to evaluate laboratory performance prior to the analysis of field samples.
The sample used for this initial demonstration of laboratory capability typically will be distributed blind (I.e.,
the laboratory will not know the concentrations of the analytes of interest) as part of the laboratory QA
intercomparison exercises. A laboratory's performance generally will be considered acceptable if its
submitted values are within ±30% (for organic analyses) and ±20% (for inorganic analyses) of the known
concentration of each analyte of interest in the sample. These criteria apply only for analyte concentrations
equal to or greater than 10 times the MDL established by the laboratory. If the results for the initial analysis
fail to meet these criteria, the laboratory will be required to repeat the analysis until the performance criteria
are met, prior to the analysis of real samples.
5.1.5 Laboratory Participation in Intercomparison Exercises
The laboratory QA intercomparison exercises previously referred to are sponsored jointly by the
EMAP-E and NOAA NS&T Programs to evaluate both the individual and collective performance of their
participating analytical laboratories. Following the initial demonstration of capability, each EMAP-E
laboratory Is required to participate in these on-going intercomparison exercises as a continuing check on
performance and intercomparability. Usually, three or four different exercises are conducted over the course
of a year. In a typical exercise, either NIST or NRCC will distribute performance evaluation samples in
common to each laboratory, along with detailed instructions for analysis. A variety of performance
evaluation samples have been utilized in the past, including accuracy-based solutions, sample extracts, and
representative matrices (e.g., sediment or tissue samples). Laboratories are required to analyze the
-------
TABLE 5.4. Target method detection limits for EMAP-Estuaries analytes.
Sections
Revision 0
Date 8/92
DRAFT!
Page 11 of 35
INORGANICS (NOTE: concentrations in
(ppm), dry weight)
Aluminum
Antimony
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Silver
Tin
Zinc
Tissue
10.0
not measured
2.0
0.2
0.1
5.0
50.0
0.1
not measured
0.01
0.5
1.0
0.01
0.05
50.0
ORGANICS (NOTE: concentrations in ng/g (ppb), dry weight)
PAHs
PCB congeners
Chlorinated pesticides
Tissue
not measured
2.0
2.0
Sediments
1500
0.2
1.5
0.05
5.0
5.0
500
1.0
1.0
0.01
1.0
0.1
0.01
0.1
2.0
Sediments
10
1.0
1.0
sample(s) "blind" and must submit their results in a timely manner both to the Louisianian Province QA
Coordinator, as well as to either NIST or NRCC (as instructed). Laboratories which fail to maintain
acceptable performance may be required to provide an explanation and/or undertake appropriate corrective
actions. At the end of each calendar year, coordinating personnel at NIST and NRCC hold a QA workshop
to present and discuss the intercomparison exercise results. Representatives from each laboratory are
encouraged to participate in the annual QA workshops, which provide a forum for discussion of analytical
problems brought to light in the intercomparison exercises.
-------
S«ction 5
Revision 0
Data 8/92
DRAFT 1
Page 12 of 35
5.1.6 Routine Analysis of Certified Reference Materials or Laboratory Control Materials
Certified Reference Materials (CRMs) generally are considered the most useful QC samples for
assessing the accuracy of a given analysis (i.e., the closeness of a measurement to the true" value).
Certified Reference Materials can be used to assess accuracy because they have "certified' concentrations
of the analytes of interest, as determined through replicate analyses by a reputable certifying agency using
two independent measurement techniques for verification. In addition, the certifying agency may provide
"non-certified" or "informational" values for other analytes of interest. Such values are determined using a
single measurement technique, which may introduce unrecognized bias. Therefore, non-certified values
must be used with caution in evaluating the performance of a laboratory using a method which differs from
the one used by the certifying agency.
A Laboratory Control Material (LCM) is similar to a Certified Reference Material in that It is a
homogeneous matrix which closely matches the samples being analyzed. A true" LCM Is one which is
prepared (i.e., collected, homogenized and stored in a stable condition) strictly for use in-house by a single
laboratory. Alternately, the material may be prepared by a central laboratory and distributed to others (so-
called regional or program control materials). Unlike CRMs, concentrations of the analytes of interest in
LCMs are not certified but are based upon a statistically-valid number of replicate analyses by one or several
laboratories. In practice, this material can be used to assess the precision (I.e., consistency) of a single
laboratory, as well as to determine the degree of comparability among different laboratories. If available,
LCMs may be preferred for routine (i.e., day-to-day) analysis because CRMs are relatively expensive.
However, CRMs still must be analyzed at regular Intervals (e.g., monthly or quarterly) to provide a check
on accuracy.
Routine analysis of Certified Reference Materials or, when available, Laboratory Control Materials
represents a particularly vital aspect of the performance-based" EMAP-E QA philosophy. At least one CRM
or LCM must be analyzed along with each batch of 25 or fewer samples (Table 5.3). For CRMs, both the
certified and non-certified concentrations of the target analytes should be known to the analyst(s) and should
be used to provide an immediate check on performance before proceeding with a subsequent sample batch.
Performance criteria for both precision and accuracy have been established for analysis of CRMs or LCMs
(Table 5.3); these criteria are discussed In detail in the following paragraphs. If the laboratory fans to meet
either the precision or accuracy control limit criteria for a given analysis of the CRM or LCM, the data for
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 13 of 35
the entire batch of samples is suspect. Calculations and instruments should be checked; the CRM or LCM
may have to be re-analyzed (I.e., re-injected) to confirm the results. If the values are still outside the control
limits in the repeat analysis, the laboratory is required to find and eliminate the source(s) of the problem and
repeat the. analysis of that batch of samples until control limits gx§ met, before continuing with further sample
processing. The results of the CRM or LCM analysis should never be used by the laboratory to "correct"
the data for a given sample batch.
Precision criteria: Each laboratory is expected to maintain control charts for use by analysts in
monitoring the overall precision of the CRM or LCM analyses. Upper and lower control chart limits (e.g.,
warning limits and control limits) should be updated at regular intervals; control limits based on 3 standard
deviations of the mean generally are recommended (Taylor 1987). Following the analysis of all samples in
a given year, an RSD (relative standard deviation, a.k.a. coefficient of variation) will be calculated for each
analyte of interest in the CRM. For each analyte having a CRM concentration >10 times the laboratory's
MDI_ an overall RSD of less than 30% will be considered acceptable precision. Failure to meet this goal
will result in a thorough review of the laboratory's control charting procedures and analytical methodology
to determine if improvements in precision are possible.
Accuracy criteria: The "absolute" accuracy of an analytical method can be assessed using CRMs
only when certified values are provided for the analytes of interest However, the concentrations of many
analytes of interest to EMAP-E are provided only as non-certified values in some of the more commonly-
used CRMs. Therefore, control limit criteria are based on 'relative accuracy", which is evaluated for each
analysis of the CRM or LCM by comparison of a given laboratory's values relative to the true' or 'accepted"
values in the LCM or CRM. In the case of CRMs, this includes both certified and noncertified values and
encompasses the 95% confidence interval for each value as described In Table 5.3.
Accuracy control limit criteria have been established both for individual compounds and combined
groups of compounds (Table 5.3). There are two combined groups of compounds for the purpose of
evaluating relative accuracy for organic analyses: PAHs and PCBs/pesticides. The laboratory's value should
be within ±30% of the true value QQ average for each combined group of organic compounds, and the
laboratory's value should be within ±35% of either the upper or lower 95% confidence limit for at least 70%
of the compounds in each group. For inorganic analyses, the laboratory's value should be within ±20% of
either the upper or lower 95% confidence limit for each analyte of interest in the CRM. Due to the inherent
-------
Section 5
Revision 0
Data 8/92
DRAFT 1
Page 14 of 35
variability in analyses near the method detection limit control limit criteria for relative accuracy only apply
to analytes having CRM true values which are a10 times the MDL established by the laboratory.
5.1.7 Continuing Calibration Checks
The initial instrument calibration performed prior to the analysis of each batch of samples is checked
through the analysis of calibration check samples (I.e., calibration standard solutions) inserted as part of the
sample stream. Calibration standard solutions used for the continuing calibration checks should contain
all the analytes of interest At a minimum, analysis of the calibration check solution should occur
somewhere in the middle and at the end of each sample batch. Analysts should use best professional
judgement to determine if more frequent calibration checks are necessary or desirable.
If the control limit for analysis of the calibration check standard is not met (Table 5.3), the initial
calibration will have to be repeated, if possible, the samples analyzed before the calibration check sample
that failed the control limit criteria should be re-analyzed following the re-calibration. The laboratory should
begin by re-analyzing the last sample analyzed before the calibration standard which failed. If the relative
percent difference (RPD) between the results of this re-analysis and the original analysis exceeds 30 percent,
the instrument is assumed to have been out of control during the original analysis. If possible, re-analysis
of samples should progress in reverse order until it is determined that there Is less than 30 RPD between
initial and re-analysis results. Only the re-analysis results should be reported by the laboratory. If it is not
possible or feasible to perform re-analysis of samples, all earlier data (I.e., since the last successful
calibration control check) is suspect. In this case, the laboratory should prepare a narrative explanation to
accompany the submitted data.
5.1.8 Laboratory Reagent Blank
Laboratory reagent blanks (also called method blanks or procedural blanks) are used to assess
laboratory contamination during all stages of sample preparation and analysis. For both organic and
inorganic analyses, one laboratory reagent blank should be run in every sample batch. The reagent blank
should be processed through the entire analytical procedure in a manner identical to the samples. Warning
and control limits for blanks (Table 5.3) are based on the laboratory's method detection limits as
documented prior to the analysis of samples (see Section 5.1.3). A reagent blank concentration between
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 15 of 35
the MDL and 3 times the MDL for one or more of the analytes of interest should serve as a warning limit
requiring further investigation based on the best professional judgement of the analyses). A reagent blank
concentration equal to or greater than 3 times the MDL for one or more of the analytes of interest requires
definitive corrective action to identify and eliminate the source(s) of contamination before proceeding with
sample analysis.
5.1.9 Internal Standards
Internal standards (commonly referred to as 'surrogates", 'surrogate spikes" or 'surrogate
compounds') are compounds chosen to simulate the analytes of interest in organic analyses. The internal
standard represents a reference analyte against which the signal from the analytes of interest is compared
directly for the purpose of quantification. Internal standards must be added to each sample, including
QA/QC samples, prior to. extraction. The reported concentration of each analyte should b_e adjusted &
correct for the recovery oj the internal standard, as is done in the NOAA National Status and Trends
Program. The internal standard recovery data therefore should be carefully monitored; each laboratory must
report the percent recovery of the internal standard (s) along with the target analyte data for each sample.
If possible, isotopically-Jabeled analogs of the analytes should be used as internal standards.
Control limit criteria for internal standard recoveries are provided In Table 5.3. Each laboratory
should set its own warning limit criteria based on the experience and best professional judgement of the
analyst(s). It is the responsibility of the analyst(s) to demonstrate that the analytical process is always "in
control* (i.e., highly variable internal standard recoveries are not acceptable for repeat analyses of the same
certified reference material and for the matrix spike/matrix spike duplicate).
5.1.10 Injection Internal Standards
For gas chromatography (GC) analysis, injection internal standards (also referred to as "internal
standards" by some analysts) are added to each sample extract just prior to. Injection to enable optimal
quantification, particularly of complex extracts subject to retention time shifts relative to the analysis of
standards. Injection internal standards are essential If the actual recovery of the internal standards added
prior to extraction is to be calculated. The injection internal standards also can be used to detect and
correct for problems in the GC injection port or other parts of the instrument. The compounds used as
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 16 of 35
injection internal standards must be different from those already used as internal standards. The analyst(s)
should monitor injection internal standard retention times and recoveries to determine if instrument
maintenance or repair, or changes in analytical procedures, are indicated. Corrective action should be
initiated based on the experience of the analyst(s) and not because warning or control limits are exceeded.
Instrument problems that may have affected the data or resulted in the re-analysis of the sample should be
documented properly in logbooks and/or internal data reports and used by the laboratory personnel to take
appropriate corrective action.
5.1.11 Matrix Spike and Matrix Spike Duplicate
A laboratory fortified sample matrix (commonly called a matrix spike, or MS) and a laboratory
fortified sample matrix duplicate (commonly called a matrix spike duplicate,,or MSD) will be used both to
evaluate the effect of the sample matrix on the recovery of the compound(s) of interest and to provide an
estimate of analytical precision. A minimum of 5% of the total number of samples submitted to the
laboratory in a given year should be selected at random for analysis as matrix spikes/matrix spike
duplicates. Each MS/MSD sample is first homogenized and then split into three subsamples. Two of these
subsamples are fortified with the matrix spike solution and the third subsample Is analyzed as is to provide
a background concentration for each analyte of interest The matrix spike solution should contain all the
analytes of interest The final spiked concentration of each analyte in the sample should be at least 10 times
the MDL for that analyte, as previously calculated by the laboratory (see Section 5.1.3).
Recovery data for the fortified compounds ultimately will provide a basis for determining the
prevalence of matrix effects in the sediment samples analyzed during the project If the percent recovery
for any analyte in the MS or MSD Is less than the recommended warning limit of 50 percent, the
chromatograms and raw data quantitatlon reports should be reviewed. If an explanation for a low percent
recovery value is not discovered, the instrument response may be checked using a calibration standard.
Low matrix spike recoveries may be a result of matrix interferences and further instrument response checks
may not be warranted, especially if the low recovery occurs In both the MS and MSD and the other QC
samples in the batch Indicate that the analysis was 'in control'. An explanation for low percent recovery
values for MS/MSO results should be discussed in a cover letter accompanying the data package.
Corrective actions taken and verification of acceptable instrument response must be included.
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 17 of 35
Analysis of the MS/MSD also is useful for assessing laboratory precision. The relative percent
difference (RPD) between the MS and MSD results should be less than 30 for each analyte of interest (see
Table 5.3). The RPD is calculated as follows:
(C1 - C2\ x 100%
RPD = (C1 + C2)/2
where: C1 is the larger of the duplicate results for a given analyte
C2 is the smaller of the duplicate results for a given analyte'
If results for any analytes do meet the RPD s 30% control limit criteria, calculations and instruments
should be checked. A repeat analysis may be required to confirm the results. Results which repeatedly faH
to meet the control limit criteria indicate poor laboratory precision. In this case, the laboratory is obligated
to halt the analysis of samples and eliminate the source of the imprecision before proceeding.
5.1.12 Field Duplicates and Field Splits
For the EMAP-E program, sediment will be collected at each station using a grab sampler. Each
time the sampler is retrieved, the top 2 cm of sediment will be scraped off, placed In a large mixing container
and homogenized, until a sufficient amount of material has been obtained. At approximately 5% of the
stations, the homogenized material will be placed in four separate sample containers for subsequent
chemical analysis. Two of the sample containers will be submitted as blind field duplicates to the primary
analytical laboratory. The other two containers, also called field duplicates, will be sent blind to a second,
reference laboratory. Together, the two pairs of duplicates are called field splits. The analysis of the field
duplicates will provide an assessment of single laboratory precision. The analysis of the field duplicates and
field splits will provide an assessment of both inter- and intra-laboratory precision, as wed as an assessment
of the efficacy of the field homogenization technique.
5.1.13 Analytical Chemistry Data Reporting Requirements
As previously Indicated, data for all QA/QC samples (e.g., CRMs, calibration check samples, blanks,
matrix spike/matrix spike duplicates, etc.) must be submitted by the laboratory as part of the data package
for each batch of samples analyzed. The laboratory should denote QA/QC samples using the
-------
Section 5
Revision 0
Oat* 8/92
DRAFT 1
Pag» 18 of 35
recommended codes (abbreviations) provided in Table 5.5. The QA/QC results and associated data will
be subject to review by the Province Manager, QA Coordinator, or their designee(s).
EMAP-E laboratories are responsible for assigning only two data qualifier codes or "flags" to the
submitted data. If an analyte is not detected, the laboratory should report the result as "ND", followed by
the letter "a". The "a" code will be have the following meaning: The analyte was not detected. The method
detection limit for this analyte has been supplied by the laboratory and can be found in an accompanying
dataset." If a quantifiable signal Is observed, the laboratory should report a concentration for the analyte;
the data qualifier code "b" should be used to flag any reported values which are below the laboratory's MDL
The "b" code will have the following meaning: The analyte was detected at a concentration less than or
equal to the method detection limit. This reported concentration is an estimate which may not accurately
reflect the actual concentration of this analyte in the sample." All other results above the laboratory's MDL
should be reported without any additional qualification codes.
TABLE 5.5. Codes for denoting QA/QC samples in submitted data packages.
Code Description Unit of measure
CLE Continuing Calibration Evaluation Percent recovery
CRM Certified Reference Material pg/g or ng/g (dry weight)
CRMPR Percent Recovery for CRM values Percent recovery
LCM Laboratory Control Material pg/g or ng/g (dry weight)
LCMPR Percent Recovery for LCM values Percent recovery
LRB Laboratory Reagent Blank pg/g or ng/g (dry weight)
LF1 Laboratory fortified sample matrix pg/g or ng/g (dry weight)
LF1 PR Percent recovery for the LF1 Percent recovery
LF2 Laboratory fortified sample matrix duplicate pg/g or ng/g (dry weight)
LF2PR Percent recovery for the LF2 Percent recovery
MSDRPD Relative percent different between LF1 and LF2 Percent
Only data which has met QA requirements should be submitted by the laboratory. When QA
requirements have not been met the samples should be re-analyzed and only the results of the re-analysts
should be submitted, provided they are acceptable. There may be a limited number of situations where
sample re-analysis Is not possible or practical (i.e., minor exceedance of a single control limit criteria). The
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 19 of 35
laboratory is expected to provide a detailed explanation of any factors affecting data quality or interpretation;
this explanation should be in the form of a cover letter accompanying each submitted data package. Iba
narrative explanation is ]n lieu of additional data qualifier codes supplied tjy thg laboratory (other than the
"a" and "b" codes). Over time, depending on the nature of these narrative explanations, the EMAP-E
program expects to develop a limited list of codes for qualifying data in the database (in addition to the 'a"
and "b" codes).
5.2 OTHER SEDIMENT MEASUREMENTS
5.2.1 Total organic carbon
As a check on precision, each laboratory should analyze at least one TOC sample in duplicate for
each batch of 25 or fewer samples. The relative percent difference (RPD) between the two duplicate
measurements should be less than 20%. If this control limit is exceeded, analysis of subsequent sample
batches should stop until the source of the discrepancy is determined and the system corrected.
At least one certified reference material (CRM) or, if available, one laboratory control material (LCM)
should be analyzed along with each batch of 25 or fewer TOC samples. Any one of several marine sediment
CRMs distributed by the National Research Council of Canada's Marine Analytical Chemistry Standards
Program (e.g., the CRMs named "BCSS-I", "MESS-1" and TACS-1") have certified concentrations of total
carbon and are recommended for this use. Prior to analysis of actual samples, it is recommended that each
laboratory perform several total organic carbon analyses using a laboratory control material or one of the
aforementioned CRMs to establish a control chart (the values obtained by the laboratory for total organic
carbon should be slightly less than the certified value for total carbon in the CRM). The control chart then
should be used to assess the laboratory's precision for subsequent analyses of the LCM or CRM with each
sample batch. In addition, a method blank should be analyzed with each sample batch. Total organic
carbon concentrations should be reported as pg/g (ppm) dry weight of the unacWified sediment sample.
Data reported for each sample batch should include QA/QC sample results (duplicates, CRMs or LCMs, and
method blanks). Any factors that may have Influenced data quality should be discussed in a cover letter
accompanying the submitted data.
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Paga 20 of 35
5.2.2 Acid volatile sulflda
Quality control of acid volatile sulfide (AVS) measurements is achieved through the routine analysis
of a variety of QA/QC samples. These are outlined in the following section and described in full detail in
the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). Prior to the analysis of samples, the
laboratory must establish a calibration curve and determine a limit of reliable detection for sulfide for the
analytical method being employed. Following this, laboratory performance will be assessed through routine
analysis of laboratory duplicates, calibration check standards, laboratory fortified blanks (i.e., spiked blanks),
and laboratory fortified sample matrices (I.e., matrix spikes).
One sample in every batch of 25 or fewer samples should be analyzed in duplicate as a check on
laboratory precision. The relative percent difference (as calculated by the formula given in section 5.1.11)
between the two analyses should be less than 20%. If the RPD exceeds 20%, a third analysis should be
performed. If the relative standard deviation of the three determined concentrations exceeds 20%, the
Individual analyses should be examined to determine if non-random errors may have occurred.
Due to the Instability of acid volatile sulfides to drying and handling in air, CRMs have not been
developed for assessing overall measurement accuracy. Therefore, each laboratory must analyze at least
one calibration check standard, one laboratory fortified blank and one laboratory fortified sample matrix In
each batch of 25 or fewer samples as a way of determining the accuracy of each step entailed In performing
the analysis. The concentration of sulfide in each of these three types of accuracy check samples will be
known to the analyst; the calculated concentration of sulfide in each sample should be within ± 15% of the
known concentration.
If the laboratory is not within ± 15% of the known concentration for the calibration check solution,
Instruments used for AVS measurement must be recalibrated and/or the stock solutions redetermined by
titratlon. If the laboratory fails to achieve the same accuracy (within ± 15% of the true value) for AVS in the
laboratory fortified blank, sources of error (e.g., leaks, excessive gas flows, poor sample-acid slurry agitation)
should be determined for the analytical system prior to continuing. If AVS recovery falls outside the 85%
to 115% range for the matrix spike, the system should be evaluated for sources of error and the analysis
should be repeated. If recovery remains unacceptable, It is possible that matrix Interferences are occurring.
If possible, the analysis should be repeated using smaller amounts of sample to reduce the interferant
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 21 of 35
effects. Results for all QA/QC samples (duplicates, calibration check standards, spiked blanks and matrix
spikes) should be submitted by the laboratory as part of the data package for each batch of samples, along
with a narrative explanation for results outside control limits.
As discussed in Section 5.1.12 for chemistry samples, field duplicates and splits will also be
collected for AVS determination to assess both inter- and intralaboratory precision.
5.2.3 Butvltlns
Assessment of the distribution and environmental impact of butyltin species of interest to the EMAP-
E program (tributyltin, dibutyltin and monobutyitin) requires their measurement in marine sediment and tissue
samples at trace levels (parts per billion to parts per trillion). Quality control of these measurements consists
of checks on laboratory precision and accuracy. One laboratory reagent blank must be run with each batch
of 25 or fewer samples. A reagent blank concentration between the MDL and 3 times the MDL should serve
as a warning limit requiring further investigation based on the best professional judgement of the analyst(s).
A reagent blank concentration equal to or greater than 3 times the MDL requires corrective action to identify
and eliminate the source(s) of contamination, followed by re-analysis of the samples in the associated batch.
One laboratory fortified sample matrix (commonly called a matrix spike) or laboratory fortified blank
(i.e., spiked blank) should be analyzed along with each batch of 25 or fewer samples to evaluate the
recovery of the butyltin species of interest The butyttins should be added at 5 to 10 times their MDLs as
previously calculated by the laboratory (see Section 5.1.3). If the percent recovery for any of the butyltins
in the matrix spike or spiked blank is outside the range 70 to 130 percent, analysis of subsequent sample
batches should stop untH the source of the discrepancy is determined and the system corrected.
The NRCC sediment reference material 'PACS-1', which has certified concentrations of the three
butyltin species of interest, also should be analyzed along with each batch of 25 or fewer sediment samples
as a check on accuracy and reproducibility (I.e., batch-to-batch precision). If values obtained by the
laboratory for butyltins in "PACS-1' are not within ±30% of the certified values, the data for the entire batch
of samples is suspect Calculations and instruments should be checked; the CRM may have to be re-
analyzed to confirm the results. If the values are still outside the control limits in the repeat analysis, the
laboratory is required to determine the source(s) of the problem and repeat the analysis of that batch of
samples until control limits are met, before continuing with further sample processing.
-------
Section 5
Revision 0
Data 8/92
DRAFT 1
Pag* 22 of 35
5.2.4 Sediment grain size
Quality control of sediment grain size analyses is accomplished by strict adherence to protocol and
documentation of quality control checks. Several procedures are critical to the collection of high quality
particle size data. Most important to the dry sieve analysis is that the screens are dean before conducting
the analysis, and that all of the sample is retrieved from them. To dean a screen, it should be Inverted and
tapped on a table, while making sure that the rim hits the table evenly. Further deaning of brass screens
may be performed by gentle scrubbing with a stiff bristle nylon brush. Stainless steel screens may be
cleaned with a nylon or brass brush.
The most critical aspect of the pipet analysis is knowledge of the temperature of the silt-day
suspension. An Increase of only 1 °C will increase the settling velocity of a partide 50 pm in diameter by
2.3 percent. It is generally recommended that the pipet analysis be conducted at a constant temperature
of 20 °C. However, Plumb (1981) provides a table to correct for settling velocities at other temperatures;
this table is included in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). Thorough
mixing of the silt-clay suspension at the beginning of the analysis Is also critical. A perforated, plexiglass
disc plunger is very effective for this purpose. If the mass of sediment used for pipet analysis exceeds 25
g, a subsample should be taken as described by Plumb (1981). Silt-day samples in excess of 25 g may
give erroneous results because of electrostatic interactions between the particles. Silt-day samples less than
5 g yield a large experimental error in weighing relative to the total sample weight
The analytical balance, drying oven, sieve shaker, and temperature bath used in the analysis should
be calibrated at least monthly. Quality assurance for the sediment analysis procedures will be accomplished
primarily by re-analyzing a randomly selected subset of samples from each batch, as described in full detail
in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). A batch of samples is defined as
a set of samples of a single textural dassification (e.g., sat/clay, sand, gravel) processed by a single
technician using a single procedure. Approximately 10% of each batch completed by the same technician
will be re-analyzed (I.e., reprocessed) in the same manner as the original sample batch. If the absolute
difference between the original value and the second value is greater than 10% (in terms of the percent of
the most abundant sediment size dass), then a third analysis will be completed by a different technician.
The values dosest to the third value will be entered into the database. In addition, all the other samples in
the same batch must be re-analyzed, and the laboratory protocol and/or technician's practices should be
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 23 of 35
reviewed and corrected to bring the measurement error under control. If the percent of the most abundant
sediment size class in the original sample and the re-analyzed sample differs by less than 10, the original
value will not be changed and the sediment analysis process will be considered in control.
5.2.5 Apparent RPD Depth
The depth of the apparent RPD (redox potential discontinuity) wHI be determined in the field through
visual observation of clear plastic cores inserted into undisturbed sediment grab samples at each station.
In fine-grained sediments, the apparent RPD depth is measured from the sediment surface to the point at
depth where the color changes from light to dark. As a QC check, sediment cores will be re-measured by
the QA Coordinator during field visits. The field crew's original measurement should be within ±5 mm of
the re-measurement; failure to achieve this agreement will result in re-training of the crew.
5.3 SEDIMENT TOX1CITY TESTING
The toxicity of sediments collected by field crews will be determined as an integral part of the
benthic indicator suite, using 10-day acute toxicity tests with the marine amphipod Ampelisca abdita and
4-day tests with mysid shrimp Mysidopsis bahia. Complete descriptions of the methods employed for the
sediment toxicity test are provided in the Laboratory Methods Manual (U.S. EPA, in preparation). The
various aspects of the test for which quality assurance/quality control procedures are specified include the
following: the condition of facilities and equipment, sample handling and storage, the source and condition
of test organisms, test conditions, instrument calibration, use of replicates, use of reference toxicants, record
keeping, and data evaluation. In addition, any laboratory which has not previously performed the sediment
toxicity test using Ampelisca abdita or Mysidopsis bahia will be required to perform an initial demonstration
of capability, as described below.
5.3.1 Facilities and Equipment
Laboratory and bioassay temperature control equipment must be adequate to maintain
recommended test temperatures. Recommended materials must be used In the fabrication of the test
equipment in contact with the water or sediment being tested, as specified In the EMAP-E Laboratory
Methods Manual (U.S. EPA, in preparation).
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 24 of 35
5.3.2 Initial Demonstration of Capability
Laboratories which have not previously conducted sediment toxlclty tests with Ampelisca abdita or
Mysidopsis bahia must demonstrate the ability to culture, collect (If applicable), hold and test the organisms
without significant loss or mortality, prior to performing tests of actual samples. There are two types of tests
which must be performed as an initial demonstration of capability; these tests will serve to indicate the
overall ability of laboratory personnel to handle the organism adequately and obtain consistent, precise
results. First, the laboratory must perform a minimum of five successive reference toxicant tests, using
sodium dodecyl sulfate (SDS) as the reference toxicant. For both Mysidopsis bahia and Ampelisca abdita,
short-term (I.e., 96-hour) tests without sediments (I.e., seawater only) can be used for this purpose.
The trimmed Spearman-Karber method of regression analysis (Hamilton ef al. 1977) or the
monotonic regression analysis developed by DeGraeve et al. (1988) can be used to determine an LC50
value for each 96-hour reference toxicant test The LC50 values should be recorded on a control chart
maintained in the laboratory (described in greater detail in section 5.3.4, to follow). Precision then can be
described by the LC50 mean, standard deviation, and percent relative standard deviation (coefficient of
variation, or CV) of the five (or more) replicate reference toxicant tests. If the laboratory fays to achieve an
acceptable level of precision in the five preliminary reference toxicant tests, the test procedure should be
examined for defects and the appropriate corrective actions should be taken. Additional tests should be
performed until acceptable precision is demonstrated.
The second series of tests which must be performed successfully prior to the testing of actual
samples are 10-day, "non-toxicant" exposures of Ampelisca abdita or 4-day exposures for Mysidopsis bahia,
in which test chambers contain the control sediment and seawater that will be used under actual testing
conditions. These 'control" tests should be performed concurrent with the reference toxicant tests used to
assess single laboratory precision. At least five replicate test chambers should be used in each test The
tests should be run in succession until two consecutive tests each have mean survival equal to or greater
than 85% and survival in the individual test chambers is not less than 80%. These are the control survival
rates which must be achieved during actual testing if a test Is to be considered acceptable (see section
5.3.6); therefore, the results of this preliminary demonstration will provide evidence that facilities, water,
control sediment, and handling techniques are adequate to result In successful testing of samples.
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 25 of 35
5.3.3 Sample Handling and Storage
Techniques for sample collection, handling, and storage are described in the Field Operations
Manual (Macauley 1992). Sediment samples for toxicity testing should be chilled to 4°C when collected,
shipped on ice, and stored in the dark in a refrigerator at 4°C until used. Sediments should be stored for
no longer than four weeks before the initiation of the test, and should not be frozen or allowed to dry.
Sample containers should be made of chemically inert materials to prevent contamination, which might result
in artificial changes in toxicity.
To avoid contamination during collection, all sampling devices and any other instruments in contact
with the sediment should be cleaned with water and a mild detergent and thoroughly rinsed between
stations (see Macauley 1992). All utensils in contact with the sample should be made of either teflon or
high quality stainless steel (304 or better).
5.3.4 Quality of Test Organisms
All test organisms used in the tests should be disease-free and should be positively identified to
species. If the amphipods are collected from the field prior to testing, they should be obtained from an area
known to be free of toxicants and should be held in clean, uncontaminated water and facilities. Mysids must
be obtained from facilities that have demonstrated successful culturing from brood stocks held in
uncontaminated seawater. Test organisms held prior to testing should be checked daily, and individuals
which appear unhealthy or dead should be discarded. If greater than 5% of the organisms in holding
containers are dead or appear unhealthy during the 48 hours preceding a test, the entire group should be
discarded and not used in the test
The sensitivity of each batch of test organisms obtained from an outside source (e.g., field collected
or obtained from an outside culture facility) must be evaluated with the reference toxicant sodium dodecyl
sulfate (SDS) in a short-term toxicity test performed concurrently with the sediment toxicity tests. The use
of the reference toxicant SDS is required as a means of standardizing test results among different
laboratories. For Ampelisca abdita and Mysidopsis bahia, a 96-hour reference toxicant test without
sediment is used to generate LC50 values, as previously described In section 5.3.2.
-------
Section 5
Revision 0
Oat* 8/92
DRAFT 1
Page 26 of 35
These LC50 values should be recorded on the same control chart used to record the results of the
five (or more) reference toxicant tests performed for the Initial demonstration of capability. The control chart
represents a 'running plot" of the toxicity values (LCSOs) from successive reference toxicant tests. The mean
LC50 and the upper and lower control limits (±2S) are recalculated with each successive point until the
statistics stabilize. Outliers, which are values which fall outside the upper and lower control limits, are readily
identified. The plotted values are used to evaluate trends in organism sensitivity, as well as the overall ability
of laboratory personnel to obtain consistent results.
Reference toxicant tests results (i.e., LC50 values) which fall outside control chart limits should serve
as a warning to laboratory personnel. At the P=0.05 probability level, one in twenty tests would be expected
to fall outside control limits by chance only. The laboratory should try to determine the cause of the outlying
LC50 value, but a re-test of the samples is not necessarily required. If the reference toxicant test results are
outside control chart limits on the next consecutive test, the sensitivity of the organisms and the overall
credibility of the test are suspect The test procedure again should be examined for defects and additional
reference toxicant tests performed. Testing of samples should not resume until acceptable reference
toxicant results can be obtained; this may require the use of a different batch of test organisms.
5.3.5 Test Conditions
Parameters such as water temperature, salinity (conductivity), dissolved oxygen, and pH should be
checked as required for each test and maintained within the specified limits (U.S. EPA, in preparation).
Instruments used for routine measurements must be calibrated and standardized according to instrument
manufacturer's procedures. All routine chemical and physical analyses must Include established quality
assurance practices as outlined In Agency methods manuals (U.S. EPA 1979a and b).
Overlying water must meet the requirements for uniform quality specified in the method (U.S. EPA,
in preparation). The minimum requirement for acceptable overlying water Is that it allows acceptable control
survival without signs of organism disease or apparent stress (I.e., unusual behavior or changes In
appearance). The overlying water used in the sediment toxicity tests with test organisms may be natural
seawater, hypersaline brine (100 %o) prepared from natural seawater, or artificial seawater prepared from sea
salts. If natural seawater Is used, it should be obtained from an uncontaminated area known to support a
healthy, reproducing population of the test organism or a comparably sensitive species.
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 27 of 35
5.3.6 Test Acceptability
Survival of organisms in control treatments should be assessed during each test as an indication
of both the validity of the test and the overall health of the test organism population. The tests with
Ampelisca abdlta and Mysidopsis bahia are acceptable if mean control survival is greater than or equal to
85 percent, and if survival in individual control test chambers exceeds 80 percent. Additional guidelines for
acceptability of individual sediment toxicity tests are presented in the EMAP-E Laboratory Methods Manual
(U.S. EPA, in preparation). An individual test may be conditionally acceptable if temperature, dissolved
oxygen (DO), and other specified conditions fall outside specifications, depending on the degree of the
departure and the objectives of the tests. Any deviations from test specifications must be noted and
reported to the QA Coordinator when reporting the data so that a determination can be made of test
acceptability.
5.3.7 Record Keeping and Reporting
Proper record keeping is mandatory. Bound notebooks must be used to maintain detailed records
of the test organisms such as species, source, age, date of receipt, and other pertinent Information relating
to their history and health, and information on the calibration of equipment and instruments, test conditions
employed, and test results. Annotations should be made on a real time basis to prevent loss of information.
Data for all QA/QC variables, such as reference toxicant test results and copies of control charts, should
be submitted by trie laboratory along with test results.
5.4 MACROBENTHIC COMMUNITY ASSESSMENT
Sediment samples for macrobenthic community assessments will be collected at each station using
a Young-modified Van Veen grab sampler. In order to be considered acceptable, each grab sample must
be obtained following the specified protocol and must meet certain pre-«stablished quality control criteria,
as described in detail in the Field Operations and Safety Manual (Reifsteck ef a/. 1992). The collected
sediment will be sieved in the field through a 0.5 mm screen and the material collected on the screen
preserved and returned to the laboratory for processing. In the laboratory, QA/QC involves a series of
check systems for organism sorting, counting and taxonomic identification. These checks are described
briefly in the following sections; more complete details can be found In the EMAP-E Laboratory Methods
Manual (U.S. EPA, in preparation).
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 28 of 35
5.4.1 Sorting
The quality control check on each technician's efficiency at sorting (I.e., separating organisms from
sediment and debris) consists of an independent re-sort by a second, experienced sorter. A minimum of
10% of all samples sorted by each technician must be re-sorted to monitor performance and thus provide
feedback necessary to maintain acceptable standards. These re-sorts should be conducted on a regular
basis on at least one sample chosen at random for each batch of 10 samples processed by a given sorter.
Inexperienced sorters require a more intensive QC check system. It is recommended that experienced
sorters or taxonomists check each sample processed by inexperienced sorters until proficiency in organism
extraction is demonstrated. Once proficiency has been demonstrated, the checks may be performed at
the required frequency of one every ten samples. Logbooks must be maintained in the laboratory and used
to record the number samples processed by each technician, as well as the results of all sample re-sorts.
For each sample that is re-sorted, sorting efficiency should be calculated using the following formula:
# of organisms originally sorted x 100
# organisms originally sorted + additional # found in resort
The results of sample re-sorts may require that certain actions be taken for specific technicians.
If sorting efficiency is greater than 95%, no action is required. If sorting efficiency is between 90% and 95%,
problem areas should be identified and the technician should be re-trained. Laboratory supervisors must
be particularly sensitive to systematic errors (e.g., consistent failure to extract specific taxonomic groups)
which may suggest the need for further training. Resort efficiencies below 90% will require re-sorting of all
samples in the associated batch and continuous monitoring of that technician to improve efficiency.
If sorting efficiency is less than 90%, organisms found in the re-sort should be added to the original
data sheet and, if possible, to the appropriate vials for biomass determination. If sorting efficiency is 90%
or greater, the QC results should be recorded in the appropriate logbook, but the animals should not tj§
added to the original sample or used for biomass determinations. Once all quality control criteria associated
with the sample re-sort have been met, the sample residues may be discarded.
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 29 of 35
5.4.2 Species Identification and Enumeration
Only senior taxonomists are qualified to perform re-identification quality control checks. A minimum
of 10% of all samples (i.e., one sample chosen at random out of every batch of ten samples) processed by
each taxonomic technician must be checked to verify the accuracy of species identification and
enumeration. This control check establishes the level of accuracy with which identification and counts are
performed and offers feedback to taxonomists in the laboratory so that a high standard of performance is
maintained. Samples should never be re-checked by the technician who originally processed the sample.
Ideally, each batch often samples processed by an individual taxonomic technician should be from
a similar habitat type (e.g., all oligohaline stations). The re-check of one out of the ten samples in a batch
should be done periodically and in a timely manner so that subsequent processing steps (e.g., biomass
determinations) and data entry may proceed. As each taxon is identified and counted during the re-check,
the results should be compared to the original data sheet. Discrepancies should be double-checked to be
sure of correct final results. Following re-identification, specimens should be returned to the original vials
and set aside for biomass determination.
When the entire sample has been re-identified and re-counted, the total number of errors should be
computed. The total number of errors will be based upon the number of misidentifications and miscounts.
Numerically, accuracy will be represented in the following manner
Total # of organisms in QC recount - Total number of errors x 100
Total # of organisms in QC recount
where the following three types of errors are included in the total # of errors:
1.) Counting errors (for example, counting 11 individuals of a given species as 10).
2.) Identification errors (for example, identifying Species X as Species Y, where both are present)
3.) Unrecorded taxa errors (for example, not identifying Species X when it is present)
Each taxonomic technician must maintain an identification and enumeration accuracy of 90% or
greater (calculated using the above formula). If results fall below this level, the entire sample batch must
be re-identified and counted. If taxonomic efficiency is between 90% and 95%, the original technician should
be advised and species identifications reviewed. All changes in species identification should be recorded
-------
Section 5
Revision 0
Oat* 8/92
DRAFT 1
Page 30 of 35
on the original data sheet (along with the date and the initials of the person making the change) and these
changes should be entered into the database. However, the numerical count for each taxonomic group
should not be corrected unless the overall accuracy for the sample is below 90%. Additional details on this
protocol are provided in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). The results
of all QC rechecks of species identification and enumeration should be recorded in a timely manner In a
separate logbook maintained for this purpose.
As organisms are identified, a voucher specimen collection (taxonomic reference collection) should
be established. This collection should consist of representative specimens of each species Identified in
samples from an Individual Province in a given year. For some species, it may be appropriate to Include
in the reference collection individuals collected in different geographic locations within the Province. The
reference collection should be used to train new taxonomists and should be sent to outside consultants to
verify the laboratory's taxonomic identifications. Any resulting discrepancies should be resolved in
consultation with the EMAP-E Province Manager and/or the Province QA Coordinator.
5.4.3 Blomass Measurements
Performance checks of the balance used for biomass determinations should be performed routinely
using a set of standard reference weights (ASTM Class 3, NIST Class S-1, or equivalents). In addition, a
minimum of 10% of all pans and crucibles in each batch processed by a given technician must be re-
weighed by a second technician as a continuous monitor on performance. Samples to be re-weighed
should be selected randomly from the sample batch; the results of the re-weigh should be compared against
the original final weight recorded on the biomass data sheet Weighing efficiency should be calculated using
the following formula:
Original final weight x 100
Reweighed final weight
If weighing efficiency Is between 95% and 105%, the sample has met the acceptable quality control
criteria and no action Is necessary. If weighing efficiency Is between either 90% to 95% or 105% to 110%,
the sample has met acceptable criteria, but the technician who completed the original weighing should be
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 31 of 35
consulted and proper measurement practices reviewed. If the weighing efficiency Is less than 90% or
greater than 110%, then the sample has failed the quality control criteria and all samples in the associated
batch must be re-weighed (following technician re-training and/or troubleshooting of laboratory equipment
to determine and eliminate the source(s) of the inconsistency). Corrections to the original data sheet should
only be made in those cases where weighing efficiency is less than 90% or greater than 110%. The results
of all QC re-weighings should be recorded in a timely manner In a separate logbook or data sheet and
maintained as part of the documentation associated with the biomass data.
5.5 FISH SAMPLING
5.5.1 Species Identification. Enumeration and Length Measurements
Fish species identification, enumeration and individual lengths will be determined in the field
following protocols presented in the Louisianian Province Reid Operations Manual (Macauley 1992). The
quality of fish identifications, enumerations and length measurements will be assured principally through
rigorous training of field personnel prior to field sampling. Qualified taxonomists will provide independent
confirmation of all fish identifications, enumerations and length measurements made by crew members
during laboratory training sessions. An emphasis will be placed on correct identification of fish "target"
species to be saved by the field crews for later chemical contaminant analyses. Fish identifications,
enumerations and length measurements also will be confirmed by the QA Coordinator, Province Manager,
or their designee(s) during field visits. In addition, each field crew will be required to save two "voucher"
specimens of each species identified in the field. These voucher specimens will be preserved in fixative and
sent back to the Reid Operations Center on a regular basis throughout the field season. A qualified fish
taxonomist will verify the species identifications and provide immediate feedback to the field crews whenever
errors are found. The fish sent to the ERL-GB laboratory for gross pathological and histopathdogical
examination also will be checked for taxonomic determination accuracy. All erroneous identifications for
a given field crew will be corrected in the database. The preserved voucher fish wM be saved to provide
a reference collection for use in subsequent years' training.
The overall accuracy goal for all fish Identifications, enumerations and length measurements in a
given sampling season Is 90% (i.e., less than 10% errors). If this goal Is not met, corrective actions will
include increased emphasis on training and more rigorous testing of field crews prior to the next year's
-------
Section 5
Revision 0
Date 8/92
DRAFM
Page 32 of 35
sampling season. During the field season, the QA Coordinator, Province Manager and/or RekJ Coordinator
must be informed of species misidentifications immediately so that the appropriate field crew can be
contacted and the problem corrected.
5.5.2 Fish Gross Pathology and Hlstopathology
The field procedures for gross pathological examination of fish are detailed in the Louisianian
Province Field Operations Manual (Macauley 1992). As with fish identification and enumeration, the quality
of gross pathology determinations will be assured principally through rigorous training of field personnel
prior to field sampling. Qualified pathologists will be responsible for planning and overseeing all crew
training and will provide independent confirmation of all pathologies noted by field personnel during the
training sessions. During the actual sample collection period, these qualified pathologists also will record
any gross external pathologies they find in examining fish which the crews send back to the laboratory for
histopathological study (which will include those fish on which crews found pathologies as well as those
without pathologies). The laboratory pathologist(s) will perform these examinations without knowledge of
the gross external pathologies noted by the field crews; this will provide a measure of the number and type
of pathologies which were either incorrectly Identified or missed In the field (I.e.. false positives and false
negatives). This information will also be used to "customize" crew training In future years.
A series of internal and external laboratory QC checks will be employed to provide verification of
the fish histopathology Identifications. In laboratories having multiple pathologists, all cases bearing
significant lesions should be examined and verified by the senior pathologist At least 5% of the slides read
by one pathologist should be selected at random and read by a second pathologist without knowledge of
the diagnoses made by the initial reader. For the external QC check, at least 5% of the slides should be
submitted for independent diagnosis to a pathologist not involved with the laboratory. These slides should
represent the range of pathological conditions found during the study, and the external pathologist should
not be aware of the diagnoses made by the laboratory personnel.
Each laboratory also should maintain a reference collection of slides that represent every type of
pathological condition identified in the EMAP-E fish. Each of these slides should be verified by an external
pathologist having experience with the species in question. The reference slide collection then can be used
to verify the diagnoses made in future years to ensure Intralaboratory consistency. The reference slides also
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 33 of 35
can be compared with those of other laboratories to ensure interlaboratory consistency. A reference
collection of photographs also can be made, but this should not substitute for a slide collection.
5.6 WATER COLUMN MEASUREMENTS
Characterization of the water column is accomplished in two ways: point-in-time vertical profiles and
continuous, long-term, near-bottom measurements. The Hydrolab Surveyor II Datalogger is used to obtain
vertical profiles of temperature, salinity, dissolved oxygen, and pH. The Hydrolab DataSonde 3 is used to
continuously record longer-term (24 hour) time series of temperature, salinity, dissolved oxygen, and pH in
the near-bottom waters (ca. 0.5 meter off the bottom).
Quality control of the water column measurements made with these electronic instruments consists
of two aspects: calibrations and QC checks of the calibrations. The Surveyor II is calibrated daily, usually
within a few hours of being used in the field. The DataSonde 3 units are calibrated and programmed for
internal recording within 24 hours of their deployment; this calibration Is then re-checked aboard the boat
immediately prior to deployment by conducting a side-by-side test against the Surveyor II. In this case, the
calibrated Surveyor II is considered the "standard" against which the performance of the DataSonde 3 is
evaluated. Upon retrieval, the DataSonde 3 is again checked side-by-side against the Surveyor II. Specific
QC procedures for each instrument are discussed in the following sections.
5.6.1 Hvdrolab Surveyor II
Calibration and Calibration Checks
Each Surveyor I! instrument is calibrated daily just prior to beginning the day's field activities. The
calibration of the conductivity cell, for measuring salinity, Is checked and re-set, if necessary, using a
secondary seawater standard that has been standardized against IAPSQ Standard Seawater using a
Guildline laboratory salinometer. The dissolved oxygen probe Is calibrated using the water-saturated air
procedure recommended by the manufacturer. This calibration is checked verified weekly by immersing the
probe in a bucket of air-saturated water and checking the D.O. reading against the value from a saturation
table. The calibration of the pH probe is checked and re-set, If necessary, using two standard pH buffers
(7 and 10) as recommended by the manufacturer. The pressure sensor, used to measure depth, is
-------
Section 5
Revision 0
Date 8/92
DRAFT 1
Page 34 of 35
calibrated by setting the depth to zero meters while holding the instrument at the water's surface (i.e., sea-
level). The calibration of the temperature sensor is set at the factory and cannot be changed; this calibration
is checked by comparison with a thermometer reading while the instrument is immersed in a bucket of
ambient water.
TABLE 5.6. Maximum acceptable differences for instrument field calibration checks.
Instrument
Hydrolab
Surveyor II
Hydrolab
DataSonde 3
Frequency
of Check
Daily
Pre- and
post-
deployment
Parameter
Temperature
Salinity
D.O.
PH
Temperature
Salinity
D.O.
PH
Checked
Against
Thermometer
Standard seawater
Saturation chart
pH buffer solution
Surveyor II
Surveyor II
Surveyor II
Surveyor II
Maximum
Acceptable
Difference
± 1°C
± 1 ppt
± 0.5 ppm
± 0.5 pH units
±1°C
± 1 ppt
± 0.5 ppm
± 0.5 pH units
5.6.2 Hvdrolab DataSonde 3
The Hydrolab DataSonde 3 is a self-contained data-logging instrument used to measure temperature,
salinity, dissolved oxygen, pH, and depth; individual units are moored approximately 0.5 meter above the
bottom inside a protective PVC housing. These instruments are programmed to record data internally at
15 minute intervals throughout their 24 hour deployments.
Calibration
Each DataSonde 3 instrument is calibrated under controlled laboratory conditions within 24 hours
of being deployed. The conductivity ceil, for measuring salinity, Is calibrated using a secondary seawater
standard that has been standardized against IAPSO Standard Seawater using a GuBdline laboratory
-------
Sections
Revision 0
Date 8/92
DRAFT 1
Page 35 of 35
salinometer. The dissolved oxygen probe is calibrated using the water-saturated air procedure
recommended by the manufacturer. The pH probe is calibrated using two standard pH buffers (7 and 10)
as recommended by the manufacturer. The pressure sensor used to measure depth is calibrated by setting
the depth to zero meters while holding the instrument at the water's surface (i.e., sealevel). The calibration
of the temperature sensor is set at the factory and cannot be changed.
Calibration Checks
Calibration checks are conducted aboard the boat immediately prior to deployment and immediately
following retrieval of each unit. The DataSonde 3 is placed in a bucket of ambient water along with a
calibrated Hydrolab Surveyor II unit which in this case is considered the "standard'. Measurements of D.O.,
pH, salinity, temperature and depth are recorded for each instrument and compared. If the difference
between the Datasonde3 and Surveyor II exceeds the acceptable range for any parameter (Table 5.6), the
DataSonde 3 is recalibrated and the side-by-side check is repeated. A second failure by the DataSonde 3
is indicative of more serious problems and would necessitate the use of a back-up instrument.
5.7 NAVIGATION
Station location information is recorded in the field on standard data forms. In addition, the field
crews are required to maintain a navigation log book and record all LORAN-C calibration information.
Furthermore, the crews must record radar ranges and hand-held compass bearings for any sampling station
where the LORAN-C cannot be used for navigation. All navigation logs will be checked for completeness
and accuracy during field audits. Following the completion of field activities, the range and bearing
information from a subset of stations visited by each crew will be reviewed at random to verify the
positioning accuracy achieved using the electronic navigation system.
Due to the complexity of navigation and potential interferences, the most important aspect of quality
assurance is thorough training. Especially important is the need for the Crew Chief to evaluate the quality
of all inputs and decide which are most appropriate at each station. Once this is decided, proper calibration
of the navigation system is critical. Calibration information will also be recorded in the navigation log. Proper
navigation must be extensively discussed during both crew chief and crew training.
-------
Section 6
Revision 0
Date 8/92
DRAFT 1
Page 1 of 3
SECTION 6
FIELD OPERATIONS AND PREVENTIVE MAINTENANCE
6.1 TRAINING AND SAFETY
Proper training of fieid personnel represents a critical aspect of quality control. Field technicians
are trained to conduct a wide variety of activities using standardized protocols to insure comparability in
data collection among crews and across regions. Each field team consists of a Team Leader and one or
two, 3-member boat crews supported by a land-based, 2-member mobile laboratory crew. Each boat crew
is headed by a Crew Chief (one of which is the Team Leader), who is captain of the boat and chief scientist,
and, as such, is the ultimate on-site decision maker regarding safety, technical direction, and communication
with the Field Operations Center.
The desired qualifications for the Team Leaders and Crew Chiefs include a M.S. degree in
biological/ecological sciences and three years of experience In field data collection activities, or a B.S.
degree and five years experience. The remaining crew members generally are required to hold B.S. degrees
and, preferably, at least one year's experience.
Prior to the actual sample collection period, each crew receives formal training and must undergo
a fairly elaborate check-out procedure. Both classroom and "hands-on" training are coordinated by staff
members at the Louisianian Province Field Operations Center at ERL-GB; these personnel have extensive
experience instructing field technicians in routine sampling operations (e.g., collection techniques, small boat
handling, etc.). The expertise of the on-site EMAP staff is supplemented by local experts in such specialized
areas as fish pathology, fish identification, field computer/navigation system use, and first aid (including
cardiopulmonary resuscitation (CPR) training). All the sampling equipment (e.g., boats, instruments, grabs,
nets, computers, etc.) Is used extensively during the "hands-on" training sessions, and by the end of the
course, all crews members must demonstrate proficiency in all the required sampling activities.
Upon completion of the formal crew training session, each crew (boat and mobile laboratory) must
pass a graded field certification exercise (passing score, normalized to per cent > 90%). The exercise, a
-------
Sections
Revision 0
Date 8/93
DRAFT 1
Page 2 of 3
full-scale sampling assignment, will incorporate all elements of field sampling and the associated support
activities of the mobile laboratory crew. The performance of the crew will be graded by a member of the
Province field management team (i.e., the Province Manager, Quality Assurance Coordinator, or the Field
Coordinator). If any deficiencies within a crew are noted, they are remedied prior to field sampling. This is
accomplished by additional training or by changing the crew composition. It is the responsibility of the
Province QA Coordinator to develop and maintain on permanent file all records related to crew certifications
(e.g., examinations, field and laboratory check-out sheets, grading forms, etc.).
All aspects of field operations are detailed in the Reid Operations Manual (Macauley 1992), which
is distributed to all trainees prior to the training period. The manual includes a checklist of all equipment,
instructions on equipment use, and detailed written descriptions of sample collection procedures. In
addition, the manual includes flow charts and a schedule of activities to be conducted at each sampling
location, along with a list of potential hazards associated with each sampling site.
6.2 FIELD QUALITY CONTROL
Quality control of measurements made during the actual field sampling period Is accomplished
through the use of a variety of QC sample types and procedures, as described in Sections 4 and 5 of this
document. At least once during each field season, a QA evaluation of each field crew is performed by either
the QAC or the Field Coordinator to insure compliance with prescribed protocols. A standardized checklist,
which is updated annually, will be used to insure comparability and consistency in this process. Reid crews
will be re-trained whenever discrepancies are noted.
6.3 DATA RECORDING
On the EMAP-E program, portable field computers are used extensively for data recording in the
field. However, it is the policy of the program that the primary means of data recording for both field and
laboratory activities will be hand-written hard copies. In the case of field activities, standardized data forms
have been prepared by the information management group; these forms are used as the primary means of
-------
Section 6
Revision 0
Data 8/92
DRAFT 1
Page 3 of 3
data recording on board the research vessels. The field crews are responsible for sending the data forms
to the Province Field Operations Center on a continual basis throughout the field season. At the Field
Operations Center, the forms are copied and the copies stored in a separate location as a means of
permanent, duplicative archival. In addition to the standardized data forms, each Crew Chief is required to
maintain a boat log for recording ancillary information (e.g., boat movements, weather conditions, schedule
of activities, etc.) pertinent to the daily field activities. These log books also are collected at the end of each
field season and maintained in permanent archival at the Reid Operations Center.
6.4 PREVENTIVE MAINTENANCE
The importance of proper maintenance of all gear cannot be understated. Failure of any piece of
major equipment, especially when back-up equipment is unavailable, can result in a significant loss of data.
Maintenance of equipment must be performed at regular intervals, as specified in the Field Operations
Manual (Macauley 1992). It is the responsibility of the Team Leader to maintain a logbook of equipment
usage and assure that proper maintenance is performed at the prescribed time intervals. The equipment
maintenance logbook is examined during field QA visits and at the end of the fiekJ season to insure that
proper procedures have been followed.
-------
Section 7
Revision 0
Data 3/92
DRAFT 1
Page 1 of 3
SECTION 7
LABORATORY OPERATIONS
7.1 DATA RECORDING
The EMAP-E program will enforce a laboratory notebook policy which requires all analytical
laboratories to maintain hard copy records of all information necessary to document the quality of the
resultant data. Examples of such hard copy records include, but are not limited to, notebooks, log books,
data forms, instrumental outputs, etc. Laboratory personnel are expected to follow Good Laboratory
Practice in the maintenance and archival of these hard copy records. Each laboratory will be required to
make all pertinent records available for review during QA audits and/or in response to requests from the
Province Manager.
7.2 LABORATORY PERSONNEL, TRAINING, AND SAFETY
This section addresses only general laboratory operations, while specific QA/QC requirements and
procedures are presented in sections 4 and 5. Personnel in any laboratory performing EMAP analyses
should be well versed in standard safety practices; it is the responsibility of the laboratory manager and/or
supervisor to ensure that safety training is mandatory for all laboratory personnel. The laboratory is
responsible for maintaining a current safety manual in compliance with the Occupational Safety and Health
Administration (OSHA) regulations, or equivalent state or local regulations. The safety manual should be
readily available to laboratory personnel. Proper procedures for safe storage, handling and disposal of
chemicals should be followed at all times; each chemical should be treated as a potential health hazard and
good laboratory practices should be implemented accordingly.
-------
Section 7
Revision 0
Date 8/92
DRAFT 1
Page 2 of 3
7.3 QUALITY CONTROL DOCUMENTATION
In each laboratory, the following EMAP-Estuaries documents must be current and available:
o Laboratory Methods Manual - A document containing detailed instructions about
laboratory and Instrument operations (U. S. EPA, in preparation).
o Quality Assurance Project Plan - A document containing clearly defined laboratory QA/QC
protocols (this document).
In addition to the official EMAP-E documents, each laboratory should maintain the following:
o Standard Operating Procedures (SOPs) - Detailed instructions for performing routine
laboratory procedures, usually written in "cookbook" format In contrast to the Laboratory
Methods Manual, SOPs offer step-by-step instructions describing exactly how the method
is implemented in a particular laboratory.
o Instrument performance study information - Information on instalment baseline noise,
calibration standard response, precision as a function of concentration, detection limits, etc.
This information usually is recorded in logbooks or laboratory notebooks.
7.4 ANALYTICAL PROCEDURES
Complete and detailed procedures for processing and analysis of samples in the field and laboratory
are provided in the Reid Operations Manual (Macauley 1992) and the Laboratory Methods Manual (U.S.
EPA, in preparation), respectively, and will not be repeated here.
7.5 LABORATORY PERFORMANCE AUDITS
Initially, a QA assistance and performance audit will be performed by QA personnel to determine
if each laboratory effort Is In compliance with the procedures outlined in the Methods Manual and QA Project
Plan and to assist the laboratory where needed. In some cases, a formal performance evaluation will be
-------
Section 7
Revision 0
Date 8/92
DRAFT 1
Page 3 of 3
required (see Section 5) before a laboratory will receive permission from the Province Manager to begin
processing field samples. Additionally, technical reviews may be conducted by a team composed of the
QA Coordinator and his/her technical assistants. Reviews may be conducted at any time during the scope
of the study, but not necessarily every year.
-------
Section 8
Revision 0
Data 8/92
DRAFT 1
Page 1 of 7
SECTION 8
QUALITY ASSURANCE AND QUALITY CONTROL
FOR MANAGEMENT OF DATA AND INFORMATION
8.1 SYSTEM DESCRIPTION
The Near Coastal Information Management System (NCIMS) is designed to perform the following
functions:
o document sampling activities and standard methods,
o support program logistics, sample tracking and shipments,
o process and organize both the data collected in the field and the results generated at
analytical laboratories,
o perform range checks on selected numerical data,
o facilitate the dissemination of information, and
o provide interaction with the EMAP Central Information System.
A complete and detailed description of the NCIMS is provided in Rosen et. al. (1991) and will not
be repeated here.
8.2 QUALITY ASSURANCE/QUALITY CONTROL
Two general types of problems which must be resolved in developing QA/QC protocols for
information and data management are: (1) correction or removal of erroneous individual values and (2)
inconsistencies that damage the integrity of the data base. The following features of the NCIMS will provide
a foundation for the management and quality assurance of all data collected and reported during the life of
the project.
8.2.1 Standardization
A systematic numbering system will be developed for unique identification of individual samples,
sampling events, stations, shipments, equipment, and diskettes. The sample numbering system will contain
-------
Section 8
Revision 0
Date 8/92
DRAFT 1
Page 2 of 7
codes which will allow the computer system to distinguish among several different sample types (e.g.,
actual samples, quality control samples, sample replicates, etc.). This system will be flexible enough to allow
changes during the life of the project, while maintaining a structure which allows easy comprehension of the
sample type.
Clearly stated standard operating procedures will be given to the field crews with respect to the use
of the field computer systems and the entry of data in the field. Contingency plans will also be stated
explicitly in the event that the field systems fail.
8.2.2 Prelabeling of Equipment and Sample Containers
Whenever possible, sample containers, equipment, and diskettes will be prelabeled to eliminate
confusion in the field. The prelabeling will reduce the number of incorrect or poorly-affixed labels. Packages
with all the required prelabeled sample containers, sample sheets, and data diskettes will be prepared for
the field crews prior to each sampling event (an event is defined as a single visit by a crew to a sampling
site). These containers will be called 'station packets". Each station packet will have the station number
affixed to it using both printed and bar code labels.
8.2.3 Data Entry. Transcription, and Transfer
To minimize the errors associated with entry and transcription of data from one medium to another,
data will be captured electronically. When manual entry is required, the data should be entered twice by
different data entry operators and then checked for non-matches to identify and correct errors. In many
instances, the use of bar code labels should eliminate the need for manual entry of routine information.
Each group transmitting data to the information center will be given a separate account on the Near
Coastal VAX 3300. Standard formats for data transfer will be established by the Information Management
Team. A specific format will be developed for each file type within each discipline. If data are sent to the
Near Coastal Information Center in formats other than those specified, the files will be deleted and the
sending laboratory or agency will be asked to resubmit the data in the established format
The communications protocols used to transfer data electronically will have mechanisms by which
-------
Section 8
Revision 0
Data 8/92
DRAFT 1
the completeness and accuracy of the transfer can be checked. In addition, the group sending the
information should specify the number of bytes and file names of the transferred files. These data
characteristics should be verified upon receipt of the data. If the file cannot be verified, a new file transfer
should be requested. Whenever feasible, a hard copy of all data should be provided with transfer files.
The data files transmitted from the field will be fixed-format text files. These files will be "parsed" by
the system. The parsing process involves transferring records of similar type into files containing only those
types of records. For example, observation on fish species and size will be copied from the original log file
transmitted from the field to a "fish" data file. After the records have been parsed from the field log files, the
individual data files will be checked automatically for erroneous values, as described in the following section.
Records in the field log file which are not entered into the data base (e.g., comments in text form) will be
archived for documentation or future extraction.
8.2.4 Automated Data Verification
Erroneous numeric data will be identified using automatic range checks and filtering algorithms.
When data fall outside of an acceptable range, they will be flagged in a report for the Quality Assurance
Coordinator (QAC), or his designee. This type of report will be generated routinely and should detail the
files processed and the status of the QA checks. The report will be generated both on disk and in hard
copy for permanent filing. The QAC will review the report and release data which have passed the QA check
for addition to the data base. All identified errors must be corrected before flagged files can be added to
a data base. If the QAC finds that the data check ranges are not reasonable, the values can be changed
by written request. The written request should include a justification for changing the established ranges.
If the QAC finds the need for additional codes, they can be entered by the senior data librarian. After such
changes are made, the files may be passed through the QA procedure again. In the event that the QA
check identifies Incorrect data, the QAC will archive the erroneous file and request that the originator
corrects the error and re-transmits the data.
Data base entn'es which are In the form of codes should be compared to lists of valid values (e.g.,
look up tables) established by experts for specific data types. These lists of valid codes wM be stored in
a central data base for easy access by data base users. When a code cannot be verified In the appropriate
look up table, the observation should be flagged in the QAC report for appropriate corrective action (e.g.,
-------
Sections
Revision 0
Date 8/92
DRAFT1
Page 4 of 7
update of the look up table or removal of the erroneous code).
8.2.5 Sample Tracking
Samples collected in the field will be shipped to multiple analytical laboratories. All shipping
information required to adequately track the samples (sample numbers, number of containers, shipment
numbers, dates, courier ID numbers etc.) will be transmitted electronically by modem to the Information
Center at the end of each sample day. Once the field crew has transmitted the data, it will be the
responsibility of the data management team to confirm that the samples arrive at their destination. Each
receiving laboratories will be required, upon receipt of the samples, to record and similarly transmit all
tracking information (e.g., sample identification numbers, shipment numbers and the status of the samples)
to the Information Center, using either microcomputers or the VAX. The use of barcode labels and readers
will facilitate this process. The information management team will generate special programs to create fixed
format records containing this information.
8.2.6 Reporting
Following analysis of the samples, the summary data packages transmitted from the laboratories
will include sample tracking information, results, quality assurance and quality control information, and
accompanying text. If the laboratory has assigned internal identification numbers to the samples, the results
should include the original sample number and the internal number used by the laboratory. The analytical
laboratories will be responsible for permanent archiving of all raw data used in generating the results.
8.2.7 Redundancy (Backups!
All files In the NCIMS will be backed up regularly. At least one copy of the entire system will be
maintained off-site to enable the information management team to reconstruct the data base in the event
that one system is destroyed or incapacitated. In the field, all information will be recorded both on paper
data sheets as well as in the computer. All Information saved to the hard drive will also be automatically
copied to a diskette immediately. In addition, at the end of each day the field computers will be "equalized"
to assure that the information contained on both are Identical. At this point all data will be contained on the
hard drives of both field computers and on a diskette. At the Louisianian Province Field Operations Center
-------
Section a
Revision 0
Data 8/92
DRAFT 1
Pag« 5 of 7
at ERL-GB, Incremental backups to removable disk will be performed on all files which have changed on
a daily basis. In addition, backups of all EMAP directories and intermediate files will be ;>erformed on a
weekly basis to provide a backup in the event of a complete loss of the Field Operations Center facility.
All original data files will be saved on-line for at least two years, after which ,e files will be
permanently archived on floppy diskette. All original files, especially those containing the raw field data, will
be protected so that they can only be read (i.e., write and delete privileges will be removed from these files).
8.2.8 Human Review
All discrepancies which are identified by the computer will be documented in hard copy. These
discrepancy logs will be saved as part of the EMAP archive. All identified discrepancies should be brought
to the attention of the QAC or his/her designee, who will determine the appropriate corrective action to be
taken. Data will not be transferred to the data base until all discrepancies have been resolved by the QAC.
Once data have been entered into the data base, changes will not be made without the written consent of
the QAC, who will be responsible for justifying and documenting the change. A record of all additions will
be entered into a data set index and kept in hard copy.
Field data require additional review to assure the absence of transcription errors. Following the
entry of data into the field computer, it will be the responsibility of the individual crew chiefs to review the
data files and assure that they are error-free. Once this review has occurred, the crew chief will lock* the
file preventing further editing in the field. Upon return of the data sheets to the Information Center, a 100%
check of the files wiH be performed (I.e., all files will be compared to the paper data sheets to identify
transcription errors that may not have been detected by the crew chiefs). Corrections will be made as
necessary and a report generated for the QAC and Information Manager.
8.3 DOCUMENTATION AND RELEASE OF DATA
Comprehensive documentation of information relevant to users of the NCI MS wfll be maintained and
updated as necessary. Most of this documentation will be accessible on-line. In data bases which describe
-------
Sections
Revision 0
Date 8/93
DRAFT 1
Page 6 of 7
and interact with the system. The documentation will include a data base dictionary, access control, and
data base directories (including directory structures), code tables, and continuously-updated information on
field sampling events, sample tracking, and data availability.
A limited number of personnel will be authorized to make changes to the EMAP-E data base. All
changes will be carefully documented and controlled by the senior data librarian. Data bases which are
accessible to outside authorized users will be available in "read only* form. Access to data by unauthorized
users will be limited through the use of standard DEC VAX security procedures. Information on access
rights to all EMAP-E directories, files, and data bases will be provided to all potential users.
The release of data from the NCIMS will occur on a graduated schedule. Different classes of users
will be given access to the data only after it reaches a specified level of quality assurance. Each group will
use the data on a restricted basis, under explicit agreements with the Estuaries Task Group. The following
four groups are defined Tor access to data:
I. The Louisianian Province central group, including the Information management team, the
field coordinator, the Province Manager, the QA Coordinator and the field crew chiefs.
II. EMAP-Estuaries primary users - ERL-GB and their contractor personnel, EMAP-E program-
level personnel at ERL-Narragansett, NOAA EMAP-E personnel, and EMAP quality
assurance personnel.
/
III. EMAP data users - All other task groups within EPA, NOAA, and other federal agencies.
IV. General Public - university personnel, other EPA offices (includes regional offices), and
other federal, state, and local governments.
Prior to release at level IV (general public) all files will be checked and/or modified to assure that
values contain the appropriate number of significant figures. The purpose is to assure that the data released
do not imply greater accuracy than was realized. This will be especially Important in files where data were
summarized. In such cases additional figures beyond the decimal point may have been added by the
statistical program during averaging or other manipulations. It will be the responsibUity of the QAC to
-------
Section 8
Revision 0
Date 8/92
DRAFT t
Page 7 of 7
determine the appropriate number of significant figures for each measurement.
Requests for premature release of data will be submitted to the Information Management Team
through the Program Manager. The senior data analyst and the QAC will determine if the data can be
released. The final authority on the release of all data is the Technical Director of EMAP-Estuaries. The
long-term goal for the Near Coastal Information Management Team will be to develop a user interface
through which all data will be accessed. This will improve control of security and monitoring of access to
the data, and it help ensure that the proper data files are being accessed.
-------
Section 9
Revision 0
Date 8/92
DRAFT 1
Page 1 of 1
SECTION 9
QUALITY ASSURANCE REPORTS TO MANAGEMENT
A quality assurance technical report (or relevant section of the Annual Statistical Summary) will be
prepared by the Province QA Coordinator following each year's sampling efforts. This report will summarize
results for the various quality control samples and assess whether measurement quality objectives for the
various indicators (as described in Sections 4 and 5 of this document) have been met. The Province QA
Coordinator also will assist the EMAP-Estuaries QA Coordinator (i.e., the Resource Group-level QA
Coordinator) in planning QA activities prior to each fiscal year. The EMAP-E QA Coordinator will be
responsible for documenting this planning information and reporting on each year's QA activities in a Quality
Assurance Annual Report and Work Plan (QAARWP), which will be submitted to the EMAP-E Technical
Director and the EMAP Program QA Coordinator prior to the beginning of each fiscal year. In addition to
the annual QA technical report, which will likely be incorporated as part of the Province Annual Statistical
Summary, the Province QA Coordinator will report regularly to the Province Manager on an informal basis.
The purpose of this informal reporting, which will take the form of memos, is to provide the Province
Manager with timely information on QA issues which potentially might affect data quality.
Within 30 days of each audit (field or laboratory), the QA Coordinator will submit a draft audit report
to the Province Manager and a courtesy copy to the person in charge of the audited entity (e.g., field crew
chief, laboratory manager, laboratory director, etc.). The Province Manager and the auditee wUI be given
two weeks to provide comments to the QA Coordinator on the draft audit report. Following receipt of
comments, the QA Coordinator will issue a final audit report within two weeks. This report will describe the
results of the audit in full detail and note any deficiencies requiring corrective action. Upon receipt of the
final audit report, the auditee will be given two weeks to submit to the Province Manager a corrective action
plan. The corrective action plan should be responsive to the audit report and should specify in detail the
actions and the time table that will be implemented for correcting any deficiencies. It will be the
responsibility of the QA Coordinator to monitor the implementation of corrective actions and make regular
progress reports to the Province Manager until the corrective actions are complete.
-------
Section 10
Revision 0
Data 8/92
DRAFTt
Page 1 of 2
SECTION 10
REFERENCES
Cantillo, A.Y. 1990. Standard and Reference Materials for Marine Sciences. Intergovernmental
Oceanographic Commission Manuals and Guides 21.
Degraeve, G.M., N. G. Reichenbach, J. D. Cooney, P. I. Feder, and D. I. Mount. 1988. New developments
in estimating endpoints for chronic toxicrty tests. Abstract, Am. Soc. Test Mater. 12th Symp. Aquat.
Toxicol. Hazard Assess., Sparks, Nev.
Federal Register, Part VIII, EPA, "Guidelines Establishing Test Procedures for the Analysis of Pollutants
Under the Clean Water Act: Final Rule and Proposed Rule. 40 CFR Past 136, Oct. 28, 1984.
Hamilton, M. A., R. C. Russo, and R. V. Thurston. 1977. Trimmed Spearman-Karber method for estimating
median lethal concentrations in toxicity bioassays. Environ. Sci. Technol. 11:714-719; Correction
12:417 (1978).
Holland, A. F., ed. 1990. Near Coastal Program Plan for 1990: Estuaries. EPA 600/4-90/033. U.S.
Environmental Protection Agency, Environmental Research Laboratory, Office of Research and
Development, Narragansett, Rl.
Hunt, D. T. E., and A. L Wilson. 1986. The Chemical Analysis of Water General Principles and Techniques.
2nd ed. Royal Society of Chemistry, London, England 683 pp.
Keith, L H., W. Crumett, J. Deegan, Jr., R. A. Libby, J. K. Taylor, and G. Wentler. 1983. Principles of
environmental analysis. Anal. Chem. 55:2210-2218.
Keith, L H. 1991. Environmental Sampling and Analysis: A Practical Guide. Lewis Publishers, Chelsea, Ml,
143 pp.
Kirchner, C. J. 1983. Quality control In water analysis. Environ. Sci. and Technol. 17(4):174A-181A.
Lauenstein, G. L in preparation. A Compendium of Methods Used in the NOAA National Status and Trends
Program.
Macauley, J. 1992. EMAP-Estuaries 1992 Louisianian Province Field Operations Manual. U.S.
Environmental Protection Agency, Environmental Research Laboratory, Office of Research and
Development, Gulf Breeze, FL
Plumb, R. H., Jr. 1981. Procedures for handling and chemical analysis of sediment and water samples.
Technical Report EPA\CE-81-1. U.S. Environmental Protection Agency/U.S. Corps of Engineers
Technical Committee on Criteria for Dredged and Fill Material, U.S. Army Waterways Experiment
Station, Vicksburg, MS. 471 pp.
-------
Section 10
Revision 0
Date 8/92
DRAFT 1
Page 2 of 2
Rosen, J. S., H. Buffum, J. Beaulieu, and M. Hughes. 1991. Information Management Plan for the EMAP-
Near Coastal Program. U.S. Environmental Protection Agency, Environmental Research Laboratory,
Office of Research and Development, Narragansett, Rl.
Stanley, T. W., and S. S. Verner. 1983. Interim Guidelines and Specifications for Preparing Quality
Assurance Project Plans. EPA/600/4-83/004. U.S. Environmental Protection Agency, Washington,
D.C.
Stanley, T. W., and S. S. Verner. 1985. The U. S. Environmental Protection Agency's quality assurance
program, pp 12-19 In: J. K. Taylor and T. W. Stanley (eds.). Quality Assurance for Environmental
Measurements, ASTM STP 867. American Society for Testing and Materials, Philadelphia,
Pennsylvania.
Taylor, J. K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers, Inc., Chelsea,
Michigan. 328 pp.
U.S. Environmental Protection Agency, in preparation. EMAP Laboratory Methods Manual: Estuaries. U.
S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Office of
Research and Development, Cincinnati, Ohio.
U.S. Environmental Protection Agency. 1979a. Methods for chemical analysis of water and wastes.
EPA-600/4-79/020. U. S. Environmental Protection Agency, Environmental Monitoring Systems
Laboratory, Office of Research and Development, Cincinnati, Ohio (revised March 1983).
U.S. Environmental Protection Agency. I979b. Handbook for analytical quality control in water and
wastewater laboratories. U. S. Environmental Protection Agency, Environmental Monitoring and
Support Laboratory, Cincinnati, Ohio, EPA/600/4-79/019.
U.S. Environmental Protection Agency. 1989. Recommended Protocols for Measuring Selected
Environmental Variables in Puget Sound. U.S. Environmental Protection Agency, Puget Sound
Estuary Program, Office of Puget Sound, Seattle, Washington.
------- |