United States
Environmental Protection
Agency
Office of Research
and Development
Gulf Breeze, FL 32561
ENVIRONMENTAL MONITORING
AND ASSESSMENT PROGRAM
EMAP-ESTUARIES
WEST INDIAN PROVINCE
1995 QUALITY ASSURANCE
PROJECT PLAN
Environmental Monitoring
and Assessment Program
-------
-------
DISCLAIMER
The research described in this article has been subjected to Agency review for internal Agency distribution.
Mention of trade names does not constitute endorsement or recommendation for use.
-------
QUALITY ASSURANCE PROJECT PLAN APPROVAL
This quality assurance project plan was developed to assure that all environmental data generated for the
Estuaries Resource Group of the Environmental Monitoring and Assessment Program (EMAP) are scientifically valid
and of acceptable quality to achieve the program's objectives. The signatures of key technical and management
personnel indicate approval or concurrence with the procedures specified in the plan. These approvals and
concurrences also represent a commitment to disseminate the plan and the philosophy of total quality to all project
participants.
Date
' John Macauley ^
West Indian Province Manager
Gulf Ecology Division of the National Health and Environmental Effects Research Laboratory
(I
' Date
Jimes Moore
Quality Assurance Manager
Gulf Ecology Division of the_tfational Health and Environmental Effects Research Laboratory
/
PO i^
^~\ Date
Kevin Summers ^"-'
EMAP-Estuaries Technical Director
Gulf Ecology Division of the National Health and Environmental Effects Research Laboratory
T
v_x
7] " Date
Foster Mayer '
Acting Division Director
Gulf Ecology Division of the National Health and Environmental Effects Research Laboratory
111
-------
ACKNOWLEDGMENTS
The authors gratefully acknowledge the significant contributions of the following individuals toward the
development of this document: foremost, Ray Valente, Science Applications International, Newport, R.I. who forged
the Quality Assurance Program for EMAP-Estuaries and authored the template for this QAPP; Kevin Summers, U.S.
Environmental Protection Agency, Gulf Ecology Division, Gulf Breeze, FL, for his long-term direction and
omnipresence during EMAP's five years of endeavor in the Louisianian/West Indian Provinces. Thank you Lois
Haseltine, Johnson Controls World Service, Inc., Gulf Breeze, FL, for document production.
IV
-------
PREFACE
This document outlines the integrated quality assurance plan for the Environmental Monitoring and
Assessment Program, Estuaries Resource Group's Monitoring in the West Indian Province. The quality assurance
plan is prepared following the general guidelines and specifications provided by the Quality Assurance Management
Staff of the U.S. Environmental Protection Agency Office of Research and Development and the guidelines provided
in the EMAP Quality Assurance Management Plan.
The primary objective of this Quality Assurance Project Plan (QAPP) is to maximize the probability that
environmental data collected by the EMAP-Estuaries program will meet or exceed the objectives established for data
quality. The QAPP presents a systematic approach that will be implemented within each major data acquisition and
data management component of the program. Basic requirements specified in the QAPP are designed to: (1) ensure
that collection and measurement procedures are standardized among all participants; (2) monitor the performance of
the various measurement systems being used in the program to maintain statistical control and to provide rapid
feedback so that corrective measures can be taken before data quality is compromised; (3) assess the performance of
these measurement systems and their components periodically; and, (4) verify that reported data are sufficiently
complete, representative, unbiased, and precise so as to be suitable for their intended use. These activities will provide
data users with information regarding the degree of uncertainty associated with the various components of the EMAP-
Estuaries data base.
The proper citation of this document is:
Heitmuller, P.T. and Clay Peacher 1995. EMAP-Estuaries West Indian Province: Quality Assurance Project
Plan for 1995. U.S. Environmental Protection Agency, Office of Research and Development, Gulf Ecology Division
of the National Health and Environmental Effects Research Laboratory, Gulf Breeze, FL.
-------
TABLE OF CONTENTS
DISCLAIMER y
APPROVALS iii
ACKNOWLEDGMENTS /...^........... iv
PREFACE .....".......... v
1 INTRODUCTION 1 of 7
1.1 OVERVIEW OF EMAP 1 of 7
1.2 THE ESTUARIES COMPONENT OF EMAP 2 of 7
1.2.1 Realignment of EMAP: EMAP Phase II 3 of 7
1.3 QUALITY ASSURANCE PROGRAM WITHIN EMAP 4 of 7
1.4 QUALITY ASSURANCE PROGRAM FOR EMAP-ESTUARIES 4 of 7
2 PROJECT ORGANIZATION 1 of 4
2.1 MANAGEMENT STRUCTURE 1 of 4
3 GENERAL REQUIREMENTS FOR HELD AND LABORATORY OPERATIONS 1 of 10
3.1 FIELD OPERATIONS 1 of 10
3.1.1 Training Program 2 of 10
3.1.2 Field Activities in Areas of Special Conditions 3 of 10
3.1.3 Location and Documentation of Sampling Site 3 of 10
3.1.4 Field Quality Control and Audits 4 of 10
3.2 LABORATORY OPERATIONS 4 of 10
3.2.1 Laboratory Personnel, Training, and Safety 5 of 10
3.2.2 Quality Assurance Documentation 6 of 10
3.2.3 Analytical Procedures 6 of 10
3.2.4 Laboratory Performance Audits 7 of 10
3.2.5 Preparation and Use of Control Charts 7 of 10
4 ASSESSMENT OF QUALITY OBJECTIVES 1 of 12
4.1 DATA QUALITY OBJECTIVES 1 of 12
4.2 REPRESENTATIVENESS 5 of 12
4.3 COMPLETENESS 8 of 12
4.4 COMPARABILITY 10 of 12
4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR 10 of 12
5 ANALYSIS OF CHEMICAL CONTAMINANTS IN SEDIMENT, FISH TISSUE, AND
WATER SAMPLES 1 of 33
5.1 OVERVIEW 1 of 33
5.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION, PRESERVATION,
AND HOLDING 4 of 33
5.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS 6 of 33
5.3.1 Overview 6 of 33
5.3.2 Initial Demonstration of Capability 10 of 33
Instrument Calibration 10 of 33
Initial Documentation of Method Detection Limits 10 of 33
Initial Blind Analysis of a Representative Sample 12 of 33
5.3.3 Ongoing Demonstration of Capability 12 of 33
Laboratory Participation in Intercomparison Exercises 12 of 33
VI
-------
Contents (Cont.)
Routine Analysis of Certified Reference Materials or Laboratory Control Materials 13 of 33
Continuing Calibration Checks 16 of 33
Laboratory Reagent Blank 17 of 33
Internal Standards 17 of 33
Injection Internal Standards 18 of 33
Matrix Spike and Matrix Spike Duplicate 18 of 33
Field Duplicates and Field Splits 20 of 33
5.4 OTHER SEDIMENT MEASUREMENTS 20 of 33
5.4.1 Total Organic Carbon 20 of 33
5.4.2 Acid Volatile Sulfide 21 of 33
5.4.3 Butyltins 22 of 33
5.5 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 23 of 33
5.5.1 Sample Tracking 23 of 33
5.5.2 Data Reporting Requirements 24 of 33
5.5.3 Data Evaluation Procedures : 27 of 33
Checking Data Completeness 28 of 33
Assessing Data Quality 30 of 33
Assigning Data Qualifier Codes 31 of 33
Taking Final Action 32 of 33
6 SEDIMENT PARTICLE SIZE ANALYSIS 1 of 6
6.1 OVERVIEW 1 of 6
6.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION, PRESERVATION,
AND HOLDING 1 of 6
6.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS 1 of 6
6.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 3 of 6
6.4.1 Sample Tracking 3 of 6
6.4.2 Data Reporting Requirements and Evaluation Procedures 4 of 6
6.4.3 Assigning Data Qualifier Codes and Taking Final Action 5 of 6
7 SEDIMENT TOXICITY TESTING 1 of 12
7.1 OVERVIEW 1 of 12
7.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION, PRESERVATION
AND HOLDING 1 of 12
7.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS 2 of 12
7.3.1 Facilities and Equipment 2 of 12
7.3.2 Initial Demonstration of Capability 2 of 12
7.3.3 Quality of Test Organisms 4 of 12
7.3.4 Test Conditions 5 of 12
7.3.5 Test Acceptability 6 of 12
7.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 6 of 12
7.4.1 Sample Tracking 6 of 12
7.4.2 Record Keeping and Data Reporting Requirements 7 of 12
7.4.3 Data Evaluation Procedures 8 of 12
7.4.4 Assigning Data Qualifier Codes 10 of 12
7.4.5 Data Quality Reports 12 of 12
Vll
-------
Contents (Cont.)
8 MACROBENTHIC COMMUNITY ASSESSMENT 1 of 14
8.1 OVERVIEW 1 of 14
8.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION, PRESERVATION,
AND HOLDING 1 of 14
8.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS 4 of 14
8.3.1 Sorting 4 of 14
8.3.2 Species Identification and Enumeration 5 of 14
8.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 8 of 14
8.4.1 Sample Tracking 8 of 14
8.4.2 Record Keeping and Data Reporting Requirements 8 of 14
8.4.3 Data Evaluation Procedures , 9 of 14
8.4.4 Data Quality Reports 11 of 14
8.5 DEVELOPMENT AND VALIDATION OF THE BENTHIC INDEX 12 of 14
9 MEASUREMENTS OF FISH COMMUNITY STRUCTURE AND PATHOLOGY 1 of 11
9.1 OVERVIEW 1 of 11
9:2 QUALITY CONTROL PROCEDURES: FIELD OPERATIONS 1 of 11
9.2.1 Trawling 1 of 11
9.2.2 Alternative Fish Collection: Traps 2 of 11
9.2.3 Species Identification, Enumeration, and Length Measurements 3 of 11
9.2.4 Sample Preparation, Labeling, and Storage 3 of 11
9.3 QUALITY CONTROL PROCEDURES: GROSS EXTERNAL PATHOLOGY
AND HISTOPATHOLOGY 4 of 11
9.3.1 Gross Pathological Examinations 4 of 11
9.3.2 Splenic Macrophage Aggregates In Fish 5 of 11
9.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 7 of 11
9.4.1 Sample Tracking 7 of 11
9.4.2 Data Reporting Requirements 7 of 11
9.4.3 Data Evaluation Procedures 8 of 11
Checking Data Completeness 9 of 11
Assessing Data Quality 9 of 11
Taking Final Action 10 of 11
10 WATER QUALITY MEASUREMENTS 1 of 11
10.1 OVERVIEW 1 of 11
10.1.1 Field Measurement of Water Quality 1 of 11
10.1.2 Measurements of Pelagic Eutrophication 2 of 11
10.2 QUALITY CONTROL PROCEDURES: FIELD MEASUREMENTS 2 of 11
10.2.1 Calibration Checks and QC Procedures 2 of 11
Hydrolab H20 3 of 11
DataSonde 3 4 of 11
LICOR L1100 Light Meter 5 of 11
Secchi Depth 6 of 11
10.3 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT 6 of 11
10.3.1 Nutrient Measurements 6 of 11
10.3.2 Chlorophyll Analysis 7 of 11
10.3.3 CHN Analysis 7 of 11
Vlll
-------
Contents (Cont.)
10.4 QUALITY CONTRL PROCEDURES: INFORMATION MANAGEMENT 7 of H
10.4.1 Sample/Data Tracking 7 ofl 1
10.4.2 Data Reporting Requirements 8 of 11
10.4.4 Data Evaluation Procedures 8 of 11
Checking Data Completeness 9 of 11
Assigning Data Qualifier Codes 9 of 11
Taking Final Action 10 of 11
11 INFORMATION MANAGEMENT 1 of 7
11.1 System Description 1 of 7
11.2 Quality Assurance/Quality Control 1 of 7
11.2.1 Standardization 1 of 7
11.2.2 Prelabeling of Equipment and Sample Containers 2 of 7
11.2.3 Data Entry, Transcription, and Transfer •. 2 of 7
11.2.4 Automated Data Verification 4 of 7
11.2.5 Sample Tracking 4 of 7
11.2.6 Reporting 5 of 7
11.2.7 Redundancy (Backups) 5 of 7
11.3 Documentation and Release of Data 6 of 7
12 QUALITY ASSURANCE REPORTS TO MANAGEMENT 1 of 1
13 REFERENCES 1 of 3
IX
-------
Section 1
Page 1 of 7
Revision 2
June 1995
SECTION 1
INTRODUCTION
1.1 OVERVIEW OF EMAP
The Environmental Monitoring and Assessment Program (EMAP), created in 1988 by the U.S.
Environmental Protection Agency (EPA) in cooperation with other Federal agencies, was charged with
providing basic answers relating to environmental problems impacting our Nation's ecological resources.
By simultaneously monitoring pollutants and environmental indicators, EMAP sought to identify the
potential causes of adverse changes. As a fully implemented program, EMAP planned to apply a
probability-based study design on regional scales to address the following objectives:
Estimate the geographic coverage and extent of the Nation's ecological resources with
known confidence.
Estimate the current status, trends, and changes in the selected indicators of condition of
the Nation's ecological resources on a regional scale with known confidence.
Seek associations between selected indicators of natural and anthropogenic stresses and
indicators of the condition of ecological resources.
Provide annual statistical summaries and periodic assessments of the Nation's ecological
resources.
-------
Section 1
Page 2 of 7
Revision 2
June 1995
1.2 THE ESTUARIES COMPONENT OF EMAP
The Estuaries component of EMAP (EMAP-E) has monitored the status and trends in the
environmental quality of the estuarine waters of the United States since 1990. The EMAP-E Program has
four major objectives:
• Provide a quantitative assessment of the regional extent of coastal environmental problems by
measuring pollution exposure and ecological condition.
• Measure changes in the regional extent of environmental problems for the Nation's estuarine and
coastal ecosystems.
• Identify and evaluate associations between the ecological condition of the Nation's estuarine and
coastal ecosystems and pollutant exposure, as well as other factors known to affect ecological
condition (e.g., climatic conditions, land use patterns).
• Assess the effectiveness of pollution control actions and environmental policies on a regional scale
(i.e., large estuaries like Chesapeake Bay, major coastal regions like the mid-Atlantic and Gulf
Coasts, large inland bodies of water like the Great Lakes) and nationally.
The EMAP-E program complements and may eventually merge with the National Oceanic and
Atmospheric Administration's (NOAA's) existing National Status and Trends Program for Marine
Environmental Quality to produce a single, cooperative estuarine monitoring program. To more
efficiently manage estuarine activities, the EMAP-E Program has been further divided to study the Great
Lakes, the offshore (shelf) environment, and the Nation's estuaries, bays, tidal rivers, and sounds.
Complete descriptions of the EMAP-E monitoring approach and rationale, sampling design, indicator
strategy, logistics, and data assessment plan are provided in the Near Coastal Program Plan for 1990:
Estuaries (Holland 1990). The strategy for implementation of the EMAP-E project is a regional, phased
approach which started with the 1990 Demonstration Project in the Varginian Province. This
-------
Section 1
Page 3 of 7
Revision 2
June 1995
biogeographical province covers an area from Cape Cod, Massachusetts, to Cape Henry, Virginia (Holland
1990). In 1991, monitoring continued in the Virginian Province and began in the Louisianian Province
(Gulf of Mexico from near Tampa Bay, Florida, to the Texas-Mexico border at the Rio Grande). The
Virginian and Louisianian Province Demonstration Projects continued until each had completed its full
term cycle of four years, in 1993 and 1994, respectively. In 1994, a full demonstration was launched in
the Carolinian Province (South Atlantic Coast from North Carolina to near Ft. Pierce, Florida) which will
continue into 1995. Also in 1995, a full demonstration project will be conducted in the West Indian
Province (South Florida - from Tampa Bay around the southern tip of Florida, including the Florida Keys,
up the Atlantic Coast to near Ft. Pierce, FL). Several limited demonstration projects were performed in the
Califomian Province and a pilot study has been conducted in the Acadian Province.
1.2.1 Realignment of EMAP: EMAP Phase II
With the recent trend of down-scaling and cost cutting throughout the Federal government, the
EMAP Program has experienced a significant reduction in funding. For this reason, the EMAP Program, as
originally planned, is no longer fiscally realistic. For EMAP to remain a viable program, restructuring was
necessary, in scope as well as size. Based on the experience gained from the initial years of the program,
on the recommendations of external reviews, and on financial constraints, EPA has developed Phase II of
EMAP. Rather than providing actual monitoring for all ecological resources across the nation, EMAP will
now focus on demonstrating the scientific validity and practicality of ecological monitoring approaches
for selected ecological resources in specific geographical locations, and continuing to develop jointly
with other Federal agencies and the States, a plan for integrating EMAP into a comprehensive national
ecological monitoring network.
Phase II of EMAP will be primarily focused on the biological integrity of aquatic resources. EMAP-
Estuaries most likely will undergo a name change to EMAP-Coastal Resources (EMAP-CR), but more
importantly, will direct its resources to several regional-scale studies in specific geographical areas to
further develop and demonstrate the technical tools and monitoring approaches that could be applied by
EPA and the States, in cooperation with other agencies, to monitor and assess status and trends in the
biological integrity of surface waters of the United States (streams, rivers* lakes, wetlands, estuaries, and
-------
Section 1
Page 4 of 7
Revision 2
June 1995
the Great Lakes) and to associate changes in biological integrity with stresses. The geographical areas
initially selected for the estuarine regional demonstrations are the Mid Atlantic, the Pacific Northwest, and
South Florida (West Indian Province). The study design for the 1995 summer monitoring in the West
Indian Province will adhere to the conventional EMAP-Estuaries approach established in recent years.
Future studies in the West Indian will be directed more to the development and testing of novel indicators
of ecological condition and alternate study designs than to the rote environmental monitoring that-
previously characterized the Estuaries program.
This document is the Quality Assurance Project Plan that for the 1995 EMAP-Estuaries Monitoring
in the West Indian Province.
1.3 QUALITY ASSURANCE PROGRAM WITHIN EMAP
The overall QA 'and management policies, organization, objectives, and functional responsibilities
associated with the EMAP program are documented in a Quality Assurance Management Plan (Kirkland
1994). The Quality Assurance Management Plan presents the guidelines and minimum requirements for
QA programs developed and implemented by each resource group within EMAP.
1.4 QUALITY ASSURANCE PROGRAM FOR EMAP-ESTUARIES
The Estuaries Resource Group, as a component of EMAP, must conform with all requirements
specified in the approved EMAP Quality Assurance Management Plan and participate in the EPA
mandatory QA program (U.S. EPA 1984). As part of this program, every environmental monitoring and
measurement project is required to have a written and approved quality assurance project plan (QAPP).
The QAPP for EMAP-E monitoring in the West Indian Province (this document) describes the quality
assurance and quality control activities and measures that will be implemented to ensure that the data will
meet all quality criteria established for the project. All project personnel must be familiar with the
policies, procedures, and objectives outlined in this quality assurance plan to assure proper interactions
among the various data acquisition and management components of the project. This document will be
revised, as appropriate, as changes are made to the existing QA program, and as additional data acquisition
-------
Section 1
Page 5 of 7
Revision 2
June 1995
activities are implemented.
EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations - EPA
QA/R-5 (U.S. EPA 1984) states that the 25 items shown in Table 1-1 should be addressed in the QA
Project Plan. Some of these items are extensively addressed in other documents for this project and
therefore are only summarized or referenced in this document.
-------
Section 1
Page 6 of 7
Revision 2
June 1995
TABLE 1-1. Sections in this report-that address the 25 items required in a Quality Assurance Project
Plan.
Quality Assurance Subject
This Report
Project Management
Title and approval sheet
Table of contents
Distribution list
Project/task organization
Problem def initionfoackground
Project/task description
Quality objectives and criteria for measurement data
Project narrative (ORD only)
Special training requirements/certification
Documentation and records
Title page and approval page
Table of contents
Section 2
Section 2
Section 1
Section 1
Section 4
Sections 1-11
Sections 3, 5, 7, and 8
Sections 3, 5-11
Measurement /Data Acquisition
Sampling process design (experimental design)
Sampling methods requirements
Sample handling and custody requirements
Analytical methods requirements
Quality control requirements
Instrument/equipment testing, inspection, and
maintenance, requirements
Instrument calibration and frequency
Inspection/acceptance requirements for supplies
and consumables
Data acquisition requirements
(non-direct measurements)
Data management
Sections 3, 5, and 10
Section 5-10
Section 3, 5-10
Sections 3, 5-10
Sections 3-10
Section 5, 6, 7, and 10
Sections 5, 6, 7, and 10
Sections 5 and 7
NA
Sections 3, 5-11
-------
TABLE 1-1. (continued)
Section t
Page 7 of 7
Revision 2
June 1995
Quality Assurance Subject
This Report
Assessment/ Oversight
Assessment and response actions
Reports to management
Data Validation and Usability
Data review, validation, and verification requirements
Validation and verification methods
Reconciliation with user requirements
Sections 3, 5-10
Section 12
Sections 3, 5-11
Sections 3, 5-11
Sections 4 and 11
-------
Section 2
Page 1 of 4
Revision 2
June 1995
SECTION 2
PROJECT ORGANIZATION
2.1 MANAGEMENT STRUCTURE
For the EMAP-Estuaries monitoring in the West Indian Province (WI), expertise in research and
monitoring will be provided by several Federal agencies and their associated contracting organizations.
The EPA 's Gulf Ecology Division of the National Health and Environmental Effects Research Laboratory
(GED) in Gulf Breeze, FL has been designated as the principal laboratory for EMAP-E monitoring in the
West Indian Province, and therefore will provide direction and support for all activities. In addition to WI
activities, GED will also be the center for key EMAP-Estuaries functions (i.e., Office of the Technical
Director, QA Coordinator, Information Management, and Statistical Design).
In 1994, an Interagency Agreement (IA) between the EPA and the Department of the Interior,
National Biological Service's (NBS) Southern Science Center in Lafayette, LA, was initiated to augment
the Federal staff with additional positions in the areas of quality assurance, statistical design and analysis,
and Geographical Information System (GIS) management. Technical support associated with GIS
functions for the Estuaries program is provided to the NBS through a contract with Johnson Controls
World Services Incorporated; several data management positions are also included on that contract.
Field teams for the 1995 monitoring in the WI will be provided through the Avanti Corporation,
the level of effort contractor at GED. These are temporary positions, most of which will only be filled for
three months (June-August).
Analytical services and sample processing for the West Indian monitoring will be conducted
through cooperative agreements with qualified organizations. The chemical analyses of contaminants in
sediments and fish tissue will be conducted by the Skidaway Institute^of Oceanography in Savannah, GA;
-------
Section 2
Page 2 of 4
Revision 2
June 1995
Skidaway is a component of the State of Georgia University System. The suite of benthic indicators will
be processed by the Gulf Coast Research Laboratory (GCRL) in Ocean Springs, MS; GCRL is associated
with the State of Mississippi University System.
Figure 2-1 illustrates the management structure for the 1995 EMAP-E monitoring in the West
Indian Province. All key personnel currently identified for the 1995 West Indian monitoring are listed in
Table 2-1.
-------
EMAP Estuaries
Section 2
Page 3 of 4
Revision 2
June 1995
Program Level
West Indian Province Level
GED Director
Foster Mayer
• j
*
EMAP Director
Thomas Murphy
Estuaries Director
Kevin Summers
NBS Chief
James Johnston
West Indian Province Manager
John Macauley
GED QA Manager
Jim Moore
GCE Branch Chief
Michael Lewis
Site Manager
Pete Bourgeois
West Indian Province
Statistician
X
r
West Indian Province
QA Coordinator
eyyuzr: ~
Processing »
Laboratories
i
1
Field Logistics
Coordinator
X
Field Crews
f\'
Province Support Staff
A
Figure 2.1 Management structure for the 1995 EMAP-E West Indian Province.
-------
Section 2
Page 4 of 4
Revision 2
June 1995
TABLE 2-1. List of key personnel, affiliations, and responsibilities for the EMAP-Estuaries 1995
West Indian Province monitoring.
NAME
T. Murphy
F. Mayer
K. Summers
J. Macauley
J. Johnston
T. Heitmuller
J. Moore
Clay Peacher
P. Borthwick
AFFILIATION
U.S. EPA-Corvallis
U.S. EPA-Gulf Breeze
U.S. EPA-Gulf Breeze
U.S. EPA-Gulf Breeze
Nat'l Biol. Service
(NBS)-Lafayette, LA
NBS-Gulf Breeze
U.S. EPA-Gulf Breeze
.U.S. EPA-Gulf Breeze
U.S. EPA-Gulf Breeze
RESPONSIBILITY
EMAP Program Director (acting)
Acting Division Director
EMAP-E Technical Director
West Indian Province Manager
Section Chief
EMAP-E QA Coordinator
GED QA Manager
West Indian Province QA Coordinator
GED QA Coordinator
M. Adams
P. Bourgeois
Z. Malaeb
Virginia Engle
Johnson Controls World Services, Inc. West Indian Information Manager
NBS-Gulf Breeze Site Manager
NBS-Gulf Breeze EMAP-E Statistical Design
NBS-Gulf Breeze West Indian Province Statistician
J. Foumie U.S. EPA-Gulf Breeze
E. O'Neill Avanti Corporation
W. Walker Gulf Coast Research Laboratory
H. Windom Skidaway Institute of Oceanography
J. Landsberg Florida Dept. Envir. Protection
Fish Pathology/Histopathology
Field Sampling
Benthic Analyses
Contaminant Analyses
Eutrophication Indicators
-------
Section 3
Page 1 of 10
Revision 2
June 1995
SECTIONS
GENERAL REQUIREMENTS FOR FIELD AND
LABORATORY OPERATIONS
3.1 FIELD OPERATIONS
All field operations conducted by the EMAP-Estuaries Resource Group are planned and implemented
according to a logistics plan that is prepared and approved following guidelines established for EMAP
(Baker and Merritt 1990). Elements of the logistics plan, presented in Table 3-1, address major areas of
project implementation, including project management, site access and scheduling, safety and waste
disposal, procurement and inventory control, training and data collection, and the assessment of the
operation upon completion.
TABLE 3-1. Required Elements of EMAP Logistics Plan (from Baker and Merritt 1990).
Logistics Plan Area Required Elements
Project Management Overview of Logistics Activities
Staffing and Personnel Requirements
Communications
Access and Scheduling Sampling Schedule
Site Access
Reconnaissance
Safety Safety Plan
Waste Disposal Plan
Procurement and Inventory Control Equipment, Supplies, and Services
Procurement, Methods, and Scheduling
Training and Data Collection Training Program
Field and Mobile Laboratory Operations
Quality Assurance
Information Management
Assessment of Operations Logistics Review and Recommendations
-------
Section 3
Page 2 of 10
Revision 2
June 1995
3.1.1 Training Program
Proper training of field personnel represents a critical aspect of quality control. Field technicians are
trained to conduct a wide variety of activities using standardized protocols to ensure comparability in data
collection among crews and across regions. Each field team typically consists of a team leader and one or
two, 3-merhber boat crews supported by a land-based, 2-member mobile laboratory crew. Each boat crew
is headed by a crew chief (one of whom is the team leader), who is captain of the boat and chief scientist,
and, as such, is the ultimate on-site decision maker regarding safety, technical direction, and
communication with province management.
For the 1995 monitoring, the field crews will complete a 4 to 5-day training session to be held at
GED. The crews will receive classroom instruction to be followed up by extensive hands-on exercises
conducted in the field under the guidance of veteran crew chiefs and other province personnel. At the
completion of the training, each crew (boat and mobile laboratory) must pass a graded field certification
exercise (passing score, normalized to percent, >90%). The exercise, a full-scale sampling assignment,
will incorporate all elements of field sampling and the associated support activities of the mobile
laboratory crew. The performance of the crew will be graded by a member of the Province field
management team (i.e., the Province Manager, Quality Assurance Manager, or the Quality Assurance
Coordinator). If any deficiencies within a crew are noted, they are remedied prior to field sampling. This
is accomplished through additional in-house training or by modifying the crew composition. It is the
responsibility of the Province QA Coordinator to develop and maintain on permanent file all records
related to crew certifications (e.g., examinations, field and laboratory check-out sheets, grading forms,
etc.).
All aspects of field operations are detailed in the West Indian Field Operations Manual (Macauley
and Summers 1995), which is distributed to all participants prior to the certification exercise. The manual
includes a checklist of all equipment, instructions on equipment use, and detailed written descriptions of
sample collection procedures. In addition, the manual includes flow charts and a schedule of activities to
be conducted at each sampling location.
-------
Section 3
Page 3 of 10
Revision 2
June 1995
3.1.2 Field Activities in Areas of Special Conditions
Certain geographical and ecological features that differentiate the West Indian Province from its
adjacent provinces, the presence of mangrove, profuse seagrass beds, and shallow, tropical waters
encompassing environmentally sensitive elements (i.e., live coral reefs, endangered species, national
parks, and other marine sanctuaries), preclude the utilization of some EMAP-Estuaries field collection and
sampling methods routinely conducted in other areas. The primary activities affected are those that involve
the collection and field processing of benthic grab samples and the collection of fish. Both activities are
vital components that generate some of the key EMAP-E indicators (e.g., sediment chemistry, benthic
community assessments, and fish pathology). In order to retain these sample types, the field collection
techniques were modified to meet the constraints imposed from physical factors or policy related aspects,
while at the same time, maintaining protocols as comparable as possible to those established for the
routinely conducted activities. Alternate field sampling methods and associated QA/QC criteria will be
discussed under the appropriate sections of this document.
3.1.3 Location and Documentation of Sampling Site
Approximately one year prior to the summer field monitoring season, the EMAP-Coastal Resources
Study Design Group randomly generate the coordinates of latitude and longitude of the base stations for
all provinces. The coordinates are provided to the respective Province Managers who routinely plot the
locations on nautical charts as the first step in a reconnaissance exercise to determine any problems that
may be associated with the site. Access to or shallow water depth are the more frequently encountered
complications in the WI Province. Areas for which the field crews have little or no familiarity may require
that they personally reconnoiter the area (site visit) to appraise the situation; again, this should be
completed well prior to the scheduled field season. In some cases, if the site is deemed unsamplable, it
may be dropped altogether; in other circumstances, the Province Manager may elect to relocate the site;
these decisions will be left to the Province Management Team and are not to be made by the field crews.
After enacting such measures, there may still be sites at which the crews experience difficulties in locating
directly on station. In the WI, if access to a site is limited, field crews are allowed to locate within a 0.05
nautical mile (~300 ft) radius, via GPS, of the intended site. This variance is only extended to situations
that are otherwise inaccessible; in no way should this be interpreted as the routine allowable drift for siting
-------
Section 3
Page 4 of 10
Revision 2
June 1995
an unencumbered station. Crews are to make every reasonable effort at locating as close to the intended
site as possible. If the vessel ends up anchored outside of the 0.05 nm tolerance range, this will be noted
on the Station Information Data Sheet with full explanation. The Province Management Team will assess
the validity of these relocation incidences on a case-by-case basis and make the determination of its
acceptability to the overall sample design. After a relocation is approved, the new coordinates of latitude
and longitude will be the designated site location that is entered into th ' database.
3.1.4 Field Quality Control and Audits
Quality control of measurements made during the actual field sampling period is accomplished
through the use of a variety of QC sample types and procedures, as described in later sections of this
document. In addition, at least once during each field season, a performance review of each field crew will
be performed by either the EMAP-LP QAC, or his federal designee, to ensure compliance with prescribed
protocols. A checklist has been developed to provide comparability and consistency in this process. Field
crews must be retrained whenever discrepancies are noted.
3.2 LABORATORY OPERATIONS
This section addresses only general laboratory operations, while the sections on each indicator
present specific QA/QC requirements and procedures associated with the processing of specific samples.
All laboratories providing analytical support for chemical or biological analyses must have the appropriate
facilities to store and prepare samples, and appropriate instrumentation and staff to provide data of the
required quality within the time period dictated by the project. Laboratories are expected to conduct
operations using appropriate laboratory practices, including:
• A program of scheduled maintenance of analytical balances, microscopes, laboratory equipment, and
instrumentation.
• Routine checking of analytical balances using a set of standard reference weights (ASTM Class 3,
NIST Class S-l, or equivalents).
• Checking and recording the composition of fresh calibration standards against the previous lot.
Acceptable comparisons are ± 2 % of the previous value.
-------
Section 3
Page 5 of 10
Revision 2
June 1995
• Recording all analytical data in bound logbooks in ink or printed data sheets.
• Daily monitoring and documenting the temperatures of cold storage areas and freezer units.
• Having a source of reagent water meeting American Society of Testing and Materials (ASTM) Type
I specifications (ASTM 1984) available in sufficient quantity to support analytical operations. The
conductivity of the reagent water should not exceed 1 ^S/cm at 25 °C.
• Labeling all containers used in the laboratory with date prepared, contents, and initials of the
individual who prepared the contents.
• Dating and storing all chemicals safely upon receipt. Chemicals are disposed of properly when the
expiration date has expired.
• Using a laboratory information management system to track the location and status of any sample
received for analysis.
Laboratories should be able to provide information documenting their ability to conduct the analyses
with the required level of data quality. Such information might include results from interlaboratory
comparison studies, control charts and summary data of internal QA/QC checks, and results from certified
reference material analyses. Laboratories must also be able to provide analytical data and associated
QA/QC information in a format and time frame specified by the West Indian Province Manager and/or
Information Manager.
3.2,1 Laboratory Personnel, Training, and Safety
Each laboratory providing analytical support to EMAP-E, West Indian Province should designate an
in-house QA coordinator. This individual will serve as the point of contact for the EMAP-E QA staff in
identifying and resolving issues related to data quality. To ensure that the samples are analyzed in a
consistent manner throughout the duration of the project, key laboratory personnel should participate in an
orientation session conducted during an initial site visit or via communication with EMAP-E QA staff.
The purpose of the orientation session is to familiarize key laboratory personnel with the QA program.
Laboratories may be required to demonstrate acceptable performance before analysis of samples can
proceed, as described for each indicator in subsequent sections. Laboratory operations will be evaluated
on a continuous basis through technical systems audits, performance evaluation studies, and by
participation in interlaboratory round-robin programs.
-------
Section 3
Page 6 of 10
Revision 2
June 1995
Personnel in any laboratory performing EMAP analyses should be well versed in standard laboratory
practices, including recognized safety procedures. It is the responsibility of the laboratory manager and/or
supervisor to ensure that safety training is mandatory for all laboratory personnel. The laboratory is
responsible for maintaining a current safety manual in compliance with the Occupational Safety and
Health Administration (OSHA), or equivalent state or local regulations. The safety manual should be
readily available to laboratory personnel. Proper procedures for safe storage, handling, and disposal of
chemicals should be followed at all times; each chemical should be treated as a potential health hazard and
established, approved laboratory practices should be implemented accordingly.
3.2.2 Quality Assurance Documentation
All laboratories must have the latest revisions of the EMAP-E West Indian Province QA Project Plan
(this document). In addition, the following documents and information must be current, and they must be
available to all laboratory personnel participating in the processing of EMAP-E samples:
• Laboratory QA Plan - Clearly defined policies and protocols specific to a particular laboratory
including personnel responsibilities, laboratory acceptance criteria for release of data, and
procedures for determining the acceptability of results.
• Laboratory Standard Operating Procedures (SOPs) - Detailed instructions for performing routine
laboratory procedures. In contrast to the Laboratory Methods Manual, SOPs offer step-by-step
instructions describing exactly how the method is implemented in the laboratory, specifically for the
particular equipment or instruments on hand.
• Instrument performance study information - Information on instrument baseline noise, calibration
standard response, analytical precision and bias data, detection limits, etc. This information usually
is recorded in logbooks or laboratory notebooks.
• Control charts - Control charts must be developed and maintained throughout the project for all
appropriate analyses and measurements (see section 3.2.5).
3.2.3 Analytical Procedures
Complete and detailed procedures for processing and analysis of samples in the field and laboratory
are provided in the West Indian Province Field Operations Manual (Macauley and Summers 1995) and the
-------
Section 3
Page 7 of 10
Revision 2
June 1995
EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision) respectively, and will not be repeated
here.
3.2.4 Laboratory Performance Audits
Initially, a QA assistance and performance audit will be performed by the federal EMAP-E QA staff
to determine if each laboratory effort is in compliance with the procedures outlined in the Methods Manual
and QA Project Plan and to assist the laboratory where needed. If deficiencies are noted for a laboratory,
they will be pointed out and the laboratory must correct the particular deficiency and demonstrate overall
competency before initiating analyses of EMAP samples. Additionally, technical systems audits may be
conducted by a team composed of the QA Manager or Coordinator and his/her technical assistants.
Reviews may be conducted at any time during the scope of the study but are not required every year.
Furthermore, laboratory performance will be assessed on a continuous basis through the use of internal
and external performance evaluation samples and laboratory intercomparison studies (round robins). If
the performance of a laboratory is found to be substandard (i.e., does not meet the QA/QC requirements
detailed in this QAPP), the laboratory must cease their analyses of EMAP samples until the problems have
been identified and resolved. Any data generated during periods of questionable performance will be
closely inspected and evaluated for validity by EMAP management. If adequate sample is available, the
laboratory may be required to rerun the analysis, providing that appropriate corrective action has been
implemented.
3.2.5 Preparation and Use of Control Charts
Control charts are a graphical tool to demonstrate and monitor statistical control of a measurement
process. A control chart basically is a sequential plot of some sample attribute (measured value or
statistic). The type of control chart used primarily by laboratory analysts is a "property" chart of
individual measurements (termed an X chart).
An example of an X chart is presented in Figure 3-1. Measured values are plotted in their sequence
of measurement. Three sets of limits are superimposed on the chart: the "central line" is the mean value
calculated from at least seven initial measurements and represents an estimate of the true value of the
-------
Section 3
Page 8 of 10
Revision 2
June 1995
sample being measured; upper and lower "warning limits" representing the 95% confidence limits around
the mean value, within which most (95%) of the measured values should lie when the measurement
process is in a state of statistical control; and upper and lower "control limits" representing the 99%
confidence limits around the mean, within which nearly all (99%) of the measured values should lie when
the measurement process is in a state of statistical control.
Control charts should be updated by laboratory personnel as soon as a control sample measurement
is completed. Based on the result of an individual control sample measurement, the following course of
action should be taken (Taylor 1987):
• If the measured value of the control sample is within the warning limits, all routine sample data
since the last acceptable control sample measurement are accepted, and routine sample analyses are
continued.
• If the measured value of the control sample is outside of the control limits, the analysis is assumed
to no longer be in a state of statistical control. All routine sample data analyzed since the last
acceptable control sample measurement are suspect. Routine sample analyses are suspended until
corrective action is taken. After corrective action, statistical control must be reestablished and
demonstrated before sample analyses continue. The reestablishment of statistical control is
demonstrated by the results of three consecutive sets of control sample measurements that are in a
state of statistical control (Taylor 1987). Once statistical control has been demonstrated, all routine
samples since the last acceptable control sample measurement are reanalyzed.
-------
Section 3
Page 9 of 10
Revision 2
June 1995
Control Limit
_ _ Warning Limit
~~~v~;~~v
o o
Measurement O ^ Central Line
(value)
O v
O
o o ^ o_
Warning Limit
O
^ Control Limit
Sequence of Analysis
Figure 3-1. Example of a property type of control chart.
• If the measured value of a control sample is outside the warning limits but within the control limits,
a second control sample is analyzed. If the second control sample measurement is within the
warning limits, the analysis is assumed to be in a state of statistical control and all routine sample
data since the last acceptable control sample measurement are accepted, and routine sample analyses
are continued. If the second sample measurement is outside the warning limits, it is assumed the
analysis is no longer in a state of statistical control. All routine sample data analyzed since the last
acceptable control sample measurement are suspect. Routine sample analyses are suspended until
corrective action is taken. After corrective action, statistical control must be reestablished and
demonstrated before sample analyses continue. The reestablishment of statistical control is
demonstrated by the results of three consecutive sets of control sample measurements that are in
control (Taylor 1987). Once statistical control has been demonstrated, all routine samples since the
-------
Section 3
Page 10 of 10
Revision 2
June 1995
last acceptable control sample measurement are reanalyzed.
Taylor (1987) also provides additional criteria for evaluating control chart data to determine if a
measurement system is no longer in a state of statistical control. For X charts, these criteria include:
• Four successive points outside a range equal to plus or minus one-half the warning limits (one
standard deviation).
• Seven successive points on one side of the central line, even if all are within the warning limits.
• More than 5 % of the points outside the warning limits.
Central line, warning limits, and control limits will be evaluated periodically by either the on-site
QA coordinator or the EMAP-E QA staff. Central lines, warning limits, and control limits for each analyte
and sample type will be redefined based on the results of quality control and quality assessment sample
measurements. Current control charts must be available for review during technical systems audits.
Copies of charts will be furnished to the Province Manager or Province QA staff upon request. Such
charts should contain both the points and their associated values.
-------
Section 4
Page 1 of 12
Revision 2
June 1995
SECTION 4
ASSESSMENT OF DATA QUALITY
4.1 DATA QUALITY OBJECTIVES
The EMAP-E program is measuring a defined set of parameters that are considered to be reliable
indicators of estuarine environmental condition. The measured parameters have been categorized as either
biotic condition, abiotic condition, or habitat indicators (Table 4-1) in accordance with the general EMAP
indicator development process described by Olsen (1992). More detailed descriptions of EMAP-E's
indicator strategy are presented in the Near Coastal Program Plan for Estuaries (Holland 1990).
TABLE 4-1. EMAPTE West Indian Province indicators by major category.
Category Indicator
Biotic Condition Benthic species composition
Fish community composition
Gross pathology of fish
Histopathology of fish
Abiotic Condition Sediment contaminant concentrations
Sediment toxicity
Contaminant concentrations in fish flesh
Dissolved oxygen concentration
Marine debris
Water clarity
Habitat
Salinity
Temperature
Depth
Grain size
pH
-------
Section 4
Page 2 of 12
Revision 2
June 1995
It is the policy of the U. S. EPA that all environmental data collection activities be planned and
implemented through the development of data quality objectives (DQOs). Data quality objectives are
statements that describe in precise quantitative terms the level of uncertainty that can be associated with
environmental data without compromising their intended use. Data quality objectives provide criteria that
can be used to design a sampling strategy while balancing the cost and/or resource constraints typically
imposed upon a program.
The EMAP is unique in its stated objective of determining ecosystem condition at regional scales
using a probability-based sampling design. The relative novelty of this design, coupled with the vast
geographic expanse and inherent complexity of the natural systems being monitored, have made the task
of developing DQOs a challenging endeavor. Typically, DQOs are specified by potential users of the data.
Because EMAP Resource Groups are developing new indicators and employing them in new uses (e.g.,
regional status and trends estimates), potential users of the data have found it difficult to develop the
necessary decision and uncertainty criteria which are basic to the DQO process. In the absence of specific
decision criteria established by potential data users, the program has established a set of target DQOs,
based primarily on professional judgement, which are intended to provide a starting point for a long-term,
iterative DQO process. Consequently, these preliminary DQOs do not necessarily constitute definitive
rules for accepting or rejecting results, but rather provide guidelines for continued improvement. Several
iterations of the DQO process may be required as EMAP scientists define their capabilities and data users
define their needs.
EMAP has established target DQOs for both status and trends estimates. The target DQO for
estimates of current status in indicators of condition for EMAP is as follows:
"For each indicator of condition and resource class, on a regional scale, estimate the proportion
of the resource in degraded condition within 10% (absolute) with 90% confidence based on four
years of sampling."
-------
Section 4
Page 3 of 12
Revision 2
June 1995
The target DQO for trends in indicators of condition for EMAP is as follows:
"Over a decade, for each indicator of condition and resource class, on a regional scale, detect, at
a minimum, a linear trend of 2% (absolute) per year (i.e., a 20% change for a decade), in the percent
of the resource class in degraded condition. The test for trend will have a maximum significance
level of alpha = 0.2 and a minimum power of 0.7 (i.e., beta = 0.3)."
It is important to note that the target DQOs which have been established are related to the ability of
the present sampling design to characterize status or discern trends within a specified level of statistical
confidence. Based on statistical analyses of the first four-year sampling cycle in the Virginian Province,
EMAP-Estuaries demonstrated thai it had met the target DQOs for status and trends. When analyses are
completed for the four-year cycle in the Louisianian Province, a more definitive assessment will be made.
During the first four years of sampling, EMAP-E actively laid the groundwork for this assessment by
gathering the data needed to identify and quantify potential sources of sampling error (Table 4-2). It is
essential to account for these potentially significant sources of uncertainty (i.e., variance) in determining
whether the current sampling design allows EMAP-E to meet the target status and/or trends DQOs.
TABLE 4-2. Potential sources of sampling error being estimated during the first four years of EMAP-
E monitoring in the Virginian and Louisianian Provinces.
Source of Error EMAP-E Estimator
Small-scale spatial Replicate stations sampled each year
variability within within each resource class
the index period
Temporal variability Certain stations in each resource
within the index class are visited twice during
period the index period
Long-term temporal The same stations are visited each
(interannual) year (long-term temporal sites)
variability
Year-to-year All random stations
temporal and sampled in each resource class
spatial variability in each year
-------
Section 4
Page 4 of 12
Revision 2
June 1995
The target DQOs established for the EMAP program represent statements about resource class
populations and do not, as stated, take into account potential sources of measurement error. Measurement
error is frequently emphasized in the DQO process as an important source of uncertainty. In EMAP,
measurement error may be a less significant contributor to total uncertainty than sample density.
Measurement error is, however, a potentially important variable in controlling the regional responsiveness,
and thus the acceptability, of individual indicators. In addition, external users of EMAP data may find that
measurement error is an important source of variability that must be accounted for in addressing their own
DQOs. It is therefore important for EMAP Resource Groups to control measurement error, to the extent
possible, when selecting sampling methods and to establish measurement quality objectives (MQOs) for
each sampling method and laboratory analysis procedure. MQOs essentially represent data quality
objectives that are based on control of the measurement system. They are being used to establish criteria
for data acceptability because reliable error bounds cannot, at present, be established for end use of
indicator response data. As a consequence, management decisions balancing the cost of higher quality
data against program objectives are not presently possible. As data are accumulated on indicators and the
error rates associated with their measurement at regional scales are established, it will be possible to
address the target DQOs that have been established and determine the need for modifications to the
sampling design and/or quality assurance program.
Measurement quality objectives for the various measurements being made in EMAP-Estuaries can be
expressed in terms of accuracy, precision, and completeness goals (Table 4-3). These MQOs were
established by obtaining estimates of the most likely data quality that is achievable based on either the
instrument manufacturer's specifications, scientific experience or historical data.
The MQOs presented in Table 4-3 are used as quality control criteria both in field and laboratory
measurement processes to set the bounds of acceptable measurement error. Generally speaking, DQOs or
MQOs are usually established for five aspects of data quality: representativeness, completeness,
comparability, accuracy, and precision (Stanley and Verner 1985). These terms are defined below with
general guidelines for establishing MQOs for each quality assurance parameter.
-------
Section 4
Page 5 of 12
Revision2
June 1995
4.2 REPRESENTATIVENESS
Representativeness is defined as "the degree to which the data accurately and precisely represent a
characteristic of a population parameter, variation of a property, a process characteristic, or an operational
condition" (Stanley and Verner 1985). The concept of representativeness within the context of EMAP
monitoring refers to the ability of the program to accurately and precisely characterize regional
phenomena through the measurement of selected environmental indicators. The focus on regional
phenomena requires that the EMAP design strategy emphasize accommodation of a wide range of
resources. In addressing this requirement, EMAP-Estuaries has adopted a regionalization scheme to
allocate the Nation's estuarine and coastal resources into manageable sampling units for collection and
reporting of data. This regionalization, determined on the basis of major climatic zones and prevailing
oceanic currents, consists of seven provinces within the continental United States, five provinces in
Alaska, Hawaii, and the Pacific territories, and a region that comprises the Great Lakes. In addition,
EMAP-Estuaries is using a classification scheme to facilitate sampling of the ecosystems within each
province in proportion to their extent and abundance, thus ensuring a statistically-acceptable
representation of all ecosystem types within the sampling frame. In the West Indian Province, physical
dimensions (e.g., surface area and aspect ratio) are used to classify estuarine resources into three
categories: large estuarine systems, and small estuarine systems. Complete descriptions of the EMAP-
Estuaries regionalization and classification schemes are provided in the Near Coastal Program Plan for
1990 (Holland 1990).
The design of the EMAP-Estuaries' sampling program and the location of West Indian Province
sampling sites provide the primary focus for defining the "representativeness" of population estimates for
this region. In its initial planning stages, the EMAP-E program faced a choice between two general
sampling approaches to meet the objective of obtaining an accurate and precise representation of estuarine
resource condition at the regional scale. As described in the Near Coastal Program Plan (Holland 1990)
and restated here, these two sampling approaches were: conduct a census the nation's estuarine and coastal
ecosystems and important habitats on a periodic basis (e.g., every 4 years), or sample a subset of estuarine
and coastal resources periodically, and use the data to make inferences about unsampled area.
-------
Section 4
Page 6 of 12
Revision 2
June 1995
The census technique is the appropriate sampling method for characterizing and assessing status and
trends for some rare resources, because minimal population densities require that most of the resource
must be sampled to characterize status and to measure trends (e.g., changes in abundance of rare and
endangered species or habitats). The census technique is not a cost-effective or appropriate sampling
approach for assessing the status and trends of broadly distributed, relatively abundant resources. EMAP-
E does not have the resources to conduct a regular census of the nation's estuarine and coastal resources.
Therefore, the decision was made that sampling a subset of the resources and using the information
obtained about the subset to make inferences about unsampled resources is the only approach that is
appropriate for EMAP-E.
The subset of resources sampled by EMAP-E could be (1) a sample which is determined, based on
available scientific knowledge, to be "representative" of the range of environmental settings that exist in
estuarine and coastal environments, or (2) a probability sample of estuarine and coastal resources.
Collection of "representative" samples is an extreme case of stratified sampling and assumes that the data
collected at the "representative" sampling locations can be extrapolated to broader spatial and temporal
scales. Available scientific information is used to identify "representative" sampling locations, as well as
to define the spatial scale and temporal periods that the samples represent. Periodic collection of
"representative" samples is a powerful technique for measuring trends, because this approach minimizes
interactions between spatial and temporal variation. Because "representative" samples can be located at
any of a number of sites, they are generally easier to collect than probability samples and frequently can be
located at a site for which there is existing historical data.
Unfortunately, the current scientific understanding of the environmental processes that control
condition and distributions of estuarine and coastal resources is inadequate to define the bias and
uncertainty associated with extrapolating environmental quality information for "representative" locations
to other sites. This is especially true for data collected over broad geographic scales and long time
periods. Therefore, EMAP-E employs a probability sampling approach that samples resources in
proportion to their abundance and distribution and obtains unbiased estimates of resource characteristic
and variability. The probability sampling approach applies systematic (e.g., grid) sampling to facilitate
characterizations of spatial patterns and to encourage broad geographic coverage.
-------
Section 4
Page 7 of 12
Revision 2
June 1995
Many of the proposed parameters that EMAP-E will measure are known to exhibit large intra-annual
variability, and EMAP-E lacks the resources to characterize this variability or to assess status for all
seasons. Therefore, sampling will be confined to a limited portion of the year (i.e., an index period), when
indicators are expected to show the greatest response to pollution stress and within-season (i.e., week-to-
week) variability is expected to be small.
For most estuarine and coastal ecosystems in the Northern Hemisphere, mid-summer (July-August) is
the period when ecological responses to pollution exposure are likely to be most severe. During this
period, dissolved oxygen concentrations are most likely to approach stressful low values. Moreover, the
cycling and adverse effects of contaminant exposure are generally greatest at the low dilution flows and
high temperatures that occur in mid-summer. Therefore, summer has been selected as the most
conservative (i.e., most ecologically stressful) index period for EMAP-E.
Once unbiased quantitative information on the kinds, extent, condition, and distribution of estuarine
and coastal resources and associated estimates of uncertainty are known, a baseline of the status of
existing conditions will be established. This baseline information will be used to develop criteria for
identifying "representative" sampling sites for future sampling (e.g., trends sites, detailed studies of
processes associated with deterioration and recovery, the magnitude of natural variation). This baseline
will also be used to determine the representativeness of historical data and sampling sites (e.g., NOAA
Status and Trends sites). Over the long term, EMAP-E seeks to develop a sampling design that includes
both "representative" and probability sampling, incorporating the advantages of both approaches.
The data quality attribute of "representativeness" applies not only to the overall sampling design, but
also to individual measurements and samples obtained as part of EMAP-E's monitoring efforts. Holding
time requirements for different types of samples ensure that analytical results are representative of
conditions at the time of sampling; these requirements are specified in the individual indicator sections of
this document. In addition, the use of QA/QC samples which are similar in composition to samples being
measured provides estimates of precision and bias that are representative of sample measurements.
Therefore, as a general program objective, the types of QA documentation samples (i.e., performance
evaluation material) used to assess the quality of analytical data will be as representative as possible of the
natural samples collected during the project with respect to both composition and concentration.
-------
Section 4
Page 8 of 12
Revision 2
June 1995
4.3 COMPLETENESS
Completeness is defined as "a measure of the amount of data collected from a measurement process
compared to the amount that was expected to be obtained under the conditions of measurement" (Stanley
and Vemer 1985). EMAP-E has established a completeness goal of 100% for the various indicators being
measured (Table 4-3). Given the probability-based sampling design being employed by EMAP-E, failure
to achieve this goal will not preclude the within-year or between-year assessment of ecosystem condition.
The major consequence of having less than 100% complete data from all expected stations is a relatively
minor loss of statistical power in the areal estimate of condition, as depicted using Cumulative
Distribution Functions. The 100% completeness goal is established in an attempt to derive the maximum
statistical power from the present sampling design. Based on past years' experience, failure to achieve this
goal usually results from the field crew's inability to sample at some stations because of logistical barriers
such as insufficient depth, impenetrable substrate, or adverse weather conditions. In the limited number of
instances where these conditions may be encountered, extensive efforts will be made to relocate the station
or resample the station at a later date, always in consultation with program managers at the Province
Center. In this way, field personnel must always strive to achieve the 100% completeness goal. In
addition, established protocols for tracking samples during shipment and laboratory processing must be
followed to minimize data loss following successful sample collection.
-------
Section 4
Page 9 of 12
Revision 2
June 1995
TABLE 4-3. Measurement quality objectives for EMAP-Estuaries indicators. Accuracy (bias) goals are
expressed either as absolute difference (± value) or percent deviation from the "true" value; precision goals
are expressed as relative percent difference (RPD) or relative standard deviation (RSD) between two or
more replicate measurements. Completeness goal is the percentage of expected results that are obtained
successfully.
Indicator/Data Type
Sediment/tissue contaminant analyses:
Organics
Inorganics
Sediment toxicity
Benthic species composition:
Sorting
Counting
Taxonomy
Sediment characteristics:
Particle size (% silt-clay) analysis
Total organic carbon
Acid volatile sulfide
Water column characteristics:
Dissolved oxygen
Salinity
Depth
pH
Temperature
Gross pathology of fish
Fish community composition:
Counting
Taxonomic identification
Length determinations
Fish histopathology
Maximum
Allowable
Accuracy (Bias)
Goal
30%
15%
NA
10%
10%
10%
NA
10%
10%
± 0.5 mg/L
±1.0 ppt
±0.5m
± 0.3 units
±1.0°C
NA
10%
10%
±5 mm
NA
Maximum
Allowable
Precision
Goal
30%
15%
NA
NA
NA
NA
10%
10%
10%
10%
10%
10%
NA
NA
10%
NA
NA
NA
NA
Completeness
Goal
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
NA
-------
Section 4
Page 10 of 12
Revision 2
June 1995
4.4 COMPARABILITY
Comparability is defined as "the confidence with which one data set can be compared to another"
(Stanley and Verner 1985). Comparability of reporting units and calculations, data base management
processes, and interpretative procedures must be assured if the overall goals of EMAP are to be realized.
One goal of the EMAP-Estuaries program is to generate a high level of documentation for the above topics
to ensure that future EMAP efforts can be made comparable. For example, both field and laboratory
methods are described in full detail in manuals which will be made available to all field personnel and
analytical laboratories. Field crews will undergo intensive training prior to the start of field work. In
addition, the comparability of laboratory measurements is monitored through the laboratory
intercomparison exercises and the use of field split or duplicate performance evaluation samples. Finally,
the sampling design for EMAP-E monitoring has been made flexible enough to allow for analytical
adjustments, when necessary, to ensure data comparability.
4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR
The term "accuracy," which is used synonymously with the term "bias" in this plan, is defined as the
difference between a measured value and the true or expected value, and represents an estimate of
systematic error or net bias (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Precision is defined as
the degree of mutual agreement among individual measurements, and represents an estimate of random
error (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Collectively, accuracy and precision can
provide an estimate of the total error or uncertainty associated with an individual measured value.
Measurement quality objectives for the various indicators are expressed separately as maximum allowable
accuracy (i.e., bias) and precision goals (Table 4-3). Accuracy and precision goals may not be definable
for all parameters because of the nature of the measurement type. For example, accuracy measurements
are not possible for toxicity testing and fish pathology identifications because "true" or expected values do
not exist for these measurement parameters (see Table 4-3). In order to evaluate the MQOs for accuracy
and precision, various QA/QC samples will be collected and analyzed for most data collection activities.
Table 4-4 presents the types of samples to be used for quality assurance/quality control for each of the
various data acquisition activities except sediment and fish tissue contaminant analyses. The frequency of
-------
Section 4
Page 11 of 12
Revision 2
June 1995
QA/QC measurements and the types of QA data resulting from these samples or processes are also
presented in Table 4-4. Because several different types of QA/QC samples are required for the complex
analyses of chemical contaminants in sediment and tissue samples, they are presented and discussed
separately in Section 5 along with presentation of warning and control limits for the various chemistry QC
sample types.
TABLE 4-4. Quality assurance sample types, frequency of use, and types of data generated for EMAP-
Estuaries monitoring (see Table 5-4 for chemical analysis QA/QC sample types).
Variable
QA Sample Type
or Measurement
Procedure
Frequency
of Use
Data Generated
for Measurement
Quality Definition
Sediment toxicity
tests
Benthic species
composition:
Sorting
Reference toxicant Each experiment
Resort of sample
10% of each
tech's work
Variance of replicated
tests over time
No. animals found
in resort
Sample counting
and Identification
Recount and ID of
sorted animals
10% of each
tech's work
No. of count and ID
errors
Sediment grain size Splits of a sample
Organic carbon
and acid volatile
sulfide
Dissolved
oxygen cone.
(H20)
Duplicates and
analysis of
standards
Water-saturated air
calibration
10% of each
tech's work
Each batch
Daily
Duplicate results
Duplicate results
and standard
recoveries
Difference between
measurement and percent
saturation
Air-saturated water Weekly
measurement
Difference between
measurement and saturation
table value
-------
TABLE 4-4. (continued).
Section 4
Page 12 of 12
Revision 2
June 1995
Variable
QA Sample Type
or Measurement
Procedure
Frequency
of Use
Data Generated
for Measurement
QualityDefinition
Dissolved oxygen
cone. (DataSonde 3)
Salinity
Side-by-side
comparison with
Surveyor II
Secondary Seawater
Standard
At deployment
and retrieval of
unit
Daily
Difference between
DataSonde 3 and
H20
Difference between
probe measurement and
standard value
Temperature
Depth
PH
Fish identification
Fish counts
Fish gross
pathology
Fish
histopathology
Thermometer
reading
Check bottom depth
against depth
finder on boat
QC check with
standard buffer
solutions
Fish preserved
for verification
by taxonomist
Duplicate counts
Specimens
preserved for
confirmation
Confirmation by
second technician
Daily
Each station
Daily
Each reference
pathology sample
10% of trawls
Per occurrence
5% of slides
Difference between
probe and thermometer
Difference from
depth finder
Difference from standard
Number of misidentifications
Replicated difference
between determinations
Number of misidentifications
Number of confirmations
-------
Section 4
Page 1 of 12
Revision 2
June 1995
SECTION 4
ASSESSMENT OF DATA QUALITY
4.1 DATA QUALITY OBJECTIVES
The EMAP-E program is measuring a defined set of parameters that are considered to be reliable
indicators of estuarine environmental condition. The measured parameters have been categorized as either
biotic condition, abiotic condition, or habitat indicators (Table 4-1) in accordance with the general EMAP
indicator development process described by Olsen (1992). More detailed descriptions of EMAP-E's
indicator strategy are presented in the Near Coastal Program Plan for Estuaries (Holland 1990).
TABLE 4-1. EMAP-E West Indian Province indicators by major category.
Category Indicator
Biotic Condition Benthic species composition
Fish community composition
Gross pathology of fish
Histopathology of fish
Abiotic Condition Sediment contaminant concentrations
Sediment toxicity
Contaminant concentrations in fish flesh
Dissolved oxygen concentration
Marine debris
Water clarity
Habitat
Salinity
Temperature
Depth
Grain size
PH
-------
Section 4
Page 2 of 12
Revision 2
June 1995
It is the policy of the U. S. EPA that all environmental data collection activities be planned and
implemented through the development of data quality objectives (DQOs). Data quality objectives are
statements that describe in precise quantitative terms the level of uncertainty that can be associated with
environmental data without compromising their intended use. Data quality objectives provide criteria that
can be used to design a sampling strategy while balancing the cost and/or resource constraints typically
imposed upon a program.
The EMAP is unique in its stated objective of determining ecosystem condition at regional scales
using a probability-based sampling design. The relative novelty of this design, coupled with the vast
geographic expanse and inherent complexity of the natural systems being monitored, have made the task
of developing DQOs a challenging endeavor. Typically, DQOs are specified by potential users of the data.
Because EMAP Resource Groups are developing new indicators and employing them in new uses (e.g.,
regional status and trends estimates), potential users of the data have found it difficult to develop the
necessary decision and uncertainty criteria which are basic to the DQO process. In the absence of specific
decision criteria established by potential data users, the program has established a set of target DQOs,
based primarily on professional judgement, which are intended to provide a starting point for a long-term,
iterative DQO process. Consequently, these preliminary DQOs do not necessarily constitute definitive
rules for accepting or rejecting results, but rather provide guidelines for continued improvement. Several
iterations of the DQO process may be required as EMAP scientists define their capabilities and data users
define their needs.
EMAP has established target DQOs for both status and trends estimates. The target DQO for
estimates of current status in indicators of condition for EMAP is as follows:
"For each indicator of condition and resource class, on a regional scale, estimate the proportion
of the resource in degraded condition within 10% (absolute) with 90% confidence based on four
years of sampling."
-------
Section 4
Page 3 of 12
Revision 2
June 1995
The target DQO for trends in indicators of condition for EMAP is as follows:
"Over a decade, for each indicator of condition and resource class, on a regional scale, detect, at
a minimum, a linear trend of 2% (absolute) per year (i.e., a 20% change for a decade), in the percent
of the resource class in degraded condition. The test for trend will have a maximum significance
level of alpha = 0.2 and a minimum power of 0.7 (i.e., beta = 0.3)."
It is important to note that the target DQOs which have been established are related to the ability of
the present sampling design to characterize status or discern trends within a specified level of statistical
confidence. Based on statistical analyses of the first four-year sampling cycle in the Virginian Province,
EMAP-Estuaries demonstrated that it had met the target DQOs for status and trends. When analyses are
completed for the four-year cycle in the Louisianian Province, a more definitive assessment will be made.
During the first four years of sampling, EMAP-E actively laid the groundwork for this assessment by
gathering the data needed to identify and quantify potential sources of sampling error (Table 4-2). It is
essential to account for these potentially significant sources of uncertainty (i.e., variance) in determining
whether the current sampling design allows EMAP-E to meet the target status and/or trends DQOs.
TABLE 4-2. Potential sources of sampling error being estimated during the first four years of EMAP-
E monitoring in the Virginian and Louisianian Provinces.
Source of Error EMAP-E Estimator
Small-scale spatial Replicate stations sampled each year
variability within within each resource class
the index period
Temporal variability Certain stations in each resource
within the index class are visited twice during
period the index period
Long-term temporal The same stations are visited each
(interannual) year (long-term temporal sites)
variability
Year-to-year All random stations
temporal and sampled in each resource class
spatial variability in each year
-------
Section 4
Page 4 of 12
Revision 2
June 1995
The target DQOs established for the EMAP program represent statements about resource class
populations and do not, as stated, take into account potential sources of measurement error. Measurement
error is frequently emphasized in the DQO process as an important source of uncertainty. In EMAP,
measurement error may be a less significant contributor to total uncertainty than sample density.
Measurement error is, however, a potentially important variable in controlling the regional responsiveness,
and thus the acceptability, of individual indicators. In addition, external users of EMAP data may find that
measurement error is an important source of variability that must be accounted for in addressing their own
DQOs. It is therefore important for EMAP Resource Groups to control measurement error, to the extent
possible, when selecting sampling methods and to establish measurement quality objectives (MQOs) for
each sampling method and laboratory analysis procedure. MQOs essentially represent data quality
objectives that are based on control of the measurement system. They are being used to establish criteria
for data acceptability because reliable error bounds cannot, at present, be established for end use of
indicator response data. As a consequence, management decisions balancing the cost of higher quality
data against program objectives are not presently possible. As data are accumulated on indicators and the
error rates associated with their measurement at regional scales are established, it will be possible to
address the target DQOs that have been established and determine the need for modifications to the
sampling design and/or quality assurance program.
Measurement quality objectives for the various measurements being made in EMAP-Estuaries can be
expressed in terms of accuracy, precision, and completeness goals (Table 4-3). These MQOs were
established by obtaining estimates of the most likely data quality that is achievable based on either the
instrument manufacturer's specifications, scientific experience or historical data.
The MQOs presented in Table 4-3 are used as quality control criteria both in field and laboratory
measurement processes to set the bounds of acceptable measurement error. Generally speaking, DQOs or
MQOs are usually established for five aspects of data quality: representativeness, completeness,
comparability, accuracy, and precision (Stanley and Verner 1985). These terms are defined below with
general guidelines for establishing MQOs for each quality assurance parameter.
-------
Section 4
Page 5 of 12
Revision 2
June 1995
4.2 REPRESENTATIVENESS
Representativeness is defined as "the degree to which the data accurately and precisely represent a
characteristic of a population parameter, variation of a property, a process characteristic, or an operational
condition" (Stanley and Vemer 1985). The concept of representativeness within the context of EMAP
monitoring refers to the ability of the program to accurately and precisely characterize regional
phenomena through the measurement of selected environmental indicators. The focus on regional
phenomena requires that the EMAP design strategy emphasize accommodation of a wide range of
resources. In addressing this requirement, EMAP-Estuaries has adopted a regionalization scheme to
allocate the Nation's estuarine and coastal resources into manageable sampling units for collection and
reporting of data. This regionalization, determined on the basis of major climatic zones and prevailing
oceanic currents, consists of seven provinces within the continental United States, five provinces in
Alaska, Hawaii, and the Pacific territories, and a region that comprises the Great Lakes. In addition,
EMAP-Estuaries is using a classification scheme to facilitate sampling of the ecosystems within each
province in proportion to their extent and abundance, thus ensuring a statistically-acceptable
representation of all ecosystem types within the sampling frame. In the West Indian Province, physical
dimensions (e.g., surface area and aspect ratio) are used to classify estuarine resources into three
categories: large estuarine systems, and small estuarine systems. Complete descriptions of the EMAP-
Estuaries regionalization and classification schemes are provided in the Near Coastal Program Plan for
1990 (Holland 1990).
The design of the EMAP-Estuaries' sampling program and the location of West Indian Province
sampling sites provide the primary focus for defining the "representativeness" of population estimates for
this region. In its initial planning stages, the EMAP-E program faced a choice between two general
sampling approaches to meet the objective of obtaining an accurate and precise representation of estuarine
resource condition at the regional scale. As described in the Near Coastal Program Plan (Holland 1990)
and restated here, these two sampling approaches were: conduct a census the nation's estuarine and coastal
ecosystems and important habitats on a periodic basis (e.g., every 4 years), or sample a subset of estuarine
and coastal resources periodically, and use the data to make inferences about unsampled area.
-------
Section 4
Page 6 of 12
Revision 2
June 1995
The census technique is the appropriate sampling method for characterizing and assessing status and
trends for some rare resources, because minimal population densities require that most of the resource
must be sampled to characterize status and to measure trends (e.g., changes in abundance of rare and
endangered species or habitats). The census technique is not a cost-effective or appropriate sampling
approach for assessing the status and trends of broadly distributed, relatively abundant resources. EMAP-
E does not have the resources to conduct a regular census of the nation's estuarine and coastal resources.
Therefore, the decision was made that sampling a subset of the resources and using the information
obtained about the subset to make inferences about unsampled resources is the only approach that is
appropriate for EMAP-E.
The subset of resources sampled by EMAP-E could be (1) a sample which is determined, based on
available scientific knowledge, to be "representative" of the range of environmental settings that exist in
estuarine and coastal environments, or (2) a probability sample of estuarine and coastal resources.
Collection of "representative" samples is an extreme case of stratified sampling and assumes that the data
collected at the "representative" sampling locations can be extrapolated to broader spatial and temporal
scales. Available scientific information is used to identify "representative" sampling locations, as well as
to define the spatial scale and temporal periods that the samples represent. Periodic collection of
"representative" samples is a powerful technique for measuring trends, because this approach minimizes
interactions between spatial and temporal variation. Because "representative" samples can be located at
any of a number of sites, they are generally easier to collect than probability samples and frequently can be
located at a site for which there is existing historical data.
Unfortunately, the current scientific understanding of the environmental processes that control
condition and distributions of estuarine and coastal resources is inadequate to define the bias and
uncertainty associated with extrapolating environmental quality information for "representative" locations
to other sites. This is especially true for data collected over broad geographic scales and long time
periods. Therefore, EMAP-E employs a probability sampling approach that samples resources in
proportion to their abundance and distribution and obtains unbiased estimates of resource characteristic
and variability. The probability sampling approach applies systematic (e.g., grid) sampling to facilitate
characterizations of spatial patterns and to encourage broad geographic coverage.
-------
Section 4
Page 7 of 12
Revision 2
June 1995
Many of the proposed parameters that EMAP-E will measure are known to exhibit large intra-annual
variability, and EMAP-E lacks the resources to characterize this variability or to assess status for all
seasons. Therefore, sampling will be confined to a limited portion of the year (i.e., an index period), when
indicators are expected to show the greatest response to pollution stress and within-season (i.e., week-to-
week) variability is expected to be small.
For most estuarine and coastal ecosystems in the Northern Hemisphere, mid-summer (July-August) is
the period when ecological responses to pollution exposure are likely to be most severe. During this
period, dissolved oxygen concentrations are most likely to approach stressful low values. Moreover, the
cycling and adverse effects of contaminant exposure are generally greatest at the low dilution flows and
high temperatures that occur in mid-summer. Therefore, summer has been selected as the most
conservative (i.e., most ecologically stressful) index period for EMAP-E.
Once unbiased quantitative information on the kinds, extent, condition, and distribution of estuarine
and coastal resources and associated estimates of uncertainty are known, a baseline of the status of
existing conditions will be established. This baseline information will be used to develop criteria for
identifying "representative" sampling sites for future sampling (e.g., trends sites, detailed studies of
processes associated with deterioration and recovery, the magnitude of natural variation). This baseline
will also be used to determine the representativeness of historical data and sampling sites (e.g., NOAA
Status and Trends sites). Over the long term, EMAP-E seeks to develop a sampling design that includes
both "representative" and probability sampling, incorporating the advantages of both approaches.
The data quality attribute of "representativeness" applies not only to the overall sampling design, but
also to individual measurements and samples obtained as part of EMAP-E's monitoring efforts. Holding
time requirements for different types of samples ensure that analytical results are representative of
conditions at the time of sampling; these requirements are specified in the individual indicator sections of
this document. In addition, the use of QA/QC samples which are similar in composition to samples being
measured provides estimates of precision and bias that are representative of sample measurements.
Therefore, as a general program objective, the types of QA documentation samples (i.e., performance
evaluation material) used to assess the quality of analytical data will be as representative as possible of the
natural samples collected during the project with respect to both composition and concentration.
-------
Section 4
Page 8 of 12
Revision 2
June 1995
4.3 COMPLETENESS
Completeness is defined as "a measure of the amount of data collected from a measurement process
compared to the amount that was expected to be obtained under the conditions of measurement" (Stanley
and Verner 1985). EMAP-E has established a completeness goal of 100% for the various indicators being
measured (Table 4-3). Given the probability-based sampling design being employed by EMAP-E, failure
to achieve this goal will not preclude the within-year or between-year assessment of ecosystem condition.
The major consequence of having less than 100% complete data from all expected stations is a relatively
minor loss of statistical power in the areal estimate of condition, as depicted using Cumulative
Distribution Functions. The 100% completeness goal is established in an attempt to derive the maximum
statistical power from the present sampling design. Based on past years' experience, failure to achieve this
goal usually results from the field crew's inability to sample at some stations because of logistical barriers
such as insufficient depth, impenetrable substrate, or adverse weather conditions. In the limited number of
instances where these conditions may be encountered, extensive efforts will be made to relocate the station
or resample the station at a later date, always in consultation with program managers at the Province
Center. In this way, field personnel must always strive to achieve the 100% completeness goal. In
addition, established protocols for tracking samples during shipment and laboratory processing must be
followed to minimize data loss following successful sample collection.
-------
Section 4
Page 9 of 12
Revision 2
June 1995
TABLE 4-3. Measurement quality objectives for EMAP-Estuaries indicators. Accuracy (bias) goals are
expressed either as absolute difference (± value) or percent deviation from the "true" value; precision goals
are expressed as relative percent difference (RPD) or relative standard deviation (RSD) between two or
more replicate measurements. Completeness goal is the percentage of expected results that are obtained
successfully.
Indicator/Data Type
Sediment/tissue contaminant analyses:
Organics
Inorganics
Sediment toxicity
Bentnic species composition:
Sorting
Counting
Taxonomy
Sediment characteristics:
Particle size (% silt-clay) analysis
Total organic carbon
Acid volatile sulfide
Water column characteristics:
Dissolved oxygen
Salinity
Depth
pH
Temperature
Gross pathology of fish
Fish community composition:
Counting
Taxonomic identification
Length determinations
Fish histopathology
Maximum
Allowable
Accuracy (Bias)
Goal
30%
15%
NA
10%
10%
10%
NA
10%
10%
± 0.5 mg/L
± 1.0 ppt
±0.5m
± 0.3 units
±1.0°C
NA
10%
10%
± 5 mm
NA
Maximum
Allowable
Precision
Goal
30%
15%
NA
NA
NA
NA
10%
10%
10%
10%
10%
10%
NA
NA
10%
NA
NA
NA
NA
Completeness
Goal
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
100%
NA
-------
Section 4
Page 10 of 12
Revision 2
June 1995
4.4 COMPARABILITY
Comparability is defined as "the confidence with which one data set can be compared to another"
(Stanley and Verner 1985). Comparability of reporting units and calculations, data base management
processes, and interpretative procedures must be assured if the overall goals of EMAP are to be realized.
One goal of the EMAP-Estuaries program is to generate a high level of documentation for the above topics
to ensure that future EMAP efforts can be made comparable. For example, both field and laboratory
methods are described in full detail in manuals which will be made available to all field personnel and
analytical laboratories. Field crews will undergo intensive training prior to the start of field work. In
addition, the comparability of laboratory measurements is monitored through the laboratory
intercomparison exercises and the use of field split or duplicate performance evaluation samples. Finally,
the sampling design for EMAP-E monitoring has been made flexible enough to allow for analytical
adjustments, when necessary, to ensure data comparability.
4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR
The term "accuracy," which is used synonymously with the term "bias" in this plan, is defined as the
difference between a measured value and the true or expected value, and represents an estimate of
systematic error or net bias (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Precision is defined as
the degree of mutual agreement among individual measurements, and represents an estimate of random
error (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Collectively, accuracy and precision can
provide an estimate of the total error or uncertainty associated with an individual measured value.
Measurement quality objectives for the various indicators are expressed separately as maximum allowable
accuracy (i.e., bias) and precision goals (Table 4-3). Accuracy and precision goals may not be definable
for all parameters because of the nature of the measurement type. For example, accuracy measurements
are not possible for toxicity testing and fish pathology identifications because "true" or expected values do
not exist for these measurement parameters (see Table 4-3). In order to evaluate the MQOs for accuracy
and precision, various QA/QC samples will be collected and analyzed for most data collection activities.
Table 4-4 presents the types of samples to be used for quality assurance/quality control for each of the
various data acquisition activities except sediment and fish tissue contaminant analyses. The frequency of
-------
Section 4
Page 11 of 12
Revision 2
June 1995
QA/QC measurements and the types of QA data resulting from these samples or processes are also
presented in Table 4-4. Because several different types of QA/QC samples are required for the complex
analyses of chemical contaminants in sediment and tissue samples, they are presented and discussed
separately in Section 5 along with presentation of warning and control limits for the various chemistry QC
sample types.
TABLE 4-4. Quality assurance sample types, frequency of use, and types of data generated for EMAP-
Estuaries monitoring (see Table 5-4 for chemical analysis QA/QC sample types).
Variable
QA Sample Type
or Measurement
Procedure
Frequency
of Use
Data Generated
for Measurement
Quality Definition
Sediment toxicity
tests
Reference toxicant Each experiment
Variance of replicated
tests over time
Benthic species
composition:
Sorting
Resort of sample
10% of each
tech's work
No. animals found
in resort
Sample counting
and Identification
Recount and ID of
sorted animals
10% of each
tech's work
No. of count and ID
errors
Sediment grain size Splits of a sample
10% of each
tech's work
Duplicate results
Organic carbon
and acid volatile
sulfide
Duplicates and
analysis of
standards
Each batch
Duplicate results
and standard
recoveries
Dissolved
oxygen cone.
(H20)
Water-saturated air Daily
calibration
Difference between
measurement and percent
saturation
Air-saturated water Weekly
measurement
Difference between
measurement and saturation
table value
-------
Section 4
Page 12 of 12
Revision 2
June 1995
TABLE 4-4. (continued).
Variable
Dissolved oxygen
cone. (DataSonde 3)
QA Sample Type
or Measurement
Procedure
Side-by-side
comparison with
Surveyor II
Frequency
of Use
At deployment
and retrieval of
unit .
Data Generated
for Measurement
Quality Definition
Difference between
DataSonde 3 and
H20
Salinity
Secondary Seawater Daily
Standard
Difference between
probe measurement and
standard value
Temperature
Depth
pH
Fish identification
Fish counts
Fish gross
pathology
Fish
histopathology
Thermometer
reading
Check bottom depth
against depth
finder on boat
QC check with
standard buffer
solutions
Fish preserved
for verification
by taxonomist
Duplicate counts
Specimens
preserved for
confirmation
Confirmation by
second technician
Daily
Each station
Daily
Each reference
pathology sample
10% of trawls
Per occurrence
5% of slides
Difference between
probe and thermometer
Difference from
depth finder
Difference from standard
Number of misidentifications
Replicated difference
between determinations
Number of misidentifications
Number of confirmations
-------
Section 5
Page 1 of 33
Revision 2
June 1995
SECTIONS
ANALYSIS OF CHEMICAL CONTAMINANTS IN SEDIMENT,
FISH TISSUE, AND WATER SAMPLES
5.1 OVERVIEW
Quality assurance of chemical measurements has many diverse aspects. This section presents EMAP-
Estuaries QA/QC protocols and requirements covering a range of activities, from sample collection and
laboratory analysis to final validation of the resultant data. Much of the guidance provided in this section
is based on protocols developed for EPA's Puget Sound Estuary Program (U.S. EPA 1989), as well as
those developed over many years on the National Oceanic and Atmospheric Administration's (NOAA's)
National Status and Trends (NS&T) Program. This guidance is applicable to low parts per billion analyses
of both estuarine sedi'ment and tissue samples unless otherwise noted.
The EMAP-E program measures a variety of organic and inorganic contaminants in estuarine
sediment and fish tissue samples (Tables 5-1 and 5-2); these compounds are the same as those measured in
the NOAA NS&T Program, with a few additions. These contaminants are being measured for the purpose
of environmental monitoring, with the understanding that the data will not be used for litigation purposes.
Therefore, legal and contracting requirements as stringent as those used in the U.S. EPA Contract
Laboratory Program, for example, have not been applied to EMAP-E. Rather, EMAP-E requires its
laboratories to demonstrate comparability continuously through strict adherence to common QA/QC
procedures, routine analysis of Certified Reference Materials,1 and regular participation in an on-going
series of QA intercomparison exercises (round-robins). This is a "performance-based" approach for
quality assurance of low-level contaminant analyses, involving continuous laboratory evaluation
1 Certified Reference Materials (CRMs) are samples in which chemical concentrations have been
determined accurately using a variety of technically valid procedures; these samples are accompanied by a
certificate or other documentation issued by a certifying body (e.g., agencies such as the National
Research Council of Canada (NRCC), U.S. EPA, U.S. Geological Survey, etc.). Standard Reference
Materials are CRMs issued by the National Institute of Standards and Technology, formerly the National
Bureau of Standards. A useful catalogue of marine science reference materials has been compiled by
Cantillo (1992).
-------
Section 5
Page 2 of 33
Revision 2
June 1995
TABLE 5.1 Chemicals to be measured in sediments by EMAP-Estuaries West Indian Province.
Polynuclear Aromatic
Hydrocarbons (PAHs)
Acenaphthene
Anthracene
Benz(a)anthracene
Benzo(a)pyrene
Biphenyl
Chrysene
Chrysene(Cl-C4)
Dibenz(a,h) anthracene
Dibenzothiophene
Dibenzothiophene(C 1-C3)
2,6-dimethylnaphthalene
Fluoranthene
Fluorene
Fluorene(Cl-C3)
2-methylnaphthaIene
DDT and its metabolites
2.4'-DDD
4,4'-DDD
2,4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT
Alkanes
C-10-C34
Pristane
Phytane
Total alkanes
Organophosphorous
Pesticides
Terbufos
Diazinon
Disulfoton
Chlorpyrifos
Ethion
Carbofonothion
1-methylnaphthalene
1 -methy Iphenanthrene
2,6-dimethylnaphtalene
Naphthalene
Naphtalene(Cl-C4)
Pyrene
Benzo(b)fluoranthene
Acenaphthylene
Benzo(k)fluoranthene
Benzo(g,h,i)perylene
Idenod ,2,3-c,d)pyrene
2,3,5-trimethylnaphthalene
Chlorinated pesticides
other than DDT
Aldrin
Alpha-Chlordane
Dieldrin
Endosulfan
Endrin
Heptachlor
Heptachlor epoxide
Hexachlorobenzene
Lindane (gamma-BHC)
Mirex
Toxaphene
Trans-Nonachlor
21 PCB Congeners:
PCB No. Compound Name
8 2,4'-dichlorobiphenyl
18 2,2',5-trichlorobiphenyl
28 2,4,4'-trichlorobiphenyl
44 2,2',3,5'-tetrachlorobiphenyl
52 2,2',5,5'-tetrachlorobiphenyl
66 2,3',4,4l-tetrachlorobiphenyl
101 2,2',4,5,5'-pentachlorobiphenyl
105 2,3,3',4,4'-pentachlorobiphenyl
110/77 2,3,3',4',6-pentachlorobiphenyl
3,3',4,4'-tetrachlorobiphenyl
118 2,3,4,4',5-pentachlorobiphenyt
126 3,3,4,4',5-pentachlorobiphenyl
128 2,21,3,3',4,4'-hexachlorobiphenyl
138 2,2',3,4,41,5'-hexachlorobiphenyl
153 2,2',4,4',5,5'-hexachlorobiphenyl
170 2,2',3,3',4,4',5-heptachlorobiphenyl
180 2,2',3,4,4',5,5'-heptachlorobiphenyl
187 2,2',3,4',5,5',6-heptachlorobiphenyl
195 2,2',3,3',4,4',5,6-octachlorobiphenyl
206 2,2',3,3',4,4',5>5',6-nonachlorobiphenyl
209 2,213,31,4.4',5,51,6,6'-decachlorobiphenyl
Trace Elements
Aluminum
Antimony
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Silver
Tin
Zinc
Other Measurements
Acid volatile sulfide
Total organic carbon
Tributyltin, Dibutyltin, Monobutyltin
Methylated mercury
-------
TABLE 5.2 Chemicals to be measured in fish
DDT and its metabolites
2.4'-DDD
4,4'-DDD
2.4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT
Chlorinated pesticides
other than DDT
Aldrin
Alpha-Chlordane
Dieldrin
Endosulfan
Endrin
Heptachlor
Heptachlor epoxide
Hexachlorobenzene
Lindane (gamma-BHC)
Mirex
Toxaphene
Trans-Nonachlor
21 PCB Congeners:
PCB No. Compound Name
8 2,4'-dichlorobiphenyl
18 2,2'5-trichlorobiphenyl
28 2,4,4'-trichlorobiphenyl
44 2,2',3,5'-tetrachlorobiphenyl
52 2,2',5,5'-tetrachlorobiphenyl
66 2,3',4,4'-tetrachlorobiphenyl
101 2,2',4,5,5'-pentachlorobiphenyl
105 2,3,3'4,4'-pentachlorobiphenyl
1 10/77 2,2',4,5,5'-pentachlorobiphenyl
3,3'4,4'-tetrachlorobiphenyl
118 2.3'.4,4',5-pentachlorobiphenyl
126 3,3',4,4',5-pemachIorobiphenyl
1 28 2,2',3,3',4,4'-hexachlorobiphenyl
1 38 2,2',3,4,4',5'-hexachlorobipheny 1
1 53 2,2',4;4',5,5'-hexachlorobipheny 1
170 2,2',3.3',4,4'.5-heptachlorobiphenyl
180 2,2'3,4,4',5,5'-heptachlorobiphenyl
187 2,2',3,4',5,5',6-heptachlorobiphenyl
1 95 2,2',3,3',4,4',5,6-octachlorobiphenyl
206 2,2',3,3'.4,4',5,5',6-nonachlorobiphenyl
209 2,2',3,3',4,4',5,5',6,6'-decachIorobiDhenyl
and shellfish tissue by
Trace Elements
Aluminum
Arsenic Cadmium
Chromium
Copper
Iron
Lead
Mercury
Nickel
Selenium
Silver
Tin
Zinc
Butyltin
Monobutyltin
Dibutyltin
Tributyltin
Section 5
Page 3 of 33
Revision 2
June 1995
EMAP-Estuaries West Indian Province.
Organophosphorous Pesticides
Terbufos
Diazinon
Disulfoton
Chlorpyrifos
Enthion
Carbofenothion
Other measurements
Methylated mercury
-------
Section 5
Page 4 of 33
Revision 2
June 1995
through the use of accuracy-based materials (e.g., CRMs), laboratory-fortified sample matrices, laboratory
reagent blanks, calibration standards, and laboratory and field replicates. The definition and use of each of
these types of quality control samples are explained in later sections.
No single analytical method has been approved officially for low-level (i.e., low parts per billion)
analysis of organic and inorganic contaminants in estuarine sediments and fish tissue. Recommended
methods for the EMAP-E program are those used in the NOAA NS&T Program (Lauenstein et al. 1993), as
well as those documented in the EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision).
Under the EMAP-E performance-based chemistry QA program, laboratories are not required to use a
single, standard analytical method for each type of analysis, but rather are free to choose the best or most
feasible method within the constraints of cost and equipment. Each laboratory must, however,
continuously demonstrate proficiency and data comparability through routine analysis of accuracy-based
performance evaluation samples and reference materials representing real-life matrices.
5.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION, AND HOLDING
Field personnel must strictly adhere to EMAP-E protocols to insure the collection of representative,
uncontaminated sediment, water, and fish tissue chemistry samples. These sample collection protocols are
described in detail in the West Indian Province Field Operations Manual (Macauley and Summers 1995).
Briefly, the key aspects of quality control associated with chemistry sample collection are as follows: 1)
field personnel must be thoroughly trained in the proper use of sample collection gear and must be able to
distinguish acceptable versus unacceptable sediment grab samples or fish trawls in accordance with pre-
established criteria; 2) field personnel must be thoroughly trained to recognize and avoid potential sources
of sample contamination (e.g., engine exhaust, winch wires, deck surfaces, ice used for cooling); 3)
samplers and utensils which come in direct contact with the sample should be made of non-contaminating
materials (e.g., glass, high-quality stainless steel and/or Teflon®) and should be thoroughly cleaned
between sampling stations (e.g., Alconox® scrub followed by thorough rinse with ambient water); 4)
sample containers should be of the recommended type (Table 5-3) and must be free of contaminants (i.e.,
carefully pre-cleaned); 5) each sample container should be uniquely labeled (i.e., barcoded sample ID
label in conjunction with barcoded station label); 6) recommendations for sample collection, preservation
and holding times should be followed (Table 5-3).
-------
Section 5
Page 5 of 33
Revision 2
June 1995
TABLE 5-3. Summary of EMAP-E chemistry sample collection, preservation, and holding time requirements. (EPA
criteria recommend maximum sample holding times of 2-4 weeks at 4 °C for most of the parameters listed here.
Currently, in the West Indian Province, logistical constraints prevent sample turn-around in the 2-4 week
recommended period. Therefore, unless stated otherwise, chemistry samples are held frozen for up to 1 year.)
Parameter
Sediment
metals
Sediment
TOC
Container
125-mlHDPE
wide- mouth
bottle
60-ml HDPE
bottle
Volume
100- 150
ml
25-40 ml
Sample Size
75 -100 g
(approx.)
30-50 g
(approx.)
Sample
Preservation
Freeze (-18'C)
Freeze,(-18°C)
Max. Sample
Holding Time
1 year
6 months
Max. Extract
Holding Time
«
a
Sediment
organics
(including
butyltins)
500-mI pre-
cleaned glass
wide- mouth jar
250 to
300ml
300 g
(approx.)
Freeze (-18°C) 1 year
40 days
Sediment
acid
volatile
sulfide
(AVS)
125-ml poly-
propylene wide-
mouth bottle
125 ml"
100 g"
(approx.)
Freeze (-18"C) 6 months
36 hours
Fish tissue
(organics
and In-
organics)
Whole fish are
placed in water-
tight plastic bags
NA
NA
Freeze (-18°C) 1 year
40 days
1 No EPA criteria exist. Every effort should be made to analyze sample as soon as possible following extraction, or in
the case of metals, digestion.
b AVS containers should be filled near the top to minimize the headspace; however, there should be small headspace
to allow for sample expansion during freezing; containers should be capped tightly and then frozen. Every effort
should be made to minimize contact of the sediment with air and to analyze these samples as soon as possible.
-------
Section 5
Page 6 of 33
Revision 2
June 1995
5.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS
5.3.1 Overview
The QA/QC requirements presented in the following sections are intended to provide a common
foundation for each laboratory's protocols; the resultant QA/QC data will enable an assessment of the
comparability of results generated by different laboratories and different analytical procedures. It should
be noted that the QA/QC requirements specified in this plan represent the minimum requirements for any
given analytical method. Additional requirements which are method-specific should always be followed,
as long as the minimum requirements presented in this document have been met.
The performance-based EMAP-E QA program for analytical chemistry laboratories consists of two
basic elements: 1.) initial demonstration of laboratory capability (e.g., performance evaluation) and 2.)
ongoing demonstration of capability. Prior to the analysis of samples, each laboratory must demonstrate
proficiency in several ways: written protocols for the analytical methods to be employed for sample
analysis must be submitted to EMAP for review, method detection limits for each analyte must be
calculated, an initial calibration curve must be established for all analytes, and acceptable performance
must be shown on a known or blind accuracy-based material. Following a successful first phase, the
laboratory must demonstrate its continued capabilities in several ways: participation in an ongoing series
of laboratory intercomparison exercises, repeated analysis of Certified Reference Materials, calibration
checks, and analysis of laboratory reagent blanks and fortified samples. These steps are detailed in the
following sections and summarized in Table 5-4. The sections are arranged to mirror the elements in
Table 5-4 to provide easy cross-reference for the reader.
The results for the various QA/QC samples should be reviewed by laboratory personnel immediately
following the analysis of each sample batch. These results then should be used to determine when
warning and control limit criteria have not been met and corrective actions must be taken, before
processing a subsequent sample batch. When warning limit criteria have not been met, the laboratory is
not obligated to halt analyses, but the analyst(s) is advised to investigate the cause of the exceedance.
When control limit criteria are not met, specific corrective actions are required before the analyses may
proceed. Warning and control limit criteria and recommended frequency of analysis for each QA/QC
-------
Section 5
Page 7 of 33
Revision 2
June 1995
element or sample type required in the EMAP-E program also are summarized in Table 5-4.
TABLE 5-4. Key elements of laboratory quality control for EMAP-Estuaries chemical analyses (see text
for detailed explanations).
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
1.) Initial demonstration
of capability (prior to
Analysis of samples):
- Instrument calibration NA
- Calculation of method
detection limits (MDL)
NA
Must be equal to or less than
target values (see Table 5-5)
- Blind analysis of
accuracy-based
material NA
NA
Initial and then
prior to analyzing
each batch of samples
At least once each
year
Initial
2.) Ongoing demonstration
of capability:
- Blind analysis of
laboratory inter-
comparison exercise
samples
NA
NA
Regular intervals
throughout the
year
- Continuing calibration
checks using calibration
standard solutions
NA
Should be within
±15% of initial
calibration on
average for all
analytes, not to
exceed ±25% for
any one analyte
At a minimum,
middle and end
of each sample
batch
(continued)
-------
TABLE 5-4 (continued).
Section 5
Page 8 of 33
Revision 2
June 1995
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
- Analysis of certified reference
material (CRM) or laboratory
control material (LCM):
Precision1:
Relative accuracy :
PAHs
PCBs/pesticides
Inorganic elements
NA
Lab's value should
be within ±25% of
true value on
average for all
analytes; not to
exceed ±30% of
true value for
more than 30% of
individual analytes
same as for PAHs
Lab should be within
±15% of true value
for each analyte
Value obtained for
each analyte should
be within 3s control
chart limits
Lab's value should
be within ±30% of
true value on
average for all
analytes; not to
exceed ±35% of
true value for
more than 30% of
individual analytes
same as for PAHs
Lab should be within
±20% of true value
for each analyte
batch of samples
Value plotted on
control chart after
each analysis of the
CRM
1 The use of control charts to monitor precision for each analyte of interest should follow generally accepted practices
(e.g., Taylor 1987 and section 3.2.5 of this document). Upper and lower control limits, based on 99% confidence
intervals around the mean, should be updated at regular intervals.
2 "True" values in CRMs may be either "certified" or "noncertified" (it is recognized that absolute accuracy can only
be assessed using certified values, hence the term relative accuracy). Relative accuracy is computed by comparing the
laboratory's value for each analyte against either end of the range of values (i.e., 95% confidence limits) reported by
the certifying agency. The laboratory's value must be within ±35% of either the upper or lower 95% confidence
interval value. Accuracy control limit criteria only apply for analytes having CRM concentrations z 10 times the
laboratory's MDL.
-------
TABLE 5-4 (continued).
Section 5
Page 9 of 33
Revision 2
June 1995
Element or
Sample Type
Warning Limit
Criteria
Control Limit
Criteria
Frequency
• Laboratory reagent
blank
Analysts should use
best professional
judgement if analytes
are detected at <3
times the MDL
No analyte should
be detected at >3
times the MDL
One with each
batch of samples
- Laboratory fortified
sample matrix
(matrix spike)
NA
Recovery should be
within the range
50% - 120% for at
least 80% of the
analytes
At least
5% of total
number of
samples
NOTE: Samples to be spiked should be chosen at random; matrix spike solutions should contain all the analytes of
interest. The final spiked concentration of each analyte in the sample should be at least 10 times the calculated MDL.
- Laboratory fortified '
sample matrix duplicate
(matrix spike duplicate)
NA
RPD3 must be
s 30 for each
analyte
Same as
matrix spike
- Field duplicates
(field splits)
- Internal standards
(surrogates)
- Injection internal
standards
NA
NA
Lab develops
its own
NA
Recovery must be
within the range
30% - 150%
Lab develops
its own
5% of total
number of
samples
Each sample
Each sample
RPD =
Relative percent difference between matrix spike and matrix spike duplicate
results (see appropriate section for equation).
-------
Section 5
Page 10 of 33
Revision 2
June 1995
5.3.2 Initial Demonstration of Capability
Instrument Calibration
Equipment should be calibrated prior to the analysis of each sample batch, after each major
equipment disruption, and whenever ongoing calibration checks do not meet recommended control limit
criteria (Table 5-4). All calibration standards should be traceable to a recognized organization for the
preparation and certification of QA/QC materials (e.g., National Institute of Standards and Technology,
U.S. Environmental Protection Agency, etc.); Calibration curves must be established for each element and
batch analysis from a calibration blank and a minimum of three analytical standards of increasing
concentration, covering the range of expected sample concentrations. The calibration curve should be
well-characterized and must be established prior to the analysis of samples. Only data that results from
quantification within the demonstrated working calibration range may be reported by the laboratory (i.e.,
quantification based on extrapolation is not acceptable). Samples outside the calibration range should be
diluted or concentrated, as appropriate, and reanalyzed.
Initial Documentation of Method Detection Limits
Analytical chemists have coined a variety of terms to define "limits" of detectability; definitions for
some of the more commonly used terms are provided in Keith et al. (1983) and in Keith (1991). In the
EMAP-E program, the Method Detection Limit (MDL) will be used to define the analytical limit of
detectability. The MDL represents a quantitative estimate of low-level response detected at the maximum
sensitivity of a method. The Code of Federal Regulations (40 CFR Part 136) gives the following rigorous
definition: "the MDL is the minimum concentration of a substance that can be measured and reported with
99% confidence that the analyte concentration is greater than zero and is determined from analysis of a
sample in a given matrix containing the analyte." Confidence in the apparent analyte concentration
increases as the analyte signal increases above the MDL.
Each EMAP-E analytical laboratory must calculate and report an MDL for each analyte of interest in
each matrix of interest (sediment or tissue) prior la the analysis of field samples for a given year. Each
laboratory is required to follow the procedure specified in 40 CFR Part 136 (Federal Register, Oct. 28,
-------
Section 5
Page 11 of 33
Revision 2
June 1995
1984) to calculate MDLs for each analytical method employed. The matrix and the amount of sample (i.e.,
dry weight of sediment or tissue) used in calculating the MDL should match as closely as possible the
matrix of the actual field samples and the amount of sample typically used. In order to ensure
comparability of results among different laboratories, MDL target values have been established for the
EMAP-E program (Table 5-5). The initial MDLs reported by each laboratory should be equal to or less
than these specified target values before the analysis of field samples may proceed. Each laboratory must
periodically (i.e., at least once each year) re-evaluate its MDLs for the analytical methods used and the
sample matrices typically encountered.
TABLE 5-5. Target method detection limits for EMAP-Estuaries
Inorganics (NOTE: concentrations
Aluminum
Antimony
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Silver
Tin
Zinc
Organics (NOTE: concentrations in
PAHs
PCB congeners
Chlorinated pesticides
in ug/g (ppm) dry weight)
Tissue
10.0
not measured
2.0
0.2
0.1
5.0
50.0
0.1
not measured
0.01
0.5
1.0
0.01
0.05
50.0
ng/g (ppb) dry weight)
Tissue
not measured
2.0
2.0
analytes.
Sediments
1500
0.2
1.5
0.05
5.0
5.0
500
1.0
1.0
0.01
1.0
0.1
0.01
0.1
2.0
Sediments
10
1.0
1.0
-------
Section 5
Page 12 of 33
Revision 2
June 1995
Initial Blind Analysis of a Representative Sample
A representative sample matrix which is uncompromised, homogeneous and contains the analytes of
interest at concentrations of interest will be provided to each analytical laboratory new to the EMAP-E
program; this sample will be used to evaluate laboratory performance prior to the analysis of field
samples. The sample used for this initial demonstration of laboratory capability typically will be
distributed blind (i.e., the laboratory will not know the concentrations of the analytes of interest) as part of
the laboratory QA intercomparison exercises. A laboratory's performance generally will be considered
acceptable if its submitted values are within ±30% (for organic analyses) and ± 20% (for inorganic
analyses) of the known concentration of each analyte of interest in the sample. These criteria apply only
for analyte concentrations equal to or greater than 10 times the MDL established by the laboratory. If the
results for the initial analysis fail to meet these criteria, the laboratory will be required to repeat the
analysis until the performance criteria are met, prior to the analysis of real samples.
5.3.3 Ongoing Demonstration of Capability
Laboratory Participation in Intercomparison Exercises
Through an interagency agreement, NOAA's NS&T Program and EPA's EMAP-E program jointly
sponsor an ongoing series of laboratory intercomparison exercises (round-robins). All EMAP-E analytical
laboratories are required to participate in these exercises, which are conducted jointly by the National
Institute of Standards and Technology (NIST) and the National Research Council of Canada (NRCC).
These exercises provide a tool for continuous improvement of laboratory measurements by helping
analysts identify and resolve problems in methodology and/or QA/QC. The results of these exercises also
are used to evaluate both the individual and collective performance of the participating analytical
laboratories on a continuous basis. The EMAP-E laboratories are required to initiate corrective actions if
their performance in these intercomparison exercises falls below certain predetermined minimal standards,
described in later sections.
Typically, three or four different exercises are conducted over the course of a year. In a typical
exercise, either NIST or NRCC will distribute performance evaluation samples in common to each
-------
Section 5
Page 13 of 33
Revision 2
June 1995
laboratory, along with detailed instructions for analysis. A variety of performance evaluation samples
have been utilized in the past, including accuracy-based solutions, sample extracts, and representative
matrices (e.g., sediment or tissue samples). Laboratories are required to analyze the sample(s) "blind" and
must submit their results in a timely manner both to the EMAP-E QA Coordinator, and to either NIST or
NRCC (as instructed). Laboratories that fail to maintain acceptable performance may be required to
provide an explanation and/or undertake appropriate corrective actions. At the end of each calendar year,
coordinating personnel at NIST and NRCC hold a QA workshop to present and discuss the
intercomparison exercise results. Representatives from each laboratory are expected to participate in the
annual QA workshops, which provide a forum for discussion of analytical problems brought to light in the
intercomparison exercises.
Routine Analysis of Certified Reference Materials or Laboratory Control Materials
Certified Reference Materials (CRMs) generally are considered the most useful QC samples for
assessing the accuracy of a given analysis (i.e., the closeness of a measurement to the "true" value).
Certified Reference Materials can be used to assess accuracy because they have "certified" concentrations
of the analytes of interest, as determined through replicate analyses by a reputable certifying agency using
two independent measurement techniques for verification. In addition, the certifying agency may provide
"noncertified" or "informational" values for other analytes of interest. Such values are determined using a
single measurement technique, which may introduce unrecognized bias. Therefore, noncertified values
must be used with caution in evaluating the performance of a laboratory using a method which differs
from the one used by the certifying agency. A list of reference materials commonly used by EMAP-E
laboratories is presented in Table 5-6.
A Laboratory Control Material (LCM) is similar to a Certified Reference Material in that it is a
homogeneous matrix which closely matches the samples being analyzed. A "true" LCM is one which is
prepared (i.e., collected, homogenized, and stored in a stable condition) strictly for use in-house by a
single laboratory. Alternately, the material may be prepared by a central laboratory and distributed to
others (so-called regional or program control materials). Unlike CRMs, concentrations of the analytes of
interest in LCMs are not certified but are based upon a statistically valid number of replicate analyses by
one or several laboratories. In practice, this material can be used to assess the precision (i.e., consistency)
-------
Section 5
Page 14 of 33
Revision 2
June 1995
Table 5-6. Certified Reference Materials commonly used by EMAP-E laboratories. Standard reference
materials (SRMs) are available from NIST (phone 301-975-6776); all other reference materials listed are
available from NRC (phone 613-993-2359).
Calibration Solutions:
SRM 1491 Aromatic Hydrocarbons in Hexane/Toluene
SRM 1492 Chlorinated Pesticides in Hexane
SRM 1493 Chlorinated Biphenyl Congeners in 2,2,4-Trimethylpentane
SRM 2260 Aromatic Hydrocarbons in Toluene
SRM 2261 Chlorinated Pesticides in Hexane
SRM 2262 Chlorinated Biphenyl Congeners in 2,2,4-Trimethylpentane
Environmental Matrices (Organics):
SRM 1941a Organics in Marine Sediment
SRM 1974 Organics in Mussel Tissue (Mytilus edulis)
Environmental Matrices (Inorganics):
SRM 1646 Estuarine Sediment BCSS-1 Marine Sediment
MESS-1 Estuarine Sediment PACS-1 Harbor Sediment
BEST-1 Marine Sediment DORM-1 Dogfish Muscle
DOLT-1 Dogfish Liver SRM 1566a Oyster Tissue
-------
Section 5
Page 15 of 33
Revision 2
June 1995
of a single laboratory, as well as to determine the degree of comparability among different laboratories. If
available, LCMs may be preferred for routine (i.e., day to day) analysis because CRMs are relatively
expensive. However, CRMs still must be analyzed at regular intervals (e.g., monthly or quarterly) to
provide a check on accuracy.
Routine analysis of CRMs or, when available, LCMs represents a particularly vital aspect of the
"performance-based" EMAP-E QA philosophy. At least one CRM or LCM must be analyzed along with
each batch of 25 or fewer samples (Table 5-4). For CRMs, both the certified and noncertified
concentrations of the target analytes should be known to the analyst(s) and should be used to provide an
immediate check on performance before proceeding with a subsequent sample batch. Performance criteria
for both precision and accuracy have been established for analysis of CRMs or LCMs (Table 5-4); these
criteria are discussed in detail in the following paragraphs. If the laboratory fails to meet either the
precision or accuracy control limit criteria for a given analysis of the CRM or LCM, the data for the entire
batch of samples is suspect. Calculations and instruments should be checked; the CRM or LCM may have
to be reanalyzed (i.e., reinjected) to confirm the results. If the values are still outside the control limits in
the repeat analysis, the laboratory is required to find and eliminate the source(s) of the problem and repeat
the analysis of that batch of samples until control limits are met, before continuing with further sample
processing. The results of the CRM or LCM analysis should never be used by the laboratory to "correct"
the data for a given sample batch.
Precision criteria: Each laboratory is expected to maintain control charts for use by analysts in
monitoring the overall precision of the CRM or LCM analyses. Upper and lower control chart limits (e.g.,
warning limits and control limits) should be updated at regular intervals; control limits based on 3
standard deviations of the mean generally are recommended (Taylor 1987). Following the analysis of all
samples in a given year, an RSD (relative standard deviation, or coefficient of variation) will be calculated
for each analyte of interest in the CRM. For each analyte having a CRM concentration ^ 10 times the
laboratory's MDL, an overall RSD of < 30% will be considered acceptable precision. Failure to meet this
goal will result in a thorough review of the laboratory's control charting procedures and analytical
methodology to determine if improvements in precision are possible.
Accuracy criteria: The "absolute" accuracy of an analytical method can be assessed using CRMs
-------
Section 5
Page 16 of 33
Revision 2
June 1995
only when certified values are provided for the analytes of interest. However, the concentrations of many
analytes of interest to EMAP-E are provided only as noncertified values in some of the more commonly-
used CRMs. Therefore, control limit criteria are based on "relative accuracy," which is evaluated for each
analysis of the CRM or LCM by comparison of a given laboratory's values relative to the "true" or
"accepted" values in the LCM or CRM. In the case of CRMs, this includes both certified and noncertified
values and encompasses the 95% confidence interval for each value as described in Table 5-4.
Accuracy control limit criteria have been established both for individual compounds and combined
groups of compounds (Table 5-4). There are two combined groups of compounds for the purpose of
evaluating relative accuracy for organic analyses, PAHs and PCBs/pesticides. The laboratory's value
should be within ±30% of the true value on average for each combined group of organic compounds, and
the laboratory's value should be within ±35% of either the upper or lower 95% confidence limit for at least
70% of the individual compounds in each group. For inorganic analyses, the laboratory's value should be
within ±20% of either the upper or lower 95% confidence limit for each analyte of interest in the CRM.
Because of the inherent variability in analyses near the MDL, control limit criteria for relative accuracy
only apply to analytes having CRM true values which are ^ 10 times the MDL established by the
laboratory.
Continuing Calibration Checks
The initial instrument calibration performed prior to the analysis of each batch of samples is checked
through the analysis of calibration check samples (i.e., calibration standard solutions) inserted as part of
the sample stream. Calibration standard solutions used for the continuing calibration checks should
contain all the analytes of interest. At a minimum, analysis of the calibration check solution should occur
somewhere in the middle and at the end of each sample batch. Analysts should use best professional
judgement to determine if more frequent calibration checks are necessary or desirable.
If the control limit for analysis of the calibration check standard is not met (Table 5-4), the initial
calibration will have to be repeated. If possible, the samples analyzed before the calibration check sample
that failed the control limit criteria should be reanalyzed following the recalibration. The laboratory
should begin by reanalyzing the last sample analyzed before the calibration standard which failed. If the
-------
Section 5
Page 17 of 33
Revision 2
June 1995
relative percent difference (RPD) between the results of this reanalysis and the original analysis exceeds
30%, the instrument is assumed to have been out of control during the original analysis. If possible,
reanalysis of samples should progress in reverse order until it is determined that there is less than 30 RPD
between initial and reanalysis results. Only the reanalysis results should be reported by the laboratory. If
it is not possible or feasible to perform reanalysis of samples, all earlier data (i.e., since the last successful
calibration control check) are suspect. In this case, the laboratory should prepare a narrative explanation
to accompany the submitted data.
Laboratory Reagent Blank
Laboratory reagent blanks (also called method blanks or procedural blanks) are used to assess
laboratory contamination during all stages of sample preparation and analysis. For both organic and
inorganic analyses, one laboratory reagent blank should be run in every sample batch. The reagent blank
should be processed through the entire analytical procedure in a manner identical to the samples. Warning
and control limits for blanks (Table 5-4) are based on the laboratory's MDLs as documented prior to the
analysis of samples. A reagent blank concentration between the MDL and 3 times the MDL for one or
more of the analytes of interest should serve as a warning limit requiring further investigation based on the
best professional judgement of the analyst(s). A reagent blank concentration ^3 times the MDL for one or
more of the analytes of interest requires definitive corrective action to identify and eliminate the source(s)
of contamination before proceeding with sample analysis.
Internal Standards
Internal standards (commonly referred to as "surrogates," "surrogate spikes" or "surrogate
compounds") are compounds chosen to simulate the analytes of interest in organic analyses. The internal
standard represents a reference analyte against which the signal from the analytes of interest is compared
directly for the purpose of quantification. Internal standards must be added to each sample, including
QA/QC samples, prior to extraction. The reported concentration of each analyte should be adjusted to
correct for the recovery of the internal standard, as is done in the NOAA National Status and Trends
Program. The internal standard recovery data therefore should be carefully monitored; each laboratory
must report the percent recovery of the internal standard (s) along with the target analyte data for each
-------
Section 5
Page 18 of 33
Revision 2
June 1995
sample. If possible, isotopically labeled analogs of the analytes should be used as internal standards.
Control limit criteria for internal standard recoveries are provided in Table 5-4. Each laboratory
should set its own warning limit criteria based on the experience and best professional judgement of the
analyst(s). It is the responsibility of the analyst(s) to demonstrate that the analytical process is always "in
control" (i.e., highly variable internal standard recoveries are not acceptable for repeat analyses of the
same certified reference material and for the matrix spike/matrix spike duplicate).
Injection Internal Standards
For gas chromatography (GC) analysis, injection internal standards (also referred to as "internal
standards" by some analysts) are added to each sample extract just prior to injection to enable optimal
quantification, particularly of complex extracts subject to retention time shifts relative to the analysis of
standards. Injection internal standards are essential if the actual recovery of the internal standards added
prior to extraction is to be calculated. The injection internal standards also can be used to detect and
correct for problems in the GC injection port or other parts of the instrument. The compounds used as
injection internal standards must be different from those already used as internal standards. The analyst(s)
should monitor injection internal standard retention times and recoveries to determine if instrument
maintenance or repair, or changes in analytical procedures, are indicated. Corrective action should be
initiated based on the experience of the analyst(s) and not because warning or control limits are exceeded.
Instrument problems that may have affected the data or resulted in the reanalysis of the sample should be
documented properly in logbooks and/or internal data reports and used by laboratory personnel to take
appropriate corrective action.
Matrix Spike and Matrix Spike Duplicate
A laboratory-fortified sample matrix (commonly called a matrix spike, or MS) and a laboratory-
fortified sample matrix duplicate (commonly called a matrix spike duplicate, or MSD) will be used both to
evaluate the effect of the sample matrix on the recovery of the compound(s) of interest and to provide an
estimate of analytical precision. A minimum of 5% of the total number of samples submitted to the
laboratory in a given year should be selected at random for analysis as matrix spikes/matrix spike
-------
Section 5
Page 19 of 33
Revision 2
June 1995
duplicates. Each MS/MSD sample is first homogenized and then split into three subsamples. Two of
these subsamples are fortified with the matrix spike solution and the third subsample is analyzed as is to
provide a background concentration for each analyte of interest. The matrix spike solution should contain
all the analytes of interest. The final spiked concentration of each analyte in the sample should be at least
10 times the MDL for that analyte, as previously calculated by the laboratory.
Recovery data for the fortified compounds ultimately will provide a basis for determining the
prevalence of matrix effects in the sediment samples analyzed during the project. If the percent recovery
for any analyte in the MS or MSD is less than the recommended warning limit of 50%, the chromatograms
and raw data quantitation reports should be reviewed. If an explanation for a low percent recovery value is
not discovered, the instrument response may be checked using a calibration standard. Low matrix spike
recoveries may be a result of matrix interferences and further instrument response checks may not be
warranted, especially if the low recovery occurs in both the MS and MSD and the other QC samples in the
batch indicate that the analysis was "in control". An explanation for low percent recovery values for
MS/MSD results should be discussed in a cover letter accompanying the data package. Corrective actions
taken and verification of acceptable instrument response must be included.
Analysis of the MS/MSD also is useful for assessing laboratory precision. The relative percent
difference (RPD) between the MS and MSD results should be less than 30 for each analyte of interest (see
Table 5-4). The RPD is calculated as follows:
RPD = (Cl - C2) x 100
Cl + C2)/2
where: Cl is the larger of the duplicate results for a given analyte
C2 is the smaller of the duplicate results for a given analyte
If results for any analytes do meet the RPD s 30% control limit criteria, calculations and instruments
should be checked. A repeat analysis may be required to confirm the results. Results which repeatedly
fail to meet the control limit criteria indicate poor laboratory precision. In this case, the laboratory is
-------
Section 5
Page 20 of 33
Revision 2
June 1995
obligated to halt the analysis of samples and eliminate the source of the imprecision before proceeding.
Field Duplicates and Field Splits
For the EMAP-E program, sediment will be collected at each station using a grab sampler. Each time
the sampler is retrieved, the top 2 cm of sediment will be scraped off, placed in a large mixing container,
and homogenized, until a sufficient amount of material has been obtained. At approximately 5% of the
stations, the homogenized material will be placed in four separate sample containers for subsequent
chemical analysis. Two of the sample containers will be submitted as blind field duplicates to the primary
analytical laboratory. The other two containers, also called field duplicates, will be sent blind to a second
laboratory. Together, the two pairs of duplicates are called field splits. The analysis of the field duplicates
will provide an assessment of single laboratory precision. The analysis of the field duplicates and field
splits will provide an assessment of both inter- and intralaboratory precision, as well as an assessment of
the efficacy of the field homogenization technique.
5.4 OTHER SEDIMENT MEASUREMENTS
The preceding sections presented QA/QC requirements covering laboratory analysis of sediment and
fish tissue samples for organics (i.e., PAHs, PCBs, and chlorinated pesticides) and inorganics (i.e., metals).
In addition to these "conventional" contaminants, EMAP-E laboratories are required to measure several
ancillary sediment parameters, such as total organic carbon (TOC), acid volatile sulfide (AVS), and tri-, di-
and monobutyltin (TBT, DBT, MBT) concentrations. The laboratory QA/QC requirements associated with
these "other sediment measurements" are presented in the following sections.
5.4.1 Total Organic Carbon
As a check on precision, each laboratory should analyze at least one total organic carbon (TOC)
sample in duplicate for each batch of 25 or fewer samples. The relative percent difference (RPD) between
the two duplicate measurements should be less than 20%. If this control limit is exceeded, analysis of
subsequent sample batches should stop until the source of the discrepancy is determined and the system
-------
Section 5
Page 21 of 33
Revision 2
June 1995
corrected.
At least one certified reference material (CRM) or, if available, one laboratory control material
(LCM) should be analyzed along with each batch of 25 or fewer TOC samples. Any one of several marine
sediment CRMs distributed by the National Research Council of Canada's Marine Analytical Chemistry
Standards Program (e.g., the CRMs named "BCSS-1," "MESS-1," and "PACS-1;" see Table 5-6) have
certified concentrations of total carbon and are recommended for this use. Prior to analysis of actual
samples, it is recommended that each laboratory perform several TOC analyses using a laboratory control
material or one of the aforementioned CRMs to establish a control chart (the values obtained by the
laboratory for total organic carbon should be slightly less than the certified value for total carbon in the
CRM). The control chart then should be used to assess the laboratory's precision for subsequent analyses
of the LCM or CRM with each sample batch. In addition, a method blank should be analyzed with each
sample batch. Total organic carbon concentrations should be reported as //g/g (ppm) dry weight of the
unacidified sediment sample. Data reported for each sample batch should include QA/QC sample results
(duplicates, CRMs or LCMs, and method blanks). Any factors that may have influenced data quality
should be discussed in a cover letter accompanying the submitted data.
5.4.2 Acid Volatile Sulfide
Quality control of acid volatile sulfide (AVS) measurements is achieved through the routine analysis
of a variety of QA/QC samples. These are outlined in the following section and described in full detail in
the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). Prior to the analysis of samples, the
laboratory must establish a calibration curve and determine a limit of reliable detection for sulfide for the
analytical method being employed. Following this, laboratory performance will be assessed through
routine analysis of laboratory duplicates, calibration check standards, laboratory-fortified blanks (i.e.,
spiked blanks), and laboratory-fortified sample matrices (i.e., matrix spikes).
-------
Section 5
Page 22 of 33
Revision 2
June 1995
One sample in every batch of 25 or fewer samples should be analyzed in duplicate as a check on
laboratory precision. The relative percent difference (RPD) between the two analyses should be less than
20%. If the RPD exceeds 20%, a third analysis should be performed. If the relative standard deviation of
the three determined concentrations exceeds 20%, the individual analyses should be examined to
determine if non random errors may have occurred. As previously discussed, field duplicates and splits
also will be collected for AVS determination to assess both inter- and intralaboratory precision.
Due to the instability of AVSs to drying and handling in air, CRMs have not been developed for
assessing overall measurement accuracy. Therefore, each laboratory must analyze at least one calibration
check standard, one laboratory-fortified blank, and one laboratory-fortified sample matrix in each batch of
25 or fewer samples as a way of determining the accuracy of each step entailed in performing the analysis.
The concentration of sulfide in each of these three types of accuracy check samples will be known to the
analyst; the calculated concentration of sulfide in each sample should be within ± 15% of the known
concentration.
If the laboratory is not within ± 15% of the known concentration for the calibration check solution,
instruments used for AVS measurement must be recalibrated and/or the stock solutions redetermined by
titration. If the laboratory fails to achieve the same accuracy (within ± 15% of the true value) for AVS in
the laboratory-fortified blank, sources of error (e.g., leaks, excessive gas flows, poor sample-acid slurry
agitation) should be determined for the analytical system prior to continuing. If AVS recovery falls
outside the 85% - 115% range for the matrix spike, the system should be evaluated for sources of error and
the analysis should be repeated. If recovery remains unacceptable, it is possible that matrix interferences
are occurring. If possible, the analysis should be repeated using smaller amounts of sample to reduce the
interferant effects. Results for all QA/QC samples (duplicates, calibration check standards, spiked blanks,
and matrix spikes) should be submitted by the laboratory as part of the data package for each batch of
samples, along with a narrative explanation for results outside control limits.
5.4.3 Butyltins
Assessment of the distribution and environmental impact of butyltin species of interest to the EMAP-
E program (tributyltin, dibutyltin and monobutyltin) requires their measurement in marine sediment and
-------
Section 5
Page 23 of 33
Revision 2
June 1995
tissue samples at trace levels. Quality control of these measurements consists of checks on laboratory
precision and accuracy. One laboratory reagent blank must be run with each batch of 25 or fewer samples.
A reagent blank concentration between the MDL and 3 times the MDL should serve as a warning limit
requiring further investigation based on the best judgement of the analyst(s). A reagent blank
concentration equal to or greater than 3 times the MDL requires corrective action to identify and eliminate
the source(s) of contamination, followed by reanalysis of the samples in the associated batch.
One laboratory-fortified sample matrix or laboratory-fortified blank should be analyzed along with
each batch of 25 or fewer samples to evaluate the recovery of the butyltin species of interest. The
butyltins should be added at 5 to 10 times their MDLs as previously calculated by the laboratory. If the
percent recovery for any of the butyltins in the matrix spike or spiked blank is outside the range 70 to
130%, analysis of subsequent sample batches should stop until the source of the discrepancy is determined
and the system corrected.
The National Research Council of Canada sediment reference material "PACS-1," which has certified
concentrations of the three butyltin species of interest, also should be analyzed along with each batch of 25
or fewer sediment samples as a check on accuracy and reproducibility (i.e., batch-to-batch precision). If
values obtained by the laboratory for butyltins in "PACS-1" are not within ±30% of the certified values,
the data for the entire batch of samples are suspect. Calculations and instruments should be checked; the
CRM may have to be reanalyzed to confirm the results. If the values are still outside the control limits in
the repeat analysis, the laboratory is required to determine the source(s) of the problem and repeat the
analysis of that batch of samples until control limits are met, before continuing with further sample
processing.
5.5 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT
5.5.1 Sample Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
-------
Section 5
Page 24 of 33
Revision 2
June 1995
labeling of sample containers, recording sampling information in the field and tracking sample shipments.
A complete description of this system is provided in the EMAP-E Information Management Plan (Rosen et
al. 1991) and also summarized in Section 11 of this plan. Each analytical laboratory must designate a
sample custodian who is authorized to check the condition of and sign for incoming field samples, obtain
documents of shipment, and verify sample custody records. This individual is required, upon receipt of
samples, to record and transmit all tracking information to the Province Information Management Center.
The use of barcode labels and readers provided by the Province will facilitate this process. Laboratory
personnel should be aware of the required sample holding times and conditions (see Table 5-3), and there
must be clearly defined custody procedures for sample handling, storage, and disbursement in the
laboratory.
5.5.2 Data Reporting Requirements
As previously indicated, laboratory personnel must verify that the measurement process was "in
control" (i.e., all specified QA/QC requirements were met) for each batch of samples before proceeding
with the analysis of a subsequent batch. In addition, each laboratory must establish a system for detecting
and eliminating transcription and/or calculation errors prior to reporting data. It is recommended that an
individual not involved directly in sample processing be designated as laboratory QA Officer to perform
these verification checks independent of day-to-day laboratory operations.
Only data which has met QA requirements should be submitted by the laboratory. When QA
requirements have not been met, the samples should be reanalyzed and only the results of the reanalysis
should be submitted, provided they are acceptable. Each data package should consist of the following:
• A cover letter providing a brief description of the procedures and instrumentation used (including the
procedure(s) used to calculate MDLs), as well as a narrative explanation of analytical problems (if any),
departures from protocols, or failure(s) to meet required quality control limits.
• Tabulated results in hard copy form, including sample size, wet weight, dry weight, and
concentrations of the analytes of interest (reported in units identified to three significant figures unless
otherwise justified). Concentration units should be ng/g or ^g/g (dry^weight) for sediment or tissue. The
-------
Section 5
Page 25 of 33
Revision 2
June 1995
results should be checked for accuracy and the report signed by the laboratory manager or designee.
• Tabulated results in computer-readable form (e.g., diskette) included in the same shipment as the hard
copy data, but packaged in a diskette mailer to prevent damage. Presently, there are three acceptable
formats for computer-readable data, descriptions of which are available upon request from the Province
Information Manager: (1) the EPA Standard Format specified in EPA Order 2180.2 ("Data Standards for
the Electronic Transmission of Laboratory Measurement Results"), (2) ASCII text files in a format
specif ied by the Province Information Manager, or (3) any format agreed upon by the submitting
laboratory and the Province Information Manager. If data is not delivered in one of these formats, the data
package will be considered incomplete and will not be accepted.
• Tabulated method detection limits achieved for the samples.
• Results for all QA/QC samples (e.g., CRMs, calibration check samples, blanks, matrix spike/matrix
spike duplicates, etc.) must be submitted by the laboratory as part of the data package for each batch of
samples analyzed. The laboratory must provide a "batch number" as a way to link samples from a given
batch or analytical set with their accompanying QA/QC samples. The laboratory should denote QA/QC
samples using the codes (abbreviations) and reporting units specified in Table 5-7.
Laboratories are responsible for assigning only two data qualifier codes or "flags" to the submitted
data. If an analyte is not detected, the laboratory should report the result either as "ND" or else leave the
"RESULT" field empty, followed by the letter "a" in the "QACODE" field and the method detection limit
(MDL) in the "MDL" field. The "a" code has the following meaning: "The analyte was not detected. The
detection limit (MDL) is reported as a separate variable." If a quantifiable signal is observed, the
laboratory should report a concentration for the analyte; the data qualifier code "b" then should be used to
flag any reported values that are below the laboratory's MDL. The "b" code has the following meaning:
"The reported concentration is below or equal to the detection limit. The detection limit (MDL) is reported
as a separate variable."
-------
TABLE 5-7.
Code
CLC
LRB
LCM
LCMPR
LF1
LF1PR
LF2
LF2PR
MSDRPD
LFB
LSFPR
LDRPD
Section 5
Page 26 of 33
Revision 2
June 1995
Codes for denoting QA/QC samples in submitted data packages.
Description
Continuing calibration check sample
Lab reagent blank
Lab control material
Lab control material % recovery
Lab spiked sample- 1st member
Lab spiked sample- 1st mem. % rec.
Lab spiked sample- 2nd member
Lab spiked sample- 2nd mem. % rec.
Rel % difference: LF1 to LF2
Lab fortified blank
Lab spiked sample % Rec.
Lab duplicate relative % diff.
Unit of Measure
Percent recovery
varies
Mg/g or ng/g dry wt.
Percent recovery
Mg/g or ng/g dry wt.
Percent recovery
/^g/g or ng/g dry wt.
Percent recovery
Percent
Percent recovery
Percent recovery
Percent
-------
Section 5
Page 27 of 33
Revision 2
June 1995
There may be a limited number of situations where sample reanalysis is not possible or practical
(i.e., minor exceedance of a single control limit criteria). The laboratory is expected to provide a detailed
explanation of any factors affecting data quality or interpretation; this explanation should be in the form of
a cover letter accompanying each submitted data package. The narrative explanation is in lieu of
additional data qualifier codes supplied by the laboratory (other than the "a" and "b" codes). Over time,
depending on the nature of these narrative explanations, the EMAP-E program expects to develop a limited
list of codes for qualifying data in the database (in addition to the "a" and "b" codes).
5.5.3 Data Evaluation Procedures
It is the responsibility of the Province Manager to acknowledge initial receipt of the data
package(s), verify that the four data evaluation steps identified in the following paragraph are completed,
notify the analytical laboratory of any additional information or corrective actions deemed necessary as a
result of the Province's data evaluation, and, following satisfactory resolution of all "corrective action"
issues, take final action by notifying the laboratory in writing that the submitted results have been
officially accepted as a completed deliverable in fulfillment of contract requirements. It may be necessary
or desirable for a team of individuals (e.g., the Province QA Coordinator and/or analytical chemists on the
Province staff) to assist the Province Manager in technical evaluation of the submitted data packages.
While the Province Manager has ultimate responsibility for maintaining official contact with the analytical
laboratory and verifying that the data evaluation process is completed, it is the responsibility of the
Province QA Coordinator to closely monitor and formally document each step in the process as it is
completed. This documentation should be in the form of a data evaluation tracking form or checklist that
is filled in as each step is completed. This checklist should be supplemented with detailed memos to the
project file outlining any concerns with data omissions, analysis problems, or descriptions of questionable
data identified by the laboratory.
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten and (if holding times have been
exceeded) can sometimes limit options for reanalysis. The following steps are to be followed in
evaluating EMAP-E chemistry data:
-------
Section 5
Page 28 of 33
Revision 2
June 1995
1) Checking data completeness (verification)
2) Assessing data quality (validation)
3) Assigning data qualifier codes
4) Taking final actions
The specific activities required to complete each of these steps are illustrated in Figure 5-1 and described
in the following sections, which are adopted in large part from the document "A Project Manager's Guide
to Requesting and Evaluating Chemical Analyses" (USEPA 1991).
Checking Data Completeness
The first part of data evaluation is to verify that all required information has been provided in the
data package. On the EMAP-E program, this should include the following specific steps:
• Province personnel should verify that the package contains the following: narrative explanations signed
by the laboratory manager, hard copies of all results (including QA/QC results), and accompanying
computer diskettes.
• The electronic data file(s) should be parsed and entered into the EMAP Province database to verify that
the correct format has been supplied.
• Once the data has been entered into the Province database, automated checks should be run to verify
that results have been reported for all expected samples and all analytes.
The Province Manager should contact the laboratory and request any missing information as soon
as possible after receipt of the data package. If information was omitted because required analyses were
not completed, the laboratory should provide and implement a plan to correct the deficiency. This plan
may include submittal of a revised data package and possible reanalysis of samples.
-------
Section 5
Page 29 of 33
Revision 2
June 1995
Information
Source
Analytical Data
and Supporting
Documentation
I
Evaluation
Criteria
Technical
Conclusion
Information
Complete
Calibrations
Acceptable
Blanks
Acceptable
Bias
Acceptable
Precision
Acceptable
Detection
Limits
Acceptable
Within Limits
Marginally
Outside Limits
Severely
Outside Limits
Management
Action
Accept
Data for Use
Accept Data
with Appropriate
Qualifications
Consult Expert
Reject Data
(and consider
re-analysis)
igure 5-1. Steps to be followed in the assessment and evaluation of EMAP-E chemistry data (from U.S. EPA 1991).
-------
Section 5
Page 30 of 33
Revision 2
June 1995
Assessing Data Quality
Data validation, or the process of assessing data quality, can begin after Province personnel have
determined that the data package is complete. Normally, the first major part of validation involves
checking 100% of the data for any possible errors resulting from transcription of tabulated results,
misidentification or miscalculations. However, EMAP-E laboratories are expected to submit data which
already has been tabulated and checked 100% for accuracy, and the raw data reports needed by Province
personnel to perform these checks (e.g., chromatograms, original quantitation reports) are not submitted as
part of the data package. In addition, a 100% validation check is both cost-prohibitive and unnecessary in
monitoring programs, like EMAP-E, which do not involve enforcement actions. Therefore, the first-step
validation checks performed by Province personnel will be limited to the following: 1) a check to verify
that all reporting units and numbers of significant figures are correct; 2) a check to verify that all of the
laboratory's calculated percent recovery values (for calibration check samples, Laboratory Control
Materials, and matrix spikes) and relative percent difference values (for duplicates) are correct; and 3) a
check to verify that the reported concentrations for each analyte fall within "environmentally realistic"
ranges, determined from previous studies and expert judgement. In addition, past studies indicate that the
different compounds in each class of chemicals being measured for EMAP-E (e.g., PAHs, PCBs, DDTs
and other chlorinated pesticides) typically occur in the environment in somewhat fixed ratios to one
another. For example, the DDT breakdown products p,p DDD and p,p DDE typically can be expected to
occur at higher concentrations than p,p DDT in estuarine sediments of the gulf coast. If anomalous
departures from such expected ratios are found, it may indicate a problem in the measurement or data
reduction process requiring further investigation.
The second major aspect of data validation is to compare the QA/QC data against established
criteria for acceptable performance, as specified earlier in this plan. This will involve the following
specific steps:
1) Results for QA/QC samples.should be tabulated, summarized, and evaluated. Specifically, a set of
summary tables should be prepared from the Province database showing the percent recovery
values and relative percent difference values (where applicable) for the following QA/QC
samples: continuing calibration check samples, laboratory control material(s), and matrix
-------
Section 5
Page 31 of 33
Revision 2
June 1995
spike/matrix spike duplicate samples. The tables should indicate the percent recovery values for
these samples for each individual batch of samples, as well as the average, standard deviation,
coefficient of variation, and range for all batches combined.
2) Similar summary tables should be prepared for the laboratory reagent blank QA/QC samples.
3) The summary results, particularly those for the Laboratory Control Material (i.e., Certified
Reference Material), should be evaluated by comparing them against the QA/QC warning and
control limit criteria for accuracy, precision, and blank contamination specified in Table 5-4.
4) Method detection limits reported by the laboratory for each analyte should be tabulated and
compared against the target values in Table 5-5.
There are several possible courses of action to be taken if the reported data are found to be deficient (i.e.,
warning and/or control limits exceeded) during the assessment of data quality:
1) The laboratory's cover letter (narrative explanation) should be consulted to determine if the
problems were satisfactorily addressed.
2) If only warning limits were exceeded, then it is appropriate for the laboratory to report the results.
Minor excee.dances of a limited number of control limits should result in all associated data being
qualified as estimated values, as explained in the following section. Large exceedances of several
action limits should result in rejection of the data because there is ample evidence that the
analyses were out of control and unreliable. However, because EMAP-E laboratories must report
only data meeting QA/QC criteria for acceptability, this type of data rejection is not anticipated.
Assigning Data Qualifier Codes
Data qualifier codes are notations used by laboratories and data reviewers to briefly describe, or
qualify, data and the systems producing data. As previously indicated, EMAP-E laboratories are expected
to assign only two data qualifier codes ("a" and "b") to data values before submitting them to the program.
-------
Section 5
Page 32 of 33
Revision 2
June 1995
EMAP-E data reviewers, in turn, will assign an additional data qualifier code in situations where there are
minor exceedances of a limited number of control limit criteria. The most typical situation is when a
laboratory fails to meet the accuracy control limit criteria for a particular analyte in a Certified Reference
Material or matrix spike sample. In these situations, the QA reviewer should verify that the laboratory did
meet the control limit criteria for precision. If the lack of accuracy is found to be consistent (i.e., control
limit criteria for precision were met), then it is likely that the laboratory experienced a true bias for that
particular analyte. In these situations, all reported values for that particular analyte will be qualified with a
"c" code. The "c" code has the following meaning: "The reported concentration is considered an estimate
because control limits for this analyte were exceeded in one or more quality control samples."
Because some degree of expert judgement and subjectivity typically is necessary to evaluate
chemistry QA/QC results and assign data qualifier codes, data validation should be conducted only by
qualified personnel. It is the philosophy of the program that data which are qualified as estimates because
of minor exceedance of a control limit in a QA/QC sample ("c" code) are still usable for most assessment
and reporting purposes. However, it is important to note that all QA/QC data will be readily available in
the database along with the results data, so that interested data users can make their own estimation of data
quality.
Taking Final Action
Upon completion of the above steps, a report summarizing the QA review of the data package should be
prepared, samples should be properly stored or disposed of, and laboratory data should be archived both in
a storage file and in the database. Technical interpretation of the data begins after the QA review has been
completed.
Reports documenting the results of the QA review of a data package should summarize all
conclusions concerning data acceptability and should note significant quality assurance problems that
were found. These reports are useful in providing data users with a written record of data concerns and a
documented rationale for why certain data were accepted as estimates or were rejected. The following
specific items should be addressed in the QA report:
-------
Section 5
Page 33 of 33
Revision 2
June 1995
• Summary of overall data quality, including a description of data that were qualified.
• Brief descriptions of analytical methods and the method(s) used to determine detection limits.
• Description of data reporting, including any corrections made for transcription or other reporting
errors, and description of data completeness relative to objectives stated in the QA plan.
• Descriptions of initial and ongoing calibration results, blank contamination, and precision and
bias relative to QA plan objectives (including tabulated summary results for Certified Reference
Materials and matrix spike/matrix spike duplicates).
The chemistry QA results will be presented in the Program Annual Quality Assurance Report and
will also become a permanent part of the database documentation (i.e., metadata). The QA/QC data
collected by the Program will be used not only to assess the accuracy and precision of individual
laboratory measurements, but ultimately to assess the comparability of data generated by multiple
laboratories.
-------
Section 6
Page 1 of 6
Revision 2
June 1995
SECTION 6
SEDIMENT PARTICLE SIZE ANALYSIS
6.1 OVERVIEW
Particle size is used to characterize the physical characteristics of sediments. Because particle size
influences both chemical and biological variables, it can be used to normalize chemical concentrations
according to sediment characteristics and to account for some of the variability found in biological
assemblages. For 1995 EMAP-E monitoring in the West Indian Province, only the percent silt-clay will be
determined for the particle size samples.
6.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION, AND HOLDING
EMAP-E protocols for collecting particle size samples are described in detail in the West Indian
Province Field Operations Safety Manual (Macauley and Summers 1995). Samples will be collected in
plastic containers; a minimum sample size of 100 g is recommended. Samples should be held and shipped
on ice (NOT dry ice) and stored at 4 °C for up to one year before analysis. Samples must not be frozen or
dried prior to analysis, as either process may change the particle size distribution.
6.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS
Quality control of sediment particle size analysis is accomplished by strict adherence to protocol and
documentation of quality control checks. Certain procedures are critical to the collection-of high quality
data. For example, it is essential that each sample be homogenized thoroughly in the laboratory before a
subsample is taken for analysis. Laboratory homogenization should be conducted even if samples were
homogenized in the field. Furthermore, all screens used for dry sieving must be clean before conducting
analysis, and all of the sample must be retrieved from them. To clean a screen, it should be inverted and
-------
Section 6
Page 2 of 6
Revision 2
June 1995
tapped on a table, while making sure that the rim hits the table evenly. Further cleaning of brass screens
may be performed by gentle scrubbing with a stiff bristle nylon brush. Stainless steel screens may be
cleaned with a nylon or brass brush.
The most critical aspect of the pipet analysis is knowledge of the temperature of the silt-clay
suspension. An increase of only 1 °C will increase the settling velocity of a particle 50 um in diameter by
2.3%. It is generally recommended that the pipet analysis be conducted at a constant temperature of 20
°C. However, Plumb (1981) provides a table to correct for settling velocities at other temperatures; this
table is included in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). If the mass of
sediment used for pipet analysis exceeds 25 g, a subsample should be taken as described by Plumb (1981).
Silt-clay samples in excess of 25 g may give erroneous results because of electrostatic interactions
between the particles. Silt-clay samples less than 5 g yield a large experimental error in weighing relative
to the total sample weight. Thorough mixing of the silt-clay suspension at the beginning of the analysis
also is critical. A perforated, plexiglass disc plunger is very effective for this purpose. Once the pipet
analysis begins, the settling cylinders must not be disturbed, as this will alter particle settling velocities.
Care must be taken to disturb the sample as little as possible when pipet extractions are made.
The analytical balance, drying oven, sieve shaker, and temperature bath used in the analysis should be
calibrated at least monthly. Dried samples should be cooled in a desiccator and held there until they are
weighed. If a desiccator is not used, the sediment will accumulate ambient moisture and the sample
weight will be overestimated. A color-indicating desiccant is recommended so that spent desiccant can be
detected easily. Also, the seal on the desiccator should be checked periodically, and, if necessary, the
ground glass rims should be greased or the "O" rings should be replaced.
Quality control for the sediment analysis procedures will be accomplished primarily by reanalyzing a
randomly selected subset of samples from each batch, as described in full detail in the EMAP-E
Laboratory Methods Manual (U.S. EPA, in preparation). A batch of samples is defined as a set of samples
of a single textural classification (e.g., silt/clay, sand, gravel) processed by a single technician using a
single procedure. Approximately 10% of each batch completed by the same technician should be
reanalyzed (i.e., reprocessed) in the same manner as the original sample batch. If the absolute difference
between the original value and the second value is greater than 10% (in terms of the percent of the most
-------
Section 6
Page 3 of 6
Revision 2
June 1995
abundant sediment size class), then a third analysis will be completed by a different technician. The
values closest to the third value will be entered into the database. In addition, all the other samples in the
same batch must be reanalyzed, and the laboratory protocol and/or technician's practices should be
reviewed and corrected to bring the measurement error under control. If the percent of the most abundant
sediment size class in the original sample and the reanalyzed sample differs by less than 10, the original
value will not be changed and the sediment analysis process will be considered in control.
Additional quality control for particle size analyses will be accomplished by reanalyzing samples that
fail either a range check or recovery check. For the range check, any sample results that fall outside
expected ranges (i.e., any percentage that totals greater than 100%) will be reanalyzed. For the recovery
check, if the total weight of the recovered sands is 10% (by weight) less or greater than the starting weight
of sands, the sample must be reanalyzed.
6.4 QUALITY CONTROL PROCEDURES: INFORMATION
MANAGEMENT
6.4.1 Sample Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
labeling of sample containers, recording sampling information in the field and tracking sample shipments.
A complete description of this system is provided in the EMAP-E Information Management Plan (Rosen et
al. 1991) and also summarized in Section 11 of this plan. The laboratory responsible for processing the
sediment particle size samples must designate a sample custodian who is authorized to check the condition
of and sign for the incoming field samples, obtain documents of shipment, and verify sample custody
records. This individual is required, upon receipt of samples, to record and transmit all tracking
information to the Province Information Management center. The use of barcode labels and readers
provided by the Province will facilitate this process. Laboratory personnel should be aware of the required
sample holding times and conditions for particle size samples, and there must be clearly defined custody
procedures for sample handling, storage, and disbursement in the laboratory.
-------
Section 6
Page 4 of 6
Revision 2
June 1995
6.4.2 Data Reporting Requirements and Evaluation Procedures
The weight of each sediment fraction should be reported to the nearest 0.0001 g dry weight. The
laboratory should report the results for all samples analyzed (including QC duplicates) both in hard copy
and in a computer-readable format specified by the Province Information Manager. In addition, the data
package should include a cover letter with a summary of all quality control checks performed and a
narrative explanation of any problems that may have influenced data quality.
It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s),
verify that the four data evaluation steps identified in the following paragraph are completed, notify the
laboratory of any additional information or corrective actions deemed necessary as a result of the
Province's data evaluation, and, following satisfactory resolution of all "corrective action" issues, take
final action by notifying the laboratory in writing that the submitted results have been officially accepted
as a completed deliverable in fulfillment of contract requirements. It may be necessary or desirable for the
Province Manager to delegate the technical evaluation of the data to the Province QA Coordinator or other
qualified staff member. It is the responsibility of the Province QA Coordinator to closely monitor and
formally document each step in the data evaluation process as it is completed. This documentation should
be in the form of a data evaluation tracking form or checklist that is filled in as each step is completed.
This checklist should be supplemented with detailed memos to the project file outlining the concerns with
data omissions, analysis problems, or descriptions of questionable data identified by the laboratory.
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten and (if holding times have been
exceeded) can sometimes limit options for reanalysis. The first part of data evaluation is to verify that all
required information has been provided in the data package. On the EMAP-E program, this should include
the following specific steps:
• Province personnel should verify that the package contains a cover letter signed by the laboratory
manager, hard copies of all results (including QA/QC results), and accompanying computer diskettes.
-------
Section 6
Page 5 of 6
Revision 2
June 1995
• The electronic data file(s) should be parsed and entered into the EMAP Province database to verify
that the correct format has been supplied.
• Once the data has been transferred to the Province database, automated checks should be run to verify
that results have been reported for all expected samples and all analytes.
The Province Manager should contact the laboratory and request any missing information as soon as
possible after receipt of the data package. If information was omitted because required analyses were not
completed, the laboratory should provide and implement a plan to correct the deficiency. This plan may
include submittal of a revised data package and possible reanalysis of samples.
Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for particle size data should consist of the
following: 1.) a check to verify that all reporting units and numbers of significant figures are correct; 2.) a
check to verify that the cumulative percentage of each particle size fraction never exceeds 100% (i.e., a
failed range check); 3.) a check to verify that the results for duplicate samples do not differ by more than
10%; and 4.) the relative standard deviation (RSD) for the three particle size samples obtained at each
station should be calculated. For any station having an RSD greater than 20%, all raw data and
calculations should be checked by the laboratory to ascertain that the difference truly reflects natural
spatial variability among the three grab samples and not measurement error.
6.4.3 Assigning Data Qualifier Codes and Taking Final Action
Data qualifier codes are notations used by laboratories and data reviewers to briefly describe or qualify
data and the systems producing data. To date, the West Indian Province particle size data has been
accepted without qualification, and no data qualifier codes have been developed. All QA/QC data
associated with particle size analyses will be readily available in the database along with the results data,
so that interested data users can perform their own assessments of data quality.
Upon completion of all data evaluation steps, a report summarizing the QA review of the data package
should be prepared, samples should be properly stored or disposed of, and laboratory data should be
-------
Section 6
Page 6 of 6
Revision 2
June 1995
archived both in a storage file and in the database. Reports documenting the results of the QA review of
the data package should summarize all conclusions concerning data acceptability and should note
significant quality assurance problems that were found. These reports are useful in providing data users
with a written record of data concerns and a documented rationale for why certain data were accepted as
estimates or were rejected. The following specific items should be addressed in the QA report:
• Summary of overall data quality, including a description of data that were qualified.
• Brief descriptions of sample collection and analysis methods.
• Description of data reporting, including any corrections made for transcription or other reporting
errors, and description of data completeness relative to objectives stated in the QA plan.
The particle size QA results will be included in the annual Program Quality Assurance Report and will
also become a permanent part of the database documentation (i.e., metadata).
-------
Section 7
Page 1 of 12
Revision 2
June 1995
SECTION 7
SEDIMENT TOXICITY TESTING
7.1 OVERVIEW
Sediment toxicity tests will not be conducted as part of the 1995 summer monitoring in the West
Indian Province. In the event that sediment toxicity tests are resumed in future EMAP monitoring
activities within the province, this section is retained in the 1995 West Indian QAPP. In the West Indian
Province, acute toxicity tests may be conducted with two species of test organisms, the marine amphipod
Ampelisca abdita and the mysid shrimp Mysidopsis bahia; amphipod tests will be 10-day exposures and
mysids, 96-hour exposures. The QA/QC procedures for sediment toxicity testing presented in this section
address the following: sample collection, preservation and holding, the condition of testing facilities and
equipment, the source and condition of test organisms, test conditions, instrument calibration, use of
reference toxicants, record keeping, data reporting requirements, and data evaluation. Any laboratory that
has not previously performed the sediment toxicity test using Ampelisca abdita will be required to perform
an initial demonstration of capability, as described below.
7.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION AND HOLDING
Protocols for sample collection, preservation and holding are presented in the Field Operations
Manual (Macauley and Summers 1995). Sediment samples for toxicity testing should be chilled to 4 °C
when collected, shipped on ice, and stored in the dark at 4 °C until used. The minimum volume of
sediment required per station is 3000 ml (i.e., 3 {). The sediment should be stored for no longer than four
weeks before the initiation of the test and should not be frozen or allowed to dry. Sample containers
should be made of chemically inert materials (e.g., glass or high density polyethylene jars with Teflon®-
lined lids) to prevent contamination, which might result in artificial changes in toxicity.
Sediment for toxicity testing is taken from the same homogenate-for the sediment chemistry sample;
-------
Section 7
Page 2 of 12
Revision 2
June 1995
this homogenate consists of the top 2-cm layer taken from multiple grabs at each station. As with the
sediment chemistry sample, contamination is to be avoided in obtaining the sediment toxicity sample.
This is accomplished through strict adherence to protocol during sample collection. For example, all
sampling devices and any other instruments in contact with the sediment should be cleaned with water and
a mild detergent and thoroughly rinsed between stations, and all utensils in contact with the sample should
be made of chemically inert materials, such as Teflon® or high-quality stainless steel (see Macauley and
Summers 1995).
7.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS
Complete descriptions of the methods employed for the sediment toxicity test are provided in the
EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision); these protocols are based on
American Society for Testing and Materials (ASTM) Standard Methods (ASTM 1991).
7.3.1 Facilities and Equipment
Laboratory and bioassay temperature control equipment must be adequate to maintain recommended
test temperatures. Recommended materials must be used in the fabrication of the test equipment in contact
with the water or sediment being tested, as specified in the EMAP-E Laboratory Methods Manual (U.S.
EPA, in preparation). Laboratories are strongly advised to provide for backup electrical power (i.e.,
emergency generator) adequate to supply ail electrical needs associated with conducting sediment toxicity
tests (e.g., temperature control for sample storage and testing, aeration, lighting, etc.).
7.3.2 Initial Demonstration of Capability
Laboratories which have not previously conducted sediment toxicity tests withAmpelisca abdica or
Mysidopsis bahia must demonstrate the ability to collect or culture (if applicable), hold, acclimate, and test
the organisms without significant loss or mortality, prior to initiating testing of actual samples. There are
two types of tests which must be performed as an initial demonstration of capability; these tests will serve
to indicate the overall ability of laboratory personnel to handle the organism adequately and obtain
-------
Section 7
Page 3 of 12
Revision 2
June 1995
consistent, precise results. First, the laboratory must perform a minimum of five successive reference
toxicant tests, using sodium dodecyl sulfate (SDS) as the reference toxicant. For both mysids and
Ampelisca abdita, short-term (i.e., 96-hour) tests without sediments (i.e., seawater only) can be used for
this purpose.
The trimmed Spearman-Karber method of regression analysis (Hamilton et al. 1977) or the monotonic
regression analysis developed by DeGraeve et al. (1988) can be used to determine an LC50 value for each
96-h reference toxicant test. The LC50 values should be recorded on a control chart maintained in the
laboratory (as described previously in section 3.2.5 of this document). Precision then can be described by
the LC50 mean, standard deviation, and percent relative standard deviation (coefficient of variation, or
CV) of the five (or more) replicate reference toxicant tests. If the laboratory fails to achieve an acceptable
level of precision in the five preliminary reference toxicant tests, the test procedure should be examined
for defects and the appropriate corrective actions should be taken. Precision is considered acceptable
when the LC50 values for five consecutive reference toxicant tests fall within the 95% confidence interval
warning limits on the control chart. Additional tests should be performed until acceptable precision is
demonstrated.
The second series of tests that must be performed successfully prior to the testing of actual samples are
"non-toxicant," 10-day exposures of Ampelisca abdita and 96-h exposures ofMysidopsis bahia, in which
test chambers contain the control sediment and seawater that will be used under actual testing conditions.
These "control" tests should be performed concurrent with the reference toxicant tests used to assess single
laboratory precision. At least five replicate test chambers should be used in each test. The tests should be
run in succession until two consecutive tests each have mean survival equal to or greater than 85% and
survival in the individual test chambers is not less than 80%. These are the control survival rates that must
be achieved during actual testing if a test is to be considered acceptable (see section 7.3.5); therefore, the
results of this preliminary demonstration will provide evidence that facilities, water, control sediment, and
handling techniques are adequate to result in successful testing of samples. The testing facility is required
to submit the results of the initial demonstration of performance to the Province Manager and receive
written approval prior to testing actual samples.
-------
Section 7
Page 4 of 12
Revision 2
June 1995
7.3.3 Quality of Test Organisms
All organisms used in the tests should be disease-free and positively identified to species by a
qualified individual. If amphipods are collected from the field prior to testing, they should be obtained
from an area known to be free of toxicants and should be held in clean, uncontaminated water and
facilities. Mysids must be obtained from facilities that have demonstrated successful culturing from brook
stocks held in uncontaminated seawater. Organisms held prior to testing should be checked daily, and
individuals which appear unhealthy or dead should be discarded. If greater than 5% of the organisms in
holding containers are dead or appear unhealthy during the 48 hours preceding a test, the entire group
should be discarded and not used in the test.
Test organisms should be as uniform as possible in age and size. For EMAP-E sediment toxicity
testing, juvenile Ampelisca abdita in the size range 2-4 mm should be used for testing. Only active,
apparently healthy individuals should be selected for testing; care should be taken not to select gravid
females or males nearing sexual maturity. To verify that the individuals used are of the appropriate size, at
least one additional group of 20-30 amphipods must be sorted at random at the beginning of each test.
This extra group should be preserved in 5% buffered formalin or 70% ethanol for later length
measurement. The length of each individual in the group should be determined using a dissecting
microscope and measuring from the base of the first antennae to the base of the telson. The mean,
standard deviation, and range of these length measurements should be used by laboratory personnel to
verify that correctly sized individuals are being used in the tests; the length measurement data also should
be reported along with the results for each test.
Mysids used for EMAP-E sediment toxicity testing must be 3-5 days old and ail animals for any one
test must be from the same source (e.g., purchased organisms cannot be used to supplement laboratory-
cultured organisms when setting up a test).
The sensitivity of each batch of organisms obtained for testing must be evaluated with the reference
toxicant sodium dodecyl sulfate (SDS) in a short-term toxicity test performed concurrently with the
sediment toxicity tests. The use of SDS as the reference toxicant is required as a means of standardizing
test results among different laboratories. For both Mysidopsis bahia and Ampelisca abdita, 96-h reference
-------
Section 7
Page 5 of 12
Revision 2
June 1995
toxicant tests without sediment are used to generate LC50 values, as previously described in section 7.3.2.
These LC50 values should be recorded on the same control chart used to record the results of the five
(or more) reference toxicant tests performed for the initial demonstration of capability. The control chart
represents a "running plot" of the toxicity values (LCSOs) from successive reference toxicant tests. The
mean LC50 and the upper and lower warning and control limits (95% and 99% confidence interval around
the mean, respectively) are recalculated with each successive point until the statistics stabilize. Outliers,
which are values which fall outside the upper and lower control limits, are readily identified. The plotted
values are used both to evaluate trends in organism sensitivity and to verify the overall ability of
laboratory personnel to obtain consistent results.
Reference toxicant test LC50 values which fall outside control chart limits should serve as a warning
to laboratory personnel. At the P=0.05 probability level, one in twenty tests would be expected to fall
outside warning limits by chance only. The laboratory should try to determine the cause of the outlying
LC50 value, but a retest of the samples is not necessarily required. If the reference toxicant test results are
outside control chart limits on the next consecutive test, the sensitivity of the organisms and the overall
credibility of the test are suspect. The test procedure again should be examined for defects and additional
reference toxicant tests performed. Testing of samples should not resume until acceptable reference
toxicant results can be obtained; this may require the use of a different batch of test organisms.
7.3.4 Test Conditions
Parameters such as water temperature, salinity (conductivity), dissolved oxygen, and pH should be
checked as required for each test and maintained within specified limits (U.S. EPA 1992, in revision).
Instruments used for routine measurements must be calibrated and standardized according to instrument
manufacturer's procedures. All routine chemical and physical analyses must include established quality
assurance practices as outlined in EPA methods manuals (U.S. EPA 1979a and b).
Overlying water must meet the requirements for uniform quality specified in the method (U.S. EPA
1992, in revision). The minimum requirement for acceptable overlying water is that it allows acceptable
control survival without signs of organism disease or apparent stress (i.e., unusual behavior or changes in
-------
Section 7
Page 6 of 12
Revision 2
June 1995
appearance). The overlying water used in the sediment toxicity tests with Ampelisca should have a
salinity of 30 %o while that used with mysids, 20 %o. Overlying water may be natural seawater, diluted
hypersaline brine prepared from natural seawater, or artificial seawater prepared from sea salts. If natural
seawater is used, it should be obtained from an uncontaminated area known to support a healthy,
reproducing population of the test organism or a comparably sensitive species.
7.3.5 Test Acceptability
Survival of organisms in control treatments should be assessed during each test as an indication of
both the validity of the test and the overall health of the test organism population. The tests with
Ampelisca abdita and Mysidopsis bahia are acceptable if mean control survival is greater than or equal to
85%, and if survival in individual control test chambers exceeds 80%. If these control survival rates are
not achieved, the test must be repeated. Additional guidelines for acceptability of individual sediment
toxicity tests are presented in the EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision). An
individual test may be conditionally acceptable if temperature, dissolved oxygen (DO), and other specified
conditions fall outside specifications, depending on the degree of the departure and the objectives of the
tests. Any deviations from test specifications must be noted in a cover letter to the West Indian Province
Manager when reporting the data so that a determination can be made of test acceptability.
7.4 QUALITY CONTROL PROCEDURES: INFORMATION
MANAGEMENT
7.4.1 Sample Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
labeling of sample containers, recording sampling information in the field and tracking sample shipments.
A complete description of this system is provided in the EMAP-E Information Management Plan (Rosen et
al. 1991) and also is summarized in Section 11 of this plan. The laboratory responsible for performing the
sediment toxicity tests must designate a sample custodian, who is authorized to check the condition of and
sign for the incoming field samples, obtain documents of shipment, and verify sample custody records.
-------
Section 7
Page 7 of 12
Revision 2
June 1995
This individual is required, upon receipt of samples, to record and transmit all tracking information to the
Province Information Management center. The use of barcode labels and readers provided by the Province
will facilitate this process. Laboratory personnel must adhere to the required sample holding times and
conditions for sediment toxicity samples, and there must be clearly defined custody procedures for sample
handling, storage, and disbursement in the laboratory.
7.4.2 Record Keeping and Data Reporting Requirements
It is mandatory for the toxicity testing facility to maintain thorough and complete records. Bound
notebooks on standard data sheets must be used to maintain records of the test organisms such as species,
source, age, date of collection and/or receipt, and other pertinent information relating to their history and
health, and information on the calibration of equipment and instruments, test conditions employed, size of
organisms used in the test and test results. Annotations should be made on a real-time basis to prevent
loss of information.
Laboratory personnel should verify that all specified QA/QC requirements are met for a given test, or,
if not, that specified corrective actions are implemented and problems resolved, before proceeding with
subsequent tests. In addition, each laboratory must establish a system for detecting and eliminating
transcription and/or calculation errors and assigning data qualifier codes prior to reporting data. It is
recommended that an individual not involved directly in sample processing be designated as laboratory
QA Officer to perform these verification checks independent of day-to-day laboratory operations.
The laboratory should submit only data which either has met all QA requirements or has been
qualified properly using allowable QA codes. Samples will be retested whenever QA requirements have
not been met, and only the results of the retesting (meeting QA requirements) should be submitted. The
laboratory should report the results for all successfully tested samples both in hard copy and in a
computer-readable format specified by the Province Information Manager. At a minimum, the following
information should be included:
• EMAP sample ID, laboratory sample ID (if applicable), laboratory test number (allows EMAP to
identify all field samples and associated controls comprising a sirtgle test), organism percent mortality
-------
Section 7
Page 8 of 12
Revision 2
June 1995
for each replicate, mean percent mortality for each sample, and results of the significance test for
toxicity (t-test of each sample versus the control).
• Data for all water quality measurements made during testing (i.e., dissolved oxygen, temperature,
salinity, and pH) and for all QA/QC variables (e.g., tabulated reference toxicant test results and
associated control charts and the mean, standard deviation, and range in length of the organisms).
• A cover letter with a summary of all quality control checks performed and a narrative explanation of
any problems that may have influenced data quality.
7.4.3 Data Evaluation Procedures
It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s),
verify that the data evaluation procedures identified in the following paragraphs are completed, notify the
laboratory of any additional information or corrective actions deemed necessary as a result of the
Province's data evaluation and, following satisfactory resolution of all "corrective action" issues, take final
action by notifying the laboratory in writing that the submitted results have been officially accepted as a
completed deliverable in fulfillment of contract requirements. It may be necessary or desirable for the
Province Manager to delegate the technical evaluation of the data to the QA Coordinator or other
qualified staff member. It is the responsibility of the Province QA Coordinator to monitor closely and
formally document each step in the data evaluation process as it is completed. This documentation should
be in the form of a data evaluation tracking form or checklist that is updated as each step is completed.
This checklist should be supplemented with detailed memos to the project file outlining the concerns with
data omissions, analysis problems, or descriptions of questionable data identified by the laboratory.
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten, and (if holding times have
been exceeded) can sometimes limit options for reanalysis. The first part of data evaluation is to verify
that all required information has been provided in the data package. First, Province personnel should
verify that the package contains the following: a cover letter signed b.y the laboratory manager, hard copies
of all results (including copies-.of control charts and other QA/QC results), and accompanying computer
-------
Section 7
Page 9 of 12
Revision 2
June 1995
diskettes. Second, the electronic data file(s) should be parsed and entered into the EMAP Province
database (SAS data sets) to verify that the correct format has been supplied. Third, once the data has been
transferred to the Province database, automated checks should be run to verify that results have been
reported for all expected samples and that no errors occurred in the converting the data into SAS data sets.
This can be accomplished by visual comparison of SAS printouts and frequency distributions versus
printouts of the original data supplied by the laboratory. The printouts should be used to re-verify the
completeness of the data sets and to verify that values reported for all variables are correct.
The Province Manager should contact the laboratory and request any missing information as soon as
possible after receipt of the data packages. If information was omitted because required analyses were not
completed, the laboratory should provide and implement a plan to correct the deficiency. This plan may
include submittal of a revised data package and possible reanalysis of samples.
Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for sediment toxicity testing data should
consist of the following:
• A random check of 20% of the reported results to verify that the statistical test of significance (t-test)
was performed without error by the laboratory. If no errors are found, it can be assumed that this test
was applied correctly to all results and no further action is necessary. If one or more errors are found,
the significance tests for the entire data set must be recalculated and a request made to the laboratory
for a written explanation of the error(s) and a corrective action plan.
• A review of the water quality data submitted as part of the data package to verify that all specified test
conditions were met.
• The QA/QC data submitted as part of the data package should be reviewed to verify that specified
limits for control survival and/or reference toxicant test LC50 values were not exceeded, or, if
exceeded, that the proper data qualifier codes were assigned by the laboratory (explained in the
following section).
-------
Section 7
Page 10 of 12
Revision 2
June 1995
7.4.4 Assigning Data Qualifier Codes
Data qualifier codes are notations used by laboratories and data reviewers to briefly describe, or
qualify, data and the systems producing data. To date, EMAP-E has developed a limited list of data
qualifier codes which are allowed in situations where the laboratory either experienced an exceedance of a
quality control limit or there was a minor deviation from the required test design or test conditions.
Normally, when control limits are exceeded the laboratory is required to repeat the test for the samples in
question. However, limitations on the amount of sample collected sometimes prevent retesting and data
qualifier codes are required. Data which is qualified is still usable for most assessment purposes, but data
users are alerted to the possible limitations on data use and can make their own judgements. The qualifier
codes developed for EMAP-E sediment toxicity data are listed in Table 7-1 and explained in the following
section. Personnel at the toxicity testing facility are responsible for reviewing the data and assigning all of
these qualifier codes, except for the ST-L code, prior to submitting the data package to EMAP-E.
Table 7-1. Qualifier codes for EMAP-E Ssdiment toxicity data.
Code Description
ST-C Fewer than specified replicates were tested (5 for amphipods, 3 for mysids)
ST-D Mean control survival less than 85%
ST-E Sample held for longer than 30 days prior to testing
ST-G No reference toxicant test was run
ST-I Control survival in one replicate was less than 80%
ST-J Physical parameters out of bounds
ST-K Less than 20 animals used per replicate (Ampelisca only)
ST-L Not used in Province Assessment
The ST-C code is assigned to the results for a given sample whenever the laboratory must use fewer than
the required 5 replicates for that sample in a test. This usually occurs for a limited number of samples
where an insufficient amount of sediment has been collected for testing. At a minimum, three replicates
must be used for a sample's results to be considered valid, as this will still allow the laboratory to perform
the statistical test for significance at test completion.
The ST-D code is assigned to the results for all samples from a giVen test when the mean survival in
-------
Section 7
Page 11 of 12
Revision 2
June 1995
the test control was less than the required 85%. Normally, this invalidates the results for the test and a re-
test is required, but the ST-D code is assigned when re-testing cannot be performed because there is
insufficient sample remaining or sample holding times have been grossly exceeded. Results flagged with
the ST-D code typically are not used for EMAP-E assessments and are of limited value.
The ST-G code is assigned to all samples from a test in which the laboratory failed to conduct the
associated 96-hr, water-only reference toxicant test, as required. The reference toxicant test represents an
important "positive" control which is used to assess both laboratory performance and the relative
sensitivity of the test organisms. Failure to conduct this test represents an omission that does not
necessarily invalidate the test results, but will necessitate closer scrutiny of the laboratory's control charts
of the reference toxicant test results and a review of procedures.
The ST-I code is assigned to all results from a test in which survival in one of the control replicates
was less than the required 80%. The laboratory normally is required to repeat the test whenever this
occurs, but this may not always be possible. If the mean control survival in the test was greater than 85%,
then the data probably are usable for most assessment purposes, but data users should be aware that aji
QA/QC requirements for control survival were not met.
The ST-J code is allowed in a limited number of situations where there was minor exceedance of a
required control limit for one of the physical parameters measured in each test (e.g., dissolved oxygen,
temperature, salinity, or pH). Minor exceedances typically do not invalidate the test results, but the
laboratory must provide a written explanation of the nature and extent of the exceedance. Based on this
explanation, the Province Manager, in consultation with the Province QA Officer or others, will make the
final decision to allow or disallow this code. The laboratory may be required to repeat the test in certain
instances.
The ST-K code is assigned to the results for any sample where the laboratory failed to use the required
20 amphipods per test chamber. This can occur when the laboratory failed to collect or receive from a
supplier an adequate number of organisms to conduct a given test. In these instances, it is preferable to
conduct the test with a fewer number of organisms in each test chamber than to use organisms that are
unhealthy or outside the acceptable size (age) range. Results from tests in which fewer than 20 organisms
-------
Section 7
Page 12 of 12
Revision 2
June 1995
were used per replicate typically are usable for most assessment purposes. Since mysids are routinely
tested using only 10 organisms per replicate, no further reductions are permissible.
The ST-L code is assigned to results which are not acceptable for use in Province assessments (e.g.,
Annual Statistical Reports or Assessment Reports). Typically, results from tests in which mean control
survival was less than the required 85% (ST-D code) are considered invalid and are not used for
assessment purposes.
7.4.5 Data Quality Reports
All QA/QC data associated with EMAP-E sediment toxicity testing will be readily available in the
database along with the results data, so that interested data users can perform their own assessment of data
usability. Upon completion of all data evaluation steps, a report summarizing the QA review of the data
package should be prepared, samples should be properly stored or disposed of, and laboratory data should
be archived both in a storage file and in the database. Reports documenting the results of the review of
the data package should summarize all conclusions concerning data acceptability and should note
significant quality assurance problems that were found. These reports are useful in providing data users
with a written explanation of why certain data qualifier codes were assigned and/or why some data was
rejected. The following specific items should be addressed in the QA report:
• Summary of overall data quality, including a description of data that were qualified.
• Brief descriptions of sample collection and testing methods.
• Description of data reporting, including any corrections made for transcription or other reporting
errors, and description of data completeness relative to objectives stated in the QA plan.
The sediment toxicity testing QA reports will be included in the Program Quality Assurance Report
and will also become a permanent part of the database documentation (i.e., metadata).
-------
Section 8
Page 1 of 14
Revision 2
June 1995
SECTIONS
MACROBENTHIC COMMUNITY ASSESSMENT
8.1 OVERVIEW
This section presents EMAP-West Indian Province QA/QC protocols and requirements for
macrobenthic community assessment, from sample collection and laboratory analysis to validation of the
resultant data and construction of a benthic index. Replicate benthic samples are obtained at each station,
representing the contents of different individual grab samples. Each sample is processed individually in
the laboratory to obtain an accurate assessment of the number of individuals of each species present. This
information is then aggregated in various ways to construct a benthic index to discriminate between
degraded and undegraded estuarine conditions.
8.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION, AND HOLDING
Sediment samples for macrobenthic community assessments will be collected at each station using a
Young-modified Van Veen grab sampler. In order to be considered acceptable, each grab sample must be
obtained following the protocols specified in the West Indian Province Field Operations Manual
(Macauley and Summers 1995). In particular, field personnel should be thoroughly trained in the proper
techniques for sieving and sample preservation (using a stained and buffered formalin solution). In
addition, each sediment sample must be inspected carefully before being accepted for benthic community
assessment. The following acceptability criteria must be satisfied (from U.S. EPA 1991):
• Sediment should not be extruded from the upper face of the sampler such that organisms may be lost.
• Overlying water should be present (indicates minimal leakage).
• The sediment surface should be relatively flat (indicates minimal disturbance or winnowing).
• The entire surface of the sample should be included in the sampler.
-------
Section 8
Page 2 of 14
Revision 2
June 1995
• The grab sampler must have penetrated the sediment to a minimum depth of 7 cm.
If a grab sample does not meet any one of these criteria, it should be rejected.
Alternative Benthic Grabs for Sediment
The Young-modified Van Veen benthic grab sampler normally used by EMAP crews to collect benthic
sediments is designed to take surface bites from bottom conditions that are typically silt or sand. Much of
the WI resource includes areas of vast seagrass beds where the vegetation may be so dense that it prevents
the Van Veen from penetrating through the mat to obtain the underlying sediment. In these situations, a
conventional set of pesthole diggers (PHDs) will be use to penetrate through the grass and bring up a core-
like plug containing both seagrass and sediment. Once on deck, the plug will be processed according to the
normal scheme, as either a grab for sieving benthos or for compositing sediment for the various sediment
analyses.
The QC guidelines for the collection of sediments by posthole diggers are similar to those for the
conventionally obtained sediment grabs; basically, the field crews must strictly adhere to the procedures
detailed in the Field Operations Manual. For a grab to be acceptable, it should be brought aboard in tact;
that is, the plug should represent a complete cylinder that fills the PHDs blades without areas of excessive
erosion or voids caused during retrieval through the water column. Grabs taken for sieving can simply be
transferred from the PHDs into a plastic dishpan which in turn will be handed to the siever for processing.
The grabs taken for sediment composting must be handled in a more sterile manner. The core will be
carefully transferred from the PHDs into a clean, high grade stainless steel tray and the mat of seagrass
(leaf blades and rhizomes) should be separated off leaving the only sediment for composting. It is
anticipated that there will only be a limited amount of sediment obtained from these type of grabs
(approximately 1 liter). When composting sediments from these grabs, all of the sediment will be added to
the composited sample, not just the top 2-3 cm portion as with conventional grabs taken by the Van Veen.
Before composting, a small aliquot (~10 cc) will be taken from the sediment core for AVS analysis; this
sample should be taken from an interior portion of the core to obtain undisrupted (as much as possible)
sediment. Of course, after sampling a station, the PHDs must be cleaned thoroughly with biodegradable
detergent (e.g., Sparkleen) and rinsed with copious amount of water. -
-------
Section 8
Page 3 of 14
Revision 2
June 1995
Modified Sieving Procedures for Areas of Dense SAV
In cases where large quantities of seagrass are brought up with the sieving grab, the entire sample will
be sieved (i.e., seagrass and sediment), however, the sample will be run through a set of stacked sieves.
The receiving sieve will have a large mesh (approx. 5mm) with one additional intermediate stepdown
mesh before the final sieve of 0.5 mm. The first two sieves will retain the gro.< agments of grass and
other detritus or large shell hash. Benthos will be collected only from the final 5mm sieve. There will be
epifauna included with the benthos collected. This important group of invertebrates may prove to be as
good, if not better, of an indicator of ecological condition as the benthic infauna. The use of stacked sieves
may also be used to facilitate processing grabs that contain large amounts of shell hash.
QC measures related to in-the-field processing of benthic grabs taken from SAV beds include rinsing
the sample (particularly the grass blades) with copious amounts of ambient seawater to ensure the
incorporation of epifauna with the benthos collected. After thoroughly rinsing the grab, the material
remaining on the larger mesh sieves should be discarded and only the specimens collected on the 0.5mm
sieve retained for subsequent laboratory evaluations. All other field QC procedures described for the
processing of conventional grabs will also apply to the SAV grabs (e.g., avoiding forceful jets of water
during rinses; gently rinsing the organisms from the sieve into collection jars; and preservation with 10%
final solution of buffered formalin with containing Rose Bengal stain),
QC associated with the capture of fish by traps includes controlling the duration time for their
deployment, standardization of trap type and size, and baiting. The traps to be used in the WI will be
purchased from the same vendor and will be of uniform design and size (yet to be determined). Each trap
will be baited with fresh cut bait (approximately 1 Ib. of 15-cm strips). Traps will be deployed during the
Day 1 site visit just after the deployment of the DataSonde. The target duration period for deployment is
24±2 hrs. Once the fish are aboard, they will be processed and documented just as for trawl-captured fish.
In the laboratory, stored samples must be easily retrieved and protected from environmental extremes.
Samples cannot be allowed to freeze and should be stored above 5 °C to prevent the formation of
paraformaldehyde. Temperatures greater than 30 °C should be avoided so as to retard evaporative losses.
Stored and archived samples should be checked once every three months for excessive evaporative losses
-------
Section 8
Page 4 of 14
Revision 2
June 1995
due to loosely-fitting or cracked container lids, or inadequately sealed jars. Exposure to direct sunlight
should be minimized since long-term exposure can degrade the Rose Bengal vital stain.
8.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS
In the laboratory, QA/QC involves a series of check systems for organism sorting, counting and
taxonomic identification. These checks are described briefly in the following sections; more complete
details can be found in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation).
8.3.1 Sorting
The quality control check on each technician's efficiency at sorting (i.e., separating organisms from
sediment and debris) consists of an independent resort by a second, experienced sorter. A minimum of
10% of all samples sorted by each technician must be re-sorts (i.e., the sediment and debris remaining after
the original sort is completely reexamined) to monitor performance and thus provide feedback necessary
to maintain acceptable standards. These re-sorts should be conducted on a regular basis on at least one
sample chosen at random from each batch of 10 samples processed by a given sorter. Inexperienced
sorters require a more intensive QC check system. It is recommended that experienced sorters or
taxonomists check each sample processed by inexperienced sorters until proficiency in organism
extraction is demonstrated. Once proficiency has been demonstrated, the checks may be performed at the
required frequency of one for every ten samples. Logbooks must be maintained in the laboratory and used
to record the number of samples processed by each technician, as well as the results of all sample re-sorts.
For each sample that is re-sorted, sorting efficiency should be calculated using the following formula:
# of organisms originally sorted x 100
# of organisms originally sorted + additional # found in re-sort
The results of sample re-sorts may require that certain actions be taken for specific technicians. If
sorting efficiency is greater than 95%, no action is required. If sorting efficiency is between 90% and
-------
Section 8
Page 5 of 14
Revision 2
June 1995
95%, problem areas should be identified and the technician should be re-trained. Laboratory supervisors
must be particularly sensitive to systematic errors (e.g., consistent failure to extract specific taxonomic
groups), which may suggest the need for further training. Sorting efficiencies below 90% will require re-
sorts of all samples in the associated batch and continuous monitoring of that technician to improve
efficiency.
If sorting efficiency is less than 90%, organisms found in the re-sort should be added to the original
data sheet and, if possible, to the appropriate vials for biomass determination. If sorting efficiency is 90%
or greater, the QC results should be recorded in the appropriate logbook, but the animals should not be
added to the original sample or used for biomass determinations. Once all quality control criteria
associated with the sample re-sort have been met, the sample residue (e.g., sediment and debris) may be
discarded.
8.3.2 Species Identification and Enumeration
Only senior taxonomists are qualified to perform reidentification quality control checks. A minimum
of 10% of all samples (i.e., one sample chosen at random out of every batch of ten samples) processed by
each taxonomic technician must be checked by a second qualified taxonomist to verify the accuracy of
species identification and enumeration. This control check establishes the level of accuracy with which
identification and counts are performed and offers feedback to taxonomists in the laboratory so that a high
standard of performance is maintained. Samples should never be rechecked by the technician who
originally processed the sample.
Ideally, each batch of ten samples processed by an individual taxonomic technician should be from a
similar habitat type (e.g., all oligohaline stations). The recheck of one out of the ten samples in a batch
should be done periodically and in a timely manner so that subsequent processing steps (e.g., biomass
determinations) and data entry may proceed. As each taxon is identified and counted during the recheck,
the results should be compared to the original data sheet. Discrepancies should be double-checked to be
sure of correct final results. Following re-identification, specimens should be returned to the original vials
and set aside for biomass determination.
-------
Section 8
Page 6 of 14
Revision 2
June 1995
When the entire sample has been reidentified and recounted, the total number of errors should be
computed. The total number of errors will be based upon the number of misidentifications and miscounts.
Numerically, accuracy will be represented in the following manner:
Total # of organisms in QC recount - Total # of errors x 100
Total # of organisms in QC recount
where the following three types of errors are included in the total # of errors:
1) Counting errors (for example, counting 11 individuals of a given species as 10).
2) Identification errors (for example, identifying Species X as Species Y, where both are present)
3) Unrecorded taxa errors (for example, not identifying Species X when it is present)
Each taxonomic technician must maintain an identification and enumeration accuracy of 90% or
greater (calculated using the above formula). If results fall below this level, the entire sample batch must
be reidentified and counted. If taxonomic efficiency is between 90% and 95%, the original technician
should be advised and species identifications reviewed. All changes in species identification should be
recorded on the original data sheet (along with the date and the initials of the person making the change)
and these changes should be entered into the database. However, the numerical count for each taxonomic
group should not be corrected unless the overall accuracy for the sample is below 90%. Additional details
on this protocol are provided in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). The
results of all QC rechecks of species identification and enumeration should be recorded in a timely manner
in a separate logbook maintained for this purpose.
Taxonomic identifications should be consistent within a given laboratory, and with the identifications
of other regional laboratories. Consistent identifications are achieved by implementing the procedures
described above and by maintaining informal, but constant, interaction among the taxonomists working on
each major group. As organisms are identified, a voucher specimen collection should be established. This
collection should consist of representative specimens of each species identified in samples from an
individual Province in a given year. For some species, it may be appropriate to include in the voucher
X
specimen collection individuals sampled from different geographic locations within the Province. At the
-------
Section 8
Page 7 of 14
Revision 2
June 1995
end of the year, the voucher specimen collection should be sent to recognized experts for verification of
the laboratory's taxonomic identifications. The verified specimens should then be placed in a permanent
taxonomic reference collection. Continued collection of verified species does not require additional expert
verification, because the reference collection can be used to confirm the identification. In addition, the
reference collection should be used to train new taxonomists. Participation of the laboratory staff in a
regional taxonomic standardization program (if available) is recommended, to ensure regional consistency
and accuracy of identification.
The laboratory is required to notify the West Indian Province Manager of any taxonomic identification
errors discovered by outside experts, as this may necessitate database corrections. Such corrections will
be made only after further consultation with the laboratory personnel and the outside expert(s) and will be
supported by written documentation which clearly explains the nature of and rationale for the changes.
All specimens in the reference collection should be preserved in 70% ethanol in labeled vials that are
segregated by species and sample. More than one specimen may be in each vial. The labels placed in
these vials should be made of waterproof, 100% (at least) rag paper and filled out using a pencil. Paper
with less than a 100% rag content or that is not waterproofed will disintegrate in the 70% alcohol mixture.
It is important to complete these labels, because future workers may not be familiar with the project,
station locations, and other details of the work in progress. In addition, the reverse side of the label should
contain information about the confirmation of the identification by experts in museums or other
institutions (if appropriate). To reduce evaporation of alcohol, the lids of vials and jars can be sealed with
plastic tape wrapped in a clockwise direction. The species (and other taxonomic designation) should be
written clearly on the outside and on an internal label. Reference specimens should be archived
alphabetically within major taxonomic groups. A listing of each species name, the name and affiliation of
the person who verified the identification, the location of the individual specimen in the laboratory, the
status of the sample if it has been loaned to outside experts, and references to pertinent literature should be
maintained by the laboratory performing the identifications.
Reference collections are invaluable and should be retained at the location where the identifications
were performed; In no instance should this collection be destroyed. A single person should be identified
as curator of the reference collection and should be responsible for its integrity. Its upkeep will require
-------
Section 8
Page 8 of 14
Revision 2
June 1995
periodic checking to ensure that alcohol levels are adequate. When refilling the jars, it is advisable to use
full-strength alcohol (i.e., 95%), because the alcohol in the 70% solution will tend to evaporate more
rapidly than the water.
8.4 QUALITY CONTROL PROCEDURES: INFORMATION
MANAGEMENT
8.4.1 Sample Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
labeling of sample containers, recording sampling information in the field, and tracking sample shipments.
A complete description of this system is provided in the EMAP-E Information Management Plan (Rosen et
al. 1991) and also summarized in Section 11 of this plan. The laboratory responsible for processing the
macrobenthic community samples must designate a sample custodian, who is authorized to check the
condition of and sign for the incoming field samples, obtain documents of shipment and verify sample
custody records. This individual is required, upon receipt of samples, to record and transmit all tracking
information to the Province Information Management center. The use of barcode labels and readers
provided by the Province will facilitate this process. In addition, the laboratory must have clearly defined
custody procedures for sample handling, storage, and disbursement in the laboratory and must maintain
accurate and timely records of the location and status of all samples.
8.4.2 Record Keeping and Data Reporting Requirements
It is mandatory for the laboratory responsible for processing the macrobenthic community samples to
maintain thorough and complete records. All data generated in the laboratory should be recorded directly
onto standardized data forms, modeled after those presented in the EMAP Laboratory Methods Manual
(U.S. EPA, in preparation). These forms are prepared for each benthic sample prior to laboratory
processing and are already filled out with species names, the biomass group for each species, and an 8-
character code for each species consisting of the first four letters each of the genus and species names.
Preparation of data sheets prior to sample processing facilitates sample tracking, sample processing,
QA/QC procedures, and data entry and helps to minimize transcription and other errors. Data forms
-------
Section 8
Page 9 of 14
Revision 2
June 1995
should be designed so that all necessary information is recorded clearly and unambiguously; data should
be recorded in ink. Data forms should be linked to specific samples using the barcoded sample numbers
assigned by the Province Information Management team prior to field sampling. Completed data sheets
and QA/QC forms should be kept in bound notebooks arranged by type; these forms should be made
available to the Province Manager upon request and will be inspected for adequacy during QA audits.
Laboratory managers should verify that all specified QA/QC requirements are met for a given batch of
samples, or, if not, that specified corrective actions are implemented and problems resolved, before a
technician is permitted to proceed with sample processing. The laboratory must establish a
comprehensive information management system that allows responsible personnel to detect and eliminate
transcription and/or calculation errors prior to submission of the final data package in computer-readable
format. This might include, for example, data entry procedures that involve double entry of information
from the laboratory data sheets into separate databases and subsequent comparison to ensure a high level
of data transcription accuracy. Data transcription errors also can be minimized through the use of
computer data entry forms that closely mirror the format of the hard-copy data sheets used in the
laboratory. Manual checks should be performed on a random subset of all transcribed data sheets (at least
10% of the total) to verify transcription accuracy.
The laboratory should report the results for all samples both in hard copy and in a computer-readable
format specified by the Province Information Manager. At a minimum, the following information should
be included: EMAP sample ID, laboratory sample ID (if applicable), numbers of individuals per sample
for each species (i.e, abundance), and biomass measurements for each biomass group expressed in dry
weight to the 0.1 mg. Tables summarizing the results of QC checks (e.g., re-sorts, recounts,
reidentifications, and reweighings) must be included as part of the data package, as well as a cover letter
signed by the Laboratory Manager that contains a narrative explanation of any problems that may have
influenced data quality.
8.4.3 Data Evaluation Procedures
It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s),
verify that the data evaluation procedures are completed, notify the laboratory of any additional
-------
Section 8
Page 10 of 14
Revision 2
June 1995
information or corrective actions deemed necessary as a result of the Province's data evaluation, and,
following satisfactory resolution of all "corrective action" issues, take final action by notifying the
laboratory in writing that the submitted results have been officially accepted as a completed deliverable in
fulfillment of contract requirements. It may be necessary or desirable for the Province Manager to
delegate the technical evaluation of the data to the QA Coordinator or other qualified staff member. It is
the responsibility of the Province QA Coordinator to monitor closely and formally document each step in
the data evaluation process as it is completed. This documentation should be in the form of a data
evaluation tracking form or checklist that is filled in as each step is completed. This checklist should be
supplemented with detailed memos to the project file outlining the concerns with data omissions, analysis
problems, or descriptions of questionable data identified by the laboratory.
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten. The first part of data
evaluation is to verify that all required information has been provided in the data package. First, Province
personnel should verify that the package contains the following: a cover letter signed by the laboratory
manager, hard copies of all results (including tables summarizing the results of all QA/QC checks), and
accompanying computer diskettes. Second, the electronic data file(s) should be parsed into the EMAP
Province database (SAS data sets) to verify that the correct format has been supplied. Third, once the data
has been transferred to the Province database, automated checks should be run to verify that results have
been reported for all expected samples and that no errors occurred in converting the data into SAS data
sets. This can be accomplished by visual comparison of SAS printouts against printouts of the original
data supplied by the laboratory. The printouts should be used to verify the completeness of the data sets
and to verify that values reported for all variables are correct.
The Province Manager should contact the laboratory and request any missing information as soon as
possible after receipt of the data packages. If information was omitted because required analyses were not
completed, the laboratory should provide and implement a plan to correct the deficiency. This plan may
include submittal of a revised data package and possible reanalysis of samples.
Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for the benthic community assessment
-------
Section 8
Page 11 of 14
Revision 2
June 1995
should consist of a thorough review of the summarized QA/QC data submitted as part of the data package
to verify that specified control limits for sample re-sorts, species recounts and reidentifications, and
biomass reweighings were not exceeded, or, if exceeded, that specified corrective actions were
implemented and are explained in adequate detail in an accompanying cover letter. If all specified control
limits were met during sample processing and/or problems adequately explained, the data can be accepted
for use without qualification. To date, no data qualifier codes have been needed for the West Indian
Province benthic community data sets.
8.4.4 Data Quality Reports
All QA/QC data associated with the laboratory processing of benthic samples will be presented in
West Indian Province reports and publications along with the results data, so that interested data users can
make their own assessment of data usability. Upon completion of all data evaluation steps, a report
summarizing the QA review of the data package should be prepared, samples should be properly stored or
disposed of, and laboratory data should be archived both in a storage file and in the database. Reports
documenting the results of the review of the data package should summarize all conclusions concerning
data acceptability and should note significant quality assurance problems that were found. These reports
are useful in providing data users with a written explanation of why certain data qualifier codes were
assigned and/or why some data were rejected.
The following specific items should be addressed in the QA report:
• Summary of overall data quality, including a description of data that were qualified.
• Brief descriptions of sample collection and testing methods.
• Description of data reporting, including any corrections made for transcription or other reporting
errors, and description of data completeness relative to objectives stated in the QA plan.
The benthic community assessment QA data will be presented in the Quality Assurance section of the
Province Annual Statistical Summary and will also become a permanent part of the database
documentation (i.e., metadata).
-------
Section 8
Page 12 of 14
• Revision 2
June 1995
8.5 DEVELOPMENT AND VALIDATION OF THE BENTHIC INDEX
Benthic assemblages have many attributes that make them reliable and sensitive indicators of the
ecological condition of estuarine environments. Based on this supposition, the EMAP-E Program is
attempting to construct a benthic index which reliably discriminates between degraded and undegraded
estuarine conditions. Construction of a benthic index and subsequent validation of the index was initiated
in both the Virginian and Louisianian Provinces; development of the benthic index will be continued
during EMAP activities in the West Indian Province. The first attempt at construction of a benthic index
occurred in 1991 using benthic community abundance and biomass data collected as part of the 1990
Virginian Province Demonstration Project. The exercise was repeated using benthic community
abundance and biomass data from the 1991 Louisianian Province Demonstration Project. Detailed
descriptions of the methods used to construct both benthic indices and subsequently to validate the indices
are provided in the 1990 Demonstration Project Report (Weisberg et al. 1993) and in a series of Virginian
and Louisianian Province documents (Rosen 1993; U.S. EPA, in prep.) and (Engle et al. 1994),
respectively. Briefly, the following major steps are followed in constructing and validating the benthic
index:
1) Degraded and undegraded (i.e., reference) stations are identified on the basis of measured near-
bottom dissolved oxygen concentrations, sediment contaminant concentrations, and sediment
toxicity.
2) A list of "candidate" parameters is developed using the species abundance data. This list includes
metrics having ecological relevance (e.g., species diversity indices, numbers of suspension-
feeding organisms, numbers of polychaetes, etc.) that potentially might be used to discriminate
between degraded and reference areas.
3) A value for each candidate parameter is calculated for each of the previously identified degraded
and reference stations.
4) A series of t-tests is performed to reduce the list of candidate parameters to a manageable number
from which it is highly probable that one or more subsets can be identified to discriminate reliably
between degraded and undegraded areas.
5) The parameters resulting from step 4 are entered into a canonical discriminant analysis to develop
a discriminant function incorporating those parameters that best discriminate degraded and
reference areas. As part of this iterative process, the frequency with which reference sites are
incorrectly classified as degraded (i.e., false positive), and the frequency with which degraded
sites are classified as reference areas (i.e., false negatives) are calculated.
-------
Section 8
Page 13 of 14
Revision 2
June 1995
6) The index is scaled so that values range between 1 and 10 (for ease of understanding). The mean
between the highest value, which reliably discriminates the degraded stations and the lowest
value, which reliably discriminates the reference stations is defined as the critical value. A
discriminant score is then calculated for the a priori degraded and reference stations to determine
rates of correct and incorrect classification. In addition, a cross-validation procedure is performed
in which each station is removed from the calibration data set and used as a test case for
validation.
7) The index is validated using an independent data set (e.g., a different set of degraded and reference
stations from the set used to construct the index) to determine rates of correct and incorrect
classification (i.e., classification efficiency). If the rate of correct classification is unacceptably
low (i.e., less than 80%), the index is reconstructed and eventually revalidated beginning at the
first step. The objective is to construct a benthic index that consistently results in high rates of
correct classification (i.e., at least greater than 80%).
From a quality assurance perspective, there are several important issues that must be addressed in the
development and application of the benthic index. These issues exist at several levels. At the most basic
level, construction of the benthic index can be viewed as a multistep process involving many data
manipulations (i.e., several levels of data aggregation and calculation of numerous parameters) followed
by an iterative series of statistical tests. At this level, a concomitant series of independent checks must be
performed to verify that each of the many data transformations and aggregations are performed without
error. In addition, it is important to verify that certain data aggregations and calculations which are
"generic" in nature are performed in a manner that is consistent and comparable between years and among
different provinces. The Province QA Coordinator is responsible for developing the required system of
independent checks and for confirming and documenting that they are implemented at each step in the
construction of the benthic index. As a required part of this verification procedure, the personnel directly
involved in constructing the index must provide, for review, detailed written documentation of each step,
including documentation of computer programs that are used to manipulate data and perform calculations.
It is also essential in construction of the benthic index that there is consistency between years and
among provinces in the statistical methods employed. As part of the required series of checks prescribed
above, there should be an independent review of these procedures by one or more qualified individuals
who are not directly involved in constructing the index. There are two aspects to this review. First, there
should be independent verification that the correct statistical tests are being employed. Second, there
should be verification that the chosen statistical tests are being performed correctly. Again, it is the
-------
Section 8
Page 14 of 14
Revision 2
June 1995
responsibility of the Province QA Coordinator to confirm and document that these independent reviews
are conducted.
Another potential QA/QC concern with respect to the benthic index is the classification of different
species into certain descriptive categories based on their presumed ecological niche or behavioral
characteristics (e.g., "deposit feeder," "suspension feeder," "equilibrium species," "opportunistic species,"
etc.). This categorization is accomplished using information from the scientific literature supplemented by
expert opinion. Because reliance on expert opinion introduces a certain level of subjectivity into the
process of constructing a benthic index, it is important that adequate documentation be developed to
justify the species classifications used at any given time. Personnel responsible for constructing the index
should enlist the help of one, or preferably several, qualified benthic ecologists in classifying species and
preparing this documentation.
On another level, a primary concern regarding the benthic index is how well it fulfills the objective of
discriminating between degraded and undegraded estuarine conditions. This concern will be addressed on
a continuous basis, using the cross-validation and year-to-year independent validation steps (steps 6 and 7
above) which are integral aspects of the ongoing iterative procedures involved in constructing the index.
In future development of the index, additional sites will be added to the calibration data set so that it
includes the full range of environmental habitats and stressors present. Furthermore, as more is learned
about other measures that are effective for discriminating sites of differing environmental quality, they can
be incorporated into the calibrations. The flexibility of the index development process will allow these
additional selected measures to be incorporated so that eventually a consistently high level of
classification efficiency will be achieved.
-------
Section 9
Page lof 11
Revision 2
June 1995
SECTION 9
MEASUREMENTS OF FISH COMMUNITY STRUCTURE
AND PATHOLOGY
9.1 OVERVIEW
This section presents EMAP-West Indian Province QA/QC protocols and requirements for fish
community structure analyses, from sample collection and laboratory analysis to final validation of the
resultant data. Collection and analysis methods are documented in the 1995 Field Operations Manual
(Macauley and Summers 1995). Data on species identification, enumeration, and length measurements are
generated by the field crews, whereas pathology data result from laboratory analyses.
Field crews are expected to conduct two replicate, "standard" 10-min trawls at all stations. The catch
is examined and all fish are identified to species, measured, and examined for gross external pathologies.
Those fish suspected of having a pathology are preserved in a fixative and shipped to a laboratory-based
pathologist for further evaluation. In addition to "pathology" fish, field crews will also collect up to 10
individuals of specified target species for analyses of chemical contaminants in fish tissue and, similarly,
up to 20 individual fish of the same target species will be retained for splenic macrophage aggregate (MA)
examination performed by laboratory-based histopathologists.
9.2 QUALITY CONTROL PROCEDURES: FIELD OPERATIONS
9.2.1 Trawling
Fish community structure data (species identification, enumeration, and length) are significantly
influenced by the collection methods. It is therefore critical that strict adherence to prescribed sampling
protocols be maintained. Factors influencing the catch are gear, trawl duration, and trawl speed. All crews
must be provided with "standard" nets to assure comparability of gear, and the importance of keeping the
trawl duration and speed within established limits should be stressedMuring training. During sampling,
-------
Section 9
Page 2 of 11
Revision 2
June 1995
crews must record coordinates of latitude and longitude at the initiation and termination of a trawl and the
duration of the trawl in minutes on the fish trawl data sheet.
Since the same target fishes are needed for two indicators (chemistry and MAs), often the catch from
two trawls is insufficient to meet the demand. In that event, the crew should conduct a third trawl,
specifically for the purpose of supplementing the "body count" for subsequent analyses of the target
species, and since community structure is not evaluated for that trawl, the duration of the trawl may be as
long or short as the crew chief deems appropriate. Even with the third trawl, at times there may not be a
full complement of fish for both chemistry and MAs, 10 and 20 fish, respectively. A minimum five-fish
)»JL
composite is the desirable sample size for both chemistry and MAs to be statistically representative.
Using that guideline, in most instances the crew chief should be able to apportion the catch to satisfy the
minimum requirement for both sample types.
Adherence to collection methodology will be monitored during initial certification of the field crew
and during all subsequent audits and field inspections conducted by senior Program personnel during the
sampling season.
9.2.2 Alternative Fish Collection: Traps
The use of otter trawl nets to collect fish and shellfish has limited application in much of the WI
Province due to 1) constraints to vessel maneuverability in mangrove swamps that permeate the region
and, 2) restrictions imposed by the stewardship groups responsible for designated sanctuary areas (e.g.,
Everglades National Park and the Florida Keys National Marine Sanctuary). In these situations, the field
crews will resort to the use of fish traps to sample fish for pathological examination and for chemical
analyses of contaminants in tissue. While this method of capture provides samples for laboratory analyses
that are comparable to those taken by trawl, it does not provide comparable data for Province-wide
diversity and abundance evaluations. However, within the confines of a particular system (e.g., Florida
Bay) in which all fish collections are by one method (i.e., all by trawl or all by trap), data collected on
diversity and abundance may provide valuable information for site specific evaluations.
-------
Section 9
Page 3 of 11
Revision 2
June 1995
9.2.3 Species Identification, Enumeration, and Length Measurements
Fish species identification, enumeration, and individual lengths must be determined in the field
following protocols presented in the West Indian Province Field Operations Manual (Macauley and
Summers 1995). The quality of fish identifications, enumerations, and length measurements are assured
principally through rigorous training and certification of field personnel prior to sampling. Qualified
taxonomists will provide independent confirmation of all fish identifications, enumerations, and length
measurements made by crew members during laboratory training sessions. Fish identifications,
enumerations, and length measurements also will be confirmed by the QA Coordinator, Province Manager,
or their designee(s) during field visits.
The accuracy of length measurements and individual counts will be checked during all QA audits and
field visits conducted by senior EMAP-E personnel. The overall accuracy goal for all fish identifications,
enumerations and length measurements in a given sampling season is 90% (i.e., less than 10% errors). If
this goal is not met, corrective actions will include increased emphasis on training and more rigorous
testing of field crews prior to the next year's sampling season.
9.2.4 Sample Preparation, Labeling, and Storage
Fish retained from the trawl as samples for later laboratory analyses must be expediently processed
once brought aboard. Normally, this activity is the last sampling event scheduled for a station visit and it
can be one of the more tedious, especially in the case of a large fish catch. It has been noticed that since
fish processing is the wrap-up assignment at a sampling station, same crews tend to hurry, the activity,
resulting in careless errors. It is imperative to follow the protocols for the various fish sample types as
detailed in the Field Operations Manual (Macauley and Summers 1995).
Fish retained for analyses of chemical contaminants require no actual preparation other than being
placed whole in clean Ziploc bags and stored on ice to await freezing once ashore. The proper
preservation of fish for histopathological examinations is more involved and requires strict adherence to
QC guidelines. Care must be taken in splitting the gut so as not to damage the internal organs; the cut
should continue anteriorly to the throat region; the incision should be spread by "popping" the pectoral
-------
Section 9
Page 4 of 11
Revision 2
June 1995
region apart; this will enable the Dietrich's fixative to flood the body cavity ensuring good fixation of the
organs. Fish must not be crowded into a Ziploc bag; there should be plenty of room to allow them full
exposure to the Dietrich's. The plastic Ziploc bag must have abundant perforations (more holes than bag is
preferred, as long as the fish and label are retained); the bags must be completely submerged in the
solution. For a large sample size (e.g., 20 hardhead catfish) it is best not to use plastic bags; instead put
the prepared fish directly into a plastic bucket of Dietrich's. The volume of fish to fixative should not
exceed 50:50.
Labeling is another critical QA/QC item related to the field processing of fish samples. Each sample
must be tagged with both a barcoded station label and a barcoded sample ID label. These should be stuck
back-to-back, and included in the sample bag. In cases where more than one plastic Ziploc bag is required
to contain the sample, the double-sided station/sample label should go into the first bag and station labels
only will be placed in the remaining bags; do not use another sample ID label. The common name of the
fish should be written on each bag for multiple-bag samples to aid recipients in accounting for all
constituents of a sample; the ink will remain legible if a Sharpie pen is used on the designated labeling
patch of the Ziploc.
Field processing of fish samples will be monitored during all QA audits and field visits by Program
personnel to assure crews are thorough in their inspections for pathologies and that they are adhering to
the EMAP protocols for processing fish samples for both chemistry and histopathology.
9.3 QUALITY CONTROL PROCEDURES: GROSS EXTERNAL
PATHOLOGY AND HISTOPATHOLOGY
9.3.1 Gross Pathological Examinations
All fish collected in standard trawls must be examined by the field crew for evidence of gross external
pathologies (lumps, growths, ulcers, and fin erosion) according to the protocols outlined in the West
Indian Province Field Operations Manual (Macauley and Summers 1995). As with fish identification and
enumeration, the quality of gross pathology determinations can be assured principally through rigorous
training and certification of field personnel prior to sampling. Qualified pathologists will be responsible
-------
Section 9
Page 5 of 11
Revision 2.
June 1995
for planning and overseeing all crew training for this indicator. Because of the potential difficulty in the
proper identification of pathologies by inexperienced personnel, all definitive examinations will be
conducted by a qualified pathologist. Field crews will be instructed to observe all fish and preserve any
suspected of having one of the four pathologies listed above. These will be returned to the laboratory with
a sample ID tag and the suspected pathology noted.
Upon receipt of a sample at the laboratory, the pathologist will examine these fish and provide the QA
Coordinator with his/her results. When there is disagreement between the field observation and the
pathologist's interpretation, a second pathologist may be consulted to verify the results from the primary
laboratory.
A series of internal and external laboratory QC checks should be employed to provide verification of
the fish histopathology identifications. In laboratories having multiple pathologists, all cases bearing
significant lesions should be examined and verified by the senior pathologist. At least 5% of the slides
read by one pathologist should be selected at random and read by a second pathologist without knowledge
of the diagnoses made by the initial reader. For the external QC check, at least 5% of the slides should be
submitted for independent diagnosis to a pathologist not involved with the laboratory. These slides should
represent the range of pathological conditions found during the study, and the external pathologist should
not be aware of the diagnoses made by the laboratory personnel.
Each laboratory also should maintain a reference collection of slides that represent every type of
pathological condition identified in the EMAP-E fish. Each of these slides should be verified by an
external pathologist having experience with the species in question. The reference slide collection then
can be used to verify the diagnoses made in future years to ensure intralaboratory consistency. The
reference slides also can be compared with those of other laboratories to ensure interlaboratory
consistency. A reference collection of photographs also can be made, but this should not substitute for a
slide collection.
9.3.2 Splenic Macrophage Aggregates In Fish
Macrophages, phagocytic entities, occur in the spleen, kidney and liver of fish (Agius 1980), and in
-------
Section 9
Page 6 of 11
Revision 2
June 1995
advanced teleosts they form discrete aggregations called macrophage aggregates (MAs) (Wolke et al
1985). It has been demonstrated that occurrence of MAs may vary depending on the size, nutritional
status, or health of a particular fish species (Agius 1979, 1980; Agius and Roberts 1981; Wolke et al.
1985), with the number and size of MAs increasing with age, starvation, and/or disease. Recent studies
suggest that MAs may be sensitive histological indicators of fish health and environmental quality. By
comparing the numberof MAs and percent area occupied by MAs among fish of the same age and species
from various sites, it may be possible to determine their relative conditions at those sites.
QA issues and field QC procedures related to the collection and processing of fish samples for
evaluation of MAs are detailed in Section 9.2.
At the laboratory, established histological techniques will be followed to section and stain the spleen
tissue from fish samples for MA examination. The occurrence ofMAs is evaluated by two methods. First,
a qualified histopathologist visually scans the slides and rates the relative presence or intensity of MAs on
a scale of 0 to 4, with 0 indicating no MAs present and 4 indicating heavy intensity. A second evaluation
uses computer image analysis to estimate MA numbers and individual MA areas from three random fields
per spleen. The data, identified by individual fish and area, are compiled and analyzed using SAS.
The QA/QC procedures for MA evaluations are similar to those that EMAP applies to conventional
fish histopathological evaluations. A series of internal and external laboratory QC checks should be
employed to provide verification of the splenic MA estimations. In laboratories having multiple
pathologists, all cases bearing significant occurrence of MAs should be examined and verified by the
senior pathologist. At least 5% of the slide's read by one pathologist should be selected at random and read
by a second pathologist without knowledge of the MA intensity estimated by the initial reader. For the
external QC check, at least 5% of the slides should be submitted for independent evaluation at an outside
laboratory, preferably one equipped with computer image analysis capabilities. These slides should
represent the range of MA intensity encountered during the evaluation, and the external laboratory should
not be aware of the range of MA intensity.
Each laboratory also should maintain a reference collection of slides that represent the full scale of
MA occurrence identified in the EMAP-E fish. The reference slide collection then can be used to verify
-------
Section 9
Page 7 of 11
Revision 2
June 1995
the evaluations made in future years to ensure intralaboratory consistency. The reference slides also can
be compared with those of other laboratories to ensure interlaboratory consistency. A reference collection
of photographs also can be made, but this should not substitute for a slide collection.
9.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT
9.4.1 Sample Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
labeling of fish specimens, recording sampling information in the field, and tracking sample shipments. A
complete description of this system is provided in the EMAP-E Information Management Plan (Rosen et
al. 1991) and is also summarized in Section 11 of this plan. Field crews must carefully and thoroughly
complete all shipment datasheets and transmit this information to the Field Operations Center during the
next electronic transfer of data.
Each analytical laboratory receiving fish for verification of species identifications, gross pathology or
further histopathological examination must designate a sample custodian who is authorized to check the
condition of and sign for the incoming samples, obtain documents of shipment, and verify sample custody
records. This individual is required, upon receipt of fish samples, to record and transmit all tracking
information to the Province Information Management Center. The use of barcode labels and readers
provided by the Province will facilitate this process. There must be clearly defined custody procedures for
handling, storage, and disbursement of the fish samples in the laboratory.
9.4.2 Data Reporting Requirements
All field data must be entered into the field computer within one day of collection. Crew chiefs must
review all data prior to electronic transfer to the Field Operations Center the following evening. Hard-
copy original datasheets must be returned to the Field Operations Center no later than the end of the crew's
work shift.
-------
Section 9
Page 8 of 11
Revision 2
June 1995
Following laboratory examination of the fish, only data which have met QA/QC requirements should
be submitted to EMAP-E. Each data package submitted by the laboratory should consist of the following:
• A cover letter providing a brief description of the procedures and instrumentation used for
verification of species identifications, gross pathology or further histopathological
examination, as well as a narrative explanation of any problems encountered or failure(s)
to meet required quality control limits.
• Tabulated results in hard-copy form, including sample ID, external pathologies (only
lumps, growths, ulcers, fin erosion), and internal pathologies noted.
• Tabulated results in computer-readable form (e.g., diskette) included in the same shipment as the hard-
copy data, but packaged in a diskette mailer to prevent damage. Data must be provided in a format
acceptable to the Province Information Manager.
• All QA/QC data (e.g., results of internal and external QC checks) must be submitted by the laboratory
as part of the data package, but should be included in separate tables and files from the actual data.
9.4.3 Data Evaluation Procedures
It is the responsibility of the Province Manager to acknowledge initial receipt of the data
package(s), verify that the four data evaluation steps identified in the following paragraph are completed,
notify the analytical laboratory (or contract field coordinator) of any additional information or corrective
actions deemed necessary as a result of the Province's data evaluation, and, following satisfactory
resolution of all "corrective action" issues, take final action by notifying the laboratory or field operations
coordinator in writing that the submitted results have been officially accepted as a completed deliverable
in fulfillment of contract requirements. It may be necessary or desirable for additional personnel (e.g., the
Province QA Coordinator) to assist the Province Manager in the technical evaluation of the submitted data
packages. While the Province Manager has ultimate responsibility for maintaining official contact with
the analytical laboratory and verifying that the data evaluation process is completed, it is the responsibility
of the Province QA Coordinator to closely monitor and formally document each step in the process as it is
completed. This documentation should be in the form of a data evaluation tracking form or checklist that
is filled in as each step is completed. This checklist should be supplemented with detailed memos to the
project file outlining the concerns with data omissions, analysis problems, or descriptions of questionable
data identified by the laboratory.
-------
Section 9
Page 9 of 11
Revision 2
June 1995
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten. The following steps are to be
followed in evaluating EMAP-E data:
1) Checking data completeness (verification)
2) Assessing data quality (validation)
3) Assigning data qualifier codes
4) Taking final actions
Checking Data Completeness
The first part of data evaluation is to verify that all required information has been provided in the data
package. For field-generated data (i.e., fish identification, enumeration, and length measurements), the
crew chief must review all data files to assure they are complete and correct prior to uploading the data to
the Field Operations Center. Once the data are received, the West Indian Province data librarian should
perform a 100% comparison of the electronic files to the original hard-copy datasheets, followed by an
additional 10% check. These steps serve not only to ensure that all data contained on the datasheets are
present in the database, but also as a check against transcription errors.
EMAP-E laboratories are expected to submit data which have already been tabulated and 100%
checked for accuracy. The submitted data will be compared to the data expected based on field
observations (i.e., there should be gross external pathology data for each fish sent in for examination).
The Province Manager should contact the laboratory and request any missing information as soon as
possible after receipt of the data package. If information was omitted because required analyses were not
completed, the laboratory should provide and implement a plan to correct the deficiency. This plan may
include submittal of a revised data package and possible reanalysis of samples.
Assessing Data Quality
Data validation, or the process of assessing data quality, can begin after Province personnel have
verified that the data package is complete and free of transcription errors. For fish community data, the
-------
Section 9
Page 10 of 11
Revision 2
June 1995
first step in validation will be automatic range checks. For example, all occurrences of each species are
compared to the maximum and minimum latitudes and salinity tolerances, and maximum length for that
species. These ranges will be determined from well-established sources. If a species is reported from a
location where it would not be expected (based on salinity and latitude), or the reported length exceeds the
maximum length expected for that species, the record will be flagged for further investigation. This can
include checking the record against the original data sheet, checking the taxonomy QA results if
applicable, or questioning the crew chief. If no explanation can be identified, the original record will
remain unchanged. An additional verification step that must be performed is a check on the trawl duration
and speed. A trawl duration between 8 and 12 minutes and a speed between 1 and 3 knots is considered
acceptable. Fish community data collected from any trawl that did not meet these acceptability criteria
will be rejected.
EMAP-E field crews are trained to screen fish catches for gross, external pathologies and to report
them generically (e.g., "lumps, bumps, growths, ulcerations, etc."); however, they are not purported as
experts in fish pathology. All pathologies screened from the field are preserved and submitted to the
laboratory where trained fish pathologists review and verify each reported incidence. Any pathology that
may have been missed or misidentified during the field examination will be corrected by the laboratory
pathologist. The verified pathologies are entered into the database. On occasion, the pathologist may
challenge a field crew's species identification of the fish sample; as the pathologists are well versed in fish
taxonomy, their judgement on these calls normally supersede that of the field crew and the correction is
entered into the database. These corrections are documented and reported to the Province QAC. The
information can be used to improve training in subsequent years, by increasing emphasis on these species
and/or pathologies which were consistently misidentified by field personnel. Since only corrected data
sets (laboratory confirmation of each pathology) are entered into the database, no qualifier codes are
assigned to the fish pathology data.
Taking Final Action
Upon completion of the above steps, a report summarizing the QA review of the data package should
be prepared, samples should be properly stored or disposed of, and laboratory data should be archived
«.
both in a storage file and in the database. Technical interpretation of the data begins after the QA review
-------
Section 9
Page 11 of 11
Revision 2
June 1995
has been completed.
Reports documenting the results of the QA review of a data package should summarize all
conclusions concerning data acceptability and should note significant quality assurance problems that
were found. These reports are useful in providing data users with a written record on data concerns and a
documented rationale for why certain data were accepted as estimates or were rejected. The following
specific items should be addressed in the QA report:
• Summary of overall data quality, including a description of data that were
qualified.
• Description of data reporting, including any corrections made for transcription or
other reporting errors, and description of data completeness relative to objectives
stated in the QA Plan.
The fish community structure and pathology QA results will be included in the annual Program
Quality Assurance Report.and will also become a permanent part of the database documentation (i.e., the
metadata). The QA/QC data collected by the Program will be used not only to assess the quality of
individual measurements, but ultimately to assess the comparability of data generated by multiple
laboratories and field crews.
-------
Section 10
Page 1 of 11
Revision 2
June1995
SECTION 10
WATER QUALITY MEASUREMENTS
10.1 OVERVIEW
This section presents EMAP-West Indian Province QA/QC protocols and requirements for water
quality measurements from collection to final verification. The field collection measurement methods are
documented in the Field Operations Manual (Macauley and Summers 1995). In addition to the established
EMAP-E water quality parameters measured in the field, during the 1995 WI monitoring, samples for
eutrophication determinations will be collected by field personnel and shipped to a laboratory for
subsequent analyses.
10.1.1 Field Measurement of Water Quality
Characterization of the water column is determined by each of two approaches: point-in-time vertical
profiles and continuous, long-term, near-bottom measurements. Hydrolab® and H2O Dataloggers are used
to obtain vertical profiles of temperature, salinity, dissolved oxygen (DO), Ph, and depth. The Hydrolab®
DataSonde 3 is used to continuously record long-term (24-h) series of temperature, salinity, DO, pH, and
depth in near-bottom water (approx. 0.5 m off bottom).
Ancillary measurements for light penetration and water clarity are also taken in conjunction with the
water column parameters listed above. Photosynthetically active radiance (PAR) is measured using a
LICOR LI1000 light meter and reading at 1-meter increments. Traditional Secchi depth is measured using
a standard 8-inch diameter black and white plexiglass disk.
Specific QC procedures for each of the water quality measurements are discussed in the following
sections.
-------
Section 10
Page 2 of 11
Revision 2
June1995
10.1.2 Measurements of Pelagic Eutrophication
It is generally acknowledged that the waterways of our nation receive and transport tremendous loads
of nutrients, both organic and inorganic, resulting from the discharge of industrial waste or by-products
and from agricultural and urban runoff. The surface water systems are limited in their capacity to
assimilate nutrients; therefore, the estuarine systems info which the rivers empty are a primary depository
for the overload of these "fertilizers." EMAP-E is interested in measuring the level of this enrichment or
eutrophication in the estuarine resource and estimating the extent of the resource that is environmentally
degraded due to eutrophication. During the 1995 monitoring in the West Indian Province, at each EMAP
station, both filtered and unfiltered water samples will be collected for subsequent laboratory analyses of
nutrients (i.e., nitrates, nitrites, ammonia, phosphate, and silicate). Filtrates of site water will also be
collected from each station for laboratory analyses of chlorophyll a, and carbon and nitrogen (CHN).
The field and laboratory QA/QC requirements associated with these analyses are presented in
following sections.
10.2 QUALITY CONTROL PROCEDURES: FIELD MEASUREMENTS
During the field crew training held each year just prior to the start-up of the seasonal monitoring, all
crew members are required to demonstrate a high degree of proficiency in their performance of the
procedures related to the collection of water quality samples and data. Particular emphasis is placed on the
proper maintenance and calibration of the Hydrolab® monitoring instruments. The crew, as a whole, must
be certified (pass a graded field exercise) before they are allowed to participate in the collection of any
EMAP samples or data;
10.2.1 Calibration Checks and QC Procedures
The two models of Hydrolab® datalogging instruments used to monitor water quality parameters in
the Province are all equipped with the same array of probes or sensors; therefore, the calibration
procedures are very similar among the instruments and are detailed in the Field Operations Manual
-------
Section 10
Page 3 of 11
Revision 2
Junel995
(Macauley and Summers 1995). Calibration of the dissolved oxygen polarographic sensor is based on
using a water-saturated air environment as the standard; for pH, a two point calibration curve is
established with standard buffer solutions of pH 7 and 10; the salinity/conductivity probe is calibrated
using a secondary seawater standard that has been standardized against IAPSO Standard Seawater using a
Wescor vapor pressure osmometer; the depth sensor, a pressure activated transducer, is set to a zero
pressure while out of the water. Temperature is a fixed function set by the manufacturer and cannot be
adjusted in the field (to date, no problems have encountered with the temperature sensor); the instrument
reading is checked against a hand-held alcohol thermometer.
The LICOR light meter is a manufacturer-calibrated instrument; if problems arise, the meter must be
returned to LICOR for servicing. There is no calibration required for the Secchi disk.
Hvdrolab® H2O
The H20 has proven to be a dependable instrument that, if properly maintained and correctly
calibrated, can be relied on to perform within the range of accuracy that EMAP-E requires for the basic
water quality parameters of temperature, salinity, DO, pH, and depth. The H20s will be calibrated daily,
preferably at dockside on the morning of their intended use; the calibration will be documented on the
Hydrographic Profile field data sheet. For each of the water quality parameters, EMAP-E has established a
maximum range of allowable difference that the instrument measurement may deviate from the calibration
standard (Table 10-1). It should be noted that while these limits are acceptable for the purpose of
qualifying field measurements taken with the unit, when performing the daily QC check, crews should
"tweak" the instrument to as near the standard value as possible. This takes on importance when the H2O
is in turn used as the "standard" to verify the performance of DataSonde 3 units during side-by-side
comparisons conducted in the field (see following section). The daily QC checks should not require more
than slight adjustments to bring the instrument into agreement. If an instrument's performance becomes
erratic or requires significant adjustments to calibrate, the unit should be thoroughly trouble-shot;
problems generally can be determined as being probe-specific or related to power source (e.g., low battery
voltage, bad connections, etc).
-------
Section 10
Page 4 of 11
Revision 2
Junel995
TABLE 10-1. Maximum acceptable differences for instrument field calibration and QC checks.
Instrument
Hydrolab®
H20
Hydrolab®
DataSonde 3
Frequency
of Check
Daily
Pre-and
post-
deployment
Parameter
Temperature
Salinity
DO
pH
Temperature
Salinity
DO
pH
Checked
Against
Thermometer
Standard seawater
Saturation chart
pH buffer solution
H20
H20
H20
H20
Maximum
Acceptable
Difference
±1°C
± 0.2 ppt
±0.1 ppm
±0.1pH units
±1°C
±lppt
±0.3f?1^'
± 0.3 pH units
In addition to the daily QC checks described above, the performance of the H2O with respect to DO
measurements will be further verified. At the start of each crew's weekly rotation, the H2O will be
carefully calibrated, then checked for accuracy against a standard of air-saturated water (5-gal bucket of
vigorously aerated tapwater). The dissolved oxygen saturation values are taken from Standard Methods
for Examination of Water and Wastewater (1985). Each value is in milligrams per liter (mg/1 or ppm) of
zero-chlorinity water, with temperature noted in degrees Centigrade. The instruments must be within 0.2
ppm of the saturation value to be in compliance. This procedure will be documented on a DO Verification
field data sheet. Performing this exercise at the beginning of a crew's rotation ensures that the instrument
is in good operable condition. If the subsequent DO verification performed at the next crew change is not
within the guidelines, the preceding week's DO data will be closely examined to determine if the data
appear compromised.
DataSonde 3
The long-term datalogger, the DataSonde 3, is deployed by EMAP field crews for overnight logging
runs, typically for 20-24 hr; the period must include monitoring the window from 1800-0600. The
DataSonde units are initially calibrated at the mobile-lab within 24 hr of their scheduled deployment
(Macauley and Summers 1995); the calibration procedure is documented on the DataSonde Lab Sheet.
-------
Section 10
Page 5 of 11
Revision 2
June1995
The DataSonde calibration is verified aboard ship, just prior to the actual deployment, by conducting a
simultaneous, side-by-side comparison with the H2O in a 5-gallon bucket of ambient site water; the H2O
measurements are considered the "standards." Ranges of acceptable variance between the DataSonde and
the H2O side-by-side results are listed in Table 10-1. If any of the DataSonde measurements exceed these
limits, then the unit is recalibrated to the values displayed by the H2O. In cases where the DataSonde will
not accept new calibration values, the crew will resort to their "backup" unit and repeat the same QC
checks. The above QC procedures will be documented on the DataSonde Field Sheet.
The DataSonde 3 is not to be deployed in situations where the near-bottom DO is < 1 ppm.
Therefore, prior to deployment of the DataSonde, the field crews must always first measure bottom DO
conditions using their H2O units.
At the retrieval of a DataSonde, following its overnight logging run, the instrument's performance is
again evaluated in a side-by-side comparison against a calibrated H2O. The results are documented on the
same data sheet used on the previous day at deployment. The follow-up QC check at retrieval helps to
ascertain that the instrument was functioning properly during the logging run. If the results are not in
acceptable agreement, the field team should consult with the Province management or the field team leader
for further instructions regarding redeployment (logistical constraints may necessitate returning to the site
at a later date).
LICOR LI100 Light Meter
No field calibration procedures are required for the LICOR light meter. However, several QC checks
are necessary to ensure taking accurate measurements of PAR. The "deck" sensor must be situated in full
sunlight (i.e., out of any shadows); likewise, the submerged sensor must be deployed from the sunny side
of the boat and care should be taken to avoid positioning the sensor in the boat's shadow. For the
comparative light readings of the deck and submerged sensors (ratio of ambient vs. submerged), the time
interval between the two readings should be held to a minimal (approximately 1 sec).
-------
Section 10
Page 6 of 11
Revision 2
June1995
Secchi Depth
No field calibration procedures are required for the Secchi disk. QC checks required by EMAP to
ensure consistency when using the Secchi disk to make water clarity measurements include designating a
specific crew member as the Secchi depth taker and, as with the LICOR, take the measurements from the
sunny side of the boat.
Eutrophication Samples
Field procedures for the collection of eutrophication samples primarily involve the filtration of water
samples. The processing laboratory will provide the field crews with a prepared sampling kit for each
EMAP station. Protocols and associated QA/QC requirements for eutrophication sampling will be
finalized during field training exercises prior to the summer monitoring; the field crews will strictly adhere
to the requirements.
10.3 QUALITY CONTROL PROCEDURES: LABORATORY MEASUREMENTS
10.3.1 Nutrient Measurements
Methodology for nutrient measurements is based on spectrophotometric determinations described by
Strickland and Parsons (1968). Analytical sets or batches should be limited to twenty or less EMAP
samples and must include appropriate QC samples uniquely indexed to the sample batch. The minimum
QC samples required for nutrient analysis on a per batch basis include a four point standard curve for each
nutrient of interest, reagent blanks at the start and completion of a run, one duplicated sample, and one
reference treatment for each nutrient. In addition, the EMAP performance criteria for an acceptable
analytical batch are: accuracy - the reported measurements for the reference samples be within 90-110% of
the true value for each component nutrient and, precision - a relative percent difference between duplicate
analyses 5 20% for each component nutrient. Any batch not meeting the QA/QC requirements will be re-
analyzed.
-------
Section 10
Page 7 of 11
Revision 2
June1995
10.3.2 Chlorophyll Analysis
Chlorophyll a (Chi a) analysis will be a fluorometric determination. The fluorometer will be
calibrated periodically (at least every three months) using Sigma Chlorophyll a standard (spinach) to
generate a regression curve that relates instrument response to Chi concentration. Prior to each day of use,
the fluorometer's sensitivity will be verified by checking against a cophorphryn standard. If the variance
is significant, the instrument will be serviced and recalibrated with Chi a standard. EMAP samples will be
run in duplicate.
10.3.3 CHN Analysis
On days of use, the CHN analyzer will be calibrated twice daily (morning and afternoon) using
acetanilid standard. With each batch of s25 EMAP samples, a weighed acetanilid standard and combusted
filter will be run to verify the previous calibration. If the standard run differs by z 5%, the previous batch
will be recalculated based on the more recent standard run. At least one EMAP will be run in duplicate per
batch.
10.4 QUALITY CONTROL PROCEDURES: INFORMATION
MANAGEMENT
10.4.1 Sample/Data Tracking
EMAP-E information management personnel have developed a comprehensive system for barcode
labeling of sample containers and data sheets, recording sampling information in the field, and tracking
sample shipments or data transfers. A complete description of this system is provided in the EMAP-E
Information Management Plan (Rosen et al. 1991) and is also summarized in Section 11 of this plan. Field
crews must carefully and thoroughly complete all shipment data sheets and transmit this information to the
Field Operations Center during the electronic transfer of data.
-------
Section 10
Page 8 of 11
Revision 2
June1995
10.4.2 Data Reporting Requirements
Data for the long-term (DataSonde) water quality measurements exist in the form of computer files
which are entered into the field computer upon retrieval of the logging unit. Crew chiefs or mobile-lab
leads must review this data prior to the electronic transfer of the files to the Field Operations Center.
Hard-copy original data sheets must be returned to the Field Operations Center no later than the end of the
crew's work shift. QA codes for water column measurements are assigned during data evaluation at the
Field Operations Center.
10.4.4 Data Evaluation Procedures
It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s),
verify that the four data evaluation steps identified in the following paragraph are completed, notify the
analytical laboratory (or contract field coordinator) of any additional information or corrective actions
deemed necessary as a result of the Province's data evaluation, and, following satisfactory resolution of all
"corrective action" issues, take final action by notifying the laboratory or field operations contractor in
writing that the submitted results have been officially accepted as a completed deliverable in fulfillment of
contract requirements. It may be necessary or desirable for additional personnel (e.g., the Province QA
Coordinator) to assist the Province Manager in the technical evaluation of the submitted data packages.
While the Province Manager has ultimate responsibility for maintaining official contact with the field
operations contractor and for verifying that the data evaluation process is completed, it is the
responsibility of the Province QA Coordinator to closely monitor and formally document each step in the
process as it is completed. This documentation should be in the form of a data evaluation tracking form or
checklist that is filled in as each step is completed. This checklist should be supplemented with detailed
memos to the project file outlining the concerns with data omissions, analysis problems, or descriptions of
questionable data identified by the laboratory.
Evaluation of the data package should commence as soon as possible following its receipt, since
delays increase the chance that information may be misplaced or forgotten. The following steps are to be
followed in evaluating EMAP-E data:
-------
Section 10
Page 9 of 11
Revision 2
June1995
1) Checking data completeness (verification)
2) Assessing data quality (validation)
3) Assigning data qualifier codes
4) Taking final actions
Checking Data Completeness
The first part of data evaluation is to verify that all required information has been provided in the
data package. For field-generated data (i.e., water quality measurements), the crew chief or mobile-lab
lead must review all data files to assure they are complete and correct prior to uploading the data to the
Field Operations Center. Once the data are received at the Center, the West Indian Province data librarian
should perform a 100% comparison of the electronic files to the original hard-copy data sheets, followed
by an additional 10% check. These steps serve not only to ensure that all data contained on the data sheets
are present in the database, but also as a check against transcription errors.
Because the DataSonde 3 profile consists of an electronic file, without any segment of the logging
run recorded on hard-copy data sheets, an additional step is required in the verification of these data. This
step consists of a check to verify that the DataSonde file is associated with the correct station and event.
Although field checks are in place to minimize the occurrences of this error, this verification should still
be conducted. To assure the file was correctly identified, the bottom depth, DO, pH, temperature, and
salinity values in the DataSonde file should be compared to those recorded on the hard-copy field data
sheet, Hydrographic Profile, for the near-bottom conditions measured using the Surveyor II. The water
quality values should match within the range of acceptable differences for instrument field calibration
checks. Any logged data file that does not match recorded values should be flagged for investigation.
Assigning Data Qualifier Codes
After the above checks are made, a database QA code should be assigned to each set of water
quality measurements (i.e., instantaneous profile and continuous, near-bottom). A listing of these codes is
presented in Table 10-2. There are 7 codes describing the acceptability of the different water quality
parameters in the different sets of measurements.
-------
Section 10
Page 10 of 11
Revision 2
Junel995
Taking Final Actions
Upon completion of the above steps, a report summarizing the QA review of the data package
should be prepared, samples should be properly stored or disposed of, and laboratory data should be
archived both in a storage file and in the database. Technical interpretation of the data begins after the QA
review has been completed.
Reports documenting the results of the QA review of a data package should summarize all
conclusions concerning data acceptability and should note significant QA problems that were found.
These reports are useful in providing data users with a written record on data concerns and a documented
rationale for why certain data were accepted as estimates or were rejected. The following specific items
should be addressed in the QA report:
• Summary of overall data quality, including a description of data that were qualified.
• Summary of all QA data (e.g., field QC checks, calibrations, calibration checks).
• Description of data reporting, including any corrections made for transcription or other reporting
errors, and description of data completeness relative to objectives stated in the QA Plan.
The water quality QA reports will be included in the annual Program Quality Assurance Report
and also will become a permanent part of the database documentation (i.e., the metadata). The QA/QC
data collected by the Program will be used not only to assess the accuracy and precision of individual
measurements, but ultimately to assess the comparability of data generated by multiple laboratories and
field crews.
-------
Section 10
Page 11 of 11
Revision 2
June1995
Table 10-2. QA codes assigned to water quality data for vertical water column (profile) and long-term,
continuous (continuous).
Code Definition
Continuous
W-A Fully acceptable data
W-M Marginally acceptable data
W-U Unacceptable data
W-F No deployment, bottom DO <.l.Q ppm
W-Z No deployment due to physical constraints
Profile
P-A Fully acceptable data
P-X Measurement not recorded, instrument failure.
-------
Section 11
Page 1 of 7
Revision 2
June 1995
SECTION 11
INFORMATION MANAGEMENT
11.1 SYSTEM DESCRIPTION
The Information Management System developed for the EMAP-E Program is designed to perform the
following functions:
• Document sampling activities and standard methods;
• Support program logistics, sample tracking, and shipments;
• Process and organize both field and laboratory data;
• Perform range checks on selected numerical data;
• Facilitate the dissemination of information; and
• Provide interaction with the EMAP Central Information System.
A complete and detailed description of the EMAP-E Information Management System (IMS) is
provided in Rosen et al. (1991) and will not be repeated here.
11.2 QUALITY ASSURANCE/QUALITY CONTROL
Two general types of problems that must be resolved in developing QA/QC protocols for information
and data management are: the correction or removal of erroneous individual values and the
inconsistencies that damage the integrity of the data base. The following features of the EMAP-E IMS
will provide a foundation for the management and quality assurance of all data collected and reported
during the life of the project.
-------
Section 11
Page 2 of 7
Revision 2
June 1995
11.2.1 Standardization
A systematic numbering system will be developed for unique identification of individual samples,
sampling events, stations, shipments, equipment, and diskettes. The sample numbering system will
contain codes which will allow the computer system to distinguish among several different sample types
(e.g., actual samples, quality control samples, sample replicates, etc.). This system will be flexible enough
to allow changes during the life of the project, while maintaining a structure that allows easy
comprehension of the sample type.
A clearly written instruction manual on the use of the field computer system will be developed for
training field personnel and to allow easy reference in the field. Contingency plans also will be stated
explicitly in the event that the field systems fail.
11.2.2 Prelabeling of Equipment and Sample Containers
Whenever possible, sample containers, equipment, and data sheets will be prelabeled to eliminate
potential confusion in the field and thereby reduce the number of incorrect or poorly affixed labels.
Sampling packages, containing all the required prelabeled sample containers, and sample sheets, will be
prepared for the field crews prior to each sampling event (an event is defined as a single visit by a crew to
a sampling site). Each sampling packet will have the station number affixed to it using both handwritten
and bar code labels.
11.2.3 Data Entry, Transcription, and Transfer
In addition to paper data sheets, all data collected by field crews are recorded in a series of electronic
forms on a laptop computer. There is a one-to-one correspondence between the electronic forms (or
records) and the paper forms. Data entered in each field of the electronic forms can be checked
automatically by the software, which will then provide a warning when data do not fall in an expected
range. In many instances, the use of barcode labels and readers in the field will eliminate the need for
manual entry of routine sample information and help avoid transcription errors.
-------
Section tl
Page 3 of 7
Revision 2
June 1995
Following the initial entry of data into the field computer system, it is printed onto hard copy and
checked 100% against the original paper data sheets. This check is performed by the field crew chief, who
may correct transcription errors and ultimately is responsible for assigning an acceptance code to the
entered data. Once the data has been checked and accepted by the crew chief, the field personnel no longer
have the ability to make changes.
A record of each day's computer activities is kept by the field computer software and used by the
communications program to compress data files before transmission. A 2400 baud, error-checking
modem, which checks each byte as it is sent and eliminates garbled transmissions, transmits the
compressed data files to the Data Receiver at the field operations center. Paper data sheets are mailed (or
hand carried) to the Field Operation Center after all sampling activities for a week have been completed.
On the field operations center Data Receiver, a program that is run automatically unpacks the
compressed data filed. A program is subsequently run to process the information and automatically
generate reports indicating stations visited and activities performed the previous day. This enables a
verification check to be performed in which the information received electronically is compared with what
the crews reported doing via a daily phone call. Phone logs are also computerized at the field operations
center. If there are discrepancies between the two reports the field crews are notified. Furthermore, each
day's data can be viewed by the Province Manager, Field Coordinator, and/or members of the QA staff.
After all data sheets have been received from a field team for a given time window (about 6 days), the
West Indian Province data librarian performs a 100% manual check of the data sheets against the
electronic data stored in the data base. Any erroneous data values identified in this check or in the
previously generated reports are changed to correct values, with authorization from the Province QA
Coordinator. In addition, suspicious data is flagged for further investigation. Whenever a change to the
data is required, the data librarian is required to enter a computerized data change form indicating the data
sheet, variable, and reason for change.
-------
Section 11
Page 4 of 7
Revision 2
June 1995
11.2.4 Automated Data Verification
Whenever possible, erroneous numeric data will be identified using automatic range checks and
filtering algorithms. When data fall outside of an acceptable range, they will be flagged in a report for
review by the Province Manager, the Province QA Coordinator, or their designee. This type of report will
be generated routinely and should detail the files processed and the status of the QA checks. The report
will be generated both on disk and in hard copy for permanent filing. The Province Manager or QA
Coordinator will review the report and release data which have passed the QA check for addition to the
data base. All identified errors must be corrected before flagged files can be added to a data base. If it is
found that the data check ranges are not reasonable, the values should be changed by a written request that
includes a justification for the change.
Data base entries which are in the form of codes should be compared to lists of valid values (e.g.,
look-up tables) established by experts for specific data types. These lists of valid codes will be stored in a
central data base for easy access by users. When a code cannot be verified in the appropriate look-up
table, the observation should be flagged in a written report for appropriate corrective action (e.g., update of
the look-up table or removal of the erroneous code).
11.2.5 Sample Tracking
Real-time tracking of all sample shipments will be performed at the West Indian Province Field
Operations Center. The tracking of sample shipments from the field crews to the analytical laboratories is
extremely important in order to minimize loss of samples by the field crews, shipping carrier, or receiving
laboratory or as a result of improper packaging. Shipment tracking is performed in two ways: by the
transfer of shipment and receipt information via daily telephone calls from the field crews and receiving
labs, and by the comparison of electronic shipment and receipt files transmitted to the Field Operations
Center.
All shipments sent to the analytical laboratories by the field crews will be tracked electronically
using the EMAP-LP Field data software. Both the field crews and the receiving laboratories are issued
barcode scanners on the EMAP-WI Field Data software. The shipments are tracked using the station and
-------
Section 11
Page 5 of 7
Revision 2
June 1995
sample barcode identification numbers. Shipping reports from the field crews and Receiving reports from
the laboratories are transmitted daily. These reports are compared using a relational database reporting
routine. Discrepancies in the reports are flagged and corrective action taken.
Hard copy shipping forms will be packaged and shipped with each set of samples, with 1 copy being
overnighted to the Field Operation Center with the data sheets. The analytical laboratory is required to
verbally verify that all samples have been received and immediately notify Field Operation Center of any
discrepancies.
11.2.6 Reporting
Following analysis of the samples, the summary data packages transmitted from the laboratories will
include results, QA/QC information, and accompanying text. If the laboratory has assigned internal
identification numbers to the samples, the results should include the original station sample number and
the internal number used by the laboratory. Specific data reporting requirements associated with each
indicator are discussed in the corresponding section of this plan. Analytical laboratories are responsible
for permanent archiving of all raw data used in generating results for a minimum period of seven years.
11.2.7 Redundancy (Backups)
All files in the EMAP-E IMS will be backed up regularly. At least one copy of the entire system will
be maintained off-site to enable the information management team to reconstruct the data base in the event
that one system is destroyed or incapacitated. In the field, all information will be recorded both on paper
data sheets and in the computer. All information saved to the hard drive will also be copied to a diskette
simultaneously. In addition, at the end of each day the field computers will be "equalized" to assure that
the information contained on both are identical. At this point all data will be contained on the hard drives
of both field computers and on a diskette. At the EMAP-E West Indian Information Management Center,
incremental backups to removable disk will be performed on all files which have changed on a daily basis.
In addition, backups of all EMAP directories and intermediate files will be performed on a weekly basis to
provide a backup in the event of a complete loss of the EMAP-E Information Center facility.
-------
Section 11
Page 6 of 7
Revision 2
June 1995
All original data files will be saved on-line for at least two years, after which the files will be
permanently archived on floppy diskette. All original files, especially those containing the raw field data,
will be protected so that they can be read only (i.e., write and delete privileges will be removed from these
files).
11.3 DOCUMENTATION AND RELEASE OF DATA
Comprehensive documentation of information relevant to users of the EMAP-E IMS will be
maintained and updated as necessary. Most of this documentation will be accessible on-line, in data bases
which describe and interact with the system. The documentation will include a data base dictionary,
access control, and data base directories (including directory structures), code tables, and continuously
updated information on field sampling events, sample tracking, and data availability.
A limited number of personnel will be authorized to make changes to the EMAP-E data base. All
changes will be carefully documented and controlled by the senior data librarian. Data bases which are
accessible to outside authorized users will be available in "read only" form. Access to data by
unauthorized users will be limited through the use of standard DEC VAX security procedures.
Information on access rights to all EMAP-E directories, files; and data bases will be provided to all
potential users.
The release of data from the EMAP-E IMS will occur on a graduated schedule. Different classes of
users will be given access to the data only after it has passed a specified level of QA review. Each group
will use the data on a restricted basis, under explicit agreements with the Estuaries Resource Group. The
following four groups are defined for access to data:
I. The West Indian Province central group, including the information management team, the field
coordinator, the Province Manager, the QA Coordinator and the field crew chiefs.
II. EMAP-Estuaries primary users: ERL-Narragansett personnel, ERL-Gulf Breeze personnel, NOAA
EMAP-E personnel, and EMAP quality assurance personnel.
-------
Section 11
Page 7 of 7
Revision 2
June 1995
III. EMAP data users: All other task groups within EPA, NOAA, and other federal agencies.
IV. General Public - University personnel, other EPA offices (includes regional offices), and other
federal, state, and local governments.
Prior to release at level IV (general public), all files will be checked and/or modified to assure that
values contain the appropriate number of significant figures. The purpose is to assure that the data
released do not imply greater accuracy than was realized. This will be especially important in files where
data were summarized. In such cases additional figures beyond the decimal point may have been added by
the statistical program during averaging or other manipulations. It will be the responsibility of the Quality
Assurance Coordinator to determine the appropriate number of significant figures for each measurement.
Requests for premature release of West Indian Province data will be submitted to the Information
Management Team through the Province Manager. The Province Information Manager and the Quality
Assurance Coordinator, in consultation with the Province Manager, will determine if the data can be
released. The final authority on the release of all data is the Technical Director of EMAP-Estuaries. The
long-term goal for the EMAP-E Information Management Team will be to develop a user interface through
which all data will be accessed directly on the computer. This will improve control of security and
monitoring of access to the data, and it will help ensure that only the proper data files are being accessed.
-------
Section 12
Page 1 of 1
Revision 2
April 1994
SECTION 12
QUALITY ASSURANCE REPORTS TO MANAGEMENT
A quality assurance report will be prepared by the Province QA Coordinator following each year's
sampling efforts. This report will summarize the measurement error estimates for the various data types
using the QA/QC sample data. Precision, accuracy, comparability, completeness, and representativeness
of the data will be addressed in this document.
Within 30 days of each audit (field or laboratory), the QA Coordinator will submit a report to the
Province Manager. This report will describe the results of the audit in full detail and note any deficiencies
requiring management action. The QA Coordinator will monitor the implementation of corrective actions
in response to negative findings, and will make regular reports to the Province Manager in this regard.
In addition to the formal reports described above, the Province QA Coordinator will report regularly
to the Province Manager on an informal basis, through electronic-mail, conference calls, and/or direct
contact. One of the primary responsibilities of the QA Coordinator is to keep the Province Manager
informed of any issue or problem that might have a negative effect on the data collected.
The EMAP-E Program Quality Assurance Coordinator, with assistance from the Province QA
Coordinators, will prepare a Quality Assurance Annual Report and Work Plan (QAARWP) for the
Estuaries Resource Group. The QAARWP summarizes the QA activities conducted during the previous
fiscal year, and describes activities planned for the upcoming fiscal year. This report will be prepared
following the guidelines presented in the approved Quality Assurance Management Plan for EMAP
(Kirkland, 1994). The QAARWP will be completed, approved by the EMAP-E Technical Director, and
delivered to the EMAP QA Coordinator by September 30 of each year.
-------
Section 13
Page 1 of 3
Revision 2
April 1994
SECTION 13
REFERENCES
Agius, C. 1979. The role of melano-macrophage centers in iron storage in normal and diseased fish. J. Fish
Dis. 2:337-343.
Agius, C. 1980. Phylogenetic development of melano-macrophage centers in fish.' J. Zool., London
191:111-132.
Agius, C. and R.J. Roberts, 1981. Effects of starvation on the melano-macrophage centers in fish. J. Fish
Biol. 19:161-169.
ASTM. 1984. Annual Book of ASTM Standards, Vol. 11.01. Standard Specifications for Reagent Water
Dl 193-77 (reapproved 1983). American Society for Testing and Materials, Philadelphia, PA.
ASTM. 1991. Guide for conducting 10-day static sediment toxicity tests with marine and estuarine
amphipods. ASTM Standard Methods Vol. 1104, Method Number E-1367-90. American Society for
Testing and Materials, Philadelphia, PA.
Baker, J.R. and G.D. Merritt. 1990. Environmental Monitoring and Assessment Program: Guidelines for
Preparing Logistics Plans. EPA 600/4-91-001. U. S. Environmental Protection Agency, Las Vegas,
Nevada.
Cantillo, A.Y. 1992. Standard and Reference Materials for Marine Sciences, 3rd. ed. NOAA Technical
Memorandum NOS ORCA 68, National Ocean Service, Office of Ocean Resources Conservation and
Assessment, Silver Spring, MD.
Degraeve, G.M., N.G. Reichenbach, J.D. Cooney, P. I. Feder, and D. I. Mount. 1988. New developments in
estimating endpoints for chronic toxicity tests. Abstract, Am. Soc. Test. Mater. 12th Symp. Aquat.
Toxicol. Hazard Assess., Sparks, Nev.
Engle, V.D., J.K. Summers, and G. Gaston. 1994. A Benthic Index of Environmental Condition of Gulf of
Mexico Estuaries. Estuaries. 17:272-284.
Federal Register, Part VIII, EPA, Oct. 28, 1984."Guidelines Establishing Test Procedures for the Analysis of
Pollutants Under the Clean Water Act: Final Rule and Proposed Rule. 40 CFR Part 136.
Hamilton, M.A., R.C. Russo, and R.V. Thurston. 1977. Trimmed Spearman-Karber method for estimating
median lethal concentrations in toxicity bioassays. Environ. Sci. Technol. 11:714-719; Correction
12:417 (1978).
-------
Section 13
Page 2 of 3
Revision 2
April 1994
Holland, A.F., ed. 1990. Near Coastal Program Plan for 1990: Estuaries. EPA 600/4-90/033. U.S.
Environmental Protection Agency, Environmental Research Laboratory, Narragansett, RI.
Hunt, D.T.E., and A.L. Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques.
2nd ed. Royal Society of Chemistry, London, England 683 pp.
Keith, L.H., W. Crumett, J. Deegan, Jr., R.A. Libby, J.K. Taylor, and G. Wentler. 1983. Principles of
environmental analysis. Anal. Chem. 55:2210-2218.
Keith, L.H. 1991. Environmental Sampling and Analysis: A Practical Guide. Lewis Publishers, Chelsea, MI,
143 pp.
Kirchner, C.J. 1983. Quality control in water analysis. Environ. Sci. and Technol. 17(4):174A-181A.
Kirkland, L., 1994. Quality Assurance Management Plan for the Environmental Monitoring and Assessment
Program. U.S. Environmental Protection Agency, Washington, D.C.
Lauenstein, G.L., A.Y. Cantillo, and S. Dolvin, eds. 1993. A Compendium of Methods Used in the NOAA
National Status and Trends Program. National Ocean Service, Office of Ocean Resources
Conservation and Assessment, Silver Spring, MD.
Macauley, J.M., and J.K. Summers. 1995. EMAP-Estuaries 1995 West Indian Province Field Operations
Manual. ERL/GB No. SR229. U.S. Environmental Protection Agency, Environmental Research
Laboratory, Gulf Breeze, FL.
Olsen, A.R., ed. 1992. The Indicator Development Strategy for the Environmental Monitoring and
Assessment Program. U.S. Environmental Protection Agency, Environmental Research Laboratory,
Corvallis, OR.
Plumb, R.H., Jr. 1981. Procedures for handling and chemical analysis of sediment and water samples.
Technical Report EPA\CE-81-1. U.S. Environmental Protection Agency/U.S. Corps of Engineers
Technical Committee on Criteria for Dredged and Fill Material, U.S. Army Waterways Experiment
Station, Vicksburg, MS. 471 pp.
Rosen, J.S. 1993. Documentation of the calculation of the EMAP Estuaries Virginian Province benthic index.
Unpublished manuscript, U.S. Environmental Protection Agency, Environmental Research
Laboratory, Narragansett, RI.
Rosen, J.S., M. Adams, J. S. Rosen, H. Buffum, J. Beaulieu, and M. Hughes. 1991. Information Management
Plan for the EMAP-Near Coastal Program. U.S. Environmental Protection Agency, Environmental
Research Laboratory, Narragansett, RI.
-------
Section 13
Page 3 of 3
Revision 2
April 1994
Stanley, T.W., and S.S. Vemer. 1985. The U.S. Environmental Protection Agency's quality assurance
program, pp 12-19 In: J.K. Taylor and T.W. Stanley, eds. Quality Assurance for Environmental
Measurements, ASTM SPT 867. American Society for Texting and Materials, Philadelphia PA.
Taylor, J.K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers^ Inc., Chelsea, MI. 328
pp.
U.S.EPA. 1992. EMAP Laboratory Methods Manual: Estuaries. U. S. Environmental Protection Agency,
Environmental Monitoring Systems Laboratory, Office of Research and Development, Cincinnati, OH
(in revision).
U.S.EPA. 1994. Statistical Summary: EMAP-Estuaries Virginian Province- 1991. EOA.620/R-94/005 U.S.
Environmental Protection Agency, Environmental Research Laboratory, Narragansett, RI.
U.S.EPA. 1979a. Methods for chemical analysis of water and wastes. EPA-600/4-79/020. U. S.
Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Cincinnati, OH
(revised March 1983).
U.S.EPA. 1979b. Handbook for analytical quality control in water and wastewater laboratories. EOA/600/4-
79/019. U. S. Environmental Protection Agency, Environmental Monitoring and Support Laboratory,
Cincinnati, OH.
U.S.EPA. 1989. Recommended Protocols for Measuring Selected Environmental Variables in Puget Sound.
U.S. Environmental Protection Agency, Puget Sound Estuary Program, Office of Puget Sound, Seattle,
WA.
U.S.EPA. 1991. A Project Manager's Guide to Requesting and Evaluating Chemical Analyses. EPA 910/9-
90-024. U.S. Environmental Protection Agency, Puget Sound Estuary Program, Office of Coastal
Waters, Region 10, Seattle, WA.
Weisberg, S.B., J.B. Frithsen, A.F. Holland, J.F. Paul, K.J. Scott, J.K. Summers, H.T. Wilson, R.M. Valente,
D.G. Heimbuch, J. Gerritsen, S.C. Schimmel, and R.W. Latimer. 1993. EMAP-Estuaries Virginian
Province 1990 Demonstration Project Report. EPA 600/R-92/100. U.S. Environmental Protection
Agency, Environmental Research Laboratory, Narragansett, RI.
Wolke, R.E., R.A. Murchelano, C.D. Dickstein, and C.J. George, 1985. Preliminary evaluation of the use of
macrophage aggregates (MA) as fish health monitors. Bull. Environ. Contam. Toxicol. 35:222-227.
------- |