ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM
EMAP-ESTUARIES VIRGINIAN PROVINCE
1993 QUALITY ASSURANCE PROJECT PLAN

by

Raymond M. Valente and Charles J. Strobel

Science Applications International Corporation
27 Tarzwell Drive
Narragansett, Rhode Island 02882

May 3,1993

EPA Contract Number 68-C1-0005

Project Officer
Barbara Brown

U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF RESEARCH AND DEVELOPMENT
ENVIRONMENTAL RESEARCH LABORATORY
NARRAGANSETT, RHODE ISLAND 02882


-------
QUALITY ASSURANCE PROJECT PLAN APPROVAL

This quality assurance project plan was developed to assure that all environmental data generated for the
Estuaries Resource Group of the Environmental Monitoring and Assessment Program (EMAP) are scientifically valid
and of acceptable quality to achieve the program's objectives. The signatures of key technical and management
personnel indicate approval or concurrence with the procedures specified in this plan. These approvals and
concurrences also represent a commitment to disseminate the plan and the philosophy of total quality to all project
participants.

Date

Norman Rubinstein

Virginian Province Manager

Environmental Research Laboratory-Narragansett

Date

Jan Prager, Ph.D.

Quality Assurance Officer

Environmental Research Laboratory-Narragansett

Date

Norbert Jaworski, Ph.D.

Laboratory Director

Environmental Research Laboratory-Narragansett

Date

J. Kevin Summers, Ph.D.

EMAP-Estuaries Technical Director

Environmental Research Laboratory-Gulf Breeze

Date

Linda Kirkland, Ph.D.

EMAP Quality Assurance Coordinator

EMAP Progam Office, Washington, D.C.

11


-------
PREFACE

This document outlines the integrated quality assurance plan for the Environmental Monitoring and
Assessment Program, Estuaries Resource Group's Monitoring in the Virginian Province. The quality assurance plan
is prepared following the general guidelines and specifications provided by the Quality Assurance Management Staff
of the U.S. Environmental Protection Agency Office of Research and Development and the guidelines provided in the
draft EMAP Quality Assurance Management Plan.

The primary objective of this Quality Assurance Project Plan (QAPjP) is to maximize the probability that
environmental data collected by the EMAP-Estuaries program will meet or exceed the objectives established for data
quality. The QAPjP presents a systematic approach that will be implemented within each major data acquisition and
data management component of the program. Basic requirements specified in the QAPjP are designed to: (1) ensure
that collection and measurement procedures are standardized among all participants; (2) monitor the performance of
the various measurement systems being used in the program to maintain statistical control and to provide rapid
feedback so that corrective measures can be taken before data quality is compromised; (3) assess the performance of
these measurement systems and their components periodically; and, (4) verify that reported data are sufficiently
complete, comparable, representative, unbiased, and precise so as to be suitable for their intended use. These activities
will provide data users with information regarding the degree of uncertainty associated with the various components
of the EMAP-Estuaries data base.

This QAPjP has been submitted in partial fulfillment of Contract Number 68-C1-0005 to Science Applications
International Corporation under the sponsorship of the U.S. Environmental Protection Agency. Mention of trade
names and commercial products does not constitute endorsement or recommendation for use.

The proper citation of this document is:

Valente, R. M. and C. J. Strobel. 1993. EMAP-Estuaries Virginian Province: Quality Assurance Project Plan
for 1993. EPA 600/X-93/XXX. U. S. Environmental Protection Agency, Office of Research and
Development, Environmental Research Laboratory, Narragansett, RI.

ill


-------
TABLE OF CONTENTS

Section	Page

Approvals	 ii

Preface 	 iii

Acknowledgments	 viii

1	INTRODUCTION		1 of 4

1.1	OVERVIEW OF EMAP 		1 of 4

1.2	THE NEAR COASTAL COMPONENT OF EMAP		1 of 4

1.3	QUALITY ASSURANCE PROGRAM WITHIN EMAP 		3 of 4

1.4	QUALITY ASSURANCE PROGRAM FOR EMAP-ESTUARIES		3 of 4

2	PROJECT ORGANIZATION 		1 of 3

2.1 MANAGEMENT STRUCTURE 		1 of 3

3	GENERAL REQUIREMENTS FOR FIELD AND LABORATORY OPERATIONS		1 of 9

3.1	FIELD OPERATIONS		1 of 9

3.1.1	Training Program 		2 of 9

3.1.2	Field Quality Control and Audits		3 of 9

3.1.3	Navigation		3 of 9

3.2	LABORATORY OPERATIONS		4 of 9

3.2.1	Laboratory Personnel, Training and Safety 		5 of 9

3.2.2	Quality Assurance Documentation		6 of 9

3.2.3	Analytical Procedures 		7 of 9

3.2.4	Laboratory Performance Audits		7 of 9

3.2.5	Preparation and Use of Control Charts		7 of 9

4	QUALITY ASSURANCE OBJECTIVES 		1 of 12

4.1	DATA QUALITY OBJECTIVES 		1 of 12

4.2	REPRESENTATIVENESS		6 of 12

4.3	COMPLETENESS		9 of 12

4.4	COMPARABILITY		9 of 12

4.5	ACCURACY (BIAS), PRECISION, AND TOTAL ERROR		10 of 12

5	ANALYSIS OF CHEMICAL CONTAMINANTS IN SEDIMENT

AND FISH TISSUE SAMPLES 		1 of 31

5.1	OVERVIEW		1 of 31

5.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,

PRESERVATION AND HOLDING 		4 of 31

5.3	QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS		6 of 31

5.3.1	Overview		6 of 31

5.3.2	Initial Demonstration of Capability 		10 of 31

Instrument Calibration 		10 of 31

iv


-------
Contents (Continued)

Section	Page

Initial Documentation of Method Detection Limits 		10 of 31

Initial Blind Analysis of a Representative Matrix		12 of 31

5.3.2 On-going Demonstration of Capability 		12 of 31

Participation in Interlaboratory

Comparison Exercises 		12 of 31

Routine Analysis of Certified Reference Materials

or Laboratory Control Materials		13 of 31

Continuing Calibration Checks		16 of 31

Laboratory Reagent Blank		16 of 31

Internal Standards		17 of 31

Injection Internal Standards		17 of 31

Matrix Spike and Matrix Spike Duplicate 		18 of 31

Field Duplicates and Field Splits		19 of 31

5.4	OTHER SEDIMENT MEASUREMENTS 		20 of 31

5.4.1	Total Organic Carbon 		20 of 31

5.4.2	Acid Volatile Sulfide		21 of 31

5.4.3	Butyltins 		22 of 31

5.5	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT		23 of 31

5.5.1	Sample Tracking		23 of 31

5.5.2	Data Reporting Requirements 		23 of 31

5.5.3	Data Evaluation Procedures		25 of 31

Checking Data Completeness 		26 of 31

Assessing Data Quality 		28 of 31

Assigning Data Qualifier Codes 		30 of 31

Taking Final Action 		31 of 31

6	SEDIMENT PARTICLE SIZE ANALYSIS	 1 of 6

6.1	OVERVIEW	 1 of 6

6.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,

PRESERVATION AND HOLDING 	 1 of 6

6.3	QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS	 1 of 6

6.4	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT	 3 of 6

6.4.1	Sample Tracking	 3 of 6

6.4.2	Data Reporting Requirements and

Evaluation Procedures 	 3 of 6

6.4.3	Assigning Data Qualifer Codes and

Taking Final Action	 5 of 6

7	SEDIMENT TOXICITY TESTING 	 1 of 12

7.1	OVERVIEW	 1 of 12

7.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,

PRESERVATION AND HOLDING 	 1 of 12

V


-------
Contents (Continued)

Section	Page

7.3	QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS		2 of 12

7.3.1	Facilities and Equipment		2 of 12

7.3.2	Initial Demonstration of Capability 		2 of 12

7.3.3	Quality of Test Organisms		3 of 12

7.3.4	Test Conditions		5 of 12

7.3.5	Test Acceptability 		5 of 12

7.4	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT		6 of 12

7.4.1	Sample Tracking		6 of 12

7.4.2	Record Keeping and Data Reporting Requirements		6 of 12

7.4.3	Data Evaluation Procedures		7 of 12

7.4.4	Assigning Data Qualifer Codes 		9 of 12

7.4.5	Data Quality Reports		12 of 12

8	MACROBENTHIC COMMUNITY ASSESSMENT		1 of 13

8.1	OVERVIEW		1 of 13

8.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,

PRESERVATION AND HOLDING 		1 of 13

8.3	QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS		2 of 13

8.3.1	Sorting 		2 of 13

8.3.2	Species Identification and Enumeration		3 of 13

8.3.3	Biomass Measurements		6 of 13

8.4	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT		7 of 13

8.4.1	Sample Tracking		7 of 13

8.4.2	Record Keeping and Data Reporting Requirements		7 of 13

8.4.3	Data Evaluation Procedures		8 of 13

8.4.4	Data Quality Reports		10 of 13

8.5	DEVELOPMENT AND VALIDATION OF THE BENTHIC INDEX		10 of 13

9	MEASUREMENTS OF FISH COMMUNITY STRUCTURE AND PATHOLOGY		1 of 9

9.1	OVERVIEW		1 of 9

9.2	QUALITY CONTROL PROCEDURES: FIELD OPERATIONS 		1 of 9

9.2.1	Trawling		lof9

9.2.2	Species Identification, Enumeration and

Length Measurements 		2 of 9

9.3	QUALITY CONTROL PROCEDURES: GROSS EXTERNAL PATHOLOGY AND

IIISTOPATIIOI.OGY		3 of 9

9.4	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT		4 of 9

9.4.1	Sample Tracking		4 of 9

9.4.2	Data Reporting Requirements 		5 of 9

9.4.3	Data Evaluation Procedures		5 of 9

Checking Data Completeness 		6 of 9

Assessing Data Quality 		7 of 9

Assigning Data Qualifier Codes 		7 of 9

Taking Final Action 		9 of 9

vi


-------
Contents (Continued)

Section	Page

10	WATER QUALITY MEASUREMENTS		1 of 15

10.1	OVERVIEW		1 of 15

10.2	QUALITY CONTROL PROCEDURES: FIELD MEASUREMENTS		1 of 15

10.2.1	Instrument Calibration		2 of 15

10.2.2	Instrument Calibration Checks		3 of 15

10.2.3	Instrument Deployment Checks		5 of 15

10.3	QUALITY CONTROL PROCEDURES: TOTAL SUSPENDED SOLIDS 		5 of 15

10.4	QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT		6 of 15

10.4.1	Sample Tracking		6 of 15

10.4.2	Data Reporting Requirements 		7 of 15

10.4.3	Data Evaluation Procedures		8 of 15

Checking Data Completeness		8 of 15

Assessing Data Quality		9 of 15

Assigning Data Qualifier Codes		11 of 15

Taking Final Action		11 of 15

11	INFORMATION MANAGEMENT		1 of 9

11.1	SYSTEM DESCRIPTION 		1 of 9

11.2	QUALITY ASSURANCE/QUALITY CONTROL 		1 of 9

11.2.1	Standardization		lof9

11.2.2	Prelabeling of Equipment and Sample Containers		2 of 9

11.2.3	Data Entry, Transcription, and Transfer 		2 of 9

11.2.4	Automated Data Verification 		3 of 9

11.2.5	Sample Tracking		4 of 9

11.2.6	Reporting		6 of 9

11.2.7	Redundancy (Backups) 		7 of 9

11.3	DOCUMENTATION AND RELEASE OF DATA		7 of 9

12	QUALITY ASSURANCE REPORTS TO MANAGEMENT		1 of 1

13	REFERENCES 		1 of 3

vii


-------
ACKNOWLEDGMENTS

The authors gratefully acknowledge the contributions of the following individuals in the development of this
document: Jim Pollard, Katherine Peres and Tom Chiang, Lockheed Engineering and Sciences Company, Las Vegas,
Nevada; Jill Schoenherr, Craig Eller, and Donald Cobb, Science Applications International Corporation, Narragansett,
Rhode Island; Dan Bender, Lora Johnson, and Robert Graves, U.S. Environmental Protection Agency, Environmental
Monitoring Systems Laboratory, Cincinnati, Ohio; Carol-Ann Manen, National Oceanic and Atmospheric Adminis-
tration, Silver Spring, Maryland; Kevin Summers, U.S. Environmental Protection Agency, Environmental Research
Laboratory, Gulf Breeze, Florida; RichPruell and Warren Boothman, U.S. Environmental Protection Agency, Environ-
mental Research Laboratory, Narragansett, Rhode Island; Fred Holland, Marine Resources Research Institute,
Charleston, South Carolina, and Jeff Frithsen and Steve Weisberg, Versar, Inc., Columbia, Maryland. The assistance
provided by Bob Graves in the development of measurement quality objectives for analytical chemistry is especially
appreciated.

viii


-------
SECTION 1

INTRODUCTION

1.1	OVERVIEW OF EMAP

The U.S. Environmental Protection Agency (EPA), in cooperation with other Federal agencies and state
organizations, is conducting research to develop a design for the Environmental Monitoring and Assessment Program
(EMAP) to monitor indicators of the condition of the Nation's ecological resources. Specifically, EMAP is intended
to respond to the growing demand for information characterizing the condition of our environment and the type and
location of changes in our environment. Simultaneous monitoring of pollutants and environmental indicators will
allow for the identification of the potential causes of adverse changes. When EMAP has been fully implemented, it
will address the following objectives:

•	Estimate the current status, trends, and changes in the selected indicators of the condition of the Nation's
ecological resources on a regional scale with known confidence.

•	Estimate the geographic coverage and extent of the Nation's ecological resources with known confidence.

•	Seek associations between selected indicators of natural and anthropogenic stresses and indicators of the
condition of ecological resources.

•	Provide annual statistical summaries and periodic assessments of the Nation's ecological resources.

1.2	THE ESTUARIES COMPONENT OF EMAP

The Estuaries component of EMAP (EMAP-E) will monitor the status and trends in environmental quality
of the estuarine waters of the United States. The EMAP-E Program has set four major objectives:

•	Provide a quantitative assessment of the regional extent of coastal environmental problems by measuring
pollution exposure and ecological condition.


-------
•	Measure changes in the regional extent of environmental problems for the nation's estuarine and coastal
ecosystems.

•	Identify and evaluate associations between the ecological condition of the nation's estuarine and coastal
ecosystems and pollutant exposure, as well as other factors known to affect ecological condition (e.g., climatic
conditions, land use patterns).

•	Assess the effectiveness of pollution control actions and environmental policies on a regional scale (i.e., large
estuaries like Chesapeake Bay, major coastal regions like the mid-Atlantic and Gulf Coasts, large inland
bodies of water like the Great Lakes) and nationally.

The EMAP-E program will complement and may eventually merge with the National Oceanic and
Atmospheric Administration's (NOAA) existing National Status and Trends Program for Marine Environmental
Quality to produce a single, cooperative estuarine monitoring program. To more efficiently manage estuarine activities,
the EMAP-E Program has been further divided to study the Great Lakes, the offshore (shelf) environment, and the
Nation's estuaries, bays, tidal rivers, and sounds.

Complete descriptions of the EMAP-E monitoring approach and rationale, sampling design, indicator strategy,
logistics, and data assessment plan are provided in the Near Coastal Program Plan for 1990: Estuaries (Holland 1990).
The strategy for implementation of the EMAP-E project is a regional, phased approach which started with the 1990
Demonstration Project in the Virginian Province. This biogeographical province covers an area from Cape Cod,
Massachusetts to Cape Henry, Virginia (Holland et al. 1990). In 1991, monitoring continued in the Virginian Province
and began in the Louisianian Province (Gulf of Mexico from near Tampa Bay, Florida to the Texas-Mexico border at
the Rio Grande). Additional provinces will be added in future years (e.g., the Carolinian Province in 1994), eventually
resulting in full national implementation of EMAP-Estuaries. This document is the Quality Assurance Project Plan
for EMAP-Estuaries 1993 monitoring in the Virginian Province.


-------
1.3 QUALITY ASSURANCE PROGRAM WITHIN EMAP

The overall QA and management policies, organization, objectives, and functional responsibilities associated
with the EMAP program are documented in a Quality Assurance Management Plan (Kirkland, in preparation). The
Quality Assurance Management Plan presents the guidelines and minimum requirements for Q A programs developed
and implemented by each resource group within EMAP.

1.4 QUALITY ASSURANCE PROGRAM FOR EMAP-ESTUARIES

The Estuaries Resource Group, as a component of EMAP, must conform with all requirements specified in
the approved EMAP Quality Assurance Management Plan and also will participate in the EPA mandatory QA program
(Stanley and Verner 1983). As part of this program, every environmental monitoring and measurement project is
required to have a written and approved quality assurance project plan (QAPjP). The QAPjP for EMAP-E monitoring
in the Virginian Province (this document) describes the quality assurance and quality control activities and measures
that will be implemented to ensure that the data will meet all quality criteria established for the project. All project
personnel must be familiar with the policies, procedures, and objectives outlined in this quality assurance plan to assure
proper interactions among the various data acquisition and management components of the project. This document
will be revised, as appropriate, as changes are made to the existing QA program, and as additional data acquisition
activities are implemented.

EPA guidance (Stanley and Verner, 1983) states that the 15 items shown in Table 1-1 should be addressed
in the QA Project Plan. Some of these items are extensively addressed in other documents for this project and therefore
are only summarized or referenced in this document.


-------
TABLE 1-1.

Sections in this report that address the 15 subjects required in a Quality Assurance Project Plan.

Quality Assurance Subject	This Report

Title page

Title page

Table of contents

Table of contents

Project description

Section 1

Project organization and responsibility

Section 2

QA objectives

Section 4

Sampling procedures

Section 3, 5-10

Sample custody

Section 3, 5-10

Calibration procedures

Section 3, 5-10

Analytical procedures

Section 3, 5-10

Data reduction, validation, and reporting

Section 3, 5-11

Internal QC checks

Section 3, 5-10

Performance and system audits

Section 3

Preventive maintenance

Section 3, 5-10

Corrective action

Section 3, 5-10

QA reports to management

Section 12


-------
Section 2
Page 1 of 3
Revision 2
May 1993

SECTION 2

PROJECT ORGANIZATION

2.1 MANAGEMENT STRUCTURE

For the EMAP-Estuaries monitoring in the Virginian Province, expertise in specific research and monitoring
areas will be provided by several EPA laboratories and their contracting organizations. The Environmental Research
Laboratory in Narragansett, Rhode Island (ERL-N) has been designated as the principal laboratory for EMAP-E
monitoring in the Virginian Province, and therefore will provide direction and support for all activities. Technical
support is provided to ERL-N through contracts with the following organizations: Science Applications International
Corporation (SAIC), Versar Incorporated, and R.O.W. Sciences Incorporated. Additional technical support will be
provided through a cooperative agreement with a consortium of universities led by the University of Rhode Island. The
Environmental Monitoring Systems Laboratory in Cincinnati, Ohio (EMSL-CIN) will be responsible for analyzing
chemical contaminants in sediment samples. The Environmental Research Laboratory in Gulf Breeze, Florida (ERL-
GB) has been designated as the principal laboratory for the statistical design of the Estuarine monitoring effort. Figure
2-1 illustrates the management structure for the EMAP-E 1993 Virginian Province monitoring. All key personnel
involved in the 1993 Virginian Province monitoring are listed in Table 2-1.


-------
Section 2

Figure 2-1, Management structure for the 1993 EMAP-E Virginian Province monitoring.


-------
Section 2
Page 3 of 3
Revision 2
May 1993

TABLE 2-1. List of key personnel, affiliations, and responsibilities for the EMAP-Estuaries 1993 Virginian
Province monitoring.

NAME

AFFILIATION

RESPONSIBILITY

E. Martinko
J. Paul
K. Summers

U.S. EPA-DC

U.S. EPA-Narragansett

U.S. EPA-Gulf Breeze

EMAP Director

EMAP Associate Director

EMAP-E Technical Director

R. Latimer
N. Rubinstein
D. Keith
D. Heggam

D.	Reifsteck
S. Kelley

L. Kirkland
J. Prager
R. Valente
C. Strobel
A. Cantillo

M. Aikenhead
H. Buffum

E.	Petrocelli

U.S. EPA-Narragansett
U.S. EPA-Narragansett
U.S. EPA-Narragansett
U.S. EPA-Las Vegas
SAIC

URI/Consortium

U.S. EPA-DC

U.S. EPA-Narragansett

SAIC

SAIC

NOAA

R.O.W. Sciences, Inc.
R.O.W. Sciences, Inc.
R.O.W. Sciences, Inc.

Deputy Technical Director
Virginian Province Manager
Field Coordinator
EMAP Logistics Coordinator
Logistics/Training Coordinator
Field Activities Coordinator

EMAP QA Coordinator
Virginian Province QA Officer
EMAP-E QA Coordinator
Virginian Province QA Technical Support
NOAA QA Liaison

Virginian Province Information Manager
Virginian Province Database Manager
Virginian Province Data Librarian

N. Malof
J. Scott
G. Gardner
G. Thursby
J. Frithsen

U.S. EPA-Cincinnati
SAIC

U.S. EPA-Narragansett
SAIC

Versar, Inc.

Contaminant Analyses-Sediments
Sediment Toxicity Testing
Fish Pathology/Histopathology
Sediment Physical Analyses
Benthic Analyses


-------
Section 3

Page 1 of9
Revision 2
May 1993

SECTION 3

GENERAL REQUIREMENTS FOR FIELD AND
LABORATORY OPERATIONS

3.1 FIELD OPERATIONS

All field operations conducted by the EMAP-Estuaries Resource Group are planned and implemented
according to a logistics plan that is prepared and approved following guidelines established for EMAP (Baker and
Merritt 1990). Elements of the logistic plan are presented in Table 3-1, and address major areas of project
implementation, including project management, site access and scheduling, safety and waste disposal, procurement
and inventory control, training and data collection, and the assessment of the operation upon completion.

TABLE 3-1. Required Elements of EMAP Logistics Plans (from Baker and Merritt 1990).

Logistics Plan Area
Project Management

Access and Scheduling

Safety

Procurement and Inventory Control
Training and Data Collection

Assessment of Operations

Required Elements

Overview of Logistic Activities
Staffing and Personnel Requirements
Communications

Sampling Schedule
Site Access
Reconnaissance

Safety Plan
Waste Disposal Plan

Equipment, Supplies, and Services
Procurement, Methods and Scheduling

Training Program

Field and Mobile Laboratory Operations
Quality Assurance
Information Management

Logistics Review and Recommendations


-------
Section 3

Page 2 of 9
Revision 2
May 1993

3.1.1 Training Program

Proper training of field personnel represents a critical aspect of quality control. Field technicians are trained
to conduct a wide variety of activities using standardized protocols to ensure comparability in data collection among
crews and across regions. Each crew consists of a boat captain, chief scientist, and a minimum of one technician.
Minimum qualifications for chief scientists should include an M.S. degree in biological/ecological sciences and three
years of experience infield data collection activities, or a B.S. degree and five years experience. The remaining crew
members generally are required to hold B.S. degrees and, preferably, at least one year's experience. The captain must
be an experienced boat handler, preferably holding a captain's license.

Upon completion of an intensive training session, each chief scientist must pass a practical examination. This
examination is useful for assessing the effectiveness of the training session and serves to point out specific areas where
further training is warranted. Following the preliminary chief scientist training session, both chief scientists and their
crew members must participate in a second intensive training program. Both classroom and "hands-on" training will
be coordinated by EMAP-VP Field Operations Center staff members having extensive experience instructing field
technicians in routine sampling operations (e.g., collection techniques, small boat handling, etc.). The expertise of
the on-site EMAP staff will be supplemented by local experts in such specialized areas as fish pathology, fish
identification, field computer/navigation system use, boat handling and water safety, and first aid (including
cardiopulmonary resuscitation (CPR) training).

All the sampling equipment (e.g., boats, instruments, grabs, nets, computers, etc.) will be used extensively
during the "hands-on" training sessions, and by the end of the course, all crews members must demonstrate proficiency
in all the required sampling activities. Upon completion of the formal crew training session, both a written and
practical examination will be administered to all personnel. At this time all crews must be satisfactorily checked out
in all pertinent areas.

All aspects of field operations are detailed in the Field Operations and Safety Manual (Reifsteck et al. 1993),
which will be distributed to all trainees prior to the training period. The manual includes a checklist of all equipment,
instructions on equipment use, and detailed written descriptions of sample collection procedures. In addition, the


-------
Section 3

Page 3 of 9
Revision 2
May 1993

manual includes flow charts and a schedule of activities to be conducted at each sampling location, along with a list
of potential hazards associated with each sampling site.

In addition to the formal classroom training and practical examinations, all crews will be evaluated on their
field performance during "dry runs" conducted just prior to the actual sampling period. Each crew will be visited
during these dry runs by either the Quality Assurance Coordinator (QAC) or the Field Coordinator. Crews may also
be evaluated by other personnel at the Field Operations Center for their performance on specific activities, such as data
entry, communications and shipping procedures. If any deficiencies within a crew are noted during this final
"certification", they must be remedied prior to field sampling. This can be accomplished by additional training or by
changing the crew composition. It is the responsibility of the QA Coordinator to develop certification and audit
"checklists" and maintain copies of all training examinations, certification results and audit reports in a central file.

3.1.2	Field Quality Control and Audits

Quality control of measurements made during the actual field sampling period is accomplished through the
use of a variety of QC sample types and procedures, as described in later sections of this document. In addition, at least
once during each field season a QA audit of each field crew will be performed by either the QAC, or his designee, to
ensure compliance with prescribed protocols. A checklist has been developed to provide comparability and consistency
in this process. Field crews must be retrained whenever discrepancies are noted.

3.1.3	Navigation

Because of the complexity of the navigation equipment and computer system being used aboard the Virginian
Province boats, the most important aspect of quality assurance is thorough training of field personnel. Because of the
potential to have interferences in the signals (e.g., satellite or Loran) received by the navigation instruments, it is
especially important for the chief scientist to evaluate the quality of all inputs and decide which are most appropriate
at each station. Once this is decided, proper calibration of the boat's on-board computer navigation system is critical.
Calibration information is recorded automatically by the boat computer and must also be recorded in writing in a
separate navigation log. Acceptable procedures are discussed in the Field Operations and Safety Manual (Reifsteck
et al. 1993).


-------
Section 3

Page 4 of 9
Revision 2
May 1993

Station location information is logged automatically in the onboard computer through the SAIC
Environmental Data Acquisition System (EDAS), which records navigation data through the interface of the Northstar
800X LORAN and Raytheon RAYSTAR 920 Global Positioning System (GPS) units. The EDAS utilizes a Kalman
filter which allows navigation through either, or both, of the available positioning systems: GPS and calibrated
LORAN-C. The station location, LORAN-C calibration factors, and a series of waypoints can be saved in the EDAS
log files for each station. The computer navigation system must be used whenever a station fix is being recorded.
Uncalibrated LORAN can result in errors of up to 500 m, and, due to Department of Defense degradation of GPS
signals, this system cannot be relied upon for accuracies less than 100 m. Therefore, calibration of the system is
essential, and field crews must maintain a separate bound navigation log book to record all LORAN-C calibration
information. In addition, crews must record radar ranges and hand-held compass bearings for each sampling station
on station location information log sheets, which are later sent to the Field Operations Center for review and permanent
storage. Basic navigation, as well as the completeness and accuracy of navigation logs, will be checked during QA
audits or visits by senior Program personnel.

As position data are received at the FOC, automatic range checks will be performed on station coordinates
(i.e., latitude/longitude). The reported station location will be compared to the expected coordinates and flagged for
further investigation if the positions differ by more than one-half mile. If discrepancies are found, original data sheets
will be reviewed and/or the chief scientist will be contacted to provide an explanation.

3.2 LABORATORY OPERATIONS

This section addresses only general laboratory operations, while the sections on each indicator present specific
QA/QC requirements and procedures associated with the processing of specific samples. All laboratories providing
analytical support for chemical or biological analyses must have the appropriate facilities to store and prepare samples,
and appropriate instrumentation and staff to provide data of the required quality within the time period dictated by the
project. Laboratories are expected to conduct operations using good laboratory practices, including:


-------
Section 3

Page 5 of 9
Revision 2
May 1993

•	A program of scheduled maintenance of analytical balances, microscopes, laboratory equipment and
instrumentation.

•	Routine checking of analytical balances using a set of standard reference weights (ASTM Class 3, NIST Class
S-l, or equivalents).

•	Checking and recording the composition of fresh calibration standards against the previous lot. Acceptable
comparisons are ± 2 percent of the previous value.

•	Recording all analytical data in bound logbooks in ink.

•	Daily monitoring and documenting the temperatures of cold storage areas and freezer units.

•	Verifying the efficiency of fume hoods.

•	Having a source of reagent water meeting American Society of Testing and Materials (ASTM) Type I
specifications (ASTM 1984) available in sufficient quantity to support analytical operations. The conductivity
of the reagent water should not exceed 1 . - S/cin at 25° C.

•	Labeling all containers used in the laboratory with date prepared, contents, and initials of the individual who
prepared the contents.

•	Dating and storing all chemicals safely upon receipt. Chemical are disposed of properly when the expiration
date has expired.

•	Using a laboratory information management system to track the location and status of any sample received
for analysis.

Laboratories should be able to provide information documenting their ability to conduct the analyses with the
required level of data quality. Such information might include results from interlaboratory comparison studies, control
charts and summary data of internal QA/QC checks, and results from certified reference material analyses.
Laboratories must also be able to provide analytical data and associated QA/QC information in a format and time frame
specified by the Virginian Province Manager and/or Information Manager.

3.2.1 Laboratory Personnel, Training and Safety

Each laboratory providing analytical support to EMAP-E should designate an on-site QA coordinator. This
individual will serve as the point of contact for the EMAP-E QA staff in identifying and resolving issues related to data
quality. To ensure that the samples are analyzed in a consistent manner throughout the duration of the project, key
laboratory personnel should participate in an orientation session conducted during an initial site visit or via


-------
Section 3

Page 6 of 9
Revision 2
May 1993

communication with EMAP-E QA staff. The purpose of the orientation session is to familiarize key laboratory
personnel with the QA program. Laboratories may be required to demonstrate acceptable performance before analysis
of samples can proceed, as described for each indicator in subsequent sections. Laboratory operations will be evaluated
on a continuous basis through technical systems audits, performance evaluation studies, and by participation in
interlaboratory round-robin programs.

Personnel in any laboratory performing EMAP analyses should be well versed in good laboratory practices,
including standard safety procedures. It is the responsibility of the laboratory manager and/or supervisor to ensure that
safety training is mandatory for all laboratory personnel. The laboratory is responsible for maintaining a current safety
manual in compliance with the Occupational Safety and Health Administration (OSHA), or equivalent state or local
regulations. The safety manual should be readily available to laboratory personnel. Proper procedures for safe storage,
handling and disposal of chemicals should be followed at all times; each chemical should be treated as a potential
health hazard and good laboratory practices should be implemented accordingly.

3.2.2 Quality Assurance Documentation

All laboratories must have the latest revisions of the EMAP-E Virginian Province QA Project Plan (this
document) and Laboratory Methods Manual (U.S. EPA 1992, in revision). In addition, the following documents and
information must be current, and they must be available to all laboratory personnel participating in the processing of
EMAP-E samples:

•	Laboratory QA Plan: Clearly defined policies and protocols specific to a particular laboratory including
personnel responsibilities, laboratory acceptance criteria for release of data, and procedures for determining
the acceptability of results.

•	Laboratory Standard Operating Procedures (SOPs) - Detailed instructions for performing routine laboratory
procedures. In contrast to the Laboratory Methods Manual, SOPs offer step-by-step instructions describing
exactly how the method is implemented in the laboratory, specific for the particular equipment or instruments
on hand.

•	Instrument performance study information - Information on instrument baseline noise, calibration standard
response, analytical precision and bias data, detection limits, etc. This information usually is recorded in
logbooks or laboratory notebooks.

•	Control charts - Control charts must be developed and maintained throughout the project for all appropriate
analyses and measurements (see section 3.2.5).


-------
Section 3

Page 7 of 9
Revision 2
May 1993

3.2.3	Analytical Procedures

Complete and detailed procedures for processing and analysis of samples in the field and laboratory are
provided in the Virginian Province Field Operations and Safety Manual (Reifsteck el al. 1992) and the EMAP-E
Laboratory Methods Manual (U.S. EPA 1992, in revision) respectively, and will not be repeated here.

3.2.4	Laboratory Performance Audits

Initially, a QA assistance and performance audit will be performed by EMAP-E QA staff to determine if each
laboratory effort is in compliance with the procedures outlined in the Methods Manual and QA Project Plan and to
assist the laboratory where needed. Additionally, technical systems audits may be conducted by a team composed of
the QA Coordinator and his/her technical assistants. Reviews may be conducted at any time during the scope of the
study, but are not required every year. Furthermore, laboratory performance will be assessed on a continous basis
through the use of internal and external performance evaluation (PE) samples and laboratory intercomparison studies
(round robins).

3.2.5	Preparation and Use of Control Charts

Control charts are a graphical tool to demonstrate and monitor statistical control of a measurement process.
A control chart basically is a sequential plot of some sample attribute (measured value or statistic). The type of control
chart used primarily by laboratory analysts is a "property" chart of individual measurements (termed an X chart).

An example of an X chart is presented in Figure 3-1. Measured values are plotted in their sequence of
measurement. Three sets of limits are superimposed on the chart: 1.) the "central line" is the mean value calculated
from at least 7 initial measurements and represents an estimate of the true value of the sample being measured, 2.)
upper and lower "warning limits" representing the 95 percent confidence limits around the mean value, within which
most (95 percent) of the measured values should lie when the measurement process is in a state of statistical control,
and 3.) upper and lower "control limits" representing the 99 percent confidence limits around the mean, within which
nearly all (99 percent) of the measured values should lie when the measurement process is in a state of statistical
control.


-------
Section 3
Page 8 of 9
Revision 2
May 1993

CONCENTRATION

Control Unit

Warning Unit

Central Lin«

Warning Linit

Control Unit

Figure 1 -1.

SEQUENCE OF ANALYSIS

Example of a property type of control chart.


-------
Section 3

Page 8 of 9
Revision 2
May 1993

Figure 1-1. Example of a property type of control chart.

Control charts should be updated by laboratory personnel as soon as a control sample measurement is
completed. Based on the result of an individual control sample measurement, the following course of action should
be taken (Taylor 1987):

• If the measured value of the control sample is within the warning limits, all routine sample data since the last
acceptable control sample measurement are accepted, and routine sample analyses are continued.

• If the measured value of the control sample is outside of the control limits, the analysis is assumed to no
longer be in a state of statistical control. All routine sample data analyzed since the last acceptable control
sample measurement are suspect. Routine sample analyses are suspended until corrective action is taken.
After corrective action, statistical control must be reestablished and demonstrated before sample analyses
continue. The reestablishment of statistical control is demonstrated by the results of three consecutive sets
of control sample measurements that are in control (Taylor 1987). Once statistical control has been
demonstrated, all routine samples since the last acceptable control sample measurement are reanalyzed.


-------
Section 3

Page 9 of 9
Revision 2
May 1993

•	If the measured value of a control sample is outside the warning limits, but within the control limits, a second
control sample is analyzed. If the second control sample measurement is within the warning limits, the
analysis is assumed to be in a state of statistical control, and all routine sample data since the last acceptable
control sample measurement are accepted, and routine sample analyses are continued. If the second sample
measurement is outside the warning limits, it is assumed the analysis is no longer in a state of statistical
control. All routine sample data analyzed since the last acceptable control sample measurement are suspect.
Routine sample analyses are suspended until corrective action is taken. After corrective action, statistical
control must be reestablished and demonstrated before sample analyses continue. The reestablishment of
statistical control is demonstrated by the results of three consecutive sets of control sample measurements that
are in control (Taylor 1987). Once statistical control has been demonstrated, all routine samples since the
last acceptable control sample measurement are reanalyzed.

Taylor (1987) also provides additional criteria for evaluating control chart data to determine if a measurement
system is no longer in a state of statistical control. For X charts, these criteria include:

•	Four successive points outside a range equal to plus or minus one-half the warning limits.

•	Seven successive points on one side of the central line, even if all are within the warning limits.

•	More than 5 percent of the points outside the warning limits.

Central line, warning limits, and control limits will be evaluated periodically by either the on-site QA
coordinator or the EMAP-E QA staff. Central lines, warning limits, and control limits for each analyte and sample
type will be redefined based on the results of quality control and quality assessment sample measurements. Current
control charts must be available for review during technical systems audits. Copies of charts will be furnished to the
Province Manager or Province QA staff upon request. Such charts should contain both the points and their associated
values.


-------
Section 4
Page 1 of 12
Revision 2
May 1993

SECTION 4

ASSESSMENT OF DATA QUALITY

4.1 DATA QUALITY OBJECTIVES

The EMAP-E program is measuring a defined set of parameters that are considered to be reliable indicators
of estuarine environmental condition. The measured parameters have been categorized as either biotic condition,
abiotic condition, or habitat indicators (Table 4-1) in accordance with the general EMAP indicator development process
described by Olsen (1992). More detailed descriptions of EMAP-E's indicator strategy are presented in the Near
Coastal Program Plan for Estuaries (Holland 1990).

TABLE 4-1. EMAP-E Virginian Province indicators by major category.

Category	Indicator

Biotic Condition	Benthic species composition and biomass

Fish community composition
Contaminant concentrations in fish flesh
Gross pathology of fish
Histopathology of fish

Abiotic Condition	Sediment contaminant concentrations

Sediment toxicity
Dissolved oxygen concentration
Marine debris
Water clarity

Habitat	Salinity

Temperature
Depth
Grain size
pH


-------
Section 4
Page 2 of 12
Revision 2
May 1993

It is the policy of the U. S. EPA that all environmental data collection activities be planned and implemented
through the development of data quality objectives (DQOs). Data quality objectives are statements that describe in
precise quantitative terms the level of uncertainty that can be associated with environmental data without comprising
their intended use. Data quality objectives provide criteria that can be used to design a sampling stategy while
balancing the cost and/or resource constraints typically imposed upon a program.

The EMAP is unique in its stated objective of determining ecosystem condition at regional scales using a
probability-based sampling design. The relative novelty of this design, coupled with the vast geographic expanse and
inherent complexity of the natural systems being monitored, have made the task of developing DQOs a challenging
endeavor. Typically, DQOs are specified by potential users of the data. Because EMAP Resource Groups are
developing new indicators and employing them in new uses (e.g., regional status and trends estimates), potential users
of the data have found it difficult to develop the necessary decision and uncertainty criteria which are basic to the DQO
process. In the absence of specific decision criteria established by potential data users, the program has established
a set of target DQOs, based primarily on professional judgement, which are intended to provide a starting point for a
long-term, iterative DQO process. Consequently, these preliminary DQOs do not necessarily constitute definitive rules
for accepting or rejecting results, but rather provide guidelines for continued improvement. Several iterations of the
DQO process may be required as EMAP scientists define their capabilities and data users define their needs.

EMAP has established target DQOs for both status and trends estimates. The target DQO for estimates of
current status in indicators of condition for EMAP is as follows:

"For each indicator of condition and resource class, on a regional scale, estimate the proportion of the
resource in degraded condition within 10% (absolute) with 90% confidence based on four years of sampling."

The target DQO for trends in indicators of condition for EMAP is as follows:

"Over a decade, for each indicator of condition and resource class, on a regional scale, detect, at a
minimum, a linear trend of 2% (absolute) per year (i.e., a 20% change for a decade), in the percent of the
resource class in degraded condition. The test for trend will have a maximum significance level of alpha = 0.2
and a minimum power of 0.7 (i.e., beta = 0.3)."


-------
Section 4
Page 3 of 12
Revision 2
May 1993

It is important to note that the target DQOs which have been established are related to the ability of the present
sampling design to characterize status or discern trends within a specified level of statistical confidence. The EMAP-E
Resource Group will not be able to begin a realistic assessment of whether it can meet the target DQOs until 1994,
when the first four-year sampling cycle will be completed in the Virginian Province. During the first four years of
sampling, however, EMAP-E has been actively laying the groundwork for this assessment by gathering the data needed
to identify and quantify potential sources of sampling error (Table 4-2). It will be essential to account for these
potentially significant sources of uncertainty (i.e., variance) in determining whether the current sampling design will
allow EMAP-E to meet the target status and/or trends DQOs.

TABLE 4-2. Potential sources of sampling error being estimated during the first four years of EMAP-E
monitoring in the Virginian Province.

Source of Error

EMAP-E Estimator

Small-scale spatial
variability within
the index period

Replicate stations sampled each year
within each resource class

Temporal variability
within the index
period

Certain stations in each resource
class are visited twice during
the index period

Long-term temporal

(interannual)

variability

The same stations are visited each
year (long-term trend sites)

Year-to-year
temporal and
spatial variability

Estimated using all random stations
sampled in each resource class
in each year


-------
Section 4
Page 4 of 12
Revision 2
May 1993

The target DQOs established for the EMAP program represent statements about resource class populations
and do not, as stated, take into account potential sources of measurement error. Measurement error is frequently
emphasized in the DQO process as an important source of uncertainty. In EMAP, measurement error may be a less
significant contributor to total uncertainty than sample density. Measurement error is, however, a potentially important
variable in controlling the regional responsiveness, and thus the acceptability, of individual indicators. In addition,
external users of EMAP data may find that measurement error is an important source of variability that must be
accounted for in addressing their own DQOs. It is therefore important for EMAP Resource Groups to control
measurement error, to the extent possible, when selecting sampling methods and establish measurement quality
objectives (MQOs) for each sampling method and laboratory analysis procedure. MQOs essentially represent data
quality objectives that are based on control of the measurement system. They are being used to establish criteria for
data acceptability because reliable error bounds cannot, at present, be established for end use of indicator response data.
As a consequence, management decisions balancing the cost of higher quality data against program objectives are not
presently possible. As data are accumulated on indicators and the error rates associated with their measurement at
regional scales are established, it will be possible to address the target DQOs that have been established and determine
the need for modifications to the sampling design and/or quality assurance program.

Measurement quality objectives for the various measurements being made in EMAP-Estuaries can be
expressed in terms of accuracy, precision, and completeness requirements (Table 4-3). These MQOs were established
by obtaining estimates of the most likely data quality that is achievable based on either the instrument manufacturer's
specifications, scientific experience or historical data.

The MQOs presented in Table 4-3 are used as quality control criteria both in field and laboratory measurement
processes to set the bounds of acceptable measurement error. Usually, DQOs or MQOs are established for five aspects
of data quality: representativeness, completeness, comparability, accuracy, and precision (Stanley and Verner 1985).
These terms are described in the following sections in terms of their overall applicability to the EMAP-Estuaries
Program and the specific measurement systems being employed for each indicator.


-------
Section 4
Page 5 of 12
Revision 2
May 1993

TABLE 4-3. Measurement quality objectives for EMAP-Estuaries indicators. Accuracy (bias) requirements are
expressed as either maximum allowable percent deviation (%) or absolute difference (± value) from
the "true" value; precision requirements are expressed as maximum allowable relative percent
difference (RPD) or relative standard deviation (RSD) between two or more replicate measurements.
Completeness goals are the percentage of expected results to be obtained successfully.

Indicator/Data Type

Accuracy (Bias) Precision	Completeness

Requirement Requirement Goal

Sediment/tissue contaminant analyses:

Organics	30%

Inorganics	15%

Sediment toxicity	NA

Benthic species composition
and biomass:

Sorting	10%

Counting	10%

Taxonomy	10%

Biomass	NA

30%
15%

NA

NA
NA
NA
10%

100%
100%

100%

100%
100%
100%
100%

Sediment characteristics:

Particle size (% silt-clay) analysis	NA

Total organic carbon	10%

Acid volatile sulfide	10%

10%
10%
10%

100%
100%
100%

Water Column Characteristics:
Dissolved oxygen
Salinity
Depth
pH

Temperature

Total Suspended solids

Gross pathology of fish

Fish community composition:
Counting

Taxonomic identification
Length determinations

± 0.5 mg/L
± 1.0 ppt
± 0.5 m
± 0.2 units
± 0.5 °C
NA

NA

10%
10%
± 5 mm

10%

10%

10%

NA

NA

10%

10%

NA
NA
NA

100%
100%
100%
100%
100%
100%

100%

100%
100%
100%

Fish histopathology

NA

NA

NA


-------
Section 4

Page 6 of 12
Revision 2
May 1993

4.2 REPRESENTATIVENESS

Representativeness is defined as "the degree to which the data accurately and precisely represent a
characteristic of a population parameter, variation of a property, a process characteristic, or an operational condition"
(Stanley and Verner, 1985). The concept of representativeness within the context of EMAP monitoring refers to the
ability of the program to accurately and precisely characterize regional phenomena through the measurement of
selected environmental indicators. The focus on regional phenomena requires that the EMAP design strategy
emphasize accommodation of a wide range of resources. In addressing this requirement, EMAP-Estuaries has adopted
a regionalization scheme to allocate the Nation's estuarine and coastal resources into manageable sampling units for
collection and reporting of data. This regionalization, determined on the basis of major climatic zones and prevailing
oceanic currents, consists of seven provinces within the continental United States, five provinces in Alaska, Hawaii,
and the Pacific territories, and a region that comprises the Great Lakes. In addition, EMAP-Estuaries is using a
classification scheme to facilitate sampling of the ecosystems within each province in proportion to their extent and
abundance, thus ensuring a statistically-acceptable representation of all ecosystem types within the sampling frame.
In the Virginian Province, physical dimensions (e.g., surface area and aspect ratio) are used to classify estuarine
resources into three categories: large estuarine systems, large tidal rivers, and small estuarine systems. Complete
descriptions of the EMAP-Estuaries regionalization and classification schemes are provided in the Near Coastal
Program Plan for 1990 (Holland 1990).

The design of the EMAP-Estuaries' sampling program and the location of Virginian Province sampling sites
provide the primary focus for defining the "representativeness" of population estimates for this region. In its initial
planning stages, the EMAP-E program faced a choice between two general sampling approaches to meet the objective
of obtaining an accurate and precise representation of estuarine resource condition at the regional scale. As described
in the Near Coastal Program Plan (Holland 1990) and restated here, these two sampling approaches were: 1.) census
the nation's estuarine and coastal ecosystems and important habitats on a periodic basis (e.g., every 4 years), or 2.)
sample a subset of estuarine and coastal resources periodically, and use the data to make inferences about unsampled
area.


-------
Section 4
Page 7 of 12
Revision 2
May 1993

The census technique is the appropriate sampling method for characterizing and assessing status and trends
for some rare resources, because minimal population densities require that most of the resource must be sampled to
characterize status and to measure trends (e.g., changes in abundance of rare and endangered species or habitats). The
census technique is not a cost-effective or appropriate sampling approach for assessing the status and trends of broadly
distributed, relatively abundant resources. EMAP-E does not have the resources to conduct a regular census of the
nation's estuarine and coastal resources. Therefore, the decision was made that sampling a subset of the resources and
using the information obtained about the subset to make inferences about unsampled resources is the only approach
that is appropriate for EMAP-E.

The subset of resources sampled by EMAP-E could be (1) a sample which is determined, based on available
scientific knowledge, to be "representative" of the range of environmental settings that exist in estuarine and coastal
environments, or (2) a probability sample of estuarine and coastal resources. Collection of "representative" samples
is an extreme case of stratified sampling and assumes that the data collected at the "representative" sampling locations
can be extrapolated to broader spatial and temporal scales. Available scientific information is used to identify
"representative" sampling locations, as well as to define the spatial scale and temporal periods that the samples
represent. Periodic collection of "representative" samples is a powerful technique for measuring trends, because this
approach minimizes interactions between spatial and temporal variation. Because "representative" samples can be
located at any of a number of sites, they are generally easier to collect than probability samples and frequently can be
located at a site for which there is existing historical data.

Unfortunately, the current scientific understanding of the environmental processes that control condition and
distributions of estuarine and coastal resources is inadequate to define the bias and uncertainty associated with
extrapolating environmental quality information for "representative" locations to other sites. This is especially true
for data collected over broad geographic scales and long time periods. Therefore, EMAP-E employs a probability
sampling approach that samples resources in proportion to their abundance and distribution and obtains unbiased
estimates of resource characteristic and variability. The probability sampling approach applies systematic (e.g., grid)
sampling to facilitate characterizations of spatial patterns and to encourage broad geographic coverage.


-------
Section 4
Page 8 of 12
Revision 2
May 1993

Many of the proposed parameters that EMAP-E will measure are known to exhibit large intra-annual
variability, and EMAP-E lacks the resources to characterize this variability or to assess status for all seasons.
Therefore, sampling will be confined to a limited portion of the year (i.e., an index period), when indicators are
expected to show the greatest response to pollution stress and within-season (i.e., week-to-week) variability is expected
to be small.

For most estuarine and coastal ecosystems in the Northern Hemisphere, mid-summer (July-August) is the
period when ecological responses to pollution exposure are likely to be most severe. During this period, dissolved
oxygen concentrations are most likely to approach stressful low values. Moreover, the cycling and adverse effects of
contaminant exposure are generally greatest at the low dilution flows and high temperatures that occur in mid-summer.
Therefore, summer has been selected as the most conservative (i.e., most ecologically-stressful) index period for
EMAP-E.

Once unbiased quantitative information on the kinds, extent, condition and distribution of estuarine and
coastal resources and associated estimates of uncertainty are known, a baseline of the status of existing conditions will
be established. This baseline information will be used to develop criteria for identifying "representative" sampling sites
for future sampling (e.g., trends sites, detailed studies of processes associated with deterioration and recovery, the
magnitude of natural variation). This baseline will also be used to determine the "representativeness" of historical data
and sampling sites (e.g., NOAA Status and Trends sites). Over the long-term, EMAP-E seeks to develop a sampling
design that includes both "representative" and probability sampling, incorporating the advantages of both approaches.

The data quality attribute of "representativeness" applies not only to the overall sampling design, but also to
individual measurements and samples obtained as part of EMAP-E's monitoring efforts. Holding time requirements
for different types of samples ensure that analytical results are representative of conditions at the time of sampling;
these requirements are specified in the individual indicator sections of this document. In addition, use of QA/QC
samples which are similar in composition to samples being measured provides estimates of precision and bias that are
representative of sample measurements. Therefore, as a general program objective, the types of QA documentation
samples (i.e., performance evaluation material) used to assess the quality of analytical data will be as representative
as possible of the natural samples collected during the project with respect to both composition and concentration.


-------
Section 4

Page 9 of 12
Revision 2
May 1993

4.3 COMPLETENESS

Completeness is defined as "a measure of the amount of data collected from a measurement process compared
to the amount that was expected to be obtained under the conditions of measurement" (Stanley and Verner 1985).
EMAP-E has established a completeness goal of 100% for the various indicators being measured (Table 4-3). Given
the probability-based sampling design being employed by EMAP-E, failure to achieve this goal will not preclude the
within-year or between-year assessment of ecosystem condition. The major consequence of having less than 100%
complete data from all expected stations is a relatively minor loss of statistical power in the areal estimate of condition,
as depicted using Cumulative Distribution Functions (CDFs). The 100% completeness goal is established in an attempt
to derive the maximum statistical power from the present sampling design. Based on past years' experience, failure
to achieve this goal usually results from the field crew's inability to sample at some stations due to logistical barriers
such as insufficient depth, inpenetrable substrate, or adverse weather conditions. In the limited number of instances
where these conditions may be encountered, extensive efforts will be made to re-locate the station or re-sample the
station at a later date, always in consultation with program managers at the Field Operations Center. In this way, field
personnel must always strive to achieve the 100% completeness goal. In addition, established protocols for tracking
samples during shipment and laboratory processing must be followed to minimize data loss following successful sample
collection.

4.4 COMPARABILITY

Comparability is defined as "the confidence with which one data set can be compared to another" (Stanley
and Verner 1985). Comparability of reporting units and calculations, data base management processes, and
interpretative procedures must be assured if the overall goals of EMAP are to be realized. One goal of the EMAP-
Estuaries program is to generate a high level of documentation for the above topics to ensure that future EMAP efforts
can be made comparable. For example, both field and laboratory methods are described in full detail in manuals which
will be made available to all field personnel and analytical laboratories. Field crews will undergo intensive training
in a single four week session prior to the start of field work. In addition, the comparability of laboratory measurements
is monitored through the interlaboratory comparison exercises and the use of field split or duplicate performance
evaluation samples. The results of this comparibility monitoring will be presented and evaluated in a quality assurance


-------
Section 4
Page 10 of 12
Revision 2
May 1993

report prepared by the program's QA personnel following each year's sampling effort. Comparability will be assessed
through application of appropriate statistical tests (e.g., t-tests, ANOVA), and results will be considered comparable
if there are no significant differences. Failure to achieve this comparability goal will result in corrective actions which
may include, but are not limited to, changes in field and laboratory methodology and/or concomitant changes in the
program's QA/QC requirements.

4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR

The term "accuracy", which is used synonymously with the term bias in this plan, is defined as the difference
between a measured value and the true or expected value, and represents an estimate of systematic error or net bias
(Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Precision is defined as the degree of mutual agreement among
individual measurements, and represents an estimate of random error (Kirchner 1983; Hunt and Wilson 1986; Taylor
1987). Collectively, accuracy and precision can provide an estimate of the total error or uncertainty associated with
an individual measured value. Measurement quality objectives for the various indicators are expressed separately as
accuracy (i.e., bias) and precision requirements (Table 4-3). Accuracy and precision requirements may not be definable
for all parameters due to the nature of the measurement type. For example, accuracy measurements are not possible
for toxicity testing and fish pathology identifications because "true" or expected values do not exist for these
measurement parameters (see Table 4-3). In order to evaluate the MQOs for accuracy and precision, various QA/QC
samples will be collected and analyzed for most data collection activities. Table 4-4 presents the types of samples to
be used for quality assurance/quality control for each of the various data acquisition activities except sediment and fish
tissue contaminant analyses. The frequency of QA/QC measurements and the types of QA data resulting from these
samples or processes are also presented in Table 4-4. Because several different types of QA/QC samples are required
for the complex analyses of chemical contaminants in sediment and tissue samples, they are presented and discussed
separately in Section 5 along with presentation of warning and control limits for the various chemistry QC sample
types.


-------
Section 4
Page 11 of 12
Revision 2
May 1993

TABLE 4-4. Quality assurance sample types, frequency of use, and types of data generated for EMAP-Estuaries
Virginian Province monitoring (see Table 5-4 for chemical analysis QA/QC sample types).

Variable

QA Sample Type
or Measurement
Procedure

Frequency
of Use

Data Generated
for Measurement
Quality Definition

Sediment toxicity
tests

Reference toxicant

Each experiment

Variance of replicated
tests over time

Benthic Species
Composition and Biomass:

Sorting

Resort of sample

10% of each
tech's work

No. animals found
in resort

Sample counting
and ID

Recount and ID of
sorted animals

10% of each
tech's work

No. of count and ID
errors

Biomass

Duplicate weights

10% of samples

Duplicate results

Sediment grain size

Splits of a sample

Organic carbon
and acid vola-
tile sulfide

Duplicates and
analysis of
standards

10% of each
tech's work

Duplicate results

Each batch

Duplicate results
and standard

recoveries

Dissolved
Oxygen conc.
(CTD)

Comparison of
calibrated YSI
and CTD values

Each CTD cast

Difference between
CTD and YSI

Dissolved
Oxygen conc.
(YSI)

Comparison with
Winkler value

Once per shift

Difference between
YSI and Winkler value

(continued)


-------
Section 4
Page 12 of 12
Revision 2
May 1993

Table 4-4 (continued).

Variable

QA Sample Type
or Measurement
Procedure

Frequency
of Use

Data Generated
for Measurement
Quality Definition

Salinity

Temperature
Depth

Refractometer
reading

Thermometer
reading

Check bottom
depth against depth
finder

Each CTD cast

Each CTD cast

Each CTD cast

Difference between
CTD probe and
refractometer readings

Difference between
probe and thermometer

Difference
from actual

pH

QC check with
standard

Once each day

Difference from
standard

Fish identification

Fish counts/length

Fish gross
pathology

Fish preserved
for verification
by taxonomist

Remeasured and
recounted during
field QA audits

Specimens
preserved for
confirmation

Twice/crew for
each species

One audit for
each crew/season

At least once
per crew shift

Number of mis-
identifications

Difference between
original and recount/
remeasurement

Number of mis-
identifications

Fish	Confirmation by	5% of slides	Number of confirmations

histopathology	second technician


-------
Section 5
Page 1 of 31
Revision 2
May 1993

SECTION 5

ANALYSIS OF CHEMICAL CONTAMINANTS IN SEDIMENT
AND FISH TISSUE SAMPLES

5.1 OVERVIEW

Quality assurance of chemical measurements has many diverse aspects. This section presents EMAP-Estuaries
QA/QC protocols and requirements covering a range of activities, from sample collection and laboratory analysis to
final validation of the resultant data. Much of the guidance provided in this section is based on protocols developed
for EPA's Puget Sound Estuary Program (U.S. EPA 1989), as well as those developed over many years on the National
Oceanic and Atmospheric Administration's (NOAA) National Status and Trends (NS&T) Program. This guidance
is applicable to low parts per billion analyses of both estuarine sediment and tissue samples unless otherwise noted.

The EMAP-E program measures a variety of organic and inorganic contaminants in estuarine sediment and
fish tissue samples (Tables 5-1 and 5-2); these compounds are the same as those measured in the NOAA NS&T
Program, with a few additions. These contaminants are being measured for the purpose of environmental monitoring,
with the understanding that the data will not be used for litigation purposes. Therefore, legal and contracting
requirements as stringent as those used in the U.S. EPA Contract Laboratory Program, for example, have not been
applied to EMAP-E. Rather, EMAP-E requires its laboratories to demonstrate comparability continuously through
strict adherence to common QA/QC procedures, routine analysis of Certified Reference Materials1, and regular
participation in an on-going series of interlaboratory comparison exercises (round-robins). This is a "performance-

1 Certified Reference Materials (CRMs) are samples in which chemical concentrations have been determined accurately
using a variety of technically valid procedures; these samples are accompanied by a certificate or other documentation
issued by a certifying body (e.g., agencies such as the National Research Council of Canada (NRCC), U.S. EPA, U.S.
Geological Survey, etc.). Standard Reference Materials (SRMs) are CRMs issued by the National Institute of Standards
and Technology (NIST), formerly the National Bureau of Standards (NBS). A useful catalogue of marine science
reference materials has been compiled by Cantillo (1992).


-------
Section 5
Page 2 of31
Revision 2
May 1993

TABLE 5-1. Chemicals to be measured in sediments by EMAP-E Virginian Province.

Polyaromatic Hydrocarbons (PAHs)

DDT and its metabolites

Acenaphthene

Anthracene

Benz(a)anthracene

Benzo(a)pyrene

Benzo(e)pyrene

Biphenyl

Chrysene

Dibenz(a,h)anthracene
2,6 -dime thy lnaphthalene
Fluoranthene
Fluorene

2-methy lnaphthalene

18 PCB Congeners:

1 -methy lnapthalene

1 -methy lphenanthrene

Naphthalene

Perylene

Phenanthrene

Pyrene

Benzo(b)fluoranthene

Acenaphthlylene

Benzo(k)fluoranthene

Benzo(g,h,i)perylene

Ideno(l,2,3-c,d)pyrene

2,3,5 -trimethy lnaphthalene

PCB No.	Compound name

8	2,4'-dichlorobiphenyl

18	2,2',5-trichlorobiphenyl

28	2,4,4'-trichlorobiphenyl

44	2,2',3,5'-tetrachlorobiphenyl

52	2,2',5,5'-tetrachlorobiphenyl

66	2,3',4,4'-tetrachlorobiphenyl

101	2,2',4,5,5'-pentachlorobiphenyl

105	2,3,3',4,4'-pentachlorobiphenyl

118	2,3',4,4',5-pentachlorobiphenyl

128	2,2',3,3',4,4'-hexachlorobiphenyl

138	2,2',3,4,4',5'-hexachlorobiphenyl

153	2,2',4,4',5,5'-hexachlorobiphenyl

170	2,2',3,3',4,4',5-heptachlorobiphenyl

180	2,2',3,4,4',5,5'-heptachlorobiphenyl

187	2,2',3,4',5,5',6-heptachlorobiphenyl

195	2,2',3,3',4,4',5,6-octachlorobiphenyl

206	2,2',3,3',4,4',5,5',6-nonachlorobiphenyl

209	2,2',3,3',4,4',5,5',6,6'-decachlorobiphenyl

Other measurements

Acid volatile sulfide

Total organic carbon

Tetra-, Tri-, Di-, and Monobutyltin

2,4'-DDD
4,4'-DDD
2,4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT

Chlorinated pesticides
other than DDT

Endrin
Aldrin

Alpha-Chlordane

Endosulfan

Trans-Nonachlor

Dieldrin

Heptachlor

Heptachlor epoxide

Hexachlorobenzene

Lindane (gamma-BHC)

Mirex

Major Elements

Aluminum
Iron

Manganese

Trace Elements

Antimony

Arsenic

Cadmium

Chromium

Copper

Lead

Mercury

Nickel

Selenium

Silver

Tin

Zinc


-------
Section 5
Page 3 of31
Revision 2
May 1993

TABLE 5-2. Chemicals to be measured in fish tissue by EMAP-E Virginian Province.

DDT and its metabolites

2,4'-DDD
4,4'-DDD
2,4'-DDE
4,4'-DDE
2,4'-DDT
4,4'-DDT

Chlorinated pesticides
other than DDT

Aldrin

Alpha-Chlordane

Trans-Nonachlor

Dieldrin

Endosulfan

Endrin

Heptachlor

Heptachlor epoxide

Hexachlorobenzene

Lindane (gamma-BHC)

Mirex

Trace Elements

Aluminum

Arsenic

Cadmium

Chromium

Copper

Iron

Lead

Mercury

Nickel

Selenium

Silver

Tin

Zinc

Other chemicals

Tetra-, Tri-, Di- and Monobutyltin

18 PCB Congeners:

PCB No.	Compound name

8	2,4'-dichlorobiphenyl

18	2,2',5-trichlorobiphenyl

28	2,4,4'-trichlorobiphenyl

44	2,2',3,5'-tetrachlorobiphenyl

52	2,2',5,5'-tetrachlorobiphenyl

66	2,3',4,4'-tetrachlorobiphenyl

101	2,2',4,5,5'-pentachlorobiphenyl

105	2,3,3',4,4'-pentachlorobiphenyl

118	2,3',4,4',5-pentachlorobiphenyl

128	2,2',3,3',4,4'-hexachlorobiphenyl

138	2,2',3,4,4',5'-hexachlorobiphenyl

153	2,2',4,4',5,5'-hexachlorobiphenyl

170	2,2',3,3',4,4',5-heptachlorobiphenyl

180	2,2',3,4,4',5,5'-heptachlorobiphenyl

187	2,2',3,4',5,5',6-heptachlorobiphenyl

195	2,2',3,3',4,4',5,6-octachlorobiphenyl

206	2,2',3,3',4,4',5,5',6-nonachlorobiphenyl

209	2,2',3,3',4,4',5,5',6,6'-decachlorobiphenyl


-------
Section 5
Page 4 of 31
Revision 2
May 1993

based" approach for quality assurance of low-level contaminant analyses, involving continuous laboratory evaluation
through the use of accuracy-based materials (e.g., CRMs), laboratory fortified sample matrices, laboratory reagent
blanks, calibration standards, and laboratory and field replicates. The definition and use of each of these types of
quality control samples are explained in later sections.

No single analytical method has been approved officially for low-level (/'. e., low parts per billion) analysis of
organic and inorganic contaminants in estuarine sediments and fish tissue. Recommended methods for the EMAP-E
program are those used in the NOAA NS&T Program (Lauenstein et al. 1993), as well as those documented in the
EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision). Under the EMAP-E performance-based chemistry
QA program, laboratories are not required to use a single, standard analytical method for each type of analysis, but
rather are free to choose the best or most feasible method within the constraints of cost and equipment. Each laboratory
must, however, continuously demonstrate proficiency and data comparability through routine analysis of accuracy-
based performance evaluation samples and reference materials representing real-life matrices.

5.2 QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION AND HOLDING

Field personnel must strictly adhere to EMAP-E protocols to insure the collection of representative,
uncontaminated sediment and fish tissue chemistry samples. These sample collection protocols are described in detail
in the Virginian Province Field Operations Manual (Reifsteck et al. 1993). Briefly, the key aspects of quality control
associated with chemistry sample collection are as follows: 1.) field personnel must be thoroughly trained in the proper
use of sample collection gear and must be able to distinguish acceptable versus unacceptable sediment grab samples
or fish trawls in accordance with pre-established criteria, 2.) field personnel must be thoroughly trained to recognize
and avoid potential sources of sample contamination (e.g., engine exhaust, winch wires, deck surfaces, ice used for
cooling), 3.) samplers and utensils which come in direct contact with the sample should be made of non-contaminating
materials (e.g., glass, high-quality stainless steel and/or Teflon®) and should be thoroughly cleaned between sampling
stations (e.g., Alconox® scrub followed by thorough rinse with ambient seawater or river water and final rinse with
deionized water), 4.) sample containers should be of the recommended type (Table 5-3) and must be free of
contaminants (i.e., carefully pre-cleaned), 5.) conditions for sample collection, preservation and holding times should
be followed (Table 5-3).


-------
Section 5
Page 5 of31
Revision 2
May 1993

Table 5-3. Summary of chemistry sample collection, preservation, and holding time conditions to be followed
for EMAP-E Virginian Province monitoring.

Parameter

Container

Sample Sample Sample	Max. Sample Max. Extract

Volume Size	Preservation Holding Time Holding Time

Sediment
Metals
(except Hg)

250-ml HDPE

wide-mouth

bottle

100 to
150 ml

100 to
150 g
(approx.)

Cool, 4°C

6 months

Sediment Hg
and TOC

same as
above

same as
above

same as
above

same as
above

28 days

Sediment
Organics
(including
butyltins)

500-ml pre-
cleaned glass
wide-mouth
jar

250 to 300 g Cool, 4°C 14 daysb	40 days

300 ml (approx.)

Sediment

Acid

Volatile

Sulfide

(AVS)

Fish
Tissue
(Organics
and In-
organics)

125-ml. poly-
propylene
wide-mouth
bottle

Whole fish
are placed
in water-tight
plastic bags

125 mlc

NA

125 g

NA

Cool, 4°C

Freeze
(-18°C)

14 days

36 hours

1 year"1

40 days

" No EPA criteria exist. Every effort should be made to analyze sample as soon as possible following extraction or,
in the case of metals, digestion.

b Every effort should be made to analyze these samples as soon as possible. If extractions are not to be performed
within 14 days, these samples should be frozen (-18°C) and extracted within 1 year.

c AVS containers should be filled to the top to minimize or eliminate headspace; containers should be capped tightly.

Every effort should be made to minimize contact of the sediment with air and to analyze these samples as soon
as possible.

d No EPA criteria exists for holding times of tissue samples. This is a maximum suggested holding time.


-------
Section 5

Page 6 of 31
Revision 2
May 1993

5.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS

5.3.1 Overview

The QA/QC requirements presented in the following sections are intended to provide a common foundation
for each laboratory's protocols; the resultant QA/QC data will enable an assessment of the comparability of results
generated by different laboratories and different analytical procedures. It should be noted that the QA/QC requirements
specified in this plan represent the minimum requirements for any given analytical method. Additional requirements
which are method-specific should always be followed, as long as the minimum requirements presented in this document
have been met.

The performance-based EMAP-E QA program for analytical chemistry laboratories consists of two basic
elements: 1.) initial demonstration of laboratory capability (e.g., performance evaluation) and 2.) ongoing
demonstration of capability. Prior to the analysis of samples, each laboratory must demonstrate proficiency in several
ways: written protocols for the analytical methods to be employed for sample analysis must be submitted to the Program
for review, method detection limits for each analyte must be calculated, an initial calibration curve must be established
for all analytes, and acceptable performance must be shown on a known or blind accuracy-based material. Following
a successful first phase, the laboratory must demonstrate its continued capabilities in several ways: participation in an
on-going series of interlaboratory comparison exercises, repeated analysis of Certified Reference Materials, calibration
checks, and analysis of laboratory reagent blanks and fortified samples. These steps are detailed in the following
sections and summarized in Table 5-4. The sections are arranged to mirror the elements in Table 5-4 to provide easy
cross-reference for the reader.

The results for the various QA/QC samples should be reviewed by laboratory personnel immediately following
the analysis of each sample batch. These results then should be used to determine when warning and control limit
criteria have not been met and corrective actions must be taken, before processing a subsequent sample batch. When
warning limit criteria have not been met, the laboratory is not obligated to halt analyses, but the analyst(s) is advised
to investigate the cause of the exceedance. When control limit criteria are not met, specific corrective actions are
required before the analyses may proceed. Warning and control limit criteria and recommended frequency of analysis
for each QA/QC element or sample type required in the EMAP-E program also are summarized in Table 5-4.


-------
Section 5
Page 7 of31
Revision 2
May 1993

TABLE 5-4. Key elements of laboratory quality control for EMAP-Estuaries chemical analyses (see text for
detailed explanations).

Element or
Sample Type

Warning Limit
Criteria

Control Limit
Criteria

Frequency

1.) Initial Demonstration
of Capability (Prior to
Analysis of Samples):

- Instrument Calibration

NA

NA

Initial and then
prior to analyzing
each batch of samples

¦ Calculation of Method
Detection Limits

Must be equal to or less than
target values (see Table 5-5)

At least
once each
year

- Blind Analysis of
Accuracy-Based
Material

NA

NA

Initial

2.) On-going Demonstration
of Capability:

- Blind Analysis of
Interlaboratory
Comparison Exercise
Samples

NA

NA

Regular intervals
throughout the
year

¦ Continuing Calibration
Checks using Calibration
Standard Solutions

NA

should be within
±15% of initial
calibration on
average for all
analytes, not to
exceed ±25% for
any one analyte

At a minimum,
middle and end
of each sample
batch

(continued)


-------
Section 5
Page 8 of31
Revision 2
May 1993

TABLE 5-4 (continued).

Element or
Sample Type

Warning Limit
Criteria

Control Limit
Criteria

Frequency

¦ Analysis of Certified Reference
Material (CRM) or Laboratory
Control Material (LCM):

Precision (see NOTE 1):

NA

Relative Accuracy
(see NOTE 2):

PAHs

PCBs/pesticides
inorganic elements

Lab's value should
be within ±25% of
true value on
average for all
analytes; not to
exceed ±30% of
true value for
more than 30% of
individual analytes

same as above

Lab should be within
±15% of true value
for each analyte

Value obtained for
each analyte should
be within 3 s control
chart limits

Lab's value should
be within ±30% of
true value on
average for all
analytes; not to
exceed ±35% of
true value for
more than 30% of
individual analytes

same as above

Lab should be within
±20% of true value
for each analyte

One with each
batch of samples

Value plotted on
control chart after
each analysis of the
CRM

NOTE 1: The use of control charts to monitor precision for each analyte of interest should follow generally accepted
practices (e.g., Taylor 1987 and section 3.2.5 of this document). Upper and lower control limits, based on 99%
confidence intervals around the mean, should be updated at regular intervals.

NOTE 2: "True" values in CRMs may be either "certified" or "non-certified" (it is recognized that absolute accuracy
can only be assessed using certified values, hence the term relative accuracy). Relative accuracy is computed by
comparing the laboratory's value for each analyte against either end of the range of values (i.e., 95% confidence limits)
reported by the certifying agency. The laboratory's value must be within ±35% of either the upper or lower 95%
confidence interval value. Accuracy control limit criteria only apply for analytes having CRM concentrations > 10
times the laboratory's MDL.

(continued)


-------
Section 5

Page 9 of 31
Revision 2
May 1993

TABLE 5-4 (continued).

Element or
Sample Type

Warning Limit
Criteria

Control Limit
Criteria

Frequency

¦ Laboratory Reagent
Blank

Analysts should use
best professional
judgement if analytes
are detected at <3
times the MDL

No analyte should
be detected at >3
times the MDL

One with each
batch of samples

¦ Laboratory Fortified
Sample Matrix
(Matrix Spike)

NA

Recovery should be At least
within the range	5% of total

50% to 120% for at number of
least 80% of the samples
analytes

NOTE: Samples to be spiked should be chosen at random; matrix spike solutions should contain all the analytes of
interest. The final spiked concentration of each analyte in the sample should be at least 10 times the calculated MDL.

¦ Laboratory Fortified
Sample Matrix Duplicate
(Matrix Spike Duplicate)

NA

RPD1 must be
< 30 for each
analyte

Same as
matrix spike

¦ Field Duplicates
(Field Splits)

¦ Internal Standards
(Surrogates)

¦ Injection Internal
Standards

NA

NA

Lab develops
its own

NA

Recovery must be
within the range
30% to 150%

Lab develops
its own

5% of total
number of
samples

Each sample
Each sample

1 RPD = Relative percent difference between matrix spike and matrix spike duplicate results (see appropriate
section for equation).


-------
Section 5
Page 10 of 31
Revision 2
May 1993

5.3.2 Initial Demonstration of Capability

Instrument Calibration

Equipment should be calibrated prior to the analysis of each sample batch, after each major equipment
disruption, and whenever on-going calibration checks do not meet recommended control limit criteria (Table 5-4).
All calibration standards should be traceable to a recognized organization for the preparation and certification of
QA/QC materials (e.g., National Institute of Standards and Technology, U.S. Environmental Protection Agency, etc.).
Calibration curves must be established for each element and batch analysis from a calibration blank and a minimum
of three analytical standards of increasing concentration, covering the range of expected sample concentrations. The
calibration curve should be well-characterized and must be established prior to the analysis of samples. Only data
which results from quantification within the demonstrated working calibration range may be reported by the laboratory
(i.e., quantification based on extrapolation is not acceptableY Samples outside the calibration range should be diluted
or concentrated, as appropriate, and reanalyzed.

Initial Documentation of Method Detection Limits

Analytical chemists have coined a variety of terms to define "limits" of detectability; definitions for some of
the more commonly-used terms are provided in Keith e/a/. (1983) and in Keith (1991). In the EMAP-E program, the
Method Detection Limit (MDL) will be used to define the analytical limit of detectability. The MDL represents a
quantitative estimate of low-level response detected at the maximum sensitivity of a method. The Code of Federal
Regulations (40 CFR Part 136) gives the following rigorous definition: "the MDL is the minimum concentration of
a substance that can be measured and reported with 99% confidence that the analyte concentration is greater than zero
and is determined from analysis of a sample in a given matrix containing the analyte." Confidence in the apparent
analyte concentration increases as the analyte signal increases above the MDL.

Each EMAP-E analytical laboratory must calculate and report an MDL for each analyte of interest in each
matrix of interest (sediment or tissue) prior to the analysis of field samples for a given year. Each laboratory is required
to follow the procedure specified in 40 CFR Part 136 (Federal Register, Oct. 28, 1984) to calculate MDLs for each


-------
Section 5
Page 11 of 31
Revision 2
May 1993

analytical method employed. The matrix and the amount of sample (i.e.. dry weight of sediment or tissue) used in
calculating the MDL should match as closely as possible the matrix of the actual field samples and the amount of
sample typically used. In order to ensure comparability of results among different laboratories, MDL target values have
been established for the EMAP-E program (Table 5-5). The initial MDLs reported by each laboratory should be equal
to or less than these specified target values before the analysis of field samples may proceed. Each laboratory must
periodically (i.e., at least once each year) re-evaluate its MDLs for the analytical methods used and the sample matrices
typically encountered.

TABLE 5-5. Target method detection limits for EMAP-Estuaries analytes.

INORGANICS (NOTE: concentrations in jig/g (ppm), dry weight)



Tissue

Sediments

Aluminum

10.0

1500

Antimony

not measured

0.2

Arsenic

2.0

1.5

Cadmium

0.2

0.05

Chromium

0.1

5.0

Copper

5.0

5.0

Iron

50.0

500

Lead

0.1

1.0

Manganese

not measured

1.0

Mercury

0.01

0.01

Nickel

0.5

1.0

Selenium

1.0

0.1

Silver

0.01

0.01

Tin

0.05

0.1

Zinc

50.0

2.0

ORGANICS (NOTE: concentrations in fpph). drv weight)





Tissue

Sediments

PAHs

not measured

10

PCB congeners

2.0

1.0

Chlorinated pesticides

2.0

1.0


-------
Section 5
Page 12 of 31
Revision 2
May 1993

Initial Blind Analysis of a Representative Sample

A representative sample matrix which is uncompromised, homogeneous and contains the analytes of interest
at concentrations of interest will be provided to each analytical laboratory new to the EMAP-E program; this sample
will be used to evaluate laboratory performance prior to the analysis of field samples. The sample used for this initial
demonstration of laboratory capability typically will be distributed blind (i.e., the laboratory will not know the
concentrations of the analytes of interest) as part of the interlaboratory comparison exercises. Based on results that
have typically been attained by experienced EMAP-Estuaries laboratories, a new laboratory's performance generally
will be considered acceptable if its submitted values are within ±30% (for organic analyses) and ±20% (for inorganic
analyses) of the known concentration of each analyte of interest in the sample. These criteria apply only for analyte
concentrations equal to or greater than 10 times the MDL established by the laboratory. If the results for the initial
analysis fail to meet these criteria, the laboratory will be required to repeat the analysis until the performance criteria
are met, prior to the analysis of real samples.

5.3.3 On-going Demonstration of Capability

Participation in Interlaboratory Comparison Exercises

Through an interagency agreement, NOAA's NS&T Program and EPA's EMAP-E program jointly sponsor
an on-going series of interlaboratory comparison exercises (round-robins). All EMAP-E analytical laboratories are
required to participate in these exercises, which are conducted jointly by the National Institute of Standards and
Technology (NIST) and the National Research Council of Canada (NRCC). These exercises provide a tool for
continuous improvement of laboratory measurements by helping analysts identify and resolve problems in methodology
and/or QA/QC. The results of these exercises also are used to evaluate both the individual and collective performance
of the participating analytical laboratories on a continuous basis. The EMAP-E laboratories are required to initiate
corrective actions if their performance in these comparison exercises falls below certain pre-determined minimal
standards, described in later sections.

Typically, three or four different exercises are conducted over the course of a year. In a typical exercise, either
NIST or NRCC will distribute performance evaluation samples in common to each laboratory, along with detailed


-------
Section 5
Page 13 of 31
Revision 2
May 1993

instructions for analysis. A variety of performance evaluation samples have been utilized in the past, including
accuracy-based solutions, sample extracts, and representative matrices (e.g., sediment or tissue samples). Laboratories
are required to analyze the sample(s) "blind" and must submit their results in a timely manner both to the EMAP-E
QA Coordinator, as well as to either NIST or NRCC (as instructed). Laboratories which fail to maintain acceptable
performance may be required to provide an explanation and/or undertake appropriate corrective actions. At the end
of each calendar year, coordinating personnel at NIST and NRCC hold a QA workshop to present and discuss the
comparison exercise results. Representatives from each laboratory are expected to participate in the annual QA
workshops, which provide a forum for discussion of analytical problems brought to light in the comparison exercises.

Routine Analysis of Certified Reference Materials or Laboratory Control Materials

Certified Reference Materials (CRMs) generally are considered the most useful QC samples for assessing the
accuracy of a given analysis (i.e., the closeness of a measurement to the "true" value). Certified Reference Materials
can be used to assess accuracy because they have "certified" concentrations of the analytes of interest, as determined
through replicate analyses by a reputable certifying agency using two independent measurement techniques for
verification In addition, the certifying agency may provide "non-certified" or "informational" values for other analytes
of interest. Such values are determined using a single measurement technique, which may introduce unrecognized
bias. Therefore, non-certified values must be used with caution in evaluating the performance of a laboratory using
a method which differs from the one used by the certifying agency. A list of reference materials commonly used by
EMAP-E laboratories is presented in Table 5-6.

A Laboratory Control Material (LCM) is similar to a Certified Reference Material in that it is a homogeneous
matrix which closely matches the samples being analyzed. A "true" LCM is one which is prepared (i.e., collected,
homogenized and stored in a stable condition) strictly for use in-house by a single laboratory. Alternately, the material
may be prepared by a central laboratory and distributed to others (so-called regional or program control materials).
Unlike CRMs, concentrations of the analytes of interest in LCMs are not certified but are based upon a statistically
valid number of replicate analyses by one or several laboratories. In practice, this material can be used to assess the
precision (i.e., consistency) of a single laboratory, as well as to determine the degree of comparability among different


-------
Section 5
Page 14 of 31
Revision 2
May 1993

Table 5-6. Certified Reference Materials commonly used by EMAP-E laboratories. SRMs are available from
NIST (phone 301-975-6776); all other reference materials listed are available from NRC (phone 613-
993-2359).

Calibration Solutions:

SRM 1491	Aromatic Hydrocarbons in Hexane/Toluene

SRM 1492	Chlorinated Pesticides in Hexane

SRM 1493	Chlorinated Biphenyl Congeners in 2,2,4-Trimethylpentane

SRM 2260	Aromatic Hydrocarbons in Toluene

SRM 2261	Chlorinated Pesticides in Hexane

SRM 2262	Chlorinated Biphenyl Congeners in 2,2,4-Trimethylpentane
Environmental Matrices (Organicst:

SRM 1941a Organics in Marine Sediment

SRM 1974 Organics in Mussel Tissue (Mvtilus ediilis)

Environmental Matrices (Inorganics'):

SRM 1646
MESS-1
BEST-1
DOLT-1

Estuarine Sediment
Estuarine Sediment
Marine Sediment
Dogfish Liver

BCSS-1
PACS-1
DORM-1
SRM 1566a

Marine Sediment
Harbor Sediment
Dogfish Muscle
Oyster Tissue

laboratories. If available, LCMs may be preferred for routine (i.e.. day to day) analysis because CRMs are relatively
expensive. However, CRMs still must be analyzed at regular intervals (e.g., monthly or quarterly) to provide a check
on accuracy.

Routine analysis of Certified Reference Materials or, when available, Laboratory Control Materials represents
a particularly vital aspect of the "performance-based" EMAP-E QA philosophy. At least one CRM or LCM must be
analyzed along with each batch of 25 or fewer samples (Table 5-4). For CRMs, both the certified and non-certified
concentrations of the target analytes should be known to the analyst(s) and should be used to provide an immediate
check on performance before proceeding with a subsequent sample batch. Performance criteria for both precision and
accuracy have been established for analysis of CRMs or LCMs (Table 5-4); these criteria are discussed in detail in the
following paragraphs. If the laboratory fails to meet either the precision or accuracy control limit criteria for a given


-------
Section 5
Page 15 of 31
Revision 2
May 1993

analysis of the CRM or LCM, the data for the entire batch of samples is suspect. Calculations and instruments should
be checked; the CRM or LCM may have to be reanalyzed (i.e., reinjected) to confirm the results. If the values are still
outside the control limits in the repeat analysis, the laboratory is required to find and eliminate the source(s) of the
problem and repeat the analysis of that batch of samples until control limits are met, before continuing with further
sample processing. The results of the CRM or LCM analysis should never be used by the laboratory to "correct" the
data for a given sample batch.

Precision criteria: Each laboratory is expected to maintain control charts for use by analysts in monitoring the
overall precision of the CRM or LCM analyses. Upper and lower control chart limits (e.g., warning limits and control
limits) should be updated at regular intervals; control limits based on 99% percent confidence intervals around the
mean are recommended. Following the analysis of all samples in a given year, an RSD (relative standard deviation,
a.k.a. coefficient of variation) will be calculated for each analyte of interest in the CRM. Based on typical results
obtained by experienced analysts, an overall RSD of less than 30% will be considered acceptable precision for each
analyte having a CRM concentration > 10 times the laboratory's MDL. Failure to meet this goal will result in a
thorough review of the laboratory's control charting procedures and analytical methodology to determine if
improvements in precision are possible.

Accuracy criteria: The "absolute" accuracy of an analytical method can be assessed using CRMs only when
certified values are provided for the analytes of interest. However, the concentrations of many analytes of interest to
EMAP-E are provided only as non-certified values in some of the more commonly-used CRMs. Therefore, control
limit criteria are based on "relative accuracy", which is evaluated for each analysis of the CRM or LCM by comparison
of a given laboratory's values relative to the "true" or "accepted" values in the LCM or CRM. In the case of CRMs,
this includes both certified and noncertified values and encompasses the 95% confidence interval for each value as
described in Table 5-4.

Based on typical results attained by experienced analysts in the past, accuracy control limit criteria have been
established both for individual compounds and combined groups of compounds (Table 5-4). There are two combined
groups of compounds for the purpose of evaluating relative accuracy for organic analyses: PAHs and PCBs/pesticides.
The laboratory's value should be within ±30% of the true value on average for each combined group of organic
compounds, and the laboratory's value should be within ±35% of either the upper or lower 95% confidence limit for


-------
Section 5
Page 16 of 31
Revision 2
May 1993

at least 70% of the individual compounds in each group. For inorganic analyses, the laboratory's value should be
within ±20% of either the upper or lower 95% confidence limit for each analyte of interest in the CRM. Due to the
inherent variability in analyses near the method detection limit, control limit criteria for relative accuracy only apply
to analytes having CRM true values which are > 10 times the MDL established by the laboratory.

Continuing Calibration Checks

The initial instrument calibration performed prior to the analysis of each batch of samples is checked through
the analysis of calibration check samples (i.e., calibration standard solutions) inserted as part of the sample stream.
Calibration standard solutions used for the continuing calibration checks should contain all the analytes of interest.
At a minimum, analysis of the calibration check solution should occur somewhere in the middle and at the end of each
sample batch. Analysts should use best professional judgement to determine if more frequent calibration checks are
necessary or desirable.

If the control limit for analysis of the calibration check standard is not met (Table 5-4), the initial calibration
will have to be repeated. If possible, the samples analyzed before the calibration check sample that failed the control
limit criteria should be reanalyzed following the recalibration. The laboratory should begin by reanalyzing the last
sample analyzed before the calibration standard which failed. If the relative percent difference (RPD) between the
results of this reanalysis and the original analysis exceeds 30 percent, the instrument is assumed to have been out of
control during the original analysis. If possible, reanalysis of samples should progress in reverse order until it is
determined that there is less than 30 RPD between initial and reanalysis results. Only the re-analysis results should
be reported by the laboratory. If it is not possible or feasible to perform reanalysis of samples, all earlier data (i.e., since
the last successful calibration control check) is suspect. In this case, the laboratory should prepare a narrative
explanation to accompany the submitted data.

Laboratory Reagent Blank

Laboratory reagent blanks (also called method blanks or procedural blanks) are used to assess laboratory
contamination during all stages of sample preparation and analysis. For both organic and inorganic analyses, one
laboratory reagent blank should be run in every sample batch. The reagent blank should be processed through the


-------
Section 5
Page 17 of 31
Revision 2
May 1993

entire analytical procedure in a manner identical to the samples. Warning and control limits for blanks (Table 5-4)
are based on the laboratory's method detection limits as documented prior to the analysis of samples. A reagent blank
concentration between the MDL and 3 times the MDL for one or more of the analytes of interest should serve as a
warning limit requiring further investigation based on the best professional judgement of the analyst(s). A reagent
blank concentration equal to or greater than 3 times the MDL for one or more of the analytes of interest requires
definitive corrective action to identify and eliminate the source(s) of contamination before proceeding with sample
analysis.

Internal Standards

Internal standards (commonly referred to as "surrogates", "surrogate spikes" or "surrogate compounds") are
compounds chosen to simulate the analytes of interest in organic analyses. The internal standard represents a reference
analyte against which the signal from the analytes of interest is compared directly for the purpose of quantification.
Internal standards must be added to each sample, including QA/QC samples, prior to extraction. The reported
concentration of each analyte should be adjusted to correct for the recovery of the internal standard, as is done
in the NOAA National Status and Trends Program. The internal standard recovery data therefore should be carefully
monitored; each laboratory must report the percent recovery of the internal standard(s) along with the target analyte
data for each sample. If possible, isotopically-labeled analogs of the analytes should be used as internal standards.

Control limit criteria for internal standard recoveries are provided in Table 5-4. Each laboratory should set
its own warning limit criteria based on the experience and best professional judgement of the analyst(s). It is the
responsibility of the analyst(s) to demonstrate that the analytical process is always "in control" (i.e., highly variable
internal standard recoveries are not acceptable for repeat analyses of the same certified reference material and for the
matrix spike/matrix spike duplicate).

Injection Internal Standards

For gas chromatography (GC) analysis, injection internal standards (also referred to as "internal standards"
by some analysts) are added to each sample extract iust prior to injection to enable optimal quantification, particularly
of complex extracts subject to retention time shifts relative to the analysis of standards. Injection internal standards


-------
Section 5
Page 18 of 31
Revision 2
May 1993

are essential if the actual recovery of the internal standards added prior to extraction is to be calculated. The injection
internal standards also can be used to detect and correct for problems in the GC injection port or other parts of the
instrument. The compounds used as injection internal standards must be different from those already used as internal
standards. The analyst(s) should monitor injection internal standard retention times and recoveries to determine if
instrument maintenance or repair, or changes in analytical procedures, are indicated. Corrective action should be
initiated based on the experience of the analyst(s) and not because warning or control limits are exceeded. Instrument
problems that may have affected the data or resulted in the reanalysis of the sample should be documented properly
in logbooks and internal data reports and used by the laboratory personnel to take appropriate corrective action.

Matrix Spike and Matrix Spike Duplicate

A laboratory fortified sample matrix (commonly called a matrix spike, or MS) and a laboratory fortified
sample matrix duplicate (commonly called a matrix spike duplicate, or MSD) will be used both to evaluate the effect
of the sample matrix on the recovery of the compound(s) of interest and to provide an estimate of analytical precision.
A minimum of 5% of the total number of samples submitted to the laboratory in a given year should be selected at
random for analysis as matrix spikes/matrix spike duplicates. Each MS/MSD sample is first homogenized and then
split into three subsamples. Two of these subsamples are fortified with the matrix spike solution and the third
subsample is analyzed as is to provide a background concentration for each analyte of interest. The matrix spike
solution should contain all the analytes of interest. The final spiked concentration of each analyte in the sample should
be at least 10 times the MDL for that analyte, as previously calculated by the laboratory.

Recovery data for the fortified compounds ultimately will provide a basis for determining the prevalence of
matrix effects in the sediment samples analyzed during the project. If the percent recovery for any analyte in the MS
or MSD is less than the recommended warning limit of 50 percent, the chromatograms and raw data quantitation
reports should be reviewed. If an explanation for a low percent recovery value is not discovered, the instrument
response may be checked using a calibration standard. Low matrix spike recoveries may be a result of matrix
interferences and further instrument response checks may not be warranted, especially if the low recovery occurs in
both the MS and MSD and the other QC samples in the batch indicate that the analysis was "in control". An


-------
Section 5
Page 19 of 31
Revision 2
May 1993

explanation for low percent recovery values for MS/MSD results should be discussed in a cover letter accompanying
the data package. Corrective actions taken and verification of acceptable instrument response must be included.

Analysis of the MS/MSD also is useful for assessing laboratory precision. The relative percent difference
(RPD) between the MS and MSD results should be less than 30 for each analyte of interest (see Table 5-4). The RPD
is calculated as follows:

RPD = fCl - C2) x 100
(CI + C2)/2

where: CI is the larger of the duplicate results for a given analyte
C2 is the smaller of the duplicate results for a given analyte

If results for any analytes do not meet the RPD < 30% control limit criteria, calculations and instruments
should be checked. A repeat analysis may be required to confirm the results. Results which repeatedly fail to meet
the control limit criteria indicate poor laboratory precision. In this case, the laboratory is obligated to halt the analysis
of samples and eliminate the source of the imprecision before proceeding.

Field Duplicates and Field Splits

For the EMAP-E program, sediment will be collected at each station using a grab sampler. Each time the
sampler is retrieved, the top 2 cm of sediment (approximately) will be scraped off, placed in a large mixing container
and homogenized, until a sufficient amount of material has been obtained. At six pre-selected stations (one for each
field crew), the homogenized material will be placed in four separate sample containers for subsequent chemical
analysis. Two of the sample containers will be submitted as blind field duplicates to the primary analytical laboratory.
The other two containers, also called field duplicates, will be sent blind to a second laboratory. Together, the two pairs
of duplicates are called field splits. The analysis of the field duplicates will provide an assessment of single laboratory
precision. The analysis of the field duplicates and field splits will provide an assessment of both inter- and intra-
laboratory precision.


-------
Section 5
Page 20 of 31
Revision 2
May 1993

5.4 OTHER SEDIMENT MEASUREMENTS

The preceding sections presented QA/QC requirements covering laboratory analysis of sediment and fish
tissue samples for organics (i.e., PAHs, PCBs and chlorinated pesticides) and inorganics (i.e., metals). In addition to
these "conventional" contaminants, EMAP-E laboratories are required to measure several ancillary sediment
parameters, such as total organic carbon (TOC), acid volatile sulfide (AVS), and tri-, di- and monobutyltin (TBT, DBT,
MBT) concentrations. The laboratory QA/QC requirements associated with these "other sediment measurements" are
presented in the following sections.

5.4.1 Total Organic Carbon

As a check on precision, each laboratory should analyze at least one total organic carbon (TOC) sample in
duplicate for each batch of 25 or fewer samples. Based on typical results attained by experienced analysts, the relative
percent difference (RPD) between the two duplicate measurements should be less than 20%. If this control limit is
exceeded, analysis of subsequent sample batches should stop until the source of the discrepancy is determined and the
system corrected.

At least one certified reference material (CRM) or, if available, one laboratory control material (LCM) should
be analyzed along with each batch of 25 or fewer TOC samples. Any one of several marine sediment CRMs distributed
by the National Research Council of Canada's Marine Analytical Chemistry Standards Program (e.g., the CRMs named
"BCSS-1", "MESS-1" and "PACS-1", see Table 5-6) have certified concentrations of total carbon and are recommended
for this use. Prior to analysis of actual samples, it is recommended that each laboratory perform several total organic
carbon analyses using a laboratory control material or one of the aforementioned CRMs to establish a control chart (the
values obtained by the laboratory for total organic carbon should be slightly less than the certified value for total carbon
in the CRM). The control chart then should be used to assess the laboratory's precision for subsequent analyses of the
LCM or CRM with each sample batch. In addition, a method blank should be analyzed with each sample batch. Total
organic carbon concentrations should be reported as , • g/g (ppm) dry weight of the unacidified sediment sample. Data
reported for each sample batch should include QA/QC sample results (duplicates, CRMs or LCMs, and method blanks).
Any factors that may have influenced data quality should be discussed in a cover letter accompanying the submitted
data, both on paper and in electronic file format (i.e., text file).


-------
Section 5
Page 21 of 31
Revision 2
May 1993

5.4.2 Acid Volatile Sulfide

Quality control of acid volatile sulfide (AVS) measurements is achieved through the routine analysis of a
variety of QA/QC samples. These are outlined in the following section and described in full detail in the EMAP-E
Laboratory Methods Manual (U.S. EPA, in preparation). Prior to the analysis of samples, the laboratory must establish
a calibration curve and determine a limit of reliable detection for sulfide for the analytical method being employed.
Following this, laboratory performance will be assessed through routine analysis of laboratory duplicates, calibration
check standards, laboratory fortified blanks (i.e., spiked blanks), and laboratory fortified sample matrices (i.e., matrix
spikes).

One sample in every batch of 25 or fewer samples should be analyzed in duplicate as a check on laboratory
precision. Based on typical results attained by experienced analysts, the relative percent difference (RPD) between the
two analyses should be less than 20%. If the RPD exceeds 20%, a third analysis should be performed. If the relative
standard deviation of the three determined concentrations exceeds 20%, the individual analyses should be examined
to determine if non-random errors may have occurred. As previously discussed, field duplicates and splits also will
be collected for AVS determination to assess both inter- and intra-laboratory precision.

Due to the instability of acid volatile sulfides to drying and handling in air, CRMs have not been developed
for assessing overall measurement accuracy. Therefore, each laboratory must analyze at least one calibration check
standard, one laboratory fortified blank and one laboratory fortified sample matrix in each batch of 25 or fewer samples
as a way of determining the accuracy of each step entailed in performing the analysis. The concentration of sulfide
in each of these three types of accuracy check samples will be known to the analyst; the calculated concentration of
sulfide in each sample should be within ± 15% of the known concentration.

If the laboratory is not within ± 15% of the known concentration for the calibration check solution,
instruments used for AVS measurement must be recalibrated and/or the stock solutions redetermined by titration. If
the laboratory fails to achieve the same accuracy (within ± 15% of the true value) for AVS in the laboratory fortified
blank, sources of error (e.g., leaks, excessive gas flows, poor sample-acid slurry agitation) should be determined for
the analytical system prior to continuing. If AVS recovery falls outside the 85% to 115% range for the matrix spike,
the system should be evaluated for sources of error and the analysis should be repeated. If recovery remains


-------
Section 5
Page 22 of 31
Revision 2
May 1993

unacceptable, it is possible that matrix interferences are occurring. If possible, the analysis should be repeated using
smaller amounts of sample to reduce the interferant effects. Results for all QA/QC samples (duplicates, calibration
check standards, spiked blanks and matrix spikes) should be submitted by the laboratory as part of the data package
for each batch of samples, along with a narrative explanation for results outside control limits.

5.4.3 Butyltins

Assessment of the distribution and environmental impact of butyltin species of interest to the EMAP-E
program (tributyltin, dibutyltin and monobutyltin) requires their measurement in marine sediment and tissue samples
at trace levels. Quality control of these measurements consists of checks on laboratory precision and accuracy. One
laboratory reagent blank must be run with each batch of 25 or fewer samples. A reagent blank concentration between
the MDL and 3 times the MDL should serve as a warning limit requiring further investigation based on the best
judgement of the analyst(s). A reagent blank concentration equal to or greater than 3 times the MDL requires
corrective action to identify and eliminate the source(s) of contamination, followed by reanalysis of the samples in the
associated batch.

One laboratory fortified sample matrix (commonly called a matrix spike) or laboratory fortified blank (i.e.,
spiked blank) should be analyzed along with each batch of 25 or fewer samples to evaluate the recovery of the butyltin
species of interest. The butyltins should be added at 5 to 10 times their MDLs as previously calculated by the
laboratory. If the percent recovery for any of the butyltins in the matrix spike or spiked blank is outside the range 70
to 130 percent, analysis of subsequent sample batches should stop until the source of the discrepancy is determined and
the system corrected.

The NRCC sediment reference material "PACS-1", which has certified concentrations of the three butyltin
species of interest, also should be analyzed along with each batch of 25 or fewer sediment samples as a check on
accuracy and reproducibility (i.e., batch-to-batch precision). If values obtained by the laboratory for butyltins in
"PACS-1" are not within ±30% of the certified values, the data for the entire batch of samples is suspect. Calculations
and instruments should be checked; the CRM may have to be reanalyzed to confirm the results. If the values are still
outside the control limits in the repeat analysis, the laboratory is required to determine the source(s) of the problem
and repeat the analysis of that batch of samples until control limits are met, before continuing with further sample
processing.


-------
Section 5
Page 23 of 31
Revision 2
May 1993

5.5 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT

5.5.1	Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling
of sample containers, recording sampling information in the field and tracking sample shipments. A complete
description of this system is provided in the EMAP-E Information Management Plan (Adams et al. 1993) and also
summarized in Section 11 of this plan. Each analytical laboratory must designate a sample custodian, authorized to
check the condition of and sign for incoming field samples, obtain documents of shipment and verify sample custody
records. This individual is required, upon receipt of samples, to record and transmit all tracking information to the
Province Information Management center. The use of barcode labels and readers provided by the Province will
facilitate this process. Laboratory personnel should be aware of the required sample holding times and conditions (see
Table 5-3), and the laboratory must have clearly-defined and documented custody procedures for sample handling,
storage, and disbursement.

5.5.2	Data Reporting Requirements

As previously indicated, laboratory personnel must verily that the measurement process was "in control" (i.e.,
all specified QA/QC requirements were met) for each batch of samples before proceeding with the analysis of a
subsequent batch. In addition, each laboratory must establish a system for detecting and eliminating transcription
and/or calculation errors prior to reporting data. It is recommended that an individual not involved directly in sample
processing be designated as laboratory QA Officer to perform these verification checks independent of day-to-day
laboratory operations.

Only data which has met QA requirements should be submitted by the laboratory. When QA requirements
have not been met, the samples should be reanalyzed and only the results of the reanalysis should be submitted,
provided they are acceptable. Each data package should consist of the following:

• A cover letter, both on paper and in electronic file format, providing a brief description of the procedures and

instrumentation used (including the procedure(s) used to calculate MDLs), as well as a narrative explanation

of analytical problems (if any) or failure(s) to meet quality control limits.


-------
Section 5
Page 24 of 31
Revision 2
May 1993

•	Tabulated results in hard copy form, including sample size, wet weight, dry weight, and concentrations of the
analytes of interest (reported in units identified to three significant figures unless otherwise justified).
Concentration units should be ng/g or. 
-------
Section 5
Page 25 of 31
Revision 2
May 1993

TABLE 5-7. Codes for denoting QA/QC samples in submitted data packages.

Code

Description

Unit of Measure

CLC

Continuing Calibration Check Sample

Percent recovery

LRB

Lab Reagent Blank

varies

LCM

Lab Control Material

, • g/g or ng/g dry wt.

LCMPR

Lab Control Material % Recovery

Percent Recovery

LF1

Lab Spiked Sample- 1st Member

. 'g/g or ng/g dry wt.

LF1PR

Lab Spiked Sample- 1st Mem. % Rec.

Percent Recovery

LF2

Lab Spiked Sample- 2nd Member

. 'g/g or ng/g dry wt.

LF2PR

Lab Spiked Sample- 2nd Mem. % Rec.

Percent Recovery

MSDRPD

Rel % Difference: LF1 to LF2

Percent

LFB

Lab Fortified Blank

Percent Recovery

LSFPR

Lab Spiked Sample % Rec.

Percent Recovery

LDRPD

Lab Duplicate Relative % Diffi

Percent

the data qualifier code "b" then should be used to flag any reported values which are below the laboratory's MDL. The
"b" code has the following meaning: "The reported concentration is below or equal to the detection limit. The detection
limit (MDL) is reported as a separate variable."

There may be a limited number of situations where sample re-analysis is not possible or practical (i.e., minor
exceedance of a single control limit criteria). The laboratory is expected to provide a detailed explanation of any factors
affecting data quality or interpretation; this explanation should be in the form of a cover letter, both on paper and in
electronic file format (i.e., text file) accompanying each submitted data package. The narrative explanation is in lieu
of additional data qualifier codes supplied bv the laboratory (other than the "a" and "b" codes). Over time, depending
on the nature of these narrative explanations, the EMAP-E program expects to develop a limited list of codes for
qualifying data in the database (in addition to the "a" and "b" codes).

5.5.3 Data Evaluation Procedures

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the four data evaluation steps identified in the following paragraph are completed, notify the analytical laboratory


-------
Section 5
Page 26 of 31
Revision 2
May 1993

of any additional information or corrective actions deemed necessary as a result of the Province's data evaluation and,
following satisficatoiy resolution of all "corrective action" issues, take final action by notifying the laboratory in writing
that the submitted results have been officially accepted as a completed deliverable in fulfillment of contract
requirements. It may be necessary or desirable for a team of individuals (e.g., the Province QA Coordinator and/or
analytical chemists on the Province staff) to assist the Province Manager in technical evaluation of the submitted data
packages. While the Province Manager has ultimate responsibility for maintaining official contact with the analytical
laboratory and verifying that the data evaluation process is completed, it is the responsibility of the Province QA
Coordinator to closely monitor and formally document each step in the process as it is completed. This documentation
should be in the form of a data evaluation tracking form or checklist that is filled in as each step is completed. This
checklist should be supplemented with detailed memos to the project file outlining any concerns with data omissions,
analysis problems, or descriptions of questionable data identified by the laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten and (if holding times have been exceeded) can
sometimes limit options for reanalysis. The following steps are to be followed and documented in evaluating EMAP-E
chemistry data:

1.)	Checking data completeness (verification)

2.)	Assessing data quality (validation)

3.)	Assigning data qualifier codes

4.)	Taking final actions

The specific activities required to complete each of these steps are illustrated in Figure 5-1 and described in the
following sections, which are adopted in large part from the document "A Project Manager's Guide to Requesting and
Evaluating Chemical Analyses" (EPA 1991).

Checking Data Completeness

The first part of data evaluation is to verify that all required information has been provided in the data
package. On the EMAP-E program, this should include the following specific steps:


-------
Section 5
Page 27 of 31
Revision 2
May 1993

INFORMATION	EVALUATION	TECHNICAL	MANAGEMENT

SOURCE	CRITERIA	CONCLUSION	ACTION

Figure 5-1. Steps to be followed in the assessment and evaluation of EMAP-E chemistry data (from
U.S. EPA 1991).


-------
Section 5
Page 28 of 31
Revision 2
May 1993

•	Province personnel should verify that the package contains the following: narrative explanations signed by
the laboratory manager, hard copies of all results (including QA/QC results), and accompanying computer
diskettes.

•	The electronic data file(s) should be parsed and entered into the EMAP Province database to verify that the
correct format has been supplied.

•	Once the data have been entered into the Province database, automated checks should be run to verify that
results have been reported for all expected samples and all analytes.

The Province Manager should contact the laboratory and request any missing information as soon as possible
after receipt of the data package. If information was omitted because required analyses were not completed, the
laboratory should provide and implement a plan to correct the deficiency. This plan may include submittal of a revised
data package and possible reanalysis of samples.

Assessing Data Quality

Data validation, or the process of assessing data quality, can begin after Province personnel have determined
that the data package is complete. Normally, the first major part of validation involves checking 100-percent of the
data for any possible errors resulting from transcription of tabulated results, misidentification or miscalculations.
However, EMAP-E laboratories are expected to submit data which already has been tabulated and checked 100% for
accuracy, and the raw data reports needed by Province personnel to perform these checks (e.g., chromatograms,
original quantitation reports) are not submitted as part of the data package. The laboratory is required to maintain this
raw data in an orderly manner and to have these records available for review by EMAP-E personnel upon request (i.e.,
the data may be audited at any time following appropriate notification of the laboratory). The first-step validation
checks performed by Province personnel will be limited to the following: 1.) a check to verify that all reporting units
and numbers of significant figures are correct; 2.) a check to verify that all of the laboratory's calculated percent
recovery values (for calibration check samples, Laboratory Control Materials, and matrix spikes) and relative percent
difference values (for duplicates) are correct; and 3.) a check to verify that the reported concentrations for each analyte
fall within "environmentally-realistic" ranges, determined from previous studies and expert judgement. In addition,


-------
Section 5
Page 29 of 31
Revision 2
May 1993

past studies indicate that the different compounds in each class of chemicals being measured on EMAP-E (e.g., PAHs,
PCBs, DDTs and other chlorinated pesticides) typically occur in the environment in somewhat fixed ratios to one
another. For example, the DDT breakdown products p,p DDD and p,p DDE typically can be expected to occur at
higher concentrations than p,p DDT in estuarine sediments of the East Coast. If anomolous departures from such
expected ratios are found, it may indicate a problem in the measurement or data reduction process requiring further
investigation.

The second major aspect of data validation is to compare the QA/QC data against established criteria for
acceptable performance, as specified earlier in this plan. This will involve the following specific steps:

1.)	Results for QA/QC samples should be tabulated, summarized and evaluated. Specifically, a set of summary
tables should be prepared from the Province database showing the percent recovery values and relative percent
difference values (where applicable) for the following QA/QC samples: continuing calibration checks samples,
laboratory control material(s), and matrix spike/matrix spike duplicate samples. The tables should indicate
the percent recovery values for these samples for each individual batch of samples, as well as the average,
standard deviation, coefficient of variation, and range for all batches combined.

2.)	Similar summary tables should be prepared for the laboratory reagent blank QA/QC samples.

3.)	The summary results, particularly those for the Laboratory Control Material (i.e., Certified Reference
Material), should be evaluated by comparing them against the QA/QC warning and control limit criteria for
accuracy, precision, and blank contamination specified in Table 5-4.

4.)	Method detection limits reported by the laboratory for each analyte should be tabulated and compared against
the target values in Table 5-5.

There are several possible courses of action to be taken if the reported data are found to be deficient (i.e.,
warning and/or control limits exceeded) during the assessment of data quality:


-------
Section 5
Page 30 of 31
Revision 2
May 1993

1.)	The laboratory's cover letter (narrative explanation) should be consulted to determine if the problems were
satisfactorily addressed.

2.)	If only warning limits were exceeded, then it is appropriate for the laboratory to report the results.
Exceedance of control limits, however, will result in one of the following courses of action: 1.) all associated
results will be qualified in the database as estimated values (as explained in the following section), or 2.) the
data will be rejected and deleted from the database because the analysis was judged to be out of control (based
on the professional judgement of the reviewer). Rejection of data due to failure of the laboratory's quality
control system could ultimately result in disqualification of the laboratory from further participation in the
EMAP-Estuaries program.

Assigning Data Qualifier Codes

Data qualifier codes are notations used by laboratories and data reviewers to briefly describe, or qualify, data
and the systems producing data. As previously indicated, EMAP-E laboratories are expected to assign only two data
qualifier codes ("a" and "b") to data values before submitting them to the program. EMAP-E data reviewers, in turn,
will assign an additional data qualifier code in situations where there are exceedances of control limit criteria. The
most typical situation is when a laboratory fails to meet the accuracy control limit criteria for a particular analyte in
a Certified Reference Material or matrix spike sample. In these situations, the QA reviewer should verify that the
laboratory did meet the control limit criteria for precision. If the lack of accuracy is found to be consistent (i.e., control
limit criteria for precision were met), then it is likely that the laboratory experienced a true bias for that particular
analyte. In these situations, all reported values for that particular analyte will be qualified with a "c" code. The "c"
code has the following meaning: "The reported concentration is considered an estimate because control limits for this
analyte were exceeded in one or more quality control samples."

Because some degree of expert judgement and subjectivity typically is necessary to evaluate chemistry QA/QC
results and assign data qualifier codes, data validation should be conducted only by qualified personnel. It is the
philosophy of the program that data which are qualified as estimates because of minor exceedance of a control limit
in a QA/QC sample ("c" code) are still usable for most assessment and reporting purposes. However, it is important


-------
Section 5
Page 31 of 31
Revision 2
May 1993

to note that all QA/QC data will be readily available in the database along with the results data, so that interested data
users can make their own estimation of data quality.

Taking Final Action

Upon completion of the above steps, a report summarizing the QA review of the data package should be
prepared, samples should be properly stored or disposed of, and laboratory data and accompanying explanatory
narratives should be archived both in a storage file and in the database. Technical interpretation of the data begins
after the QA review has been completed.

Reports documenting the results of the QA review of a data package should summarize all conclusions
concerning data acceptability and should note significant quality assurance problems that were found. These reports
are useful in providing data users with a written record on data concerns and a documented rationale for why certain
data were accepted as estimates or were rejected. The following specific items should be addressed in the QA report:

•	Summary of overall data quality, including a description of data that were qualified.

•	Brief descriptions of analytical methods and the method(s) used to determine detection limits.

•	Description of data reporting, including any corrections made for transcription or other reporting errors, and
description of data completeness relative to objectives stated in the QA plan.

•	Descriptions of initial and ongoing calibration results, blank contamination, and precision and bias relative
to QA plan objectives (including tabulated summary results for Certified Reference Materials and matrix
spike/matrix spike duplicates).

The chemistry QA results will be presented in the Program Annual Quality Assurance Report and will also
become a permanent part of the database documentation (i.e., metadata). The QA/QC data collected by the Program
will be used not only to assess the accuracy and precision of individual laboratory measurements, but ultimately to
assess the comparability of data generated by multiple laboratories.


-------
Section 6

Page 1 of 6
Revision 2
May 1993

SECTION 6
SEDIMENT PARTICLE SIZE ANALYSIS

6.1	OVERVIEW

Particle size is used to characterize the physical characteristics of sediments. Because particle size influences
both chemical and biological variables, it can be used to normalize chemical concentrations according to sediment
characteristics and to account for some of the variability found in biological assemblages. For 1993 EMAP-E
monitoring in the Virginian Province, only the percent silt-clay will be determined for the particle size samples.

6.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION AND HOLDING

EMAP-E protocols for collecting particle size samples are described in detail in the Virginian Province Field
Operations and Safety Manual (Reifsteck et al. 1993). Samples will be collected in plastic Whirl-pak® containers; a
minimum sample size of 100 grams is recommended. Samples should be held and shipped on ice (NOT dry ice) and
may be stored at 4 °C for up to one year before analysis. Samples must not be frozen or dried prior to analysis, as
either process may change the particle size distribution.

6.3	QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS

Quality control of sediment particle size analysis is accomplished by strict adherence to protocol and
documentation of quality control checks. Certain procedures are critical to the collection of high quality data. For
example, it is essential that each sample be homogenized thoroughly in the laboratory before a subsample is taken for
analysis. Laboratory homogenization should be conducted even if samples were homogenized in the field.
Furthermore, all screens used for dry seiving must be clean before conducting analysis, and all of the sample must be
retrieved from them. To clean a screen, it should be inverted and tapped on a table, while making sure that the rim
hits the table evenly. Further cleaning of brass screens may be performed by gentle scrubbing with a stiff bristle nylon
brush. Stainless steel screens may be cleaned with a nylon or brass brush.


-------
Section 6

Page 2 of 6
Revision 2
May 1993

The most critical aspect of the pipet analysis is knowledge of the temperature of the silt-clay suspension. An
increase of only 1 °C will increase the settling velocity of a particle 50 |im in diameter by 2.3 percent. It is generally
recommended that the pipet analysis be conducted at a constant temperature of 20 °C. However, Plumb (1981)
provides a table to correct for settling velocities at other temperatures; this table is included in the EMAP-E Laboratory
Methods Manual (U.S. EPA, in preparation). If the mass of sediment used for pipet analysis exceeds 25 g, a subsample
should be taken as described by Plumb (1981). Silt-clay samples in excess of 25 g may give erroneous results because
of electrostatic interactions between the particles. Silt-clay samples less than 5 g yield a large experimental error in
weighing relative to the total sample weight. Thorough mixing of the silt-clay suspension at the beginning of the
analysis also is critical. A perforated, plexiglass disc plunger is very effective for this purpose. Once the pipet analysis
begins, the settling cylinders must not be disturbed, as this will alter particle settling velocities. Care must be taken
to disturb the sample as little as possible when pipet extractions are made.

The analytical balance, drying oven, sieve shaker, and temperature bath used in the analysis should be
calibrated at least monthly. Dried samples should be cooled in a dessicator and held there until they are weighed. If
a dessicator is not used, the sediment will accumulate ambient moisture and the sample weight will be overestimated.
A color-indicating desiccant is recommended so that spent desiccant can be detected easily. Also, the seal on the
dessicator should be checked periodically, and, if necessary, the ground glass rims should be greased or the "O" rings
should be replaced.

Quality control for the sediment analysis procedures will be accomplished primarily by reanalyzing a randomly
selected subset of samples from each batch, as described in full detail in the EMAP-E Laboratory Methods Manual
(U.S. EPA, in preparation). A batch of samples is defined as a set of samples of a single textural classification (e.g.,
silt/clay, sand, gravel) processed by a single technician using a single procedure. Approximately 10% of each batch
completed by the same technician should be reanalyzed (i.e., reprocessed) in the same manner as the original sample
batch. Based on results typically attained by experienced technicians, if the absolute difference between the original
value and the second value is greater than 10% (in terms of the percent of the most abundant sediment size class), then
a third analysis will be completed by a different technician. The values closest to the third value will be entered into
the database. In addition, all the other samples in the same batch must be re-analyzed, and the laboratory protocol


-------
Section 6

Page 3 of 6
Revision 2
May 1993

and/or technician's practices should be reviewed and corrected to bring the measurement error under control. If the
percent of the most abundant sediment size class in the original sample and the reanalyzed sample differs by less than
10, the original value will not be changed and the sediment analysis process will be considered in control.

Additional quality control for particle size analyses will be accomplished by reanalyzing samples that fail
either a range check or recovery check. For the range check, any sample results that fall outside expected ranges (i.e.,
any percentage that totals greater than 100%) will be reanalyzed. For the recovery check, if the total weight of the
recovered sands is 10% (by weight) less or greater than the starting weight of sands, the sample must be reanalyzed.

6.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT

6.4.1	Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling
of sample containers, recording sampling information in the field and tracking sample shipments. A complete
description of this system is provided in the EMAP-E Information Management Plan (Adams et al. 1993) and also
summarized in Section 11 of this plan. The laboratory responsible for processing the sediment particle size samples
must designate a sample custodian, authorized to check the condition of and sign for the incoming field samples, obtain
documents of shipment and verify sample custody records. This individual is required, upon receipt of samples, to
record and transmit all tracking information to the Province Information Management center. The use of barcode
labels and readers provided by the Province will facilitate this process. Laboratory personnel should be aware of the
required sample holding times and conditions for particle size samples, and there must be clearly-defined custody
procedures for sample handling, storage, and disbursement in the laboratory.

6.4.2	Data Reporting Requirements and Evaluation Procedures

The weight of each sediment fraction should be reported to the nearest 0.0001 gram dry weight. The
laboratory should report the results for all samples analyzed (including QC duplicates) both in hard copy and in a
computer-readable format specified by the Province Information Manager. In addition, both the paper and electronic


-------
Section 6
Page 4 of 6
Revision 2
May 1993

data packages should include a cover letter with a summary of all quality control checks performed and a narrative
explanation of any problems that may have influenced data quality.

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the four data evaluation steps identified in the following paragraph are completed, notify the laboratory of any
additional information or corrective actions deemed necessary as a result of the Province's data evaluation and,
following satisficatoiy resolution of all "corrective action" issues, take final action by notifying the laboratory in writing
that the submitted results have been officially accepted as a completed deliverable in fulfillment of contract
requirements. It may be necessary or desirable for the Province Manager to delegate the technical evaluation of the
data to the QA Coordinator or other qualified staff member. It is the responsibility of the Province QA Coordinator
to closely monitor and formally document each step in the data evaluation process as it is completed. This
documentation should be in the form of a data evaluation tracking form or checklist that is filled in as each step is
completed. This checklist should be supplemented with detailed memos to the electronic and paper project files
outlining the concerns with data omissions, analysis problems, or descriptions of questionable data identified by the
laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten and (if holding times have been exceeded) can
sometimes limit options for reanalysis. The first part of data evaluation is to verify that all required information has
been provided in the data package. On the EMAP-E program, this should include the following specific steps:

•	Province personnel should verify that the package contains a cover letter signed by the laboratory manager,
hard copies of all results (including QA/QC results), and accompanying computer diskettes.

•	The electronic data file(s) should be parsed and entered into the EMAP Province database to verify that the
correct format has been supplied.

•	Once the data have been transferred to the Province database, automated checks should be run to verify that
results have been reported for all expected samples and all analytes.


-------
Section 6

Page 5 of 6
Revision 2
May 1993

The Province Manager should contact the laboratory and request any missing information as soon as possible
after receipt of the data package. If information was omitted because required analyses were not completed, the
laboratory should provide and implement a plan to correct the deficiency. This plan may include submittal of a revised
data package and possible reanalysis of samples.

Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for particle size data should consist of the following:
1.) a check to verify that all reporting units and numbers of significant figures are correct; 2.) a check to verify that
the cumulative percentage of each particle size fraction never exceeds 100% (i.e., a failed range check); 3.) a check
to verify that the results for duplicate samples do not differ by more than 10%; and 4.) the relative standard deviation
(RSD) for the three particle size samples obtained at each station should be calculated. For any station having an RSD
greater than 20%, all raw data and calculations should be checked by the laboratory to ascertain that the difference truly
reflects natural spatial variability among the three grab samples and not measurement error.

6.4.3 Assigning Data Qualifier Codes and Taking Final Action

Data qualifier codes are notations used by laboratories and data reviewers to briefly describe, or qualify, data
and the systems producing data. To date, the Virginian Province particle size data has been accepted without
qualification, and no data qualifier codes have been developed. All QA/QC data associated with particle size analyses
will be readily available in the database along with the results data, so that interested data users can perform their own
assessments of data quality.

Upon completion of all data evaluation steps, a report summarizing the QA review of the data package should
be prepared, samples should be properly stored or disposed of, and laboratory data should be archived both in a storage
file and in the database. Reports documenting the results of the QA review of the data package should summarize all
conclusions concerning data acceptability and should note significant quality assurance problems that were found.
These reports are useful in providing data users with a written record of data concerns and a documented rationale for
why certain data were accepted as estimates or were rejected. The following specific items should be addressed in the
QA report:


-------
Section 6
Page 6 of 6
Revision 2
May 1993

Summary of overall data quality, including a description of data that were qualified.

Brief descriptions of sample collection and analysis methods.

Description of data reporting, including any corrections made for transcription or other reporting errors, and
description of data completeness relative to objectives stated in the QA plan.

The particle size QA results will be included in the annual Program Quality Assurance Report and will also
a permanent part of the database documentation (i.e., metadata).


-------
Section 7
Page 1 of 12
Revision 2
May 1993

SECTION 7
SEDIMENT TOXICITY TESTING

7.1	OVERVIEW

The toxicity of sediments collected by field crews will be determined as an integral part of the benthic
indicator suite, using 10-day acute toxicity tests with the marine amphipodAmpelisca abdita. The various aspects of
the test for which QA/QC procedures are specified in this section include the following: sample collection, preservation
and holding, the condition of testing facilities and equipment, the source and condition of test organisms, test
conditions, instrument calibration, use of reference toxicants, record keeping, data reporting requirements and data
evaluation. In addition, any laboratory which has not previously performed the sediment toxicity test using Ampelisca
abdita will be required to perform an initial demonstration of capability, as described below.

7.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION AND HOLDING

Protocols for sample collection, preservation and holding are presented in the Field Operations and Safety
Manual (Reifsteck et al. 1993). Sediment samples for toxicity testing should be chilled to 4 ° C when collected, shipped
on ice, and stored in the dark at 4 °C until used. The minimum volume of sediment required per station is 3000 ml
(i.e., 3 liters). The sediment should be stored for no longer than four weeks before the initiation of the test and should
not be frozen or allowed to dry. Sample containers should be made of chemically inert materials (e.g., glass or high
density polyethylene jars with Teflon® lined lids) to prevent contamination, which might result in artificial changes
in toxicity.

Sediment for toxicity testing is taken from the same homogenate used for the sediment chemistry sample; this
homogenate consists of the top 2 cm layer (approximate) taken from multiple grabs at each station. As with the
sediment chemistry sample, contamination is to be avoided in obtaining the sediment toxicity sample. This is
accomplished through strict adherence to protocol during sample collection. For example, all sampling devices and
any other instruments in contact with the sediment should be cleaned with water and a mild detergent and thoroughly


-------
Section 7
Page 2 of 12
Revision 2
May 1993

rinsed between stations, and all utensils in contact with the sample should be made of chemically inert materials, such
as Teflon® or high quality stainless steel (see Reifsteck et al. 1993).

7.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS

Complete descriptions of the methods employed for the sediment toxicity test are provided in the EMAP-E
Laboratory Methods Manual (U.S. EPA 1992, in revision); these protocols are based on American Society for Testing
and Materials (ASTM) Standard Methods (ASTM 1991).

7.3.1	Facilities and Equipment

Laboratory and bioassay temperature control equipment must be adequate to maintain recommended test
temperatures. Recommended materials must be used in the fabrication of the test equipment in contact with the water
or sediment being tested, as specified in the EMAP-E Laboratory Methods Manual (U.S. EPA 1992, in revision).

7.3.2	Initial Demonstration of Capability

Laboratories which have not previously conducted sediment toxicity tests with Ampelisca abdita must
demonstrate the ability to collect (if applicable), hold and test the organisms without significant loss or mortality, prior
to initiating testing of actual samples. There are two types of tests which must be performed as an initial demonstration
of capability; these tests will serve to indicate the overall ability of laboratory personnel to handle the organism
adequately and obtain consistent, precise results. First, the laboratory must perform a minimum of five successive
reference toxicant tests, using sodium dodecyl sulfate (SDS) as the reference toxicant. For Ampelisca abdita,
short-term (i.e., 96-hour) tests without sediments (i.e., seawater only) can be used for this purpose.

The trimmed Spearman-Karber method of regression analysis (Hamilton et al. 1977) or the monotonic
regression analysis developed by DeGraeve et al. (1988) can be used to determine an LC50 value for each 96-hour
reference toxicant test. The LC50 values should be recorded on a control chart maintained in the laboratory (as
described previously in section 3.2.5 of this document). Precision then can be described by the LC50 mean, standard


-------
Section 7
Page 3 of 12
Revision 2
May 1993

deviation, and percent relative standard deviation (coefficient of variation, or CV) of the five (or more) replicate
reference toxicant tests. If the laboratory fails to achieve an acceptable level of precision in the five preliminary
reference toxicant tests, the test procedure should be examined for defects and the appropriate corrective actions should
be taken. Precision is considered acceptable when the LC50 values for five consecutive reference toxicant tests fall
within the 95% confidence interval warning limits on the control chart. Additional tests should be performed until
acceptable precision is demonstrated.

The second series of tests which must be performed successfully prior to the testing of actual samples are 10-
day, "non-toxicant" exposures of Ampelisca abdita, in which test chambers contain the control sediment and seawater
that will be used under actual testing conditions. These "control" tests should be performed concurrent with the
reference toxicant tests used to assess single laboratory precision. At least five replicate test chambers should be used
in each test. The tests should be run in succession until two consecutive tests each have mean survival equal to or
greater than 85% and survival in the individual test chambers is not less than 80%. These are the control survival rates
which must be achieved during actual testing if a test is to be considered acceptable (ASTM 1991; see section 7.3.5);
therefore, the results of this preliminary demonstration will provide evidence that facilities, water, control sediment,
and handling techniques are adequate to result in successful testing of samples. The testing facility is required to
submit the results of the initial demonstration of performance to the Province Manager and receive written approval
to initiate testing of actual samples.

7.3.3 Quality of Test Organisms

Amphipods used in the tests should appear to be healthy (i.e., active with full guts) and should be positively
identified to species by a qualified individual. If the amphipods are collected from the field prior to testing, they should
be obtained from an area known to be free of toxicants and should be held in clean, uncontaminated water and
facilities. Amphipods held prior to testing should be checked daily, and individuals which appear unhealthy or dead
should be discarded. If greater than 5% of the organisms in holding containers are dead or appear unhealthy during
the 48 hours preceding a test, the entire group should be discarded and not used in the test.


-------
Section 7
Page 4 of 12
Revision 2
May 1993

Test organisms should be as uniform as possible in age and size. For EMAP-E sediment toxicity testing,
juvenile Ampelisca abdita in the size range 2 to 4 mm should be used for testing. Only active, apparently healthy
individuals should be selected for testing; care should be taken not to select gravid females or males nearing sexual
maturity. To verify that the individuals used are of the appropriate size, at least one additional group of 20 to 30
amphipods must be sorted at random at the beginning of each test. This extra group should be preserved in 5%
buffered formalin or 70% ethanol for later length measurement. The length of each individual in the group should be
determined using a dissecting microscope and measuring from the base of the first antennae to the base of the telson.
The mean, standard deviation, and range of these length measurements should be used by laboratory personnel to verify
that correctly-sized individuals are being used in the tests; the length measurement data also should be reported along
with the results for each test.

The sensitivity of each batch of organisms obtained for testing must be evaluated with the reference toxicant
sodium dodecyl sulfate (SDS) in a short-term toxicity test performed concurrently with the sediment toxicity tests. The
use of SDS as the reference toxicant is required as a means of standardizing test results among different laboratories.
For Ampelisca abdita, a 96-hour reference toxicant test without sediment is used to generate LC50 values, as previously
described in section 7.3.2.

These LC50 values should be recorded on the same control chart used to record the results of the five (or more)
reference toxicant tests performed for the initial demonstration of capability. The control chart represents a "running
plot" of the toxicity values (LC50s) from successive reference toxicant tests. The mean LC50 and the upper and lower
warning and control limits (95% and 99% confidence interval around the mean, respectively) are recalculated with
each successive point until the statistics stabilize. Outliers, which are values which fall outside the upper and lower
control limits, are readily identified. The plotted values are used both to evaluate trends in organism sensitivity and
to verify the overall ability of laboratory personnel to obtain consistent results.

Reference toxicant test LC50 values which fall outside control chart limits should serve as a warning to
laboratory personnel. At the P=0.05 probability level, one in twenty tests would be expected to fall outside warning
limits by chance only. The laboratory should try to determine the cause of the outlying LC50 value, but a retest of the
samples is not necessarily required. If the reference toxicant test results are outside control chart limits on the next
consecutive test, the sensitivity of the organisms and the overall credibility of the test are suspect. The test procedure


-------
Section 7
Page 5 of 12
Revision 2
May 1993

again should be examined for defects and additional reference toxicant tests performed. Testing of samples should not
resume until acceptable reference toxicant results can be obtained; this may require the use of a different batch of test
organisms.

7.3.4	Test Conditions

Parameters such as water temperature, salinity (conductivity), dissolved oxygen, and pH should be checked
as required for each test and maintained within specified limits (U.S. EPA 1992, in revision). Instruments used for
routine measurements must be calibrated and standardized according to instrument manufacturer's procedures. All
routine chemical and physical analyses must include established quality assurance practices as outlined in Agency
methods manuals (U.S. EPA 1979a and b) and SOP's based upon them.

Overlying water must meet the requirements for uniform quality specified in the EMAP-E Laboratory Methods
Manual (U.S. EPA 1992, in revision). The minimum requirement for acceptable overlying water is that it allows
acceptable control survival without signs of organism disease or apparent stress (i.e., unusual behavior or changes in
appearance). The overlying water used in the sediment toxicity tests with Ampelisca should have a salinity of 30 %o
and may be natural seawater, diluted hypersaline brine prepared from natural seawater, or artificial seawater prepared
from sea salts. If natural seawater is used, it should be obtained from an uncontaminated area known to support a
healthy, reproducing population of the test organism or a comparably sensitive species.

7.3.5	Test Acceptability

Survival of organisms in control treatments should be assessed during each test as an indication of both the
validity of the test and the overall health of the test organism population. The amphipod tests with Ampelisca abdita
are acceptable if mean control survival is greater than or equal to 85 percent, and if survival in individual control test
chambers exceeds 80 percent. If these control survival rates are not achieved, the test must be re-run. Additional
guidelines for acceptability of individual sediment toxicity tests are presented in the EMAP-E Laboratory Methods
Manual (U.S. EPA 1992, in revision). An individual test may be conditionally acceptable if temperature, dissolved


-------
Section 7

Page 6 of 12
Revision 2
May 1993

oxygen (DO), and other specified conditions fall outside specifications, depending on the degree of the departure and
the objectives of the tests. Any deviations from test specifications must be noted in a cover letter when reporting the
data so that a determination can be made of test acceptability by the Virginian Province Manager.

7.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT

7.4.1	Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling
of sample containers, recording sampling information in the field and tracking sample shipments. A complete
description of this system is provided in the EMAP-E Information Management Plan (Adams et al. 1993) and also is
summarized in Section 11 of this plan. The laboratory responsible for performing the sediment toxicity tests must
designate a sample custodian, authorized to check the condition of and sign for the incoming field samples, obtain
documents of shipment and verify sample custody records. This individual is required, upon receipt of samples, to
record and transmit all tracking information to the Province Information Management center. The use of barcode
labels and readers provided by the Province will facilitate this process. Laboratory personnel must adhere to the
required sample holding times and conditions for sediment toxicity samples, and there must be clearly-defined custody
procedures to establish and document sample handling, storage, and disbursement in the laboratory.

7.4.2	Record Keeping and Data Reporting Requirements

It is mandatory for the toxicity testing facility to maintain thorough and complete records. Bound notebooks
must be used to maintain records of the test organisms such as species, source, age, date of collection and/or receipt,
and other pertinent information relating to their history and health, and information on the calibration of equipment
and instruments, test conditions employed, size of organisms used in the test and test results. Signed annotations
should be made on a real-time basis to prevent loss of information.

Laboratory personnel should verify that all specified QA/QC requirements are met for a given test, or, if not,
that specified corrective actions are implemented and problems resolved, before proceeding with subsequent tests. In


-------
Section 7
Page 7 of 12
Revision 2
May 1993

addition, each laboratory must establish a system for detecting and eliminating transcription or calculation errors and
assigning data qualifier codes prior to reporting data. It is recommended that an individual not involved directly in
sample processing be designated as laboratory QA Officer to perform these verification checks independent of day-to-
day laboratory operations.

The laboratory should submit only data which either have met all QA requirements or have been qualified
properly using designated QA codes. Samples will be retested whenever QA requirements have not been met, and only
the results of the retesting (meeting QA requirements) should be submitted. The laboratory should report the results
for all successfully-tested samples both in hard copy and in a computer-readable format specified by the Province
Information Manager. At a minimum, the following information should be included:

•	EMAP sample ID, laboratory sample ID (if applicable), laboratory test number (allows EMAP to identify all
field samples and associated controls comprising a single test), organism percent mortality for each replicate,
mean percent mortality for each sample, and results of the significance test for toxicity (t-test of each sample
versus the control).

•	Data for all water quality measurements made during testing (i.e., dissolved oxygen, temperature, salinity,
and pH) and for all QA/QC variables, such as tabulated reference toxicant test results and associated control
charts and the mean, standard deviation, and range in length of the organisms, should be submitted by the
laboratory along with the test results.

•	Both the hard copy and electronic data packages include a cover letter with a summary of all quality control
checks performed and a narrative explanation of any problems that may have influenced data quality.

7.4.3 Data Evaluation Procedures

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the data evaluation procedures identified in the following paragraphs are completed, notify the laboratory of any
additional information or corrective actions deemed necessary as a result of the Province's data evaluation and,
following satisficatoiy resolution of all "corrective action" issues, take final action by notifying the laboratory in writing


-------
Section 7
Page 8 of 12
Revision 2
May 1993

that the submitted results have been officially accepted as a completed deliverable in fulfillment of contract
requirements. It may be necessary or desirable for the Province Manager to delegate the technical evaluation of the
data to the QA Coordinator or other qualified staff member. It is the responsibility of the Province QA Coordinator
to monitor closely and formally document each step in the data evaluation process as it is completed. This
documentation should be in the form of a data evaluation tracking form or checklist that is updated as each step is
completed. This checklist should be supplemented with detailed memos to both the paper and electronic project file
outlining the concerns with data omissions, analysis problems, or descriptions of questionable data identified by the
laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten and (if holding times have been exceeded) can
sometimes limit options for reanalysis. The first part of data evaluation is to verify that all required information has
been provided in the data package. First, Province personnel should verily that the package contains the following:
a cover letter signed by the laboratory manager, hard copies of all results (including copies of control charts and other
QA/QC results), and accompanying computer diskettes. Second, the electronic data file(s) should be parsed and
entered into the EMAP Province database (SAS datasets) to verily that the correct format has been supplied. Third,
once the data has been transferred to the Province database, automated checks should be run to verily that results have
been reported for all expected samples and that no errors occurred in converting the data into SAS datasets. This can
be accomplished by visual comparision of SAS printouts and frequency distributions versus printouts of the original
data supplied by the laboratory. The printouts should be used to re-verify the completeness of the data sets and to verify
that values reported for all variables are correct.

The Province Manager should contact the laboratory and request any missing information as soon as possible
after receipt of the data package. If information was omitted because required analyses were not completed, the
laboratory should provide and implement a plan to correct the deficiency. This plan may include submittal of a revised
data package and possible reanalysis of samples.

Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for sediment toxicity testing data should consist of the
following:


-------
Section 7

Page 9 of 12
Revision 2
May 1993

•	A random check of 20% of the reported results to verify that the statistical test of significance (t-test) was
performed without error by the laboratory. If no errors are found, it can be assumed that this test was applied
correctly to all results and no further action is necessary. If one or more errors are found, the significance
tests for the entire data set must be recalculated and a request made to the laboratory for a written explanation
of the error(s) and a corrective action plan

•	A review of the water quality data submitted as part of the data package to verily that all specified test
conditions were met.

•	The QA/QC data submitted as part of the data package should be reviewed to verify that specified limits for
control survival and/or reference toxicant test LC50 values were not exceeded, or, if exceeded, that the proper
data qualifier codes were assigned by the laboratory (explained in the following section).

7.4.4 Assigning Data Qualifier Codes

Data qualifier codes are notations used by laboratories and data reviewers to evaluate, describe, or qualify, data
and the systems producing data. To date, EMAP-E has developed a limited list of data qualifier codes which are
allowed in situations where the laboratory either experienced an exceedance of a quality control limit or there was a
minor deviation from the required test design or test conditions. Normally, when control limits are exceeded the
laboratory is required to repeat the test for the samples in question. However, limitations on the amount of sample
collected sometimes prevent retesting and data qualifier codes are required. Qualified data are still usable for most
assessment purposes, but data users are alerted to the possible limitations on data use and can make their own
judgements. The qualifier codes developed for EMAP-E sediment toxicity data are listed in Table 7-1 and explained
in the following section. Personnel at the toxicity testing facility are responsble for reviewing the data and assigning
all of these qualifier codes, except for the ST-L code, prior to submitting the data package to EMAP-E.


-------
Section 7

Page 10 of 12

Revision 2
May 1993

Table 7-1. Qualifier Codes for EMAP-E Sediment Toxicity Data.

Code

Description

ST-C

ST-D

ST-E

ST-G

ST-H

ST-I

ST-J

ST-K

ST-L

Fewer than 5 replicates were tested

Mean control survival less than 85%

Sample held for longer than 30 days prior to testing

No reference toxicant test was run

Hardness and alkalinity not measured (Virginian Province 1990 only)

Control survival in one replicate was less than 80%

Physical parameters out of bounds

Less than 20 animals used per replicate

Not used in Province Assessment

The ST-C code is assigned to the results for a given sample whenever the laboratory must use fewer than the
required 5 replicates for that sample in a test. This usually occurs for a limited number of samples where an
insufficient amount of sediment has been collected for testing. At a minimum, three replicates must be used for a
sample's results to be considered valid, as this will still allow the laboratory to perform the statistical test for
significance at test completion. Results flagged with the ST-C code will be used for EMAP-E assessments.

The ST-D code is assigned to the results for all samples from a given test when the mean survival in the test
control was less than the required 85%. Normally, this invalidates the results for the test and a re-test is required, but
the ST-D code is assigned when re-testing cannot be performed because there is insufficient sample remaining or
sample holding times have been grossly exceeded. Results flagged with the ST-D code typically are not used for
EMAP-E assessments and are of limited value.

The ST-G code is assigned to all samples from a test in which the laboratory failed to conduct the associated
48-hr, water-only reference toxicant test, as required. The reference toxicant test represents an important "positive"
control which is used to assess both laboratory performance and the relative sensitivity of the test organisms. Failure
to conduct this test represents an omission that does not necessarily invalidate the test results, but will necessitate closer
scrutiny of the laboratory's control charts of the reference toxicant test results and a review of procedures. This check


-------
Section 7
Page 11 of 12
Revision 2
May 1993

will be done to verify that all reference toxicant tests which were performed had results within required control chart
limits. Results flagged with the ST-G code typically will be used for EMAP-E assessments.

The ST-H code is assigned to certain results from the freshwater amphipod tests conducted as part of the
Virginian Province 1990 Demonstration Project. The code indicates that the laboratory failed to measure hardness and
alkalinity in the test chambers, as required. This does not necessarily invalidate the test, and results flagged with this
code have been used for EMAP-E assessments in the past. The freshwater amphipod test was only used in 1990 and
is no longer conducted in the program.

The ST-I code is assigned to all results from a test in which survival in one of the control replicates was less
than the required 80%. The laboratory normally is required to repeat the test whenever this occurs, but this may not
always be possible. If the mean control survival in the test was greater than 85%, then the data are used by EMAP-E
for assessment purposes, but data users should be aware that all QA/QC requirements for control survival were not met.

The ST-J code is allowed in a limited number of situations where there was minor exceedance of a required
control limit for one of the physical parameters measured in each test (e.g., dissolved oxygen, temperature, salinity,
or pH). Minor exceedances typically do not invalidate the test results, but the laboratory must provide a written
explanation of the nature and extent of the exceedance. Based on this explanation, the Province Manager, in
consultation with the Province QA Officer or others, will make the final decision to allow or disallow this code
assignment. The laboratory may be required to repeat the test in certain instances. Results qualified with this code
are used for EMAP-E assessments.

The ST-K code is assigned to the results for any sample where the laboratory failed to use the required 20
animals per test chamber. This can occur when the laboratory failed to collect or receive from a supplier an adequate
number of organisms to conduct a given test. In these instances, it is preferable to conduct the test with a fewer number
of organisms in each test chamber than to use organisms which are unhealthy or outside the acceptable size (age)
range. Results from tests in which fewer than 20 organisms were used per replicate typically are usable for most
assessment purposes.

The ST-L code is assigned to results which are not acceptable for use in Province assessments (e.g., Annual


-------
Section 7
Page 12 of 12
Revision 2
May 1993

Statistical Reports or Assessment Reports). Typically, results from tests in which mean control survival was less than
the required 85% (ST-D code) are considered invalid and are not used for assessment purposes.

7.4.5 Data Quality Reports

All QA/QC data and interpretive commentary associated with EMAP-E sediment toxicity testing will be
readily available in the database along with the results data, so that interested data users can perform their own
assessment of data usability. Upon completion of all data evaluation steps, a report summarizing the QA review of the
data package should be prepared, samples should be properly stored or disposed of, and laboratory data should be
archived both in a storage file and in the database. Reports documenting the results of the review of the data package
should summarize all conclusions concerning data acceptability and should note significant quality assurance problems
that were found. These reports are useful in providing data users with a written explanation of why certain data
qualifier codes were assigned and/or why some data were rejected. The following specific items should be addressed
in the QA report:

•	Summary of overall data quality, including a description of data that were qualified.

•	Brief descriptions of sample collection and testing methods.

•	Description of data reporting, including any corrections made for transcription or other reporting errors, and

description of data completeness relative to objectives stated in the QA plan.

The sediment toxicity testing QA reports will be included in the Program Quality Assurance Report and will
also become a permanent part of the database documentation (i.e., metadata).


-------
Section 8
Page 1 of 13
Revision 2
May 1993

SECTION 8

MACROBENTHIC COMMUNITY ASSESSMENT

8.1	OVERVIEW

This section presents EMAP-Virginian Province QA/QC protocols and requirements for macrobenthic
community assessment, from sample collection and laboratory analysis to validation of the resultant data and
construction of a benthic index. Replicate benthic samples are obtained at each station, representing the contents of
different individual grab samples. Each sample is processed individually in the laboratory to obtain an accurate
assessment of the number of individuals of each species present and their biomass (i.e., weight). This information is
then aggregated in various ways to construct a benthic index to discriminate between degraded and undegraded
estuarine conditions.

8.2	QUALITY CONTROL PROCEDURES: SAMPLE COLLECTION,
PRESERVATION AND HOLDING

Sediment samples for macrobenthic community assessments will be collected at each station using a Young-
modified Van Veen grab sampler. In order to be considered acceptable, each grab sample must be obtained following
the protocols specified in the Virginian Province Field Operations and Safety Manual (Reifsteck el al. 1993). In
particular, field personnel should be thoroughly trained in the proper techniques for sieving and sample preservation
(using a stained and buffered formalin solution). In addition, each sediment sample must be inspected carefully before
being accepted for benthic community assessment. Each of the following acceptability criteria must be satisfied (from
U. S. EPA 1991):

•	Sediment should not be extruded from the upper face of the sampler such that organisms may be lost.

•	Overlying water should be present (indicates minimal leakage)

•	The sediment surface should be relatively flat (indicates minimal disturbance or winnowing)

•	The entire surface of the sample should be included in the sampler

•	The grab sampler must have penetrated the sediment to a minimum depth of 7 cm

If a grab sample does not meet any one of these criteria, it should be rejected.


-------
Section 8
Page 2 of 13
Revision 2
May 1993

In the laboratory, stored samples must be easily retrieved and protected from environmental extremes.
Samples cannot be allowed to freeze and should be stored above 5 °C to prevent the formation of paraformaldehyde.
Temperatures greater than 30 °C should be avoided so as to retard evaporative losses. Stored and archived samples
should be checked once every three months for excessive evaporative losses due to loosely-fitting or cracked container
lids, or inadequately sealed jars. Exposure to direct sunlight should be minimized since long-term exposure can
degrade the vital stain rose bengal.

8.3 QUALITY CONTROL PROCEDURES: LABORATORY OPERATIONS

In the laboratory, QA/QC involves a series of check systems for organism sorting, counting and taxonomic
identification. These checks are described briefly in the following sections; more complete details can be found in the
EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation).

8.3.1 Sorting

The quality control check on each technician's efficiency at sorting (i.e.. separating organisms from sediment
and debris) consists of an independent resort by a second, experienced sorter. A minimum of 10% of all samples sorted
by each technician must be resorted (i.e., the sediment and debris remaining after the original sort is completely re-
examined) to monitor performance and thus provide feedback necessary to maintain acceptable standards. These
resorts should be conducted on a regular basis on at least one sample chosen at random from each batch of 10 samples
processed by a given sorter. Inexperienced sorters require a more intensive QC check system. It is recommended that
experienced sorters or taxonomists check each sample processed by inexperienced sorters until proficiency in organism
extraction is demonstrated. Once proficiency has been demonstrated, the checks may be performed at the required
frequency of one every ten samples. Bound laboratory logbooks must be maintained and used to record the number
of samples processed by each technician, as well as the results of all sample re-sorts.


-------
Section 8
Page 3 of 13
Revision 2
May 1993

For each sample that is resorted, percent sorting efficiency should be calculated using the following formula:

	# of organisms originally sorted	 x 100

# of organisms originally sorted + additional # found in resort

The results of sample resorts may require that certain actions be taken for specific technicians. If sorting
efficiency is greater than 95%, no action is required. If sorting efficiency is between 90% and 95%, problem areas
should be identified and the technician should be re-trained. Laboratory supervisors must be particularly sensitive to
systematic errors (e.g., consistent failure to extract specific taxonomic groups) which may suggest the need for further
training. Sorting efficiencies below 90% will require resorting and recounting of all samples in the associated batch
and continuous monitoring of that technician to improve efficiency.

If sorting efficiency is less than 90%, organisms found in the resort should be added to the original data sheet
and, if possible, to the appropriate vials for biomass determination. If sorting efficiency is 90% or greater, the QC
results should be recorded in the appropriate logbook, but the animals should be kept separate from the original sample
and not used for biomass determinations. Once all quality control criteria associated with the sample resort have been
met, the sample residue (e.g., sediment and debris) may be discarded.

8.3.2 Species Identification and Enumeration

Only senior taxonomists are qualified to perform re-identification quality control checks. A minimum of 10%
of all samples (i.e., one sample chosen at random out of every batch of ten samples) processed by each taxonomic
technician must be checked by a second qualified taxonomist to verify the accuracy of species identification and
enumeration. This control check establishes the level of accuracy with which identification and counts are performed
and offers feedback to taxonomists in the laboratory so that a high standard of performance is maintained. Samples
should never be re-checked by the technician who originally processed the sample.


-------
Section 8
Page 4 of 13
Revision 2
May 1993

Ideally, each batch of ten samples processed by an individual taxonomic technician should be from a similar
habitat type (e.g., all oligohaline stations). The recheck of one out of the ten samples in a batch should be done
periodically and in a timely manner so that subsequent processing steps (e.g., biomass determinations) and data entry
may proceed. As each taxon is identified and counted during the recheck, the results should be compared to the
original data sheet. Discrepancies should be double-checked to be sure of correct final results. Following re-
identification, specimens should be returned to the original vials and set aside for biomass determination.

When the entire sample has been re-identified and recounted, the total number of errors should be computed.
The total number of errors will be based upon the number of misidentifications and miscounts. Numerically, percent
accuracy will be represented in the following manner:

Total # of organisms in OC recount - Total number of errors x 100
Total # of organisms in QC recount

where the following three types of errors are included in the total # of errors:

1.)	Counting errors (for example, counting 11 individuals of a given species as 10).

2.)	Identification errors (for example, identifying Species X as Species Y, where both are present)

3.)	Unrecorded taxa errors (for example, not identifying Species X when it is present)

Each taxonomic technician must maintain an identification and enumeration accuracy of 90% or greater
(calculated using the above formula). If results fall below this level, the entire sample batch must be re-identified and
counted. If taxonomic efficiency is between 90% and 95%, the original technician should be advised and species
identifications reviewed. All changes in species identification should be recorded on the original data sheet (along with
the date and the initials of the person making the change) and these changes should be entered into the database.
However, the numerical count for each taxonomic group should not be corrected unless the overall accuracy for the
sample is below 90%. Additional details on this protocol are provided in the EMAP-E Laboratory Methods Manual
(U.S. EPA, in preparation). The results of all QC rechecks of species identification and enumeration should be
recorded in a timely manner in a separate logbook maintained for this purpose.


-------
Section 8
Page 5 of 13
Revision 2
May 1993

Taxonomic identifications should be consistent within a given laboratory, and with the identifications of other
regional laboratories. Consistent identifications are achieved by implementing the procedures described above and by
maintaining informal, but constant, interaction among the taxonomists working on each major group. As organisms
are identified, a voucher specimen collection should be established. This collection should consist of representative
specimens of each species identified in samples from an individual Province in a given year. For some species, it may
be appropriate to include in the voucher specimen collection individuals sampled from different geographic locations
within the Province. At the end of the year, the voucher specimen collection should be sent to recognized experts for
verification of the laboratory's taxonomic identifications. The verified specimens should then be placed in a permanent
taxonomic reference collection Continued collection of verified species does not require additional expert verification,
because the reference collection can be used to confirm the identification. In addition, the reference collection should
be used to train new taxonomists. Participation of the laboratory staff in a regional taxonomic standardization program
(if available) is recommended, to ensure regional consistency and accuracy of identification.

The laboratory is required to notify the Virginian Province Manager of any taxonomic identification errors
discovered by outside experts, as this may necessitate database corrections. Such corrections will be made only after
further consultation with the laboratory personnel and the outside expert(s) and will be supported by written
documentation which clearly explains the nature of and rationale for the changes.

All specimens in the reference collection should be preserved in 70% ethanol in labeled vials that are
segregated by species and sample. More than one specimen may be in each vial. The labels placed in these vials
should be made of waterproof, 100-percent (at least) rag paper and filled out using a pencil. Paper with less than a
100-percent rag content or that is not waterproofed will disintegrate in the 70-percent alcohol mixture. It is important
to complete these labels, because future workers may not be familiar with the project, station locations, and other details
of the work in progress. In addition, the reverse side of the label should contain information about the confirmation
of the identification by experts in museums or other institutions (if appropriate). To reduce evaporation of alcohol, the
lids of vials and jars can be sealed with plastic tape wrapped in a clockwise direction. The species (and other
taxonomic designation) should be written clearly on the outside and on an internal label. Reference specimens should
be archived alphabetically within major taxonomic groups. A listing of each species name, the name and affiliation
of the person who verified the identification, the location of the individual specimen in the laboratory, the status of the


-------
Section 8

Page 6 of 13
Revision 2
May 1993

sample if it has been loaned to outside experts, and references to pertinent literature should be maintained by the
laboratory performing the identifications.

Reference collections are invaluable, and should be retained at the location where the identifications were
performed. In no instance should this collection be destroyed. A single person should be identified as curator of the
reference collection and should be responsible for its integrity. Its upkeep will require periodic checking to ensure that
alcohol levels are adequate. When refilling the jars, it is advisable to use full-strength alcohol (i.e., 95 percent),
because the alcohol in the 70-percent solution will tend to evaporate more rapidly than the water.

8.3.3 Biomass Measurements

Performance checks of the balance used for biomass determinations should be performed routinely using a
set of standard reference weights (ASTM Class 3, NIST Class S-l, or equivalents). In addition, a minimum of 10%
of all pans and crucibles in each batch processed by a given technician must be re-weighed by a second technician as
a continuous monitor on performance. Samples to be reweighed should be selected randomly from the sample batch;
the results of the reweigh should be compared against the original final weight recorded on the biomass data sheet.
Weighing efficiency should be calculated using the following formula:

Original final weight x 100
Reweighed final weight

Based on results typically obtained by experienced technicians, if weighing efficiency is between 95% and
105%, the sample has met the acceptable quality control criteria and no action is necessary. If weighing efficiency is
between either 90% to 95% or 105% to 110%, the sample has met acceptable criteria, but the technician who completed
the original weighing should be consulted and proper measurement practices reviewed. If the weighing efficiency is
less than 90% or greater than 110%, the sample has failed the quality control criteria and all samples in the associated
batch must be reweighed (following technician retraining and/or troubleshooting of laboratory equipment to determine
and eliminate the source(s) of the inconsistency). Corrections to the original data sheet should only be made in those


-------
Section 8
Page 7 of 13
Revision 2
May 1993

cases where weighing efficiency is less than 90% or greater than 110%. The results of all QC reweighings should be
recorded in a timely manner in a separate logbook or data sheet and maintained as part of the documentation associated
with the biomass data.

8.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT

8.4.1	Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling
of sample containers, recording sampling information in the field and tracking sample shipments. A complete
description of this system is provided in the EMAP-E Information Management Plan (Adams et al. 1993) and also
summarized in Section 11 of this plan. The laboratory responsible for processing the macrobenthic community samples
must designate a sample custodian, authorized to check the condition of and sign for the incoming field samples, obtain
documents of shipment and verify sample custody records. This individual is required, upon receipt of samples, to
record and transmit all tracking information to the Province Information Management center. The use of barcode
labels and readers provided by the Province will facilitate this process. In addition, the laboratory must have clearly-
defined custody procedures for sample handling, storage, and disbursement in the laboratory and must maintain
accurate and timely records of the location and status of all samples.

8.4.2	Record Keeping and Data Reporting Requirements

It is mandatory for the laboratory responsible for processing the macrobenthic community samples to maintain
thorough and complete records. All data generated in the laboratory should be recorded directly onto standardized data
forms, modeled after those presented in the EMAP Laboratory Methods Manual (U. S. EPA, in preparation). These
forms are prepared for eachbenthic sample prior to laboratory processing and are already filled out with species names,
the biomass group for each species and an 8-character code for each species consisting of the first four letters each of
the genus and species names. Preparation of data sheets prior to sample processing facilitates sample tracking, sample
processing, QA/QC procedures, and data entry and helps to minimize transcription and other errors. Data forms
should be designed so that all necessary information is recorded clearly and unambiguously; data should be recorded
in ink and signed by the responsible person. Data forms should be linked to specific samples using the bar coded


-------
Section 8
Page 8 of 13
Revision 2
May 1993

sample numbers assigned by the Province Information Management team prior to field sampling. Completed data
sheets and QA/QC forms should be kept in bound notebooks arranged by type; these forms should be made available
to the Province Manager upon request and will be inspected for adequacy during QA audits.

Laboratory managers should verify that all specified QA/QC requirements are met for a given batch of
samples, or, if not, that specified corrective actions are implemented and problems resolved, before a technician is
permitted to proceed with sample processing. The laboratory must establish a comprehensive information
management system that allows responsible personnel to detect and eliminate transcription and/or calculation errors
prior to submission of the final data package in computer readable format. This might include, for example, data entry
procedures that involve double entry of information from the laboratory datasheets into separate databases and
subsequent comparison to ensure a high level of data transcription accuracy. Data transcription errors also can be
minimized through the use of computer data entry forms that duplicate or closely mirror the format of the hard copy
data sheets used in the laboratory. The laboratory's manager or QA Officer should perform manual checks on a random
subset of all transcribed data sheets (at least 10% of the total) to verify transciption accuracy.

The laboratory should report the results for all samples both in hard copy and in a computer-readable format
specified by the Province Information Manager. At a minimum, the following information should be included: EMAP
sample ID, laboratoiy sample ID (if applicable), numbers of individuals per sample for each species (i.e, abundance),
and biomass measurements for each biomass group expressed in dry weight to the 0.1 mg. Tables summarizing the
results of QC checks (e.g., resorts, recounts, reidentifications and reweighings) must be included as part of the data
package, as well as a cover letter signed by the Laboratory Manager containing a narrative explanation of any problems
that may have influenced data quality.

8.4.3 Data Evaluation Procedures

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the data evaluation procedures are completed, notify the laboratory of any additional information or corrective
actions deemed necessary as a result of the Province's data evaluation and, following satisficatory resolution of all
"corrective action" issues, take final action by notifying the laboratory in writing that the submitted results have been


-------
Section 8

Page 9 of 13
Revision 2
May 1993

officially accepted as a completed deliverable in fulfillment of contract requirements. It may be necessary or desirable
for the Province Manager to delegate the technical evaluation of the data to the QA Coordinator or other qualified staff
member. It is the responsibility of the Province QA Coordinator to monitor closely and formally document each step
in the data evaluation process as it is completed. This documentation should be in the form of a data evaluation
tracking form or checklist that is filled in as each step is completed. This checklist should be supplemented with
detailed memos to the project file outlining the concerns with data omissions, analysis problems, or descriptions of
questionable data identified by the laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten. The first part of data evaluation is to verily that
all required information has been provided in the data package. First, Province personnel should verily that the
package contains the following: a cover letter in both electronic (i.e., computer text file) and paper formats (signed by
the laboratory manager), hard copies of all results (including tables summarizing the results of all QA/QC checks),
and accompanying computer diskettes. Second, the electronic data file(s) should be parsed into the EMAP Province
database (SAS datasets) to verily that the correct format has been supplied. Third, once the data has been transferred
to the Province database, automated checks should be run to verily that results have been reported for all expected
samples and that no errors occurred in converting the data into SAS datasets. This can be accomplished by visual
comparision of SAS printouts against printouts of the original data supplied by the laboratory. The printouts should
be used to verify the completeness of the data sets and to verify that values reported for all variables are correct.

The Province Manager should contact the laboratory and request any missing information as soon as possible
after receipt of the data package. If information was omitted because required analyses were not completed, the
laboratory should provide and implement a plan to correct the deficiency. This plan may include submittal of a revised
data package and possible reanalysis of samples.

Data validation, or the process of assessing data quality, should begin after Province personnel have
determined that the data package is complete. Data validation for the benthic community assessment should consist
of a thorough review of the summarized QA/QC data submitted as part of the data package to verify that specified
control limits for sample resorts, species recounts and reidentifications, and biomass reweighings were not exceeded,


-------
Section 8
Page 10 of 13
Revision 2
May 1993

or, if exceeded, that specified corrective actions were implemented and are explained in adequate detail in an
accompanying cover letter. If all specified control limits were met during sample processing and/or problems
adequately explained, the data can be accepted for use without qualification. To date, no data qualifier codes have been
needed for the Virginian Province benthic community data sets.

8.4.4 Data Quality Reports

All QA/QC data associated with the laboratory processing of benthic samples will be presented in Virginian
Province reports and publications along with the results data, so that interested data users can make their own
assessment of data usability. Upon completion of all data evaluation steps, a report summarizing the QA review of the
data package should be prepared, samples should be properly stored or disposed of, and laboratory data and associated
commentary should be archived both in a storage file and in the database. Reports documenting the results of the
review of the data package should summarize all conclusions concerning data acceptability and should note significant
quality assurance problems that were found. These reports are useful in providing data users with a written explanation
of why certain data qualifier codes were assigned and/or why some data was rejected. The following specific items
should be addressed in the QA report:

•	Summary of overall data quality, including a description of data that were qualified.

•	Brief descriptions of sample collection and testing methods.

•	Description of data reporting, including any corrections made for transcription or other reporting errors, and

description of data completeness relative to objectives stated in the QA plan.

The benthic community assessment QA data will be presented in the Quality Assurance section of the Province
Annual Statistical Summary and will also become a permanent part of the database documentation (i.e., metadata).

8.5 DEVELOPMENT AND VALIDATION OF THE BENTHIC INDEX

Benthic assemblages have many attributes that make them reliable and sensitive indicators of the ecological
condition of estuarine environments. Based on this supposition, the EMAP-E Program is attempting to construct a


-------
Section 8
Page 11 of 13
Revision 2
May 1993

benthic index which reliably discriminates between degraded and undegraded estuarine conditions. Construction of
a benthic index and subsequent validation of the index are ongoing processes both in the Virginian and Louisianian
Provinces. EMAP-E's first attempt at construction of a benthic index occurred in 1991 using benthic community
abundance and biomass data collected as part of the 1990 Virginian Province Demonstration Project. Detailed
descriptions of the methods used to construct the 1990 benthic index and subsequently to validate this index are
provided in the 1990 Demonstration Project Report (Weisberg et al. 1993) and in a series of Virginian Province
documents (Rosen 1993; U.S. EPA, in prep.). Briefly, the following major steps are followed in constructing and
validating the benthic index:

1.)	Degraded and undegraded (i.e., reference) stations are identified on the basis of measured near-bottom
dissolved oxygen concentrations, sediment contaminant concentrations and sediment toxicity.

2.)	A list of "candidate" parameters is developed using the species abundance and biomass data. This list
includes metrics having ecological relevance (e.g., species diversity indices, numbers of suspension-feeding
organisms, numbers of deposit-feeding organisms, etc.) that potentially might be used to discriminate between
degraded and reference areas.

3.)	A value for each candidate parameter is calculated for each of the previously-identified degraded and reference
stations.

4.)	A series of t-tests is performed to reduce the list of candidate parameters to a manageable number from which
it is highly probable that a subset(s) can be identified to discriminate reliably between degraded and
undegraded areas.

5.)	The parameters resulting from step 4 are entered into a canonical discriminant analysis to develop a
discriminant function incorporating those parameters which best discriminate degraded and reference areas.
As part of this iterative process, the frequency with which reference sites are incorrectly classified as degraded
(i.e., false positives), and the frequency with which degraded sites are classified as reference areas (i.e., false
negatives) is calculated.

6.)	The index is scaled so that values range between 1 and 10 (for ease of understanding). The mean between
the highest value which reliably discriminates the degraded stations and the lowest value which reliably
discriminates the reference stations is defined as the critical value. A discriminant score is then calculated
for the aprior degraded and reference stations to determine rates of correct and incorrect classification. In
addition, a cross-validation procedure is performed in which each station is removed from the calibration data
set and used as a test case for validation.

7.)	The index is validated using an independent data set (e.g., a different set of degraded and reference stations
from the set used to construct the index) to determine rates of correct and incorrect classification (i.e.,
classification efficiency). If the rate of correct classification is unacceptably low (i.e., less than 80%), the


-------
Section 8
Page 12 of 13
Revision 2
May 1993

index is reconstructed and eventually re-validated beginning at the first step. The objective is to construct a

benthic index which consistently results in high rates of correct classification (i.e., at least greater than 80%).

From a quality assurance perspective, there are several important issues that must be addressed in the
development and application of the benthic index. These issues exist at several levels. At the most basic level,
construction of the benthic index can be viewed as a multistep process involving many data manipulations (i.e., several
levels of data aggregation and calculation of numerous parameters) followed by an iterative series of statistical tests.
At this level, a concomitant series of independent checks must be performed to verify that each of the many data
transformations and aggregations are performed without error. In addition, it is important to verify that certain data
aggregations and calculations which are "generic" in nature are performed in a manner that is consistent and
comparable between years and among different Provinces. Principal investigators, with the oversight of the Province
QA Coordinator, are responsible for developing a system of independent checks and for confirming and documenting
that they are implemented at each step in the construction of the benthic index. As a required part of this verification
procedure, the personnel directly involved in constructing the index must provide, for review, detailed written
documentation of each step, including documentation of computer programs that are used to manipulate data and
perform calculations.

It is also essential in construction of the benthic index that there is consistency between years and among
Provinces in the statistical methods employed. As part of the required series of checks prescribed above, there should
be an indepedent review of these procedures by one or more qualified individuals who are not directly involved in
constructing the index. There are two aspects to this review. First, there should be independent verification that the
correct statistical tests are being employed. Second, there should be verification that the chosen statistical tests are
being performed correctly. Again, it is the responsibility of the Province QA Coordinator to confirm and document
that these independent reviews are conducted.

Another potential QA/QC concern with respect to the benthic index is the classification of different species
into certain descriptive categories based on their presumed ecological niche or behavioral characteristics (e.g., "deposit
feeder", "suspensionfeeder", "equilibrium species", "opportunistic species", etc.). This categorization is accomplished
using information from the scientific literature supplemented by expert opinion. Because reliance on expert opinion
introduces a certain level of subjectivity into the process of constructing a benthic index, it is important that adequate
documentation be developed to justify the species classifications used at any given time. Personnel responsible for


-------
Section 8
Page 13 of 13
Revision 2
May 1993

constructing the index should enlist the help of one, or, preferably, several qualified benthic ecologists in classifying
species and preparing this documentation.

On another level, a primary concern regarding the benthic index is how well it fulfills the objective of
discriminating among degraded and undegraded estuarine conditions. This concern will be addressed on an continuous
basis, using the cross-validation and year-to-year independent validation steps (steps 6 and 7 above) which are integral
aspects of the ongoing iterative procedures involved in constructing an index. In future development of the index,
additional sites will be added to the calibration data set so that it includes the full range of environmental habitats and
stressors present. Furthermore, as more is learned about other measures that are effective for discriminating sites of
differing environmental quality, they can be incorporated into the calibrations. The flexibility of the index development
process will allow these additional selected measures to be incorporated so that eventually, a consistently high level
of classification efficiency will be achieved.


-------
Section 9

Page 1 of9
Revision 2
May 1993

SECTION 9

MEASUREMENTS OF FISH COMMUNITY STRUCTURE

AND PATHOLOGY

9.1	OVERVIEW

This section presents EMAP-Virginian Province QA/QC protocols and requirements for fish community
structure analyses, from sample collection and laboratory analysis to final validation of the resultant data. Collection
and analysis methods are documented in the 1993 Field Operations and Safety Manual (Reifsteck et al. 1993). Data
on species identification, enumeration, and length measurements are generated by the field crews, whereas pathology
data result from laboratory analyses.

Field crews are expected to conduct a "standard" 10-minute trawl at all stations. The contents of the net are
examined and fish identified to species, measured, and examined for evidence of gross external pathologies. Those
fish suspected of having a pathology are preserved in a fixative and shipped to a laboratory-based pathologist for further
examination.

9.2	QUALITY CONTROL PROCEDURES: FIELD OPERATIONS

9.2.1 Trawling

Fish community structure data (species identification, enumeration, and length) are significantly influenced
by the collection methods. It is therefore critical that strict adherence to prescribed sampling protocols be maintained.
Factors influencing the catch are gear, fishing the trawl, trawl duration, and trawl speed. All crews must be provided
with "standard" nets to assure comparability of gear, and the importance of keeping the trawl duration and speed within
established limits should be stressed during training. During sampling, crews must record "speed over bottom" and
trawl duration on the fish trawl datasheet. In addition, the computerized data acquisition system contains a clock
which automatically monitors trawl time and records the trawl duration in an electronic data file. As part of the crew


-------
Section 9

Page 2 of 9
Revision 2
May 1993

chiefs daily review of the electronic data, he/she must compare all data in the electronic file with those on the hard-
copy datasheets. Any discrepancies must be investigated and corrected.

Adherence to collection methodology will be monitored during initial certification of the field crew during
all subsequent audits and field inspections conducted by senior Program personnel during the sampling season.

9.2.2 Species Identification, Enumeration, and Length Measurements

Fish species identification, enumeration and individual lengths must be determined in the field following
protocols presented in the Virginian Province Field Operations and Safety Manual (Reifsteck el al. 1993). The quality
of fish identifications, enumerations and length measurements are assured principally through rigorous training and
certification of field personnel prior to sampling. Qualified taxonomists will provide independent confirmation of all
fish identifications, enumerations and length measurements made by crew members during laboratory training sessions.
Fish identifications, enumerations and length measurements also will be confirmed by the QA Coordinator, Province
Manager, or their designee(s) during field visits. In addition, each field crew is required to save two "voucher"
specimens of each species identified in the field. These voucher specimens should be preserved in fixative and sent
back to the Field Operations Center prior to the completion of each "work shift" throughout the field season. A
qualified fish taxonomist must verify the species identifications and provide immediate feedback to the field crews
whenever errors are found. All erroneous identifications for a given field crew should be corrected in the database,
and the crew informed of the taxonomist's findings prior to their returning to the field. Preserved voucher fish will
be saved to provide a reference collection for use in subsequent years' training.

The accuracy of length measurements and individual counts will be checked during all QA audits and field
visits conducted by senior Program personnel. To maintain a consistent level of field crew performance, the EMAP-E
program has established an overall accuracy goal of 90% (i.e., less than 10% errors) for all fish identifications,
enumerations and length measurements in a given sampling season. If this goal is not met, corrective actions will
include increased emphasis on training and more rigorous testing of field crews prior to the next year's sampling
season.


-------
Section 9

Page 3 of 9
Revision 2
May 1993

9.3 QUALITY CONTROL PROCEDURES: GROSS EXTERNAL PATHOLOGY
AND HISTOPATHOLOGY

Fish collected in standard trawls must be examined by the field crew for evidence of gross external pathologies
(lumps, growths, ulcers, and fin erosion) according to the protocols outlined in the Virginian Province Field Operations
and Safety Manual (Reifsteck et al. 1993). As with fish identification and enumeration, the quality of gross pathology
determinations can be assured principally through rigorous training and certification of field personnel prior to
sampling. Qualified pathologists will be responsible for planning and overseeing all crew training for this indicator.
Because of the potential difficulty in the proper identification of pathologies by inexperienced personnel, all definitive
examinations will be conducted by a qualified pathologist. Field crews will be instructed to observe all fish and
preserve any suspected of having one of the four pathologies listed above. These will be returned to the laboratory with
a sample ID tag and the suspected pathology noted.

Upon receipt of a sample at the laboratory, the pathologist will examine these fish and provide the QA
Coordinator with the results. When there is disagreement between the field observation and the pathologist's
interpretation, a second pathologist will be consulted to verify the results from the primary laboratory.

Crews also will be required to preserve "pathology-free" fish collected at selected stations for examination by
the pathologist to determine the potential error rate of "false negatives". Stations where these reference fish are to be
collected are listed in the Field Operations and Safety Manual (Reifsteck et al. 1993). Fish collected for
histopathological examination must be preserved according to the protocols described in the Field Operations and
Safety Manual. Failure to follow these protocols will result in inadequate penetration of the fixative into the internal
organs, rendering the samples useless.

A series of internal and external laboratory QC checks should be employed to provide verification of the fish
histopathology identifications. In laboratories having multiple pathologists, all cases bearing significant lesions should
be examined and verified by the senior pathologist. At least 5% of the slides read by one pathologist should be selected
at random and read by a second pathologist without knowledge of the diagnoses made by the initial reader. For the
external QC check, at least 5% of the slides should be submitted for independent diagnosis to a pathologist not involved


-------
Section 9

Page 4 of 9
Revision 2
May 1993

with the laboratory. These slides should represent the range of pathological conditions found during the study, and
the external pathologist should not be aware of the diagnoses made by the laboratory personnel.

Each laboratory also should maintain a reference collection of slides that represent every type of pathological
condition identified in the EMAP-E fish. Each of these slides should be verified by an external pathologist having
experience with the species in question. The reference slide collection then can be used to verify the diagnoses made
in future years to ensure intralaboratory consistency. The reference slides also can be compared with those of other
laboratories to ensure interlaboratory consistency. A reference collection of photographs also will be made for training
purposes.

9.4 QUALITY CONROL PROCEDURES: INFORMATION MANAGEMENT

9.4.1 Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling
of fish specimens, recording sampling information in the field, and tracking sample shipments. A complete description
of this system is provided in the EMAP-E Information Management Plan (Adams el al. 1993) and is also summarized
in Section 11 of this plan. Field crews must carefully and thoroughly complete all shipment datasheets and transmit
this information to the Field Operations Center during the next scheduled electronic transfer of data.

Each analytical laboratory receiving fish for verification of species identifications, gross pathology or further
histopathological examination must designate a sample custodian who is authorized to check the condition of and sign
for the incoming samples, obtain documents of shipment, and verify sample custody records. This individual is
required, upon receipt of fish samples, to record and transmit all tracking information to the Province Information
Management Center. The use of barcode labels and readers provided by the Province will facilitate this process. There
must be clearly-defined custody procedures for handling, storage, and disbursement of the fish samples in the
laboratory.


-------
Section 9

Page 5 of 9
Revision 2
May 1993

9.4.2	Data Reporting Requirements

All field data must be entered into the field computer within one day of collection. Crew chiefs must review
all data prior to electronic transfer to the Field Operations Center the following evening. Hard-copy original datasheets
must be returned to the Field Operations Center no later than the end of the crew's work shift.

Following laboratory examination of the fish, only data which have met QA/QC requirements should be
submitted to EMAP-E. Each data package submitted by the laboratory should consist of the following:

•	A cover letter providing a brief description of the procedures and instrumentation used for verification of
species identifications, gross pathology or further histopathological examination, as well as a narrative
explanation of any problems encountered or failure(s) to meet required quality control limits. A copy of the
cover letter in electronic format (i.e., computer-readable text file) must also be submitted.

•	Tabulated results in hard-copy form, including sample ID, external pathologies (only lumps, growths, ulcers,
fin erosion), and internal pathologies noted.

•	Tabulated results in computer-readable form (e.g., diskette) included in the same shipment as the hard-copy data,
but packaged in a diskette mailer to prevent damage. Data must be provided in a format acceptable to the Province
Information Manager for transfer to the Province database.

•	All QA/QC data (e.g., results of internal and external QC checks) must be submitted by the laboratory as part of
the data package, but should be included in separate tables and files from the actual data.

9.4.3	Data Evaluation Procedures

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the four data evaluation steps identified in the following paragraph are completed, notify the analytical laboratory
(or contract field coordinator) of any additional information or corrective actions deemed necessary as a result of the
Province's data evaluation, and, following satisfactory resolution of all "corrective action" issues, take final action by
notifying the laboratory or field operations coordinator in writing that the submitted results have been officially
accepted as a completed deliverable in fulfillment of contract requirements. It may be necessary or desirable for
additional personnel (e.g., the Province QA Coordinator) to assist the Province Manager in the technical evaluation
of the submitted data packages. While the Province Manager has ultimate responsibility for maintaining official


-------
Section 9

Page 6 of 9
Revision 2
May 1993

contact with the analytical laboratory and verifying that the data evaluation process is completed, it is the responsibility
of the Province QA Coordinator to closely monitor and formally document each step in the process as it is completed.
This documentation should be in the form of a data evaluation tracking form or checklist that is filled in as each step
is completed. This checklist should be supplemented with detailed memos to the project file (both hardcopy and in
electronic format) outlining the concerns with data omissions, analysis problems, or descriptions of questionable data
identified by the laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten. The following steps are to be followed and
documented in evaluating EMAP-E data:

1.	Checking data completeness (verification)

2.	Assessing data quality (validation)

3.	Assigning data qualifier codes

4.	Taking final actions

Checking Data Completeness

The first part of data evaluation is to verify that all required information has been provided in the data
package. For field-generated data (i.e., fish identification, enumeration, and length measurements), the crew chief
must review all data files to assure they are complete and correct prior to uploading the data to the Field Operations
Center. Once the data are received at the FOC, the Virginian Province data librarian should perform a 100%
comparison of the electronic files to the original hard-copy datasheets, followed by an additional 10% check. These
steps serve not only to ensure that all data contained on the datasheets are present in the database, but also as a check
against transcription errors.

EMAP-E laboratories are expected to submit data which have already been tabulated and 100% checked for
accuracy. The submitted data will be compared to the data expected based on field observations (i.e., there should be
gross external pathology data for each fish sent in for examination). The Province Manager should contact the
laboratory and request any missing information as soon as possible after receipt of the data package. If information


-------
Section 9

Page 7 of 9
Revision 2
May 1993

was omitted because required analyses were not completed, the laboratory should provide and implement a plan to
correct the deficiency. This plan may include submittal of a revised data package and possible reanalysis of samples.

Assessing Data Quality

Data validation, or the process of assessing data quality, can begin after Province personnel have verified that
the data package is complete and free of transcription errors. For fish community data, the first step in validation will
be automatic range checks. For example, all occurrences of each species are compared to the maximum and minimum
latitudes and salinity tolerances, and maximum length for that species. These ranges will be determined from well
established sources. If a species is reported from a location where it would not be expected (based on salinity and
latitude), or the reported length exceeds the maximum length expected for that species, the record will be flagged for
further investigation. This can include checking the record against the original datasheet, checking the taxonomy QA
results if applicable, or questioning the crew chief. If no explanation can be identified, the original record will remain
unchanged. An additional verification step that must be performed is a check on the trawl duration and speed. A trawl
duration between 8 and 12 minutes and a speed between 1 and 3 knots is considered acceptable. Data collected from
any trawl that did not meet these acceptability criteria will be rejected.

During the fish data validation process, the results of the laboratory confirmation of the fish species and
pathology identifications will be reviewed. These results will be used to improve training in subsequent years, by
increasing emphasis on those species and/or pathologies which were consistently misidentified by the field personnel.
The fish species which were misidentified will be corrected in the database with the approval of the Province QA
Coordinator. Data qualifier codes also will be assigned to the fish pathology results, as described in the following
section.

Assigning Data Qualifier Codes

The independent laboratory confirmation of each pathology noted by the field crews will be used to qualify
these results in the database. The database codes are self-explanatory (Table 9-1) and are assigned based on whether
or not a given pathology was reviewed and confirmed by the laboratory experts.


-------
Section 9

Page 8 of 9
Revision 2
May 1993

Table 9-1. Qualifier codes assigned to fish pathology data.

Code	Definition

FP-A	LUMP not observed in field and not observed by laboratory pathologist

FP-B	LUMP not observed in field but was observed by laboratory pathologist

FP-C	LUMP observed in field but not confirmed by laboratory pathologist

FP-D	LUMP observed in field and confirmed by laboratory pathologist

FP-E	LUMP observed in field, but fish not examined by laboratory pathologist

FP-F	LUMP not observed in field, fish not examined by laboratory pathologist

FP-G	GROWTH not observed in field and not observed by laboratory pathologist

FP-H	GROWTH not observed in field but was observed by laboratory pathologist

FP-I	GROWTH observed in field but not confirmed by laboratory pathologist

FP-J	GROWTH observed in field and confirmed by laboratory pathologist

FP-K	GROWTH observed in field, but fish not examined by laboratory pathologist

FP-L	GROWTH not observed in field, fish not examined by laboratory pathologist

FP-M	ULCER not observed in field and not observed by laboratory pathologist

FP-N	ULCER not observed in field but was observed by laboratory pathologist

FP-0	ULCER observed in field but not confirmed by laboratory pathologist

FP-P	ULCER observed in field and confirmed by laboratory pathologist

FP-Q	ULCER observed in field, but fish not examined by laboratory pathologist

FP-R	ULCER not observed in field, fish not examined by laboratory pathologist

FP-S	FIN EROSION not observed in field and not observed by laboratory pathologist

FP-T	FIN EROSION not observed in field but was observed by laboratory pathologist

FP-U	FIN EROSION observed in field but not confirmed by laboratory pathologist

FP-V	FIN EROSION observed in field and confirmed by laboratory pathologist

FP-W	FIN EROSION observed in field, but fish not examined by laboratory pathologist

FP-X	FIN EROSION not observed in field, fish not examined by laboratory pathologist

FP-Y	Fish not examined for gross external pathology


-------
Section 9

Page 9 of 9
Revision 2
May 1993

Taking Final Action

Upon completion of the above steps, a report summarizing the QA review of the data package should be
prepared, samples should be properly stored or disposed of, and laboratory data should be archived both in a storage
file and in the database. Technical interpretation of the data begins after the QA review has been completed.

Reports documenting the results of the QA review of a data package should summarize all conclusions
concerning data acceptability and should note significant quality assurance problems that were found. These reports
are useful in providing data users with a written record on data concerns and a documented rationale for why certain
data were accepted as estimates or were rejected. The following specific items should be addressed in the QA report:

•	Summary of overall data quality, including a description of data that were qualified.

•	Description of data reporting, including any corrections made for transcription or other reporting

errors, and description of data completeness relative to objectives stated in the QA Plan.

Fish species identification, enumeration, length measurement and pathology QA results will be included in
the annual Program Quality Assurance Report and will also become a permanent part of the database documentation
(i.e., the metadata). The QA/QC data collected by the Program will be used not only to assess the quality of individual
measurements, but ultimately to assess the comparability of data generated by multiple laboratories and field crews.


-------
Section 10
Page 1 of 15
Revision 2
May 1993

SECTION 10
WATER QUALITY MEASUREMENTS

10.1	OVERVIEW

This section presents EMAP-Virginian Province QA/QC protocols and requirements for water quality
measurements, from collection to final validation. Collection and analysis methods are documented in the 1993 Field
Operations and Safety Manual (Reifsteck et al. 1993). With the exception of Total Suspended Solids, all data are
generated by the field crews.

Characterization of the water column is accomplished using the Seabird SBE 25 Sealogger® CTD to obtain
point-in-time, vertical profiles of temperature, salinity, dissolved oxygen, pH, light transmission, chlorophyll a
fluorescence, and photosynthetically active radiation (PAR). A hand-held dissolved oxygen meter manufactured by
Yellow Springs Instruments (YSI®) is used to make an additional point measurement of surface and near-bottom
dissolved oxygen as a check on, and back-up to, the Seabird CTD measurement. A single surface water sample is
obtained at each station for the determination of Total Suspended Solids (TSS) concentration.

Quality control of the water column measurements made with these electronic instruments has several aspects:
calibration, QC checks on the calibration, QC checks prior to deployment, and systematic review of the resultant data
and QC results. Calibration checks are conducted after the initial calibration and at regular intervals to determine the
need for recalibration.

10.2	QUALITY CONTROL PROCEDURES: FIELD MEASUREMENTS

The SeaBird CTD is a sophisticated instrument designed to collect high-quality data for the parameters
measured. Intense training of all personnel expected to operate this instrument is necessary to assure reliable operation
and acceptable data. Crew chiefs and any other potential operators must be certified in the use of this instrument
during training.


-------
Section 10
Page 2 of 15
Revision 2
May 1993

The 1990 EMAP-NC Demonstration Project in the Virginian Province shed light on several CTD deployment
problems that had an adverse effect on the performance of the dissolved oxygen sensor. The most commonly
encountered problems were: 1.) air bubbles trapped in the dissolved oxygen plumbing loop, 2.) mud being drawn
through the conductivity cell and into the plumbing loop upon contact of the instrument with the bottom, and 3.)
insufficient thermal equilibration time of the dissolved oxygen sensor. Deployment procedures were modified to
eliminate these problems. Protocols specified in the 1993 Field Operations and Safety Manual (Reifsteck et al. 1993)
must be followed to assure data quality and equipment maintenance.

A YSI Model 58 dissolved oxygen meter will be used to measure the dissolved oxygen concentration at the
surface and in water collected in a Go-Flo® bottle from approximately one meter off the bottom at each station at the
same time the Seabird CTD is deployed. Where possible, the Go-Flo® should be attached to the CTD to assure
comparability. Comparison of the YSI and CTD dissolved oxygen measurements provides a check on the operation
of the CTD dissolved oxygen sensor during deployment. In addition, the YSI meter is used for side-by-side QC checks
of the Seabird CTDs (once each week).

All water quality measurement activities will be monitored during field crew certification and during QA
audits or inspections conducted by senior Program personnel.

10.2.1 Instrument Calibration

Dissolved oxygen and pH sensors on the CTD must be calibrated under controlled laboratory conditions by
trained technicians following procedures specified by the manufacturer. For the dissolved oxygen sensor, a two point
calibration procedure utilizing a zero adjustment (sodium sulfite solution or nitrogen gas) and a slope adjustment with
air-saturated freshwater is employed. The pH probe is calibrated at three points using pH 4, 7 and 10 standard buffer
solutions.

Calibrations will be conducted prior to the field sampling and as needed throughout the field season.
Immediately following calibration, the dissolved oxygen and pH sensors should be checked for accuracy using Winkler
titrations and pH standards, respectively. The instruments' fluorometers are calibrated against algal cultures of known
chlorophyll a concentration to assure comparability among units and between years. Temperature, conductivity, light


-------
Section 10
Page 3 of 15
Revision 2
May 1993

transmission, and PAR sensors are calibrated by their manufacturers. If calibration checks of these sensors reveal a
problem (see the following section), the instrument should be returned to the manufacturer for troubleshooting and/or
recalibration.

The YSI dissolved oxygen meters must be calibrated immediately prior to use at each station using the water-
saturated air calibration procedure recommended by the manufacturer.

10.2.2 Instrument Calibration Checks

Performance checks should be conducted on the CTD units at the beginning and end of the field season. This
procedure involves setting up the four CTD units to simultaneously log data in a well-mixed seawater tank. Overall
variability among instruments should be assessed by comparing the simultaneous readings in the tank. The accuracy
of the dissolved oxygen measurements can be assessed by comparing the CTD readings against Winkler titration
values. The accuracy of the CTD salinity (conductivity) measurements is assessed through comparison with readings
obtained with a laboratory salinometer (Guildline AutoSal Model 8400) calibrated with IAPSO Standard Seawater
(a.k.a. "Copenhagen" water). The accuracy of the CTD temperature measurements is assessed by comparisons against
a NIST-certified thermometer. The instruments should then be removed from the tank and further tested: the
transmissometer and fluorometer voltage endpoints (open and blocked light path) are recorded as described by the
manufacturer, and the pH sensor readings are checked using three standard pH buffer solutions (pH 4, 7 and 10).

Field QC checks of the CTD temperature, salinity, dissolved oxygen and pH readings must be conducted at
least once each week. For this weekly check, real-time CTD readings from just below the surface should be compared
to simultaneous measurements with a thermometer, refractometer, and calibrated YSI dissolved oxygen meter. The
pH readings are checked using the pH 10 standard buffer solution. If maximum acceptable differences are exceeded
(Table 10-1), the CTD instrument must be checked thoroughly and a determination made of the need for recalibration.
If it is determined that a probe is malfunctioning and/or requires recalibration, the instrument must be sent back to the
Virginian Province Field Operations Center and replaced with a back-up unit.


-------
Section 10

Page 4 of 15

Revision 2
May 1993

Calibration QC checks of the YSI meter should be conducted at weekly intervals in the mobile laboratories.
Following calibration, the YSI probe should be immersed into a bucket of air-saturated water (bubbled at least 12
hours) and allowed to stabilize. The dissolved oxygen of the water bath is determined by Winkler titration and
compared to the YSI reading. The temperature of the water bath should be measured with an alcohol thermometer and
compared to the YSI temperature reading. If the dissolved oxygen or temperature difference exceeds the specified
limits (Table 10-1), the instrument must be checked thoroughly and a determination made of the need for recalibration
or oxygen sensor replacement.

A Hach digital titrator is employed for performing Winkler titrations. This method employs a concentrated
solution of sodium thiosulfate as the titrant, therefore, small errors in the delivery of the titrant can cause a significant
effect on the final DO value calculated. All personnel conducting titrations must be throughly trained in the operation
of the titrator, and demonstrate proficiency during training. In addition, each time a set of samples is titrated, the
titrator and thiosulfate cartridge must be standardized against an Iodide-Iodate solution.

Table 10-1. Maximum Acceptable Differences for Instrument Field Calibration Checks

Frequency

Instrument of Check Parameter

Checked
Against

Maximum
Acceptable
Difference

Seabird
CTD

Once each
week

Temperature	Thermometer	± 2 °C

Salinity	Refractometer	± 3 ppt

DO	YSI meter	± 0.5 mg/L

pH	pH buffer solution	±0.5 units

YSID.O.
Meter

Once each
week

D.O.	Winkler titration ± 0.5 mg/L

Temperature Thermometer	± 2 °C


-------
Section 10
Page 5 of 15
Revision 2
May 1993

10.2.3 Instrument Deployment Checks

Each CTD cast data file must be reviewed in the field immediately for evidence of deployment problems. A
standard check on the data file should consist of a comparison of the downcast versus the upcast for all parameters,
with particular attention to dissolved oxygen, salinity and light transmission. The dissolved oxygen profile should be
further evaluated by comparing the surface dissolved oxygen values at the beginning and end of the cast, and by
comparing the surface and bottom dissolved oxygen values to those recorded by the hand-held YSI meter. If either of
these dissolved oxygen differences exceed 0.5 mg/L, the field crew should recalibrate the YSI and redeploy the CTD
to obtain a second profile. If the deployment QC criteria are still not met on the second CTD profile, the field crew
should still save the data, but the dissolved oxygen values used in the assessment of water quality will be those from
the YSI. The field crew should determine the cause of the discrepancy and either make the necessary repairs in the
field or ship the instrument back to the field operations center for servicing. Salinity and temperature should also be
checked at the surface and bottom using a refractometer and thermometer, respectively.

10.3 QUALITY CONTROL PROCEDURES: TOTAL SUSPENDED SOLIDS

A surface water sample should be obtained at each station to determine the concentration of Total Suspended
Solids. This sample should be collected during the surface soak of the CTD according to the protocols outlined in the
Field Operations and Safety Manual (Reifsteck et al. 1993). This sample should be placed on ice immediately
following collection and shipped at the same temperature (4 °C).

Upon receipt of the sample at the laboratory, the sample custodian must log-in the sample and assure proper
storage (4 °C). Strict adherence to the protocols outlined in the EMAP-E Laboratory Methods Manual (U.S. EPA
1992, in revision) is mandatory. Samples must be stored for no longer than three months prior to analysis.

The analytical balance and drying oven used in the analysis should be calibrated at least monthly. Quality
assurance for the TSS analysis procedures should be accomplished primarily by analyzing a randomly selected subset
of 10% of the samples in each batch in duplicate, as described in full detail in the EMAP-E Laboratory Methods
Manual (U.S. EPA 1992, in revision). If the relative percent difference (RPD) between the duplicate values is greater
than 10%, then a third analysis must be completed by a different technician. The values closest to the third value
should be entered into the database. In addition, all the other samples in the same batch must be reanalyzed, and the


-------
Section 10

Page 6 of 15
Revision 2
May 1993

laboratory protocol and/or technician's practices should be reviewed and corrected to bring the measurement error
under control. If the RPD differs by less than 10%, the original value should not be changed and the analysis process
can be considered in control. The RPD is calculated as follows:

Relative Percent Difference (RPD) = CI - C2 x 100

(CI + C2)/2

where: CI is the larger of the duplicate results for a measurement
C2 is the smaller of the duplicate results for a measurement

10.4 QUALITY CONTROL PROCEDURES: INFORMATION MANAGEMENT

10.4.1 Sample Tracking

EMAP-E information management personnel have developed a comprehensive system for barcode labeling of
sample containers, recording sampling information in the field, and tracking sample shipments. A complete
description of this system is provided in the EMAP-E Information Management Plan (Adams el al. 1993) and is also
summarized in Section 11 of this plan. Field crews must carefully and thoroughly complete all shipment datasheets
and transmit this information to the Field Operations Center during the electronic transfer of data.

The analytical laboratory responsible for processing the TSS samples must designate a sample custodian who
is authorized to check the condition of and sign for incoming field samples, obtain documents of shipment, and verily
sample custody records. This individual is required, upon receipt of samples, to record and transmit all tracking
information to the Province Information Management Center. The use of barcode labels and readers provided by the
Province will facilitate this process. There must be clearly-defined custody procedures for sample handling, storage,
and disbursement in the laboratory.


-------
Section 10
Page 7 of 15
Revision 2
May 1993

10.4.2 Data Reporting Requirements

Data for the electronic water quality measurements exist in the form of computer files which are entered into
the field computer automatically at the time of collection. Crew chiefs must review these data prior to the electronic
transfer of the files to the Field Operations Center. Hard-copy original datasheets must be returned to the Field
Operations Center no later than the end of the crew's work shift.

For the TSS measurements, only laboratory data which have met QA requirements should be submitted to
EMAP. When QA requirements have not been met, the sample should be reanalyzed and only the results of the
reanalysis submitted, provided they are acceptable. Each data package submitted by the laboratory should consist of
the following:

•	A cover letter providing a brief description of the procedures and instrumentation used, as well as a narrative
explanation of analytical problems (if any), departures from protocols, or failure(s) to meet required quality
control limits.

•	Tabulated results in hard-copy form, including sample ID, filter weights, total weights (filter + sample),
volume of water filtered, and concentration of TSS in milligrams per liter.

•	Tabulated results in computer-readable form (e.g., diskette) included in the same shipment as the hard-copy
data, but packaged in a diskette mailer to prevent damage. Data must be provided in a format acceptable to
the Province Information Manager.

•	Results for all QA/QC samples (e.g., results of duplicate analyses) must be submitted by the laboratory as part
of the data package, but should be included in tables and files separate from the actual data.

No QA qualifier codes or "flags" are currently acceptable for TSS data. QA codes for water column
measurements are assigned during data evaluation at the Field Operations Center.


-------
Section 10
Page 8 of 15
Revision 2
May 1993

10.4.3 Data Evaluation Procedures

It is the responsibility of the Province Manager to acknowledge initial receipt of the data package(s), verify
that the four data evaluation steps identified in the following paragraph are completed, notify the analytical laboratory
(or contract field coordinator) of any additional information or corrective actions deemed necessary as a result of the
Province's data evaluation, and, following satisfactory resolution of all "corrective action" issues, take final action by
notifying the laboratory or field operations contractor in writing that the submitted results have been officially accepted
as a completed deliverable in fulfillment of contract requirements. It may be necessary or desirable for additional
personnel (e.g., the Province QA Coordinator) to assist the Province Manager in the technical evaluation of the
submitted data packages. While the Province Manager has ultimate responsibility for maintaining official contact with
the analytical laboratory and verifying that the data evaluation process is completed, it is the responsibility of the
Province QA Coordinator to closely monitor and formally document each step in the process as it is completed. This
documentation should be in the form of a data evaluation tracking form or checklist that is filled in as each step is
completed. This checklist should be supplemented with detailed memos to the project file outlining the concerns with
data omissions, analysis problems, or descriptions of questionable data identified by the laboratory.

Evaluation of the data package should commence as soon as possible following its receipt, since delays
increase the chance that information may be misplaced or forgotten. The following steps are to be followed in
evaluating EMAP-E data:

1.	Checking data completeness (verification)

2.	Assessing data quality (validation)

3.	Assigning data qualifier codes

4.	Taking final actions

Checking Data Completeness

The first part of data evaluation is to verify that all required information has been provided in the data
package. For field-generated data (/'. e., water quality measurements), the crew chief must review all data files to assure


-------
Section 10

Page 9 of 15
Revision 2
May 1993

they are complete and correct prior to uploading the data to the Field Operations Center. Once the data are received
at the Center, the Virginian Province Data Librarian should perform a 100% comparison of the electronic files to the
original hard-copy datasheets, followed by an additional 10% check. These steps serve not only to ensure that all data
contained on the datasheets are present in the database, but also as a check against transcription errors.

EMAP-E laboratories are expected to submit data which have already been tabulated and 100% checked for
accuracy. Data received from the analytical laboratory should be compared to the data expected based on field
observations (i.e., there should be a TSS value for each sample shipped). The Province Manager should contact the
laboratory and request any missing information as soon as possible after receipt of the data package. If information
was omitted because required analyses were not completed, the laboratory should provide and implement a plan to
correct the deficiency. This plan may include submittal of a revised data package and possible reanalysis of samples.

Because the CTD profile consists of an electronic file, only a small portion of which is recorded on hard-copy
data sheets, an additional step is required in the verification of these data. This step consists of a check to verify that
the CTD file is associated with the correct station and event. Although the field computer system has been designed
to virtually eliminate this error, this check should still be conducted. The bottom depth, DO, and salinity values in the
CTD file should be compared to those recorded by the field crew on the hard-copy data sheet to assure the file was
correctly identified. This step can be automated. DO and salinity values should match exactly, and the CTD bottom
depth should match the fathometer reading within 3 meters. Any CTD file that does not match recorded values should
be flagged for investigation.

Assessing Data Quality

Data validation, or the process of assessing data quality, can begin after Province personnel have determined
that the data package is complete. Each CTD profile must be examined, both manually and via automatic range
checks, as part of the validation. Plots of depth versus dissolved oxygen, salinity and temperature should be examined
to determine if there is a visible lag between the depth and these parameters as evidenced by the separation of
downcasts and upcasts through the pycnocline. At a well-mixed station, there will not be a visible lag in these
parameters and alignment is unnecessary. For those profiles which are misaligned, a delay factor for oxygen and or


-------
Section 10
Page 10 of 15
Revision 2
May 1993

salinity/temperature (averaging 1 second for salinity/temperature and 5 seconds for dissolved oxygen) should be applied
to the raw data file using the Seabird "ALIGNCTD" software. The file should then be reprocessed and plotted and the
entire process repeated with different delays until the upcast and down cast align at the pycnocline. The analyst
conducting this operation must be qualified in the use of the SeaBird software and must understand the scientific
aspects of water quality measurements.

Each CTD cast also must be visually inspected to identify any unusual patterns or spikes that necessitate
further review. This can be performed in conjunction with the alignment discussed above. Specific parameters which
should be checked are:

Amount of time at the surface - should be at least 120 seconds;

Stability of dissolved oxygen at the end of the surface soak - readings for the last 30 seconds prior
to the downcast should not vary by more than 0.5 mg/L;

Stability of the dissolved oxygen values at the beginning and end of the bottom soak - difference
should not exceed 0.5 mg/L;

Stability of the salinity values at the beginning and end of the bottom soak - values should not differ
by more than 1 ppt;

Unexpected patterns or trends in the downcast or upcast (e.g., spikes or dissolved oxygen values
increasing with depth);

A match between downcast and upcast values (compare the last dissolved oxygen record in the pre-
deployment surface soak with the last record in the post-deployment surface soak. Flag if the
difference is greater than 0.5 mg/L);

Amount of time at the bottom - should be at least 120 seconds;

Indications that CTD was lowered into the sediments (large change in oxygen and/or salinity, or a
spike in light transmission values).

In addition to examining the profiles themselves, the following range checks should be conducted (either manually
or automatically) on the downcast and bottom soak values:

Depth - acceptable range = 0.3 - 50 meters,

Temperature - acceptable range = 10-35° C,

Salinity - acceptable range = 0-35 ppt,

Dissolved Oxygen - acceptable range = 0-15 mg/L,


-------
Section 10
Page 11 of 15
Revision 2
May 1993

pH - acceptable range = 6 - 11 pH units,

Light Transmission - acceptable range = 0 - 100%,

Fluorescence - acceptable range = 0-30 units,

PAR - acceptable range = 0 - 6000 microeinsteins s"1 m"2,

Sigma-t - acceptable range = 0-25.

Any values falling outside of these ranges should be flagged for investigation. The values and flags should
be output in a QA/QC report for each cast.

Assigning Data Qualifier Codes

After the above checks are made, a database QA code should be assigned to the cast. A listing of these codes
is presented in Table 10-2. There are 54 codes describing the acceptability of the different water quality parameters
in different sections of the cast. No codes are currently accepted for TSS measurements.

Taking Final Actions

Upon completion of the above steps, a report summarizing the QA review of the data package should be
prepared, samples should be properly stored or disposed of, and laboratory data should be archived both in a storage
file and in the database. Technical interpretation of the data begins after the QA review has been completed.

Reports documenting the results of the QA review of a data package should summarize all conclusions
concerning data acceptability and should note significant quality assurance problems that were found. These reports
are useful in providing data users with a written record on data concerns and a documented rationale for why certain
data were accepted as estimates or were rejected. The following specific items should be addressed in the QA report:

•	Summary of overall data quality, including a description of data that were qualified.

•	Summary of all QA data (e.g., field QC checks, calibrations, calibration checks).


-------
Section 10
Page 12 of 15
Revision 2
May 1993

• Description of data reporting, including any corrections made for transcription or other reporting errors, and
description of data completeness relative to objectives stated in the QA Plan.

The water quality QA reports will be included in the annual Program Quality Assurance Report and also will
become a permanent part of the database documentation (i.e., the metadata). The QA/QC data collected by the
Program will be used not only to assess the accuracy and precision of individual measurements, but ultimately to assess
the comparability of data generated by multiple laboratories and field crews.


-------
Section 10
Page 13 of 15
Revision 2
May 1993

Table 10-2. QA codes assigned to CTD files.

Code	Definition	

C-A	Reject entire CTD cast (all parameters)

C-B	Accept entire CTD cast (all parameters)

C-IA	Reject surface values (all parameters)

C-IB	Reject pre-deploy. soak, accept post-deployment soak (all parameters)

C-IC	Reject entire bottom soak, no bottom values available (all parameters)

C-ID	Reject entire downcast file (all parameters)

C-IE	Reject bottom soak, use last value of downcast (all parameters)

C-IF	Reject average of bottom soak but accept last value (all parameters)

C-IG	Shallow station with pre-deployment soak and bottom soak only (no profile)

C-IH	Shallow station: surface and bottom values equal. Bottom file used for both.

C-II	Depth values questionable

C-IJ	Reject surface dissolved oxygen (pre and post)

C-IK	Reject pre cast dissolved oxygen but accept post cast dissolved oxygen

C-IL	Reject downcast dissolved oxygen

C-IM	Reject bottom dissolved oxygen

C-IN	Reject bottom soak dissolved oxygen but use last value of downcast

C-IO	Reject average bottom dissolved oxygen but use last value of bottom file

C-IP	Reject surface salinity (pre and post)

C-IQ	Reject pre cast saliniity but accept post cast salinity

(continued)


-------
Section 10
Page 14 of 15
Revision 2
May 1993

Table 10-2 (continued).

Code	Definition

C-IR

Reject downcast salinity

C-IS

Reject bottom salinity

C-IT

Reject bottom soak salinity but use last value of downcast

C-IU

Reject average bottom salinity but use last record of bottom file

C-IV

Reject surface temperature (pre and post cast)

C-IW

Reject pre cast temperature but accept post cast temperature

C-IX

Reject downcast temperature

C-IY

Reject bottom temperature

C-IZ

Reject bottom soak temperature but use last value of downcast

C-JA

Reject average bottom temperature but use last value of bottom file

C-JB

Reject surface pH (pre and post)

C-JC

Reject pre cast pH but accept post cast pH

C-JD

Reject downcast pH

C-JE

Reject bottom pH

C-JF

Reject bottom soak pH but use last value of downcast file

C-JG

Reject average bottom pH but use last value of bottom file

C-JH

Reject surface PAR (pre and post soak)

C-JI

Reject pre-cast PAR but accept post-cast PAR

(continued)


-------
Section 10
Page 15 of 15
Revision 2
May 1993

Table 10-2 (continued).

Code	Definition

C-JJ

Reject downcast PAR

C-JK

Reject bottom PAR

C-JL

Reject bottom soak PAR but use last value of downcast

C-JM

Reject average bottom PAR but use last value of bottom file

C-JN

Reject surface transmissometry (pre and post)

C-JO

Reject pre cast transmissometry but accept post cast transmissometry

C-JP

Reject downcast transmissometry

C-JQ

Reject bottom transmissometry

C-JR

Reject bottom soak transmissometry but use last value of downcast

C-JS

Reject average bottom transmissometry but use last value of bottom file

C-JT

Reject surface fluorescence (pre and post)

C-JU

Reject pre cast fluorescence but accept but accept post cast fluorescence

C-JV

Reject downcast fluorescence

C-JW

Reject bottom fluorescence

C-JX

Reject bottom soak fluorescence but use last value of downcast

C-JY

Reject average bottom fluorescence but use last value of bottom file

C-JZ

Fluorescence off-scale


-------
Section 11

Page 1 of9
Revision 2
May 1993

SECTION 11
INFORMATION MANAGEMENT

11.1	SYSTEM DESCRIPTION

The Information Management System developed for the EMAP-E Program is designed to perform the
following functions:

•	Document sampling activities and standard methods,

•	Support program logistics, sample tracking and shipments,

•	Process and organize both field and laboratory data

•	Perform range checks on selected numerical data,

•	Facilitate the dissemination of information, and

•	Provide interaction with the EMAP Central Information System.

A complete and detailed description of the EMAP-E Information Management System (IMS) is provided in
Adams et. al. (1993) and will not be repeated here.

11.2	QUALITY ASSURANCE/QUALITY CONTROL

Two general types of problems which must be resolved in developing QA/QC protocols for information and
data management are: (1) correction or removal of erroneous individual values and (2) inconsistencies that damage
the integrity of the data base. The following features of the EMAP-E IMS will provide a foundation for the
management and quality assurance of all data collected and reported during the life of the project.

11.2.1 Standardization

A systematic numbering system will be developed for unique identification of individual samples, sampling
events, stations, shipments, equipment, and diskettes. The sample numbering system will contain codes which will


-------
Section 11

Page 2 of 9
Revision 2
May 1993

allow the computer system to distinguish among several different sample types (e.g., actual samples, quality control
samples, sample replicates, etc.). This system will be flexible enough to allow changes during the life of the project,
while maintaining a structure which allows easy comprehension of the sample type.

A clearly-written instruction manual on the use of the field computer system will be developed for training
field personnel and to allow easy reference in the field. Contingency plans also will be stated explicitly in the event
that the field systems fail.

11.2.2	Prelabeling of Equipment and Sample Containers

Whenever possible, sample containers, equipment, and diskettes will be prelabeled to eliminate potential
confusion in the field and thereby reduce the number of incorrect or poorly-affixed labels. Containers with all the
required prelabeled sample containers, sample sheets, and data diskettes will be prepared for the field crews prior to
each sampling event (an event is defined as a single visit by a crew to a sampling site). These containers will be called
"event boxes". Each event box will have the event number affixed to it using both handwritten and bar code labels.

11.2.3	Data Entry, Transcription, and Transfer

In addition to paper data sheets, all data collected by field crews are recorded in a series of electronic forms
on a laptop computer. There is a one-to-one correspondence between the electronic forms (or records) and the paper
forms. Data entered in each field of the electronic forms can be checked automatically by the software, which will then
provide a warning when data do not fall in an expected range. In many instances, the use of bar code labels and
readers in the field will eliminate the need for manual entry of routine sample information and help avoid transcription
errors.

Following the initial entry of data into the field computer system, it is printed onto hard copy and checked
100% against the original paper data sheets. This check is performed by the field crew chief, who may correct
transcription errors and ultimately is responsible for assigning an acceptance code to the entered data. Once the data
have been checked and accepted by the crew chief, the field personnel no longer have the ability to make changes.


-------
Section 11

Page 3 of 9
Revision 2
May 1993

A record of each day's computer activities is kept by the field computer software and used by the
communications program to compress the data files before transmission. A 9600 baud, error checking modem, which
checks each byte as it's sent and eliminates garbled transmissions, transmits the compressed data files to the VAX
computer at the field operations center. Paper data sheets are mailed (or hand carried) to the FOC after all sampling
activities for a week have been completed.

On the field operations center VAX computer, a program which is run automatically at a specified time of
night (after the field crews have transmitted data) performs the following tasks: 1.) unpack the compressed data files,
2.) distribute the data files to appropriate directories on the VAX, 3.) process and plot CTD profile files, and 4.) parse
the incoming data into SAS importable files. A SAS program is subsequently run to process the information and
automatically generate reports indicating stations visited and activities performed the previous day. This enables a
verification check to be performed in which the information received electronically is compared with what the crews
reported doing via a daily phone call. Phone logs are also computerized at the Field Operations Center. If there are
discrepancies between the two reports the field crews are notified. The SAS program additionally performs range
checks on certain types of critical data. Furthermore, each day's data can be viewed by the Province Manager, Field
Coordinator, and/or members of the QA staff.

After all data sheets have been received from a field team for a given window (about 6 days), the Virginian
Province data librarian performs a 100% manual check of the data sheets against the electronic data stored on the
VAX. Any erroneous data values identified in this check or in the previously-generated SAS reports are changed to
correct values, with authorization from the Province QA Coordinator. In addition, suspicious data is flagged for further
investigation. Whenever a change to the data is required, the data librarian is required to enter a computerized data
change form indicating the data sheet, variable, and reason for change. This information is written to a SAS-
importable file and is used in compiling error rate statistics for data entry. When satisfied that the data is 100%
correct, the data librarian assigns an acceptance code.

11.2.4 Automated Data Verification

Whenever possible, erroneous numeric data will be identified using automatic range checks and filtering
algorithms. When data fall outside of an acceptable range, they will be flagged in a report for review by the Province


-------
Section 11

Page 4 of 9
Revision 2
May 1993

Manager, the Province Quality Assurance coordinator (QAC), or their designee. This type of report will be generated
routinely and should detail the files processed and the status of the QA checks. The report will be generated both on
disk and in hard copy for permanent filing. The Province Manager or Quality Assurance Coordinator will review the
report and release data which have passed the QA check for addition to the data base. All identified errors must be
corrected before flagged files can be added to a data base. If it is found that the data check ranges are not reasonable,
the values should be changed by a written request which includes a justification for the change.

Data base entries which are in the form of codes should be compared to lists of valid values (e.g., look-up
tables) established by experts for specific data types. These lists of valid codes will be stored in a central data base for
easy access by users. When a code cannot be verified in the appropriate look-up table, the observation should be
flagged in a written report for appropriate corrective action (e.g., update of the look-up table or removal of the
erroneous code).

11.2.5 Sample Tracking

Real-time tracking of all sample shipments will be performed at the Virginian Province Field Operations
Center (FOC). The tracking of sample shipments from the field crews to the analytical laboratories is extremely
important in order to minimize loss of samples by the field crews, shipping carrier, or receiving laboratory or as a result
of improper packaging. Shipment tracking is performed in two ways: by the transfer of shipment and receipt
information via daily telephone calls from the field crews and receiving labs, and by the comparison of electronic
shipment and receipt files transmitted to the FOC.

All shipments sent to the analytical laboratories by the field crews will be tracked by Virginian Province FOC
personnel using a six- or seven-digit shipment number. These shipment numbers are printed on barcodes for ease of
entry into computerized shipment and receipt data sheets. All field samples collected are to be associated with a
shipment number, whether they are shipped using a carrier (i.e., UPS or Federal Express) or hand carried to a
laboratory by a crew member. The association of field samples with the shipment numbers will make it possible to
track numerous individual samples through a single number.


-------
Section 11

Page 5 of 9
Revision 2
May 1993

As previously indicated, field crews will be required to inform FOC personnel via telephone of daily field and
shipping activities. All shipment numbers, shipment dates, sample types, destinations, and carrier identification
numbers listed during the telephone call will be carefully recorded by FOC personnel on a phone log. The information
on the phone log will then be entered into a SAS dataset on the VAX computer and will be output by S AS in a daily
field activities report.

The analytical laboratories will be instructed to place a telephone call to the FOC upon receipt of an EMAP-E
sample shipment. The information transmitted in this telephone call will include the EMAP shipment number, date
of receipt, and the condition of the samples. This information will also be entered into a SAS dataset on the VAX and
will be included in the daily field activities report. The telephone call from the laboratories will constitute verbal
confirmation of shipment receipt. If an analytical laboratory informs the FOC that a sample was missing from the
shipment or rejected due to improper processing or packaging, this information will be immediately conveyed to the
field coordinator in the FOC.

If verbal confirmation of receipt of a package is not received within three days of the shipment date, the data
librarian will place a telephone call to the analytical laboratory to confirm that the shipment was not received. If the
shipment has not been received, the field coordinator would contact the carrier to begin a trace of the shipment.

All of the field crews will be required to complete an electronic shipment data sheet in the field computer
system. This data sheet will contain general shipment information as well as descriptions of each sample in the
shipment. A paper shipment form will be completed if the field computer system is not available; however, the
information on this form will be entered into the field computer shipment form when a field computer system becomes
available. The computer system will maintain a list of all samples collected by each crew along with the status of each
sample (i.e., collected, collected/shipped). When a crew member ships a set of samples and enters the SAMPLEIDs
into the shipment program, the field computer system will automatically update the status of each of the samples in
the list of samples collected. Upon exiting the shipping program, the crew member will be warned if any samples
collected have not yet been shipped, if a SAMPLEID already shipped has been entered a second time, or if there is no
record of collection of a SAMPLEID entered in the shipment file. This electronic system will improve the accuracy
of the shipment files and facilitate sample tracking by the field crews.


-------
Section 11

Page 6 of 9
Revision 2
May 1993

A printout of the computer data sheet or a copy of the hand-written shipment form will be included in all
shipments to serve as a packing list. The electronic shipment information will be transmitted to the VAX computer
overnight and loaded into SAS datasets. SAS will immediately produce a series of reports from which Field Operations
Center personnel can track the status of each sample collected through the shipment process. In addition, a printout
of the shipment information will be sent to the Field Operations Center weekly accompanied by diskettes storing copies
of the electronic shipment files. Final data archiving will be on optical storage media within 6 months of receipt.

All of the analytical laboratories also are required to transmit an electronic file containing shipment receipt
information to the FOC on the day each shipment is received. If a receipt file listing all of the SAMPLEIDs in the
shipment is not transmitted to the FOC within several days of the verbal confirmation of receipt, the analytical
laboratory will be contacted and requested to submit a file as soon as possible.

The goal of electronic shipment tracking is to automate the tracking of shipments and samples as much as
possible. The SAMPLEIDs of all field samples collected are stored in a SAS dataset (SAMPLOG). By comparing
electronic shipment and receipt files to each other and to SAMPLOG and the electronic phone log file, it is possible
to flag missing or extra samples. Every night during the sampling season after field data have been transmitted by the
crews, a series of SAS reports will be automatically output listing the status of all shipments.

Each week during the sampling season, a report of all samples in the sample log not yet listed in a receipt file
and all samples listed in receipt files but not in the sample log will be produced to allow the further tracing of "missing"
or "extra" samples. The data librarian will account for each sample in this report by examining the raw shipped and
receipt files, by reviewing the field data sheets, and by contacting the analytical laboratories. If any corrections to the
shipment or receipt datasets must be made (i.e., to correct typographical errors), they will be approved by the Virginian
Province QA Coordinator, performed by the data librarian, and documented in a memo.

11.2.6 Reporting

Following analysis of the samples, the summary data packages transmitted from the laboratories will include
results, QA/QC information, and accompanying text. If the laboratory has assigned internal identification numbers


-------
Section 11

Page 7 of 9
Revision 2
May 1993

to the samples, the results should include the original sample number and the internal number used by the laboratory.
Specific data reporting requirements associated with each indicator are discussed in the corresponding section of this
plan. Analytical laboratories are responsible for permanent archiving of all raw data used in generating results for a
minimum period of seven years.

11.2.7 Redundancy (Backups)

All files in the EMAP-E IMS will be backed up regularly. At least one copy of the entire system will be
maintained off-site to enable the information management team to reconstruct the data base in the event that one
system is destroyed or incapacitated. In the field, all information will be recorded both on paper data sheets as well
as in the computer. All information saved to the hard drive will also be copied to a diskette simultaneously. In
addition, at the end of each day the field computers will be "equalized" to assure that the information contained on both
are identical. At this point all data will be contained on the hard drives of both field computers and on a diskette. At
the EMAP-E Virginian Province Information Management Center in Narragansett, incremental backups to removable
disk will be performed on all files which have changed on a daily basis. In addition, backups of all EMAP directories
and intermediate files will be performed on a weekly basis to provide a backup in the event of a complete loss of the
EMAP-E Information Center facility.

All original data files will be saved on-line for at least two years, after which the files will be permanently
archived. Archiving of data will be on a non-volatile medium such as an optical "WORM" disk, and one copy of this
will be kept off-site. All original files, especially those containing the raw field data, will be protected so that they can
be read only (i.e., write and delete privileges will be removed from these files).

11.3 DOCUMENTATION AND RELEASE OF DATA

Comprehensive documentation of information relevant to users of the EMAP-E IMS will be maintained and
updated as necessaiy. Most of this documentation will be accessible on-line, in data bases which describe and interact
with the system. The documentation will include a data base dictionary, access control, and data base directories
(including directory structures), code tables, and continuously-updated information on field sampling events, sample
tracking, and data availability.


-------
Section 11

Page 8 of 9
Revision 2
May 1993

A limited number of personnel will be authorized to make changes to the EMAP-E data base. All changes
will be carefully documented and controlled by the senior data librarian. Data bases which are accessible to outside
authorized users will be available in "read only" form. Access to data by unauthorized users will be limited through
the use of standard DEC VAX security procedures. Information on access rights to all EMAP-E directories, files, and
data bases will be provided to all potential users.

The release of data from the EMAP-E IMS will occur on a graduated schedule. Different classes of users will
be given access to the data only after it has passed a specified level of quality assurance review. Each group will use
the data on a restricted basis, under explicit agreements with the Estuaries Resource Group. The following four groups
are defined for access to data:

I.	The Virginian Province central group, including the information management team, the field
coordinator, the Province Manager, the QA Coordinator and the field crew chiefs.

II.	EMAP-Estuaries primary users - ERL-Narragansett personnel, ERL-Gulf Breeze personnel, NOAA
EMAP-E personnel, and EMAP quality assurance personnel.

III.	EMAP data users - All other task groups within EPA, NOAA, and other federal, state and municipal
agencies.

IV.	General Public - University personnel and the research community.

Prior to release at level IV (general public), all files will be checked and/or modified to assure that values
contain the appropriate number of significant figures. The purpose is to assure that the data released do not imply
greater accuracy than was realized. This will be especially important in files where data were summarized. In such
cases additional figures beyond the decimal point may have been added by the statistical program during averaging
or other manipulations. It will be the responsibility of the Quality Assurance Coordinator to determine the appropriate
number of significant figures for each measurement.


-------
Section 11

Page 9 of 9
Revision 2
May 1993

Requests for premature release of Virginian Province data will be submitted to the Information Management
Team through the Province Manager. The Province Information Manager and the Quality Assurance Coordinator,
in consultation with the Province Manager, will determine if the data can be released. The final authority on the
release of all data is the Technical Director of EMAP-Estuaries. The long-term goal for the EMAP-E Information
Management Team will be to develop a user interface through which all data will be accessed directly on the computer.
This will improve control of security and monitoring of access to the data, and it will help ensure that only the proper
data files are being accessed.


-------
Section 12
Page 1 of 1
Revision 2
May 1993

SECTION 12

QUALITY ASSURANCE REPORTS TO MANAGEMENT

A quality assurance report will be prepared by the Province QA Coordinator following each year's sampling
efforts. This report will summarize the measurement error estimates for the various data types using the QA/QC
sample data. Precision, accuracy, comparability, completeness, and representativeness of the data will be addressed
in this document.

Within 30 days of each audit (field or laboratory), the QA Coordinator will submit a report to the Province
Manager. This report will describe the results of the audit in full detail and note any deficiencies requiring
management action. The QA Coordinator will monitor the implementation of corrective actions in response to negative
findings, and will make regular reports to the Province Manager in this regard.

In addition to the formal reports described above, the Province QA Coordinator will report regularly to the
Province Manager on an informal basis, through E-mail, conference calls, and/or direct contact. One of the primary
responsibilities of the QA Coordinator is to keep the Province Manager informed of any issue or problem which might
have a negative effect on the data collected.

The EMAP-E Program Quality Assurance Coordinator, with assistance from the Province QA Coordinators,
will prepare a Quality Assurance Annual Report and Work Plan (QAARWP) for the Estuaries Resource Group. The
QAARWP summarizes the quality assurance activities conducted during the previous fiscal year, and describes
activities planned for the upcoming fiscal year. This report will be prepared following the guidelines presented in the
approved Quality Assurance Management Plan for EMAP (Kirkland, in preparation). The QAARWP will be
completed, approved by the EMAP-E Technical Director, and delivered to the EMAP QA Coordinator by September
30 of each year.


-------
Section 13
Page 1 of 3
Revision 2
May 1993

SECTION 13

REFERENCES

Adams, M., J. S. Rosen, H. Buffum, J. Beaulieu, and M. Hughes. 1991. Information Management Plan
for the EMAP-Near Coastal Program. U.S. Environmental Protection Agency, Environmental
Research Laboratoiy, Office of Research and Development, Gulf Breeze, FL

American Society for Testing and Materials. 1984. Annual Book of ASTM Standards, Vol. 11.01.
Standard Specifications for Reagent Water D1193-77 (reapproved 1983). American Society for
Testing and Materials, Philadelphia, PA.

American Society for Testing and Materials. 1991. Guide for conducting 10-day static sediment toxicity
tests with marine and estuarine amphipods. ASTM Standard Methods Volume 1104, Method
Number E-1367-90. American Society for Testing and Materials, Philadelphia, PA.

Baker, J. R. and G. D. Merritt. 1990. Environmental Monitoring and Assessment Program: Guidelines for
Preparing Logistics Plans. EPA 600/4-91 -001. U. S. Environmental Protection Agency, Las Vegas,
Nevada.

Cantillo, A. Y. 1992. Standard and Reference Materials for Marine Sciences, Third Edition. NOAA
Technical Memorandum NOS ORCA 68, National Ocean Service, Office of Ocean Resources
Conservation and Assessment, Silver Spring, MD.

Degraeve, G. M., N. G. Reichenbach, J. D. Cooney, P. I. Feder, and D. I. Mount 1988. New
developments in estimating endpoints for chronic toxicity tests. Abstract, Am. Soc. Test. Mater.
12th Symp. Aquat. Toxicol. Hazard Assess., Sparks, Nev.

Federal Register, Part VIII, EPA, "Guidelines Establishing Test Procedures for the Analysis of Pollutants
Under the Clean Water Act: Final Rule and Proposed Rule. 40 CFR Part 136, Oct. 28, 1984.

Hamilton, M. A., R. C. Russo, and R. V. Thurston. 1977. Trimmed Spearman-Karber method for
estimating median lethal concentrations in toxicity bioassays. Environ. Sci. Technol. 11:714-719;
Correction 12:417 (1978).

Holland, A. F. (ed.). 1990. Near Coastal Program Plan for 1990: Estuaries. EPA 600/4-90/033. U.S.
Environmental Protection Agency, Environmental Research Laboratory, Office of Research and
Development, Narragansett, Rl.

Hunt, D. T. E., and A. L Wilson. 1986. The Chemical Analysis of Water: General Principles and
Techniques. 2nd ed. Royal Society of Chemistry, London, England 683 pp.

Keith, L H., W. Crumett, J. Deegan, Jr., R. A. Libby, J. K. Taylor, and G. Wentler. 1983. Principles of
environmental analysis. Anal. Chem. 55:2210-2218.


-------
Section 13
Page 2 of 3
Revision 2
May 1993

Keith, L H. 1991. Environmental Sampling and Analysis: A Practical Guide. Lewis Publishers, Chelsea,
Ml, 143 pp.

Kirchner, C. J. 1983. Quality control in water analysis. Environ. Sci. and Technol. 17(4):174A-181A.

Kirkland, L, in preparation. Quality Assurance Management Plan for the Environmental Monitoring and
Assessment Program. U.S. Environmental Protection Agency, Washington, D.C.

Lauenstein, G. L, A. V. Cantillo, and S. Dolvin (eds.). 1993. A Compendium of Methods Used in the
NOAA National Status and Trends Program. National Ocean Service, Office of Ocean Resources
Conservation and Assessment, Silver Spring, MD (in press).

Olsen, A. R. (ed.). 1992. The Indicator Development Strategy for the Environmental Monitoring and
Assessment Program. U. S. Environmental Protection Agency, Environmental Research
Laboratory, Corvallis, OR.

Plumb, R. H., Jr. 1981. Procedures for handling and chemical analysis of sediment and water samples.

' Technical Report EPA\CE-81-1. U.S. Environmental Protection Agency/U.S. Corps of Engineers
Technical Committee on Criteria for Dredged and Fill Material, U.S. Army Waterways Experiment
Station, Vicksburg, MS. 471 pp.

Reifsteck D., C. J. Strobe!, and S. C. Schimmel. 1993. EMAP-Estuaries 1993 Virginian Province Field
Operations and Safety Manual. U.S. Environmental Protection Agency, Environmental Research
Laboratory, Office of Research and Development, Narragansett, Rl.

Rosen, J. S. 1993. Documentation of the Calculation of the EMAP Estuaries Virginian Province Benthic
Index. Unpublished manuscript, U.S. Environmental Protection Agency, Environmental Research
Laboratory, Office of Research and Development, Narragansett, Rl.

Stanley, T. W„ and S. S. Verner. 1983. Interim Guidelines and Specifications for Preparing Quality
Assurance Project Plans. EPA/600/4-83/004. U.S. Environmental Protection Agency, Washington,
D.C.

Stanley, T. W„ and S. S. Verner. 1985. The U. S. Environmental Protection Agency's quality assurance
program, pp 12-19 In: J. K. Taylor and T. W. Stanley (eds.). Quality Assurance for Environmental
Measurements, ASTM STP 867. American Society for Testing and Materials, Philadelphia,
Pennsylvania.

Taylor, J. K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers, Inc., Chelsea,
Michigan. 328 pp.

U.S. Environmental Protection Agency. 1992. EMAP Laboratory Methods Manual: Estuaries. U. S.
Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Office of
Research and Development, Cincinnati, Ohio (in revision).


-------
Section 13
Page 3 of 3
Revision 2
May 1993

U.S. Environmental Protection Agency, in preparation. Statistical Summary: EMAP-Estuaries Virginian
Province - 1991. U. S. Environmental Protection Agency, Environmental Research Laboratory,
Office of Research and Development, Narragansett, Rl.

U.S. Environmental Protection Agency. 1979a Methods for chemical analysis of water and wastes.
EPA-600/4-79/020. U. S. Environmental Protection Agency, Environmental Monitoring Systems
Laboratory, Office of Research and Development, Cincinnati, Ohio (revised March 1983).

U.S. Environmental Protection Agency. 1979b. Handbook for analytical quality control in water and
wastewater laboratories. U. S. Environmental Protection Agency, Environmental Monitoring and
Support Laboratory, Cincinnati, Ohio, EPA/600/4-79/019.

U.S. Environmental Protection Agency. 1989. Recommended Protocols for Measuring Selected
Environmental Variables in Puget Sound. U.S. Environmental Protection Agency, Puget Sound
Estuary Program, Office of Puget Sound, Seattle, Washington.

U.S. Environmental Protection Agency. 1991. A Project Manager's Guide to Requesting and Evaluating
Chemical Analyses. EPA 910/9-90-024. U.S. Environmental Protection Agency, Puget Sound
Estuary Program, Office of Coastal Waters, Region 10, Seattle, Washington.

Weisberg, S. B„ J. B. Frithsen, A. F. Holland, J. F. Paul, K. J. Scott, J. K. Summers, H. T. Wilson, R. M.
Valente, D. G. Heimbuch, J. Gerritsen, S. C. Schimmel, and R. W. Latimer. 1993. EMAP-Estuaries
Virginian Province 1990 Demonstration Project Report. EPA 600/R-92/100. U.S. Environmental
Protection Agency, Environmental Research Laboratory, Narragansett, Rl.


-------