DRAFT 4/92 ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM NEAR COASTAL VIRGINIAN PROVINCE QUALITY ASSURANCE PROJECT PLAN by R. Valente and J. Schoenherr Science Applications International Corporation 27 Tarzwell Drive Narragansett, Rhode Island 02882 U.S. ENVIRONMENTAL PROTECTION AGENCY OFFICE OF RESEARCH AND DEVELOPMENT ENVIRONMENTAL RESEARCH LABORATORY NARRAGANSETT, RHODE ISLAND 02882 ------- PREFACE This document outlines the integrated quality assurance plan for the Environmental Monitoring and Assessment Program's Near Coastal Monitoring in the Virginian Province. The quality assurance plan is prepared following the guidelines and specifications provided by the Quality Assurance Management Staff of the U.S. Environmental Protection Agency Office of Research and Development. Objectives for five data quality indicators (completeness, representativeness, comparability, precision, and accuracy) are established for the Near Coastal Monitoring in the Virginian Province. The primary purpose of the integrated quality assurance plan is to maximize the probability that data collected over the duration of the project will meet or exceed these objectives, and thus provide scientifically sound interpretations of the data in support of the project goals. Various procedures are specified in the quality assurance plan to: (1) ensure that collection and measurement procedures are standardized among all participants; (2) monitor performance of the measurement systems being used in the program to maintain statistical control and to provide rapid feedback so that corrective measures can be taken before data quality is compromised; (3) allow for the periodic assessment of the performance of these measurement systems and their components; and, (4) to verify and validate that reported data are sufficiently representative, unbiased, and precise so as to be suitable for their intended use. These activities will provide users with information regarding the degree of uncertainty associated with the various components of the EMAP Near Coastal data base. This quality assurance plan has been submitted in partial fulfillment of Contract Number 68-C1-0006 to Science Applications International Corporation under the sponsorship of the U.S. Environmental Protection Agency. Mention of trade names and commercial products does not constitute endorsement or recommendation for use. ii ------- Table of Contents Revision 2 Date 4/92 TABLE OF CONTENTS Section Page Preface ii Acknowledgments vi 1 INTRODUCTION 1 of 4 1.1 OVERVIEW 1 of 4 1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS 3 of 4 2 PROJECT ORGANIZATION 1 of 3 2.1 MANAGEMENT STRUCTURE 1 of 3 3 PROJECT DESCRIPTION 1 of 2 3.1 PURPOSE 1 of2 4 QUALITY ASSURANCE OBJECTIVES 1 of 8 4.1 DATA QUALITY OBJECTIVES 1 of 8 4.2 REPRESENTATIVENESS 2 of 8 4.3 COMPLETENESS 5 of 8 4.4 COMPARABILITY 5 of 8 4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR 6 of 8 5 QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS, CRITERIA, AND CORRECTIVE ACTION 1 of 49 5.1 CHEMICAL ANALYSIS OF SEDIMENT AND TISSUE SAMPLES 1 of 49 5.1.1 General QA/QC Requirements 5 of 49 5.1.2 Initial Calibration 6 of 49 5.1.3 Initial Documentation of Detetection Limits 10 of 49 5.1.4 Initial Blind Analysis of Representative Sample 11 of 49 5.1.5 Laboratory Participation in Intercomparison Exercises 12 of 49 5.1.6 Routine Analysis of Certified Reference Materials or Laboratory Control Materials 13 of 49 5.1.7 Continuing Calibration Check 15 of 49 5.1.8 Laboratory Reagent Blank 16 of 49 5.1.9 Internal Standards 17 of 49 5.1.10 Injection Internal Standards 18 of 49 5.1.11 Matrix Spike and Matrix Spike Duplicate 18 of 49 5.1.12 Field Duplicates and Field Splits 20 of 49 5.1.13 Analytical Chemistry Data Reporting Requirements 20 of 49 iii ------- Table of Contents Revision 2 Date 4/92 Contents (Continued) Section Page 5.2 OTHER SEDIMENT MEASUREMENTS 22 of 49 5.2.1 Total organic carbon 22 of 49 5.2.2 Acid volatile sulfide 23 of 49 5.2.3 Butyltins 25 of 49 5.2.4 Sediment grain size 26 of 49 5.2.5 Apparent RPD depth 27 of 49 5.3 SEDIMENT TOXICITY TESTING 28 of 49 5.3.1 Facilities and Equipment 28 of 49 5.3.2 Initial Demonstration of Capability 28 of 49 5.3.3 Sample Handling and Storage 30 of 49 5.3.4 Quality of Test Organisms 30 of 49 5.3.5 Test Conditions 30 of 49 5.3.6 Test Acceptability 33 of 49 5.3.7 Record Keeping and Reporting 33 of 49 5.4 MACROBENTHIC COMMUNITY ASSESSMENT 34 of 49 5.4.1 Sorting 34 of 49 5.4.2 Species Identification and Enumeration 36 of 49 5.4.3 Biomass Measurements 38 of 49 5.5 FISH SAMPLING 39 of 49 5.5.1 Species Identification, Enumeration and Length Measurements 39 of 49 5.5.2 Fish Gross Pathology and Histopathology 40 of 49 5.6 WATER COLUMN MEASUREMENTS 42 of 49 5.6.1 Seabird SBE 25 Sealogger 42 of 49 5.6.2 Hydrolab Datasonde 3 46 of 49 5.6.3 YSI Dissolved Oxygen Meter 48 of 49 5.7 NAVIGATION 49 of 49 6 FIELD OPERATIONS AND PREVENTIVE MAINTENANCE 1 of 4 6.1 TRAINING AND SAFETY 1 of 4 6.2 FIELD QUALITY CONTROL AND AUDITS 3 of 4 6.3 PREVENTIVE MAINTENANCE 4 of 4 iv ------- Table of Contents Revision 2 Date 4/92 Contents (Continued) Section Page 7 LABORATORY OPERATIONS 1 of 2 7.1 LABORATORY PERSONNEL, TRAINING, AND SAFETY 1 of 2 7.2 QUALITY CONTROL DOCUMENTATION 1 of 2 7.3 ANALYTICAL PROCEDURES 2 of 2 7.4 LABORATORY PERFORMANCE AUDITS 2 of 2 8 QUALITY ASSURANCE AND QUALITY CONTROL FOR MANAGEMENT OF DATA AND INFORMATION 1 of 8 8.1 SYSTEM DESCRIPTION 1 of 8 8.2 QUALITY ASSURANCE/QUALITY CONTROL 1 of 8 8.2.1 Standardization 2 of 8 8.2.2 Prelabeling of Equipment and Sample Containers 2 of 8 8.2.3 Data Entry and Transfer 3 of 8 8.2.4 Automated Data Verification 4 of 8 8.2.5 Sample Tracking 5 of 8 8.2.6 Reporting 5 of 8 8.2.7 Redundancy (Backups) 6 of 8 8.2.8 Human Review 6 of 8 8.3 DOCUMENTATION AND RELEASE OF DATA 7 of 8 9 QUALITY ASSURANCE REPORTS TO MANAGEMENT 1 of 1 10 REFERENCES 1 of 2 v ------- ACKNOWLEDGMENTS The following individuals contributed to the development of this document: J. Pollard, K. Peres and T. Chiang, Lockheed Engineering and Sciences Company, Las Vegas, Nevada; C. Strobel, C. Eller, and D. Cobb, Science Applications International Corporation, Narragansett, Rhode Island; D. Bender and L. Johnson, Technology Applications Inc., Cincinnati, Ohio; R. Graves, U.S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Cincinnati, Ohio; C.A. Manen, National Oceanic and Atmospheric Administration, Rockville, Maryland; K. Summers, U.S. Environmental Protection Agency, Envi- ronmental Research Laboratory, Gulf Breeze, Florida; R. Pruell and S. Schimmel, U.S. Environmental Protection Agency, Environmental Research Laboratory, Narragansett, Rhode Island; F. Holland and S. Weisberg, Versar, Inc., Columbia, Maryland. The assistance provided by R. Graves in the development of measurement quality objectives for analytical chemistry is especially appreciated. vi ------- Section 1 Revision 1 Date 7/91 DRAFT 1 Page 1 of 4 SECTION 1 INTRODUCTION 1.1 OVERVIEW The U.S. Environmental Protection Agency (EPA), in cooperation with other Federal agencies and state organizations, has designed the Environmental Monitoring and Assessment Program (EMAP) to monitor indicators of the condition and health of the Nation's ecological resources. Specifically, EMAP is intended to respond to the growing demand for information characterizing the condition of our environment and the type and location of changes in our environment. Simultaneous monitoring of pollutants and environmental indicators will allow for the identification of the likely causes of adverse changes. When EMAP has been fully implemented, the program will answer the following critical questions: o What is the status, extent and geographic distribution of the nation's important ecological resources? o What proportion of these resources is declining or improving? Where, and at what rate? o What are the factors that are likely to be contributing to declining condition? o Are control and mitigation programs achieving overall improvement in ecological conditions? o Which resources are at greatest risk to pollution impacts? ------- Section 1 Revision 1 Date 7/91 DRAFT 1 Page 2 of 4 To answer these types of questions, the Near Coastal Component of EMAP-Near Coastal (EMAP-NC) has set four major objectives: o Provide a quantitative assessment of the regional extent of coastal environmental problems by measuring pollution exposure and ecological condition. o Measure changes in the regional extent of environmental problems for the nation's estuarine and coastal ecosystems. o Identify and evaluate associations between the ecological condition of the nation's estuarine and coastal ecosystems and pollutant exposure, as well as other factors known to affect ecological condition (e.g., climatic conditions, land use patterns). o Assess the effectiveness of pollution control actions and environmental policies on a regional scale (i.e., large estuaries like Chesapeake Bay, major coastal regions like the mid-Atlantic and Gulf Coasts) and nationally. The Near Coastal component of EMAP will monitor the status and trends in environmental quality of the coastal waters of the United States. This program will complement and eventually may merge with the National Oceanic and Atmospheric Administration's (NOAA) existing National Status and Trends Program for Marine Environmental Quality to produce a single, cooperative, coastal and estuarine monitoring program. ------- Section 1 Revision 1 Date 7/91 DRAFT 1 Page 3 of 4 The strategy for implementation of the Near Coastal project is a regional, phased approach which started with the 1990 Demonstration Project in the Virginian Province. This biogeographical province covers an area from Cape Cod, Massachusetts to Cape Henry, Virginia (Holland 1990). In 1991, monitoring will continue in the Virginian Province and begin in the Louisianian Province (Gulf of Mexico from near Tampa Bay, Florida to the Texas-Mexico border at the Rio Grande). Additional provinces will be added in future years, eventually resulting in full national implementation of EMAP-Near Coastal. 1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS The quality assurance policy of the EPA requires every monitoring and measurement project to have a written and approved quality assurance plan (Stanley and Verner 1983). This requirement applies to all environmental monitoring and measurement efforts authorized or supported by the EPA through regulations, grants, contracts, or other means. The quality assurance plan for the project specifies the policies, organization, objectives, and functional activities for the project. The plan also describes the quality assurance and quality control activities and measures that will be implemented to ensure that the data will meet all criteria for data quality established for the project. All project personnel must be familiar with the policies and objectives outlined in this quality assurance plan to assure proper interactions among the various data acquisition and management components of the project. EPA guidance (Stanley and Verner 1983) states that the 15 items shown in Table 1-1 should be addressed in the QA Project Plan. Some of these items are extensively addressed in other documents for this project and therefore are only summarized or referenced in this document. ------- Section 1 Revision 1 Date 7/91 DRAFT 1 Page 4 of 4 TABLE 1-1. SECTIONS IN THIS REPORT THAT ADDRESS THE 15 SUBJECTS REQUIRED IN A QUALITY ASSURANCE PROJECT PLAN. Quality Assurance Subject Title page Table of contents Project description Project organization and responsibility QA objectives Sampling procedures Sample custody Calibration procedures Analytical procedures Data reduction, validation, and reporting Internal QC checks Performance and system audits Preventive maintenance Corrective action QA reports to management This Report Title page Table of contents Section 3 Section 2 Section 4 Section 6 Section 8 Section 5,6,7 Section 7 Section 8,9 Section 5 Section 5,6,7 Section 6 Section 5 Section 9 ------- Section 2 Revision 1 Date 7/91 DRAFT 1 Page 1 of 3 SECTION 2 PROJECT ORGANIZATION 2.1 MANAGEMENT STRUCTURE For the EMAP-Near Coastal monitoring in the Virginian Province, expertise in specific research and monitoring areas will be provided by several EPA laboratories and their contracting organizations. The Environmental Research Laboratory in Narragansett, Rhode Island (ERL-N) has been designated as the principal laboratory for EMAP-NC monitoring in the Virginian Province, and therefore will provide direction and support for all activities. Technical support is provided to ERL-N by Science Applications International Corporation (SAIC), Versar Incorporated, and Computer Sciences Corporation. The Environmental Monitoring Systems Laboratory in Cincinnati, Ohio (EMSL-CIN) will provide additional technical support for quality assurance activities and analysis of chemical contaminants in sediment and tissue samples. The Environmental Research Laboratory in Gulf Breeze, Florida (ERL- GB) has been designated as the principal laboratory for the statistical design of the Near Coastal monitoring effort. Figure 2-1 illustrates the management structure for the EMAP-NC 1991 Virginian Province monitoring. All key personnel involved in the 1991 Virginian Province monitoring are listed in Table 2-1. ------- Figure 2-1. Management structure for the 1991 EMAP-NC Virginian Province monitoring. ------- Section 2 Revision 1 Date 7/91 DRAFT 1 Page 3 of 3 TABLE 2-1. LIST OF KEY PERSONNEL, AFFILIATIONS, AND RESPONSIBILITIES FOR THE EMAP- NEAR COASTAL 1991 VIRGINIAN PROVINCE MONITORING. NAME ORGANIZATION RESPONSIBILITY F. Kutz D. McKenzie J. Paul R. Latimer U.S. EPA-DC U.S. EPA-RTP U.S. EPA-Narragansett U.S. EPA-Narragansett EMAP Director Deputy Director NC Associate Director NC Technical Director K. Summers S. Schimmel C. Strobel U.S. EPA-Gulf Breeze U.S. EPA-Narragansett SAIC NC Design Lead Virginian Province Manager Virginian Province Field Coordinator B. Graves R. Valente J. Schoenherr U.S. EPA-Cincinnati SAIC SAIC EMAP QA Coordinator EMAP-NC QA Officer Virginian Province QA Officer B. Thomas J. Brooks J. Scott G. Thursby U.S. EPA-Cincinnati Texas A&M Univ. SAIC SAIC Contaminant Analyses-Sediments Contaminant Analyses-Tissue T oxicology/Sampling Toxicology G. Gardner D. Keith N. Mountford U.S. EPA-Narragansett U.S. EPA-Narragansett Cove Corporation Histopathology Sediment Physical Analyses Benthic Analyses J. Baker F. Holland S. Weisberg J. Frithsen LESC Versar, Inc. Versar, Inc. Versar, Inc. Logistics Support Technical Support Technical Support Technical Support J. Rosen CSC Information Management A. Cantillo NOAA NOAA QA Liaison ------- Section 3 Revision 1 Date 7/91 DRAFT 1 Page 1 of 2 SECTION 3 PROJECT DESCRIPTION 3.1 PURPOSE Complete descriptions of the EMAP-NC monitoring approach and rationale, sampling design, indicator strategy, logistics, and data assessment plan are provided in the Near Coastal Program Plan for 1990: Estuaries (Holland 1990). Briefly, the objectives of the 1991 Near Coastal Virginian Province monitoring are to: o Obtain estimates of the variability associated with Near Coastal indicators which will allow establishment of program level data quality objectives (DQOs). o Evaluate the utility, sensitivity, and applicability of the EMAP-Near Coastal indicators on a regional scale. o Determine the effectiveness of the EMAP network design for quantifying the extent and magnitude of pollution problems in the Virginian Province. o Demonstrate the usefulness of results for the purposes of planning, prioritization, and determining the effectiveness of existing pollutant control actions. o Develop methods for indicators that can be transferred to EMAP-Nc user groups. o Identify and resolve logistical issues associated with implementing the network design in the Virginian Province. ------- Section 3 Revision 1 Date 7/91 DRAFT 1 Page 2 of 2 The strategy for accomplishing the above objectives will be to continue to field test the sensitivity of the proposed Near Coastal indicators and network design through a second year of sampling in the Virginian Province estuaries. Estuaries were selected as the target ecosystem because their natural circulation patterns concentrate and retain pollutants. Estuaries are spawning and nursery grounds for many species of living resources, and the estuarine watersheds receive a great proportion of the pollutants discharged in the waterways of the U.S. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 1 of 8 SECTION 4 QUALITY ASSURANCE OBJECTIVES 4.1 DATA QUALITY OBJECTIVES The EMAP-Near Coastal personnel are making a variety of measurements to monitor a defined set of parameters (i.e., indicators of estuarine and coastal environmental quality). Complete descriptions of the program's objectives and indicator stategy are presented in the Near Coastal Program Plan (Holland 1990) and will not be repeated here. To successfully meet the objectives, the program's assessments of ecosystem health must be based on scientifically sound interpretations of the data. To achieve this end, and as required by EPA for all monitoring and measurement programs, objectives must be established for data quality based on the proposed uses of the data (Stanley and Verner 1985). The primary purpose of the quality assurance program is to maximize the probability that the resulting data will meet or exceed the data quality objectives (DQOs) specified for the project. Data quality objectives established for the EMAP-Near Coastal project, however, are based on control of the measurement system because error bounds cannot, at present, be established for end use of indicator response data. As a consequence, management decisions balancing the cost of higher quality data against program objectives are not presently possible. As data are accumulated on indicators and the error rates associated with them are established, end use DQOs can be established and quality assurance systems implemented to assure acceptable data quality to meet pre-established program objectives. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 2 of 8 Data quality objectives for the various measurements being made on EMAP-Near Coastal can be expressed in terms of accuracy, precision, and completeness goals (Table 4-1). These data quality objectives more accurately can be termed "measurement quality objectives" (MQOs), because they are based solely on the likely magnitude of error generated through the measurement process. The MQOs for the Near Coastal project were established by obtaining estimates of the most likely data quality that is achievable based on either the instrument manufacturer's specifications or historical data. Scientists familiar with each particular data type provided estimates of likely measurement error for a given measurement process. The MQOs presented in Table 4-1 are used as quality control criteria both in field and laboratory measurement processes to set the bounds of acceptable measurement error. General speaking, DQOs or MQOs are usually established for five aspects of data quality: representativeness, completeness, comparability, accuracy, and precision (Stanley and Verner 1985). These terms are defined below with general guidelines for establishing MQOs for each QA parameter. 4.2 REPRESENTATIVENESS Representativeness is defined as "the degree to which the data accurately and precisely represent a characteristic of a population parameter, variation of a property, a process characteristic, or an operational condition" (Stanley and Verner, 1985). Representativeness applies to the location of sampling or monitoring sites, to the collection of samples or field measurements, to the analysis of those samples, and to the types of samples being used to evaluate various aspects of data quality. The location of sampling sites and the design of the sampling program for EMAP-Near Coastal monitoring in the Virginian Province provide the primary focus for defining representative population estimates from this region. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 3 of 8 TABLE 4-1. MEASUREMENT QUALITY OBJECTIVES FOR EMAP-NEAR COASTAL INDICATORS AND ASSOCIATED DATA. Maximum Maximum Allowable Allowable Accuracy (Bias) Precision Completeness Indicator/Data Type Goal Goal Goal Sediment contaminant analyses: Organics 30% 30% 90% Inorganics 15% 15% 90% Fish tissue contaminant analyses: Organics 30% 30% 90% Inorganics 15% 15% 90% Sediment toxicity NA NA 90% Benthic species composition and biomass: Sorting 10% NA 90% Counting 10% NA 90% Taxonomy 10% NA 90% Biomass NA 10% 90% Sediment characteristics: Grain size analyses NA 10% 90% Total organic carbon 10% 10% 90% Acid volatile sulfide 10% 10% 90% Water Column Characteristics: Dissolved oxygen ± 1.0 mg/L 10% 90% Salinity ±1.0ppt 10% 90% Depth ± 0.5 m 10% 90% pH ± 0.2 units NA 90% Temperature ± 0.5 °C NA 90% Total Suspended solids NA 10% 90% (CONTINUED) ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 4 of 8 TABLE 4-1. (Continued) Maximum Maximum Allowable Allowable Accuracy (Bias) Precision Completeness Indicator/Data Type Goal Goal Goal Gross pathology of fish NA 10% 90% Fish community composition: Counting 10% NA 90% Taxonomic identification 10% NA 90% Length determinations ± 5 mm NA 90% Fish histopathology NA NA NA Apparent RPD depth ± 5 mm NA 90% estuarine environment. The proposed sampling design combines the strengths of systematic and random sampling with an understanding of estuarine systems, to collect data that will provide unbiased estimates of the status of the Nation's estuarine resources. Field protocols are documented in the Near Coastal Field Operations and Safety Manual (Strobel and Schimmel 1991) for future reference and protocol standardization, as are laboratory measurement protocols in the Laboratory Methods Manual (U. S. EPA, in preparation). The types of QA documentation samples (i.e., performance evaluation material) used to assess the quality of chemical data will be as representative as possible of the natural samples collected during the project with respect to both composition and concentration. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 5 of 8 4.3 COMPLETENESS Completeness is defined as "a measure of the amount of data collected from a measurement process compared to the amount that was expected to be obtained under the conditions of measurement" (Stanley and Verner 1985). A criteria ranging from 75 to 90 percent valid data from a given measurement process is suggested as being reasonable for the Near Coastal project. As data are compiled for the various indicators, more realistic criteria for completeness can be developed. The suggested criteria for each data type to be collected is presented in Table 4-1. 4.4 COMPARABILITY Comparability is defined as "the confidence with which one data set can be compared to another" (Stanley and Verner 1985). Comparability of reporting units and calculations, data base management processes, and interpretative procedures must be assured if the overall goals of EMAP are to be realized. One goal of the EMAP-Near Coastal program is to generate a high level of documentation for the above topics to ensure that future EMAP efforts can be made comparable. For example, both field and laboratory methods are described in full detail in manuals which will be made available to all field personnel and analytical laboratories. Field crews will undergo intensive training in a single three week session prior to the start of field work. Finally, the sampling design for the Virginian Province monitoring has been made flexible enough to allow for analytical adjustments, when necessary, to ensure data comparability. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 6 of 8 4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR The term "accuracy", which is used synonymously with the term bias in this plan, is defined as the difference between a measured value and the true or expected value, and represents an estimate of systematic error or net bias (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Precision is defined as the degree of mutual agreement among individual measurements, and represents an estimate of random error (Kirchner 1983; Hunt and Wilson 1986; Taylor 1987). Collectively, accuracy and precision can provide an estimate of the total error or uncertainty associated with an individual measured value. Measurement quality objectives for the various indicators are expressed separately as maximum allowable accuracy (i.e., bias) and precision goals (Table 4-1). Accuracy and precision goals may not be definable for all parameters due to the nature of the measurement type. For example, accuracy measurements are not possible for toxicity testing and fish pathology identifications because "true" or expected values do not exist for these measurement parameters (see Table 4-1). In order to evaluate the MQOs for accuracy and precision, various QA/QC samples will be collected and analyzed for most data collection activities. Table 4-2 presents the types of samples to be used for quality assurance/quality control for each of the various data acquisition activities except sediment and fish tissue contaminant analyses. The frequency of QA/QC measurements and the types of QA data resulting from these samples or processes are also presented in Table 4-2. Because several different types of QA/QC samples are required for the complex analyses of chemical contaminants in sediment and tissue samples, they are presented and discussed separately in Section 5.1 along with presentation of warning and control limits for the various chemistry QC sample types. ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 7 of 8 TABLE 4-2. QUALITY ASSURANCE SAMPLE TYPES, FREQUENCY OF USE, AND TYPES OF DATA GENERATED FOR EMAP-NEAR COASTAL VIRGINIAN PROVINCE MONITORING (SEE TABLE 5-1 FOR CHEMICAL ANALYSIS QA/QC SAMPLE TYPES). QA Sample Type Data Generated or Measurement Frequency for Measurement Variable Procedure of Use Quality Definition Sediment toxicity tests Reference toxicant Each experiment Variance of replicated tests over time Benthic Species Composition and Biomass: Sorting Resort of complete sample including debris 10% of each tech's work No. animals resorted Sample counting and ID Recount and ID of sorted animals 10% of each tech's work No. of count and ID errors Biomass Duplicate weights 10% of samples Duplicate results Sediment grain size Splits of a sample Organic carbon and acid vola- tile sulfide Duplicates and analysis of standards 10% of each tech's work Duplicate results Each batch Duplicate results and standard recoveries Dissolved Oxygen Cone. Comparison of Hydro- labs and Seabird CTDs with YSI dissolved oxygen meter Each cast (CTD); Before and after retrieval (Hydrolab) Difference between Hydrolab or CTD and YSI meter values Salinity Refractometer reading Once each day Difference between probe and refractometer (CONTINUED) ------- Section 4 Revision 1 Date 7/91 DRAFT 1 Page 8 of 8 TABLE 4-2. (Continued) QA Sample Type Data Generated or Measurement Frequency for Measurement Variable Procedure of Use Quality Definition Temperature Thermometer check Once each day Difference between probe and thermometer Depth Check bottom depth against depth finder on boat One at each sampling location Replicated difference from actual % Transmission Duplicate suspended solids samples from surface 10% of stations Difference between duplicates pH QC check with buffer solution standard Once each day Difference from standard Fish identification Check specimens sent back to laboratory for confirmation Once/crew for Number of mis- each target identifications species Fish counts Duplicate counts 10% of trawls Replicated difference between determinations Fish gross pathology Check specimens sent back to laboratory for confirmation Regular intervals Number of mis- identifications Fish histopathology Apparent RPD depth Independent confirmation by second technician 5% of slides Number of con- firmations Duplicate measurements 10% of samples Duplicate results ------- Section 5 Revision 2 Date 4/92 Page 1 of 49 SECTION 5 QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS, CRITERIA, AND CORRECTIVE ACTION Complete and detailed protocols for field and laboratory measurements can be found in the 1991 Virginian Province Field Operations and Safety Manual (Strobel and Schimmel 1991) and in the EMAP-Estuaries Laboratory Methods Manual (U.S. EPA, in prep.), respectively. Specific QA/QC procedures to be followed during the 1991 Virginian Province monitoring are presented in the following sections. 5.1 CHEMICAL ANALYSIS OF SEDIMENT AND FISH TISSUE SAMPLES The EMAP-E program will measure a variety of organic and inorganic contaminants in estuarine sediment and fish tissue samples (Tables 5-1 and 5-2); these compounds are identical to those measured in NOAA's National Status and Trends (NS&T) Program. No single analytical method has been approved officially for low-level (i.e., low parts per billion) analysis of organic and inorganic contaminants in estuarine sediments and fish tissue. Recommended methods for the EMAP-E program are those used in the NS&T Program (Lauenstein in prep.), as well as those documented in the EMAP-E Laboratory Methods Manual (U.S. EPA in prep.). EMAP-E does not require that a single, standardized analytical method be followed, but rather that participating laboratories demonstrate proficiency and comparability through routine analysis of Certified Reference Materials1 (CRMs) or similar types of accuracy-based materials. 1 Certified Reference Materials are samples in which chemical concentrations have been determined accurately using a variety of technically valid procedures; these samples are accompanied by a certificate or other documentation issued by a certifying body (e.g., agencies such as the National Research Council of Canada (NRCC), U.S. EPA, U.S. Geological Survey, etc.). Standard Reference Materials (SRMs) are CRMs issued by the National Institute of Standards and Technology (NIST), formerly the National Bureau of Standards (NBS). A useful catalogue of marine science reference materials has been compiled by Cantillo (1990). TABLE 5-1. CHEMICALS TO BE MEASURED IN SEDIMENTS BY EMAP-E VIRGINIAN PROVINCE. ------- Section 5 Revision 2 Date 4/92 Page 2 of 49 Polvaromatic Hydrocarbons (PAHs) DDT and its metabolites Acenaphthene Anthracene Benz(a)anthracene Benzo(a)pyrene Benzo(e)pyrene Biphenyl Chrysene Dibenz(a,h)anthracene 2,6 -dime thy lnaphthalene Fluoranthene Fluorene 2-methy lnaphthalene 18 PCB Congeners: 1 -methy lnapthalene 1 -methy lphenanthrene Naphthalene Perylene Phenanthrene Pyrene Benzo(b)fluoranthene Acenaphthlylene Benzo(k)fluoranthene Benzo(g,h,i)perylene Ideno(l,2,3-c,d)pyrene 2,3,5 -trimethy lnaphthalene 2,4'-DDD 4,4'-DDD 2,4'-DDE 4,4'-DDE 2,4'-DDT 4,4'-DDT Chlorinated pesticides other than DDT Aldrin Alpha-Chlordane Trans-Nonachlor Dieldrin Heptachlor PCB No. Compound name Heptachlor epoxide 8 2,4'-dichlorobiphenyl Hexachlorobenzene 18 2,2',5-trichlorobiphenyl Lindane (gamma-BHC) 28 2,4,4'-trichlorobiphenyl Mirex 44 2,2',3,5'-tetrachlorobiphenyl 52 2,2', 5,5'-tetrachlorobipheny 1 66 2,3',4,4'-tetrachlorobiphenyl Major Elements 101 2,2',4,5,5'-pentachlorobiphenyl 105 2,3,3',4,4'-pentachlorobiphenyl Aluminum 118 2,3',4,4',5-pentachlorobiphenyl Iron 128 2,2',3,3',4,4'-hexachlorobiphenyl Manganese 138 2,2',3,4,4',5'-hexachlorobiphenyl 153 2,2',4,4',5,5'-hexachlorobiphenyl Trace Elements 170 2,2',3,3',4,4',5-heptachlorobiphenyl 180 2,2',3,4,4',5,5'-heptachlorobiphenyl Antimony 187 2,2',3,4',5,5',6-heptachlorobiphenyl Arsenic 195 2,2',3,3',4,4',5,6-octachlorobiphenyl Cadmium 206 2,2',3,3',4,4',5,5',6-nonachlorobiphenyl Chromium 209 2,2',3,3',4,4',5,5',6,6'-decachlorobiphenyl Copper Lead Mercury Other measurements Nickel Selenium Acid volatile sulfide Silver Total organic carbon Tin Tributyltin, Dibutyltin, Monobutyltin Zinc ------- Section 5 Revision 2 Date 4/92 Page 3 of 49 TABLE 5-2. CHEMICALS TO BE MEASURED IN FISH TISSUE BY EMAP-E VIRGINIAN PROVINCE. DDT and its metabolites 2,4'-DDD 4,4'-DDD 2,4'-DDE 4,4'-DDE 2,4'-DDT 4,4'-DDT Chlorinated pesticides other than DDT Aldrin Alpha-Chlordane Trans-Nonachlor Dieldrin Heptachlor Heptachlor epoxide Hexachlorobenzene Lindane (gamma-BHC) Mirex Trace Elements Aluminum Arsenic Cadmium Chromium Copper Iron Lead Mercury Nickel Selenium Silver Tin Zinc 18 PCB Congeners: PCB No. Compound name 8 2,4'-dichlorobiphenyl 18 2,2',5-trichlorobiphenyl 28 2,4,4'-trichlorobiphenyl 44 2,2',3,5'-tetrachlorobiphenyl 52 2,2', 5,5'-tetrachlorobipheny 1 66 2,3',4,4'-tetrachlorobiphenyl 101 2,2',4,5,5'-pentachlorobiphenyl 105 2,3,3',4,4'-pentachlorobiphenyl 118 2,3',4,4',5-pentachlorobiphenyl 128 2,2',3,3',4,4'-hexachlorobiphenyl 138 2,2',3,4,4',5'-hexachlorobiphenyl 153 2,2',4,4',5,5'-hexachlorobiphenyl 170 2,2',3,3',4,4',5-heptachlorobiphenyl 180 2,2',3,4,4',5,5'-heptachlorobiphenyl 187 2,2',3,4',5,5',6-heptachlorobiphenyl 195 2,2',3,3',4,4',5,6-octachlorobiphenyl 206 2,2',3,3',4,4',5,5',6-nonachlorobiphenyl 209 2,2',3,3',4,4',5,5',6,6'-decachlorobiphenyl ------- Section 5 Revision 2 Date 4/92 Page 4 of 49 Furthermore, through an interagency agreement with the NOAA's NS&T Program, all EMAP-E analytical laboratories are required to participate in an on-going series of laboratory intercomparison exercises (round-robins), which are conducted jointly by the National Institute of Standards and Technology (NIST) and the National Research Council of Canada (NRCC). Laboratories must participate in these QA intercomparison exercises both to demonstrate initial capability (i.e., prior to the analysis of actual samples) and on a continual basis throughout the project. The EMAP-E laboratories will be required to initiate corrective actions if their performance in these intercomparison exercises falls below certain pre-determined minimal standards, described in later sections. As discussed earlier, the data quality objectives for EMAP-E were developed with the understanding that the data will not be used for litigation purposes. Therefore, legal and contracting requirements as stringent as those used in the U.S. EPA Contract Laboratory Program, for example, need not be applied to EMAP-E. Rather, it is the philosophy of EMAP-E that as long as required QA/QC procedures are followed and comparable analytical performance is demonstrated through the routine analysis of Certified Reference Materials and through the on-going QA intercomparison exercises, multiple procedures for the analysis of different compound classes used by different laboratories should yield comparable results. This represents a "performance-based" approach for quality assurance of low-level contaminant analyses, involving continuous laboratory evaluation through the use of accuracy-based materials (CRMs), laboratory fortified sample matrices, laboratory reagent blanks, calibration standards, and laboratory and field replicates. The conceptual basis for the use of each of these types of quality control samples is presented in the following sections. ------- Section 5 Revision 2 Date 4/92 Page 5 of 49 5.1.1 General OA/OC Requirements The guidance provided in the following sections is based largely on the protocols developed for the Puget Sound Estuary Program (U.S. EPA 1989); it is applicable to low parts-per-billion analyses of both sediment and tissue samples unless otherwise noted. The QA/QC requirements are intended to provide a common foundation for each laboratory's protocols; the resultant QA/QC data will enable an assessment of the comparability of results generated by different laboratories and different analytical procedures. It should be noted that the QA/QC requirements specified in this plan represent the minimum requirements for any given analytical method. Additional requirements which are method-specific should always be followed, as long as the minimum requirements presented in this document have been met. Data for all QA/QC variables must be submitted by the laboratory as part of the data package; the completeness of each submitted data package will be checked by the Virginian Province manager, quality assurance coordinator, or their designee(s). Data validation will be conducted by qualified personnel to ascertain that control limits for QA/QC samples have been met, or, if exceeded, that acceptable narrative explanations have been provided by the laboratory along with the submitted data (a more detailed description of data reporting requirements is provided in Section 5.1.13). The QA/QC data will be used initially to assess the accuracy and precision of individual laboratory measurements, and ultimately to assess comparability of data generated by different laboratories. The results for the various QA/QC samples should be reviewed by laboratory personnel immediately following the analysis of each sample batch. These results then should be used to determine when warning and control limit criteria have not been met and corrective actions must be taken, before processing a subsequent sample batch. When warning limit criteria have not been met, the laboratory is not obligated to halt analyses, but the analyst(s) is advised ------- Section 5 Revision 2 Date 4/92 Page 6 of 49 to investigate the cause of the exceedance. When control limit criteria are not met, specific corrective actions are required before the analyses may proceed. Warning and control limit criteria and recommended frequency of analysis for each QA/QC element or sample type required in the EMAP-E program are summarized in Table 5-3. Descriptions of the use, frequency of analysis, type of information obtained, and corrective actions for each of these QA/QC sample types or elements are provided in the following sections. 5.1.2 Initial Calibration Equipment should be calibrated prior to the analysis of each sample batch, after each major equipment disruption, and whenever on-going calibration checks do not meet recommended control limit criteria (Table 5-3). All calibration standards should be traceable to a recognized organization for the preparation and certification of QA/QC materials (e.g., National Institute of Standards and Technology, U.S. Environmental Protection Agency, etc.). Calibration curves must be established for each element and batch analysis from a calibration blank and a minimum of three analytical standards of increasing concentration, covering the range of expected sample concentrations. The calibration curve should be well-characterized and must be established prior to the analysis of samples. Only data which results from quantification within the demonstrated working calibration range may be reported by the laboratory (i.e., quantification based on extrapolation is not acceptableY Samples outside the calibration range should be diluted or concentrated, as appropriate, and reanalyzed. ------- Section 5 Revision 2 Date 4/92 Page 7 of 49 TABLE 5-3. KEY ELEMENTS FOR QUALITY CONTROL OF EMAP-ESTUARIES CHEMICAL ANALYSES (SEE TEXT FOR DETAILED EXPLANATIONS). Element or Sample Type Warning Limit Control Limit Criteria Criteria Frequency 1.) Initial Demonstration of Capability (Prior to Analysis of Samples): - Instrument Calibration NA NA Initial and then prior to analyzing each batch of samples ¦ Calculation of Method Detection Limits Must be equal to or less than target values (see Table 5-4) At least once each year - Blind Analysis of Accuracy-Based Material NA NA Initial 2.) On-going Demonstration of Capability: - Blind Analysis of Laboratory Inter- comparison Exercise Samples NA Regular intervals throughout the NA year 3.) Continuing Calibration NA Checks using Calibration Standard Solutions should be within At a minimum, ±15% of initial middle and end calibration on of each sample average for all batch analytes, not to exceed ±25% for any one analyte Continued on following page ------- Section 5 Revision 2 Date 4/92 Page 8 of 49 TABLE 5-3 (Continued) Element or Sample Type Warning Limit Criteria Control Limit Criteria Frequency 3.) Analysis of Certified Reference Material (CRM) or Laboratory Control Material (LCM): Precision (see NOTE 1): NA Value obtained for each analyte should be within 3 s control chart limits One with each batch of samples Value plotted on control chart after each analysis of the CRM Relative Accuracy (see NOTE 2): PAHs PCBs/pesticides inorganic elements Lab's value should be within ±25% of true value on average for all analytes; not to exceed ±30% of true value for more than 30% of individual analytes same as above Lab should be within ±15% of true value for each analyte Lab's value should be within ±30% of true value on average for all analytes; not to exceed ±35% of true value for more than 30% of individual analytes same as above Lab should be within ±20% of true value for each analyte NOTE 1: The use of control charts to monitor precision for each analyte of interest should follow generally accepted practices (e.g., Taylor 1987). Upper and lower control limits, based on three standard deviations (3s) of the mean, should be updated at regular intervals. . NOTE 2: "True" values in CRMs may be either "certified" or "non-certified" (it is recognized that absolute accuracy can only be assessed using certified values, hence the term relative accuracy). Relative accuracy is computed by comparing the laboratory's value for each analyte against either end of the range of values (i.e., 95% confidence limits) reported by the certifying agency. The laboratory's value must be within ±35% of either the upper or lower 95% confidence interval value. Accuracy control limit criteria only apply for analytes having CRM concentrations > 10 times the laboratory's MDL. Continued on following page ------- Section 5 Revision 2 Date 4/92 Page 9 of 49 TABLE 5-3 (Continued) Element or Sample Type Warning Limit Criteria Control Limit Criteria Frequency 4.) Laboratory Reagent Blank Analysts should use best professional judgement if analytes are detected at <3 times the MDL No analyte should be detected at >3 times the MDL One with each batch of samples 5.) Laboratory Fortified Sample Matrix (Matrix Spike) NA Recovery should be At least within the range 5% of total 50% to 120% for at number of least 80% of the samples analytes NOTE: Samples to be spiked should be chosen at random; matrix spike solutions should contain all the analytes of interest. The final spiked concentration of each analyte in the sample should be at least 10 times the calculated MDL. 6.) Laboratory Fortified Sample Matrix Duplicate (Matrix Spike Duplicate) NA RPD1 must be < 30 for each analyte Same as matrix spike 7.) Field Duplicates (Field Splits) NA NA 5% of total number of samples 8.) Internal Standards (Surrogates) NA Recovery must be within the range 30% to 150% Each sample 9.) Injection Internal Standards Lab develops its own Each sample 1 RPD = Relative percent difference between matrix spike and matrix spike duplicate results (see section 5.1.11 for equation) ------- Section 5 Revision 2 Date 4/92 Page 10 of 49 5.1.3 Initial Documentation of Method Detection Limits Analytical chemists have coined a variety of terms to define "limits" of detectability; definitions for some of the more commonly-used terms are provided in Keith et al. (1983) and in Keith (1991). On the EMAP-E program, the Method Detection Limit (MDL) will be used to define the analytical limit of detectability. The MDL represents a quantitative estimate of low-level response detected at the maximum sensitivity of a method. The Code of Federal Regulations (40 CFR Part 136) gives the following rigorous definition: "the MDL is the minimum concentration of a substance that can be measured and reported with 99% confidence that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte." Confidence in the apparent analyte concentration increases as the analyte signal increases above the MDL. Each EMAP-E analytical laboratory must calculate and report an MDL for each analyte of interest in each matrix of interest (sediment or tissue) prior to the analysis of field samples for a given year. Each laboratory is required to follow the procedure specified in 40 CFR Part 136 (Federal Register, Oct. 28, 1984) to calculate MDLs for each analytical method employed. The matrix and the amount of sample (i.e., dry weight of sediment or tissue) used in calculating the MDL should match as closely as possible the matrix of the actual field samples and the amount of sample typically used. In order to ensure comparability of results among different laboratories, MDL target values have been established for the EMAP-E program (Table 5-4). The initial MDLs reported by each laboratory should be equal to or less than these specified target values before the analysis of field samples may proceed. Each laboratory must periodically (i.e., at least once each year) re-evaluate its MDLs for the analytical methods used and the sample matrices typically encountered. ------- Section 5 Revision 2 Date 4/92 Page 11 of 49 TABLE 5-4. TARGET METHOD DETECTION LIMITS FOR EMAP-ESTtJARIES ANALYTES INORGANICS (NOTE: concentrations in jig/g (ppm), dry weight) Tissue Sediments Aluminum 10.0 1500 Antimony not measured 0.2 Arsenic 2.0 1.5 Cadmium 0.2 0.05 Chromium 0.1 5.0 Copper 5.0 5.0 Iron 50.0 500 Lead 0.1 1.0 Manganese not measured 1.0 Mercury 0.01 0.01 Nickel 0.5 1.0 Selenium 1.0 0.1 Silver 0.01 0.01 Tin 0.05 0.1 Zinc 50.0 2.0 ORGANICS (NOTE: concentrations in fpph). drv weight) Tissue Sediments PAHs not measured 10 PCB congeners 2.0 1.0 Chlorinated pesticides 2.0 1.0 5.1.4 Initial Blind Analysis of a Representative Sample A representative sample matrix which is uncompromised, homogeneous and contains the analytes of interest at concentrations of interest will be provided to each analytical laboratory new to the EMAP-E program; this sample will be used to evaluate laboratory performance prior to the analysis of field samples. The sample used for this initial demonstration of laboratory capability typically will be distributed blind (i.e., the laboratory will not know the concentrations of the analytes of interest) as part of the laboratory QA intercomparison exercises. A laboratory's performance generally will be considered acceptable if its submitted values are within ±30% (for organic analyses) and ------- Section 5 Revision 2 Date 4/92 Page 12 of 49 ±20% (for inorganic analyses) of the known concentration of each analyte of interest in the sample. These criteria apply only for analyte concentrations equal to or greater than 10 times the MDL established by the laboratory. If the results for the initial analysis fail to meet these criteria, the laboratory will be required to repeat the analysis until the performance criteria are met, prior to the analysis of real samples. 5.1.5 Laboratory Participation in Intercomparison Exercises The laboratory QA intercomparison exercises previously referred to are sponsored jointly by the EMAP-E and NOAA NS&T Programs to evaluate both the individual and collective performance of their participating analytical laboratories. Following the initial demonstration of capability, each EMAP-E laboratory is required to participate in these on-going intercomparison exercises as a continuing check on performance and intercomparability. Usually, three or four different exercises are conducted over the course of a year. In a typical exercise, either NIST or NRCC will distribute performance evaluation samples in common to each laboratory, along with detailed instructions for analysis. A variety of performance evaluation samples have been utilized in the past, including accuracy-based solutions, sample extracts, and representative matrices (e.g., sediment or tissue samples). Laboratories are required to analyze the sample(s) "blind" and must submit their results in a timely manner both to the Virginian Province QA Coordinator, as well as to either NIST or NRCC (as instructed). Laboratories which fail to maintain acceptable performance may be required to provide an explanation and/or undertake appropriate corrective actions. At the end of each calendar year, coordinating personnel at NIST and NRCC hold a QA workshop to present and discuss the intercomparison exercise results. Representatives from each laboratory are encouraged to participate in the annual QA workshops, which provide a forum for discussion of analytical problems brought to light in the intercomparison exercises. ------- Section 5 Revision 2 Date 4/92 Page 13 of 49 5.1.6 Routine Analysis of Certified Reference Materials or Laboratory Control Materials Certified Reference Materials (CRMs) generally are considered the most useful QC samples for assessing the accuracy of a given analysis (i.e., the closeness of a measurement to the "true" value). Certified Reference Materials can be used to assess accuracy because they have "certified" concentrations of the analytes of interest, as determined through replicate analyses by a reputable certifying agency using two independent measurement techniques for verification In addition, the certifying agency may provide "non-certified" or "informational" values for other analytes of interest. Such values are determined using a single measurement technique, which may introduce unrecognized bias. Therefore, non-certified values must be used with caution in evaluating the performance of a laboratory using a method which differs from the one used by the certifying agency. A Laboratory Control Material (LCM) is similar to a Certified Reference Material in that it is a homogeneous matrix which closely matches the samples being analyzed. A "true" LCM is one which is prepared (i.e., collected, homogenized and stored in a stable condition) strictly for use in-house by a single laboratory. Alternately, the material may be prepared by a central laboratory and distributed to others (so-called regional or program control materials). Unlike CRMs, concentrations of the analytes of interest in LCMs are not certified but are based upon a statistically- valid number of replicate analyses by one or several laboratories. In practice, this material can be used to assess the precision (i.e., consistency) of a single laboratory, as well as to determine the degree of comparability among different laboratories. If available, LCMs may be preferred for routine (i.e., day-to-day) analysis because CRMs are relatively expensive. However, CRMs still must be analyzed at regular intervals (e.g., monthly or quarterly) to provide a check on accuracy. ------- Section 5 Revision 2 Date 4/92 Page 14 of 49 Routine analysis of Certified Reference Materials or, when available, Laboratory Control Materials represents a particularly vital aspect of the "performance-based" EMAP-E QA philosophy. At least one CRM or LCM must be analyzed along with each batch of 25 or fewer samples (Table 5-3). For CRMs, both the certified and non-certified concentrations of the target analytes should be known to the analyst(s) and should be used to provide an immediate check on performance before proceeding with a subsequent sample batch. Performance criteria for both precision and accuracy have been established for analysis of CRMs or LCMs (Table 5-3); these criteria are discussed in detail in the following paragraphs. If the laboratory fails to meet either the precision or accuracy control limit criteria for a given analysis of the CRM or LCM, the data for the entire batch of samples is suspect. Calculations and instruments should be checked; the CRM or LCM may have to be reanalyzed (i.e., re-injected) to confirm the results. If the values are still outside the control limits in the repeat analysis, the laboratory is required to find and eliminate the source(s) of the problem and repeat the analysis of that batch of samples until control limits are met, before continuing with further sample processing. The results of the CRM or LCM analysis should never be used by the laboratory to "correct" the data for a given sample batch. Precision criteria: Each laboratory is expected to maintain control charts for use by analysts in monitoring the overall precision of the CRM or LCM analyses. Upper and lower control chart limits (e.g., warning limits and control limits) should be updated at regular intervals; control limits based on 3 standard deviations of the mean generally are recommended (Taylor 1987). Following the analysis of all samples in a given year, an RSD (relative standard deviation, a.k.a. coefficient of variation) will be calculated for each analyte of interest in the CRM. For each analyte having a CRM concentration > 10 times the laboratory's MDL, an overall RSD of less than 30% will be considered acceptable precision. Failure to meet this goal will result in a thorough review of the laboratory's control charting procedures and analytical methodology to determine if improvements in precision are possible. ------- Section 5 Revision 2 Date 4/92 Page 15 of 49 Accuracy criteria: The "absolute" accuracy of an analytical method can be assessed using CRMs only when certified values are provided for the analytes of interest. However, the concentrations of many analytes of interest to EMAP-E are provided only as non-certified values in some of the more commonly-used CRMs. Therefore, control limit criteria are based on "relative accuracy", which is evaluated for each analysis of the CRM or LCM by comparison of a given laboratory's values relative to the "true" or "accepted" values in the LCM or CRM. In the case of CRMs, this includes both certified and noncertified values and encompasses the 95% confidence interval for each value as described in Table 5-3. Accuracy control limit criteria have been established both for individual compounds and combined groups of compounds (Table 5-3). There are two combined groups of compounds for the purpose of evaluating relative accuracy for organic analyses: PAHs and PCBs/pesticides. The laboratory's value should be within ±30% of the true value on average for each combined group of organic compounds, and the laboratory's value should be within ±35% of either the upper or lower 95% confidence limit for at least 70% of the compounds in each group. For inorganic analyses, the laboratory's value should be within ±20% of either the upper or lower 95% confidence limit for each analyte of interest in the CRM. Due to the inherent variability in analyses near the method detection limit, control limit criteria for relative accuracy only apply to analytes having CRM true values which are > 10 times the MDL established by the laboratory. 5.1.7 Continuing Calibration Checks The initial instrument calibration performed prior to the analysis of each batch of samples is checked through the analysis of calibration check samples (i.e., calibration standard solutions) inserted as part of the sample stream. Calibration standard solutions used for the continuing calibration checks should contain all the analytes of interest. ------- Section 5 Revision 2 Date 4/92 Page 16 of 49 At a minimum, analysis of the calibration check solution should occur somewhere in the middle and at the end of each sample batch. Analysts should use best professional judgement to determine if more frequent calibration checks are necessary or desirable. If the control limit for analysis of the calibration check standard is not met (Table 5-3), the initial calibration will have to be repeated. If possible, the samples analyzed before the calibration check sample that failed the control limit criteria should be reanalyzed following the re-calibration. The laboratory should begin by reanalyzing the last sample analyzed before the calibration standard which failed. If the relative percent difference (RPD) between the results of this reanalysis and the original analysis exceeds 30 percent, the instrument is assumed to have been out of control during the original analysis. If possible, reanalysis of samples should progress in reverse order until it is determined that there is less than 30 RPD between initial and reanalysis results. Only the reanalysis results should be reported by the laboratory. If it is not possible or feasible to perform reanalysis of samples, all earlier data (i.e., since the last successful calibration control check) is suspect. In this case, the laboratory should prepare a narrative explanation to accompany the submitted data. 5.1.8 Laboratory Reagent Blank Laboratory reagent blanks (also called method blanks or procedural blanks) are used to assess laboratory contamination during all stages of sample preparation and analysis. For both organic and inorganic analyses, one laboratory reagent blank should be run in every sample batch. The reagent blank should be processed through the entire analytical procedure in a manner identical to the samples. Warning and control limits for blanks (Table 5-3) are based on the laboratory's method detection limits as documented prior to the analysis of samples (see Section 5.1.3). A reagent blank concentration between the MDL and 3 times the MDL for one or more of the analytes of interest ------- Section 5 Revision 2 Date 4/92 Page 17 of 49 should serve as a warning limit requiring further investigation based on the best professional judgement of the analyst(s). A reagant blank concentration equal to or greater than 3 times the MDL for one or more of the analytes of interest requires definitive corrective action to identify and eliminate the source(s) of contamination before proceeding with sample analysis. 5.1.9 Internal Standards Internal standards (commonly referred to as "surrogates", "surrogate spikes" or "surrogate compounds") are compounds chosen to simulate the analytes of interest in organic analyses. The internal standard represents a reference analyte against which the signal from the analytes of interest is compared directly for the purpose of quantification. Internal standards must be added to each sample, including QA/QC samples, prior to extraction. The reported concentration of each analyte should be adjusted to correct for the recovery of the internal standard, as is done in the NOAA National Status and Trends Program. The internal standard recovery data therefore should be carefully monitored; each laboratory must report the percent recovery of the internal standard(s) along with the target analyte data for each sample. If possible, isotopically-labeled analogs of the analytes should be used as internal standards. Control limit criteria for internal standard recoveries are provided in Table 5-3. Each laboratory should set its own warning limit criteria based on the experience and best professional judgement of the analyst(s). It is the responsibility of the analyst(s) to demonstrate that the analytical process is always "in control" (i.e., highly variable internal standard recoveries are not acceptable for repeat analyses of the same certified reference material and for the matrix spike/matrix spike duplicate). ------- Section 5 Revision 2 Date 4/92 Page 18 of 49 5.1.10 Injection Internal Standards For gas chromatography (GC) analysis, injection internal standards (also referred to as "internal standards" by some analysts) are added to each sample extract just prior to injection to enable optimal quantification, particularly of complex extracts subject to retention time shifts relative to the analysis of standards. Injection internal standards are essential if the actual recovery of the internal standards added prior to extraction is to be calculated. The injection internal standards also can be used to detect and correct for problems in the GC injection port or other parts of the instrument. The compounds used as injection internal standards must be different from those already used as internal standards. The analyst(s) should monitor injection internal standard retention times and recoveries to determine if instrument maintenance or repair, or changes in analytical procedures, are indicated. Corrective action should be initiated based on the experience of the analyst(s) and not because warning or control limits are exceeded. Instrument problems that may have affected the data or resulted in the reanalysis of the sample should be documented properly in logbooks and/or internal data reports and used by the laboratory personnel to take appropriate corrective action. 5.1.11 Matrix Spike and Matrix Spike Duplicate A laboratory fortified sample matrix (commonly called a matrix spike, or MS) and a laboratory fortified sample matrix duplicate (commonly called a matrix spike duplicate, or MSD) will be used both to evaluate the effect of the sample matrix on the recovery of the compound(s) of interest and to provide an estimate of analytical precision. A minimum of 5% of the total number of samples submitted to the laboratory in a given year should be selected at random for analysis as matrix spikes/matrix spike duplicates. Each MS/MSD sample is first homogenized and then split into three subsamples. Two of these subsamples are fortified with the matrix spike solution and the third ------- Section 5 Revision 2 Date 4/92 Page 19 of 49 subsample is analyzed as is to provide a background concentration for each analyte of interest. The matrix spike solution should contain all the analytes of interest. The final spiked concentration of each analyte in the sample should be at least 10 times the MDL for that analyte, as previously calculated by the laboratory (see Section 5.1.3). Recovery data for the fortified compounds ultimately will provide a basis for determining the prevalence of matrix effects in the sediment samples analyzed during the project. If the percent recovery for any analyte in the MS or MSD is less than the recommended warning limit of 50 percent, the chromatograms and raw data quantitation reports should be reviewed. If an explanation for a low percent recovery value is not discovered, the instrument response may be checked using a calibration standard. Low matrix spike recoveries may be a result of matrix interferences and further instrument response checks may not be warranted, especially if the low recovery occurs in both the MS and MSD and the other QC samples in the batch indicate that the analysis was "in control". An explanation for low percent recovery values for MS/MSD results should be discussed in a cover letter accompanying the data package. Corrective actions taken and verification of acceptable instrument response must be included. Analysis of the MS/MSD also is useful for assessing laboratory precision. The relative percent difference (RPD) between the MS and MSD results should be less than 30 for each analyte of interest (see Table 5-3). The RPD is calculated as follows: fCl - C2) x 100% RPD = (CI + C2)/2 where: CI is the larger of the duplicate results for a given analyte C2 is the smaller of the duplicate results for a given analyte ------- Section 5 Revision 2 Date 4/92 Page 20 of 49 If results for any analytes do meet the RPD < 30% control limit criteria, calculations and instruments should be checked. A repeat analysis may be required to confirm the results. Results which repeatedly fail to meet the control limit criteria indicate poor laboratory precision. In this case, the laboratory is obligated to halt the analysis of samples and eliminate the source of the imprecision before proceeding. 5.1.12 Field Duplicates and Field Splits For the EMAP-E program, sediment will be collected at each station using a grab sampler. Each time the sampler is retrieved, the top 2 cm of sediment will be scraped off, placed in a large mixing container and homogenized, until a sufficient amount of material has been obtained. At approximately 5% of the stations, the homogenized material will be placed in four separate sample containers for subsequent chemical analysis. Two of the sample containers will be submitted as blind field duplicates to the primary analytical laboratory. The other two containers, also called field duplicates, will be sent blind to a second, reference laboratory. Together, the two pairs of duplicates are called field splits. The analysis of the field duplicates will provide an assessment of single laboratory precision. The analysis of the field duplicates and field splits will provide an assessment of both inter- and intra-laboratory precision, as well as an assessment of the efficacy of the field homogenization technique. 5.1.13 Analytical Chemistry Data Reporting Requirements As previously indicated, data for all QA/QC samples (e.g., CRMs, calibration check samples, blanks, matrix spike/matrix spike duplicates, etc.) must be submitted by the laboratory as part of the data package for each batch of samples analyzed. The laboratory should denote QA/QC samples using the recommended codes (abbreviations) ------- Section 5 Revision 2 Date 4/92 Page 21 of 49 provided in Table 5-5. The QA/QC results and associated data will be subject to review by the Province Manager, QA Coordinator, or their designee(s). TABLE 5-5. CODES FOR DENOTING QA/QC SAMPLES IN SUBMITTED DATA PACKAGES. Code Description Unit of measure CCCS Continuing calibration check standard Percent recovery CECS Calibration end check standard Percent recovery CRM Certified Reference Material ' 'g/g or ng/g (dry weight) PRCRM Percent recovery for CRM Percent recovery LRB Laboratory reagent blank ' 'g/g or ng/g (dry weight) LFSM Laboratory fortified sample matrix ' 'g/g or ng/g (dry weight) PRLFSM Percent recovery for the LFSM Percent recovery LFSMD Laboratory fortified sample matrix duplicate ' 'g/g or ng/g (dry weight) PRLFSMD Percent recovery for the LFSMD Percent recovery RPD Relative percent different between LFSM/LFSMD Percent EMAP-E laboratories are responsible for assigning only two data qualifier codes or "flags" to the submitted data. If an analyte is not detected, the laboratory should report the result as "ND", followed by the letter "a". The "a" code will be have the following meaning: "The analyte was not detected. The method detection limit for this analyte has been supplied by the laboratory and can be found in an accompanying dataset." If a quantifiable signal is observed, the laboratory should report a concentration for the analyte; the data qualifier code "b" then should be used to flag any reported values which are below the laboratory's MDL. The "b" code will have the following meaning: "The analyte was detected at a concentration less than or equal to the method detection limit. This reported concentration is an estimate which may not accurately reflect the actual concentration of this analyte in the sample." ------- Section 5 Revision 2 Date 4/92 Page 22 of 49 Only data which has met QA requirements should be submitted by the laboratory. When QA requirements have not been met, the samples should be re-analyzed and only the results of the re-analysis should be submitted, provided they are acceptable. There may be a limited number of situations where sample re-analysis is not possible or practical (i.e., minor exceedance of a single control limit criteria). The laboratory is expected to provide a detailed explanation of any factors affecting data quality or interpretation; this explanation should be in the form of a cover letter accompanying each submitted data package. The narrative explanation is in lieu of additional data qualifier codes supplied bv the laboratory (other than the "a" and "b" codes). Over time, depending on the nature of these narrative explanations, the EMAP-E program expects to develop a limited list of codes for qualifying data in the database (in addition to the "a" and "b" codes). 5.2 OTHER SEDIMENT MEASUREMENTS 5.2.1 Total organic carbon As a check on precision, each laboratory should analyze at least one TOC sample in duplicate for each batch of 25 or fewer samples. The relative percent difference (RPD) between the two duplicate measurements should be less than 20%. If this control limit is exceeded, analysis of subsequent sample batches should stop until the source of the discrepancy is determined and the system corrected. At least one certified reference material (CRM) or, if available, one laboratory control material (LCM) should be analyzed along with each batch of 25 or fewer TOC samples. Any one of several marine sediment CRMs distributed by the National Research Council of Canada's Marine Analytical Chemistry Standards Program (e.g., the CRMs named "BCSS-1", "MESS-1" and "PACS-1") have certified concentrations of total carbon and are recommended for this use. ------- Section 5 Revision 2 Date 4/92 Page 23 of 49 Prior to analysis of actual samples, it is recommended that each laboratory perform several total organic carbon analyses using a laboratory control material or one of the aforementioned CRMs to establish a control chart (the values obtained by the laboratory for total organic carbon should be slightly less than the certified value for total carbon in the CRM). The control chart then should be used to assess the laboratory's precision for subsequent analyses of the LCM or CRM with each sample batch. In addition, a method blank should be analyzed with each sample batch. Total organic carbon concentrations should be reported as , • g/g (ppm) dry weight of the unacidified sediment sample. Data reported for each sample batch should include QA/QC sample results (duplicates, CRMs or LCMs, and method blanks). Any factors that may have influenced data quality should be discussed in a cover letter accompanying the submitted data. 5.2.2 Acid volatile sulfide Quality control of acid volatile sulfide (AVS) measurements is achieved through the routine analysis of a variety of QA/QC samples. These are outlined in the following section and described in full detail in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). Prior to the analysis of samples, the laboratory must establish a calibration curve and determine a limit of reliable detection for sulfide for the analytical method being employed. Following this, laboratory performance will be assessed through routine analysis of laboratory duplicates, calibration check standards, laboratory fortified blanks (i.e., spiked blanks), and laboratory fortified sample matrices (i.e., matrix spikes). One sample in every batch of 25 or fewer samples should be analyzed in duplicate as a check on laboratory precision. The relative percent difference (as calculated by the formula given in section 5.1.11) between the two analyses should be less than 20%. If the RPD exceeds 20%, a third analysis should be performed. If the relative ------- Section 5 Revision 2 Date 4/92 Page 24 of 49 standard deviation of the three determined concentrations exceeds 20%, the individual analyses should be examined to determine if non-random errors may have occurred. Due to the instability of acid volatile sulfides to drying and handling in air, CRMs have not been developed for assessing overall measurement accuracy. Therefore, each laboratory must analyze at least one calibration check standard, one laboratory fortified blank and one laboratory fortified sample matrix in each batch of 25 or fewer samples as a way of determining the accuracy of each step entailed in performing the analysis. The concentration of sulfide in each of these three types of accuracy check samples will be known to the analyst; the calculated concentration of sulfide in each sample should be within ± 15% of the known concentration. If the laboratory is not within ± 15% of the known concentration for the calibration check solution, instruments used for AVS measurement must be recalibrated and/or the stock solutions redetermined by titration. If the laboratory fails to achieve the same accuracy (within ± 15% of the true value) for AVS in the laboratory fortified blank, sources of error (e.g., leaks, excessive gas flows, poor sample-acid slurry agitation) should be determined for the analytical system prior to continuing. If AVS recovery falls outside the 85% to 115% range for the matrix spike, the system should be evaluated for sources of error and the analysis should be repeated. If recovery remains unacceptable, it is possible that matrix interferences are occurring. If possible, the analysis should be repeated using smaller amounts of sample to reduce the interferent effects. Results for all QA/QC samples (duplicates, calibration check standards, spiked blanks and matrix spikes) should be submitted by the laboratory as part of the data package for each batch of samples, along with a narrative explanation for results outside control limits. ------- Section 5 Revision 2 Date 4/92 Page 25 of 49 5.2.3 Butvltins Assessment of the distribution and environmental impact of butyltin species of interest to the EMAP-E program (tributyltin, dibutyltin and monobutyltin) requires their measurement in marine sediment and tissue samples at trace levels (parts per billion to parts per trillion). Quality control of these measurements consists of checks on laboratory precision and accuracy. One laboratory reagent blank must be run with each batch of 25 or fewer samples. A reagent blank concentration between the MDL and 3 times the MDL should serve as a warning limit requiring further investigation based on the best professional judgement of the analyst(s). A reagant blank concentration equal to or greater than 3 times the MDL requires corrective action to identify and eliminate the source(s) of contamination, followed by re-analysis of the samples in the associated batch. One laboratory fortified sample matrix (commonly called a matrix spike) or laboratory fortified blank (i.e., spiked blank) should be analyzed along with each batch of 25 or fewer samples to evaluate the recovery of the butyltin species of interest. The butyltins should be added at 5 to 10 times their MDLs as previously calculated by the laboratory (see Section 5.1.3). If the percent recovery for any of the butyltins in the matrix spike or spiked blank is outside the range 70 to 130 percent, analysis of subsequent sample batches should stop until the source of the discrepancy is determined and the system corrected. The NRCC sediment reference material "PACS-1", which has certified concentrations of the three butyltin species of interest, also should be analyzed along with each batch of 25 or fewer sediment samples as a check on accuracy and reproducibility (i.e., batch-to-batch precision). If values obtained by the laboratory for butyltins in "PACS-1" are not within ±30% of the certified values, the data for the entire batch of samples is suspect. Calculations and instruments should be checked; the CRM may have to be reanalyzed to confirm the results. If the values are still ------- Section 5 Revision 2 Date 4/92 Page 26 of 49 outside the control limits in the repeat analysis, the laboratory is required to determine the source(s) of the problem and repeat the analysis of that batch of samples until control limits are met, before continuing with further sample processing. 5.2.4 Sediment grain size Quality control of sediment grain size analyses is accomplished by strict adherence to protocol and documentation of quality control checks. Several procedures are critical to the collection of high quality particle size data. Most important to the dry sieve analysis is that the screens are clean before conducting the analysis, and that all of the sample is retrieved from them. To clean a screen, it should be inverted and tapped on a table, while making sure that the rim hits the table evenly. Further cleaning of brass screens may be performed by gentle scrubbing with a stiff bristle nylon brush. Stainless steel screens may be cleaned with a nylon or brass brush. The most critical aspect of the pipet analysis is knowledge of the temperature of the silt-clay suspension. An increase of only 1 °C will increase the settling velocity of a particle 50 |im in diameter by 2.3 percent. It is generally recommended that the pipet analysis be conducted at a constant temperature of 20 °C. However, Plumb (1981) provides a table to correct for settling velocities at other temperatures; this table is included in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). Thorough mixing of the silt-clay suspension at the beginning of the analysis is also critical. A perforated, plexiglass disc plunger is very effective for this purpose. If the mass of sediment used for pipet analysis exceeds 25 g, a subsample should be taken as described by Plumb (1981). Silt-clay samples in excess of 25 g may give erroneous results because of electrostatic interactions between the particles. Silt-clay samples less than 5 g yield a large experimental error in weighing relative to the total sample weight. ------- Section 5 Revision 2 Date 4/92 Page 27 of 49 The analytical balance, drying oven, sieve shaker, and temperature bath used in the analysis should be calibrated at least monthly. Quality assurance for the sediment analysis procedures will be accomplished primarily by reanalyzing a randomly selected subset of samples from each batch, as described in full detail in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). A batch of samples is defined as a set of samples of a single textural classification (e.g., silt/clay, sand, gravel) processed by a single technician using a single procedure. Approximately 10% of each batch completed by the same technician will be reanalyzed (i.e., reprocessed) in the same manner as the original sample batch. If the absolute difference between the original value and the second value is greater than 10% (in terms of the percent of the most abundant sediment size class), then a third analysis will be completed by a different technician. The values closest to the third value will be entered into the database. In addition, all the other samples in the same batch must be re-analyzed, and the laboratory protocol and/or technician's practices should be reviewed and corrected to bring the measurement error under control. If the percent of the most abundant sediment size class in the original sample and the re-analyzed sample differs by less than 10, the original value will not be changed and the sediment analysis process will be considered in control. 5.2.5 Apparent RPD Depth The depth of the apparent RPD (redox potential discontinuity) will be determined in the field through visual observation of clear plastic cores inserted into undisturbed sediment grab samples at each station. In fine-grained sediments, the apparent RPD depth is measured from the sediment surface to the point at depth where the color changes from light to dark. As a QC check, sediment cores will be re-measured by the QA Coordinator during field audits. The field crew's original measurement should be within ±5 mm of the re-measurement; failure to achieve this agreement will result in re-training of the crew. ------- Section 5 Revision 2 Date 4/92 Page 28 of 49 5.3 SEDIMENT TOXICITY TESTING The toxicity of sediments collected in the field will be determined as an integral part of the benthic indicator suite, using 10-day acute toxicity tests with the marine amphipod Ampelisca abdita. Complete descriptions of the methods employed for the sediment toxicity test are provided in the Laboratory Methods Manual (U.S. EPA, in preparation). The various aspects of the test for which quality assurance/quality control procedures are specified include the following: the condition of facilities and equipment, sample handling and storage, the source and condition of test organisms, test conditions, instrument calibration, use of replicates, use of reference toxicants, record keeping, and data evaluation. In addition, any laboratory which has not previously performed the sediment toxicity test using Ampelisca abdita will be required to perform an initial demonstration of capability, as described below. 5.3.1 Facilities and Equipment Laboratory and bioassay temperature control equipment must be adequate to maintain recommended test temperatures. Recommended materials must be used in the fabrication of the test equipment in contact with the water or sediment being tested, as specified in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). 5.3.2 Initial Demonstration of Capability Laboratories which have not previously conducted sediment toxicity tests with Ampelisca abdita must demonstrate the ability to collect (if applicable), hold and test the organisms without significant loss or mortality, prior to performing tests of actual samples. There are two types of tests which must be performed as an initial demonstration of capability; these tests will serve to indicate the overall ability of laboratory personnel to handle the organism ------- Section 5 Revision 2 Date 4/92 Page 29 of 49 adequately and obtain consistent, precise results. First, the laboratory must perform a minimum of five successive reference toxicant tests, using sodium dodecyl sulfate (SDS) as the reference toxicant. For Ampelisca abdita. short-term (i.e., 96-hour) tests without sediments (i.e., seawater only) can be used for this purpose. The trimmed Spearman-Karber method of regression analysis (Hamilton et al. 1977) or the monotonic regression analysis developed by DeGraeve et al. (1988) can be used to determine an LC-50 value for each 96-hour reference toxicant test. The LC-50 values should be recorded on a control chart maintained in the laboratory (described in greater detail in section 5.3.4, to follow). Precision then can be described by the LC-50 mean, standard deviation, and percent relative standard deviation (coefficient of variation, or CV) of the five (or more) replicate reference toxicant tests. If the laboratory fails to achieve an acceptable level of precision in the five preliminary reference toxicant tests, the test procedure should be examined for defects and the appropriate corrective actions should be taken. Additional tests should be performed until acceptable precision is demonstrated. The second series of tests which must be performed successfully prior to the testing of actual samples are 10- day, "non-toxicant" exposures of Ampelisca abdita. in which test chambers contain the control sediment and seawater that will be used under actual testing conditions. These "control" tests should be performed concurrent with the reference toxicant tests used to assess single laboratory precision. At least five replicate test chambers should be used in each test. The tests should be run in succession until two consecutive tests each have mean survival equal to or greater than 90% and survival in the individual test chambers is not less than 80%. These are the control survival rates which must be achieved during actual testing if a test is to be considered acceptable (see section 5.3.6); therefore, the ------- Section 5 Revision 2 Date 4/92 Page 30 of 49 results of this preliminary demonstration will provide evidence that facilities, water, control sediment, and handling techniques are adequate to result in successful testing of samples. 5.3.3 Sample Handling and Storage Techniques for sample collection, handling, and storage are described in the field methods manual (Strobel and Schimmel 1991). Sediment samples for toxicity testing should be chilled to 4°C when collected, shipped on ice, and stored in the dark in a refrigerator at 4°C until used. Sediments should be stored for no longer than four weeks before the initiation of the test, and should not be frozen or allowed to dry. Sample containers should be made of chemically inert materials to prevent contamination, which might result in artificial changes in toxicity. To avoid contamination during collection, all sampling devices and any other instruments in contact with the sediment should be cleaned with water and a mild detergent and thoroughly rinsed between stations (see Strobel and Schimmel 1991). Only sediments not in contact with the sides of the sampling device should be subsampled, composited, and subsequently homogenized using teflon or stainless steel instruments and containers. 5.3.4 Quality of Test Organisms All amphipods used in the tests should be disease-free and should be positively identified to species. If the amphipods are collected from the field prior to testing, they should be obtained from an area known to be free of toxicants and should be held in clean, uncontaminated water and facilities. Amphipods held prior to testing should be checked daily, and individuals which appear unhealthy or dead should be discarded. If greater than 5% of the ------- Section 5 Revision 2 Date 4/92 Page 31 of 49 organisms in holding containers are dead or appear unhealthy during the 48 hours preceding a test, the entire group should be discarded and not used in the test. The sensitivity of each batch of test organisms obtained from an outside source (e.g., field collected or obtained from an outside culture facility) must be evaluated with the reference toxicant sodium dodecyl sulfate (SDS) in a short-term toxicity test performed concurrently with the sediment toxicity tests. The use of the reference toxicant SDS is required as a means of standardizing test results among different laboratories. For Ampelisca abdita. a 96-hour reference toxicant test without sediment is used to generate LC-50 values, as previously described in section 5.3.2. These LC-50 values should be recorded on the same control chart used to record the results of the five (or more) reference toxicant tests performed for the initial demonstration of capability. The control chart represents a "running plot" of the the toxicity values (LC50s) from successive reference toxicant tests. The mean LC50 and the upper and lower control limits (±2S) are recalculated with each successive point until the statistics stabilize. Outliers, which are values which fall outside the upper and lower control limits, are readily identified. The plotted values are used to evaluate trends in organism sensitivity, as well as the overall ability of laboratory personnel to obtain consistent results. Reference toxicant tests results (i.e., LC50 values) which fall outside control chart limits should serve as a warning to laboratory personnel. At the P=0.05 probability level, one in twenty tests would be expected to fall outside control limits by chance only. The laboratory should try to determine the cause of the outlying LC50 value, but a re-test of the samples is not necessarily required. If the reference toxicant test results are outside control chart limits on the next consecutive test, the sensitivity of the organisms and the overall credibility of the test are suspect. The test procedure again should be examined for defects and additional reference toxicant tests performed. Testing of samples ------- Section 5 Revision 2 Date 4/92 Page 32 of 49 should not resume until acceptable reference toxicant results can be obtained; this may require the use of a different batch of test organisms. 5.3.5 Test Conditions Parameters such as water temperature, salinity (conductivity), dissolved oxygen, and pH should be checked as required for each test and maintained within the specified limits (U.S. EPA, in preparation). Instruments used for routine measurements must be calibrated and standardized according to instrument manufacturer's procedures. All routine chemical and physical analyses must include established quality assurance practices as outlined in Agency methods manuals (U.S. EPA 1979a andb). Overlying water must meet the requirements for uniform quality specified in the method (U.S. EPA, in preparation). The minimum requirement for acceptable overlying water is that it allows acceptable control survival without signs of organism disease or apparent stress (i.e., unusual behavior or changes in appearance). The overlying water used in the sediment toxicity tests with Ampelisca may be natural seawater, hypersaline brine (100 o/oo) prepared from natural seawater, or artificial seawater prepared from sea salts. If natural seawater is used, it should be obtained from an uncontaminated area known to support a healthy, reproducing population of the test organism or a comparably sensitive species. ------- Section 5 Revision 2 Date 4/92 Page 33 of 49 5.3.6 Test Acceptability Survival of organisms in control treatments should be assessed during each test as an indication of both the validity of the test and the overall health of the test organism population. The amphipod tests with Ampelisca abdita are acceptable if mean control survival is greater than or equal to 90 percent, and if survival in individual control test chambers exceeds 80 percent. Additional guidelines for acceptability of individual sediment toxicity tests are presented in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). An individual test may be conditionally acceptable if temperature, dissolved oxygen (DO), and other specified conditions fall outside specifications, depending on the degree of the departure and the objectives of the tests. Any deviations from test specifications must be noted and reported to the QA Coordinator when reporting the data so that a determination can be made of test acceptability. 5.3.7 Record Keeping and Reporting Proper record keeping is mandatory. Bound notebooks should be used to maintain detailed records of the test organisms such as species, source, age, date of receipt, and other pertinent information relating to their history and health, and information on the calibration of equipment and instruments, test conditions employed, and test results. Annotations should be made on a real time basis to prevent loss of information. Data for all QA/QC variables, such as reference toxicant test results and copies of control charts, should be submitted by the laboratory along with test results. ------- Section 5 Revision 2 Date 4/92 Page 34 of 49 5.4 MACROBENTHIC COMMUNITY ASSESSMENT Sediment samples for macrobenthic community assessments will be collected at each station using a Young- modified Van Veen grab sampler. In order to be considered acceptable, each grab sample must be obtained following the specified protocol and must meet certain pre-established quality control criteria, as described in detail in the Field Operations Manual (Strobel and Schimmel 1991). The collected sediment will be sieved in the field through a 0.5 mm screen and the material collected on the screen preserved and returned to the laboratory for processing. In the laboratory, QA/QC involves a series of check systems for organism sorting, counting and taxonomic identification. These checks are described briefly in the following sections; more complete details can be found in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). 5.4.1 Sorting The quality control check on each technician's efficiency at sorting (i.e., separating organisms from sediment and debris) consists of an independent re-sort by a second, experienced sorter. A minimum of 10% of all samples sorted by each technician must be re-sorted to monitor performance and thus provide feedback necessary to maintain acceptable standards. These re-sorts should be conducted on a regular basis on at least one sample chosen at random for each batch of 10 samples processed by a given sorter. Inexperienced sorters require a more intensive QC check system. It is recommended that experienced sorters or taxonomists check each sample processed by inexperienced sorters until proficiency in organism extraction is demonstrated. Once proficiency has been demonstrated, the checks may be performed at the required frequency of one every ten samples. Logbooks must be maintained in the laboratory ------- Section 5 Revision 2 Date 4/92 Page 35 of 49 and used to record the number samples processed by each technician, as well as the results of all sample re-sorts. For each sample that is re-sorted, sorting efficiency should be calculated using the following formula: # of organisms originally sorted x 100 # organisms originally sorted + additional # found in resort The results of sample re-sorts may require that certain actions be taken for specific technicians. If sorting efficiency is greater than 95%, no action is required. If sorting efficiency is between 90% and 95%, problem areas should be identified and the technician should be re-trained. Laboratory supervisors must be particularly sensitive to systematic errors (e.g., consistent failure to extract specific taxonomic groups) which may suggest the need for further training. Resort efficiencies below 90% will require resorting of all samples in the associated batch and continuous monitoring of that technician to improve efficiency. If sorting efficiency is less than 90%, organisms found in the resort should be added to the original data sheet and, if possible, to the appropriate vials for biomass determination. If sorting efficiency is 90% or greater, the QC results should be recorded in the appropriate logbook, but the animals should not be added to the original sample or used for biomass determinations. Once all quality control criteria associated with the sample resort have been met, the sample residues may be discarded. ------- Section 5 Revision 2 Date 4/92 Page 36 of 49 5.4.2 Species Identification and Enumeration Only senior taxonomists are qualified to perform re-identification quality control checks. A minimum of 10% of all samples (i.e., one sample chosen at random out of every batch of ten samples) processed by each taxonomic technician must be checked to verify the accuracy of species identification and enumeration. This control check establishes the level of accuracy with which identification and counts are performed and offers feedback to taxonomists in the laboratory so that a high standard of perfromance is maintained. Samples should never be rechecked by the technician who originally processed the sample. Ideally, each batch of ten samples processed by an individual taxonomic technician should be from a similar habitat type (e.g., all oligohaline stations). The re-check of one out of the ten samples in a batch should be done periodically and in a timely manner so that subsequent processing steps (e.g., biomass determinations) and data entry may proceed. As each taxon is identified and counted during the re-check, the results should be compared to the original data sheet. Discrepancies should be double-checked to be sure of correct final results. Following re- identification, specimens should be returned to the original vials and set aside for biomass determination. When the entire sample has been re-identified and re-counted, the total number of errors should be computed. The total number of errors will be based upon the number of misidentifications and miscounts. Numerically, accuracy will be represented in the following manner: Total # of organisms in OC recount - Total number of errors x 100 Total # of organisms in QC recount ------- Section 5 Revision 2 Date 4/92 Page 37 of 49 where the following three types of errors are included in the total # of errors: 1.) Counting errors (for example, counting 11 individuals of a given species as 10). 2.) Identification errors (for example, identifying Species X as Species Y, where both are present) 3.) Unrecorded taxa errors (for example, not identifying Species X when it is present) Each taxonomic technician must maintain an identification and enumeration accuracy of 90% or greater (calculated using the above formula). If results fall below this level, the entire sample batch must be re-identified and counted. If taxonomic efficiency is between 90% and 95%, the original technician should be advised and species identifications reviewed. All changes in species identification should be recorded on the original data sheet (along with the date and the initials of the person making the change) and these changes should be entered into the database. However, the numerical count for each taxonomic group should not be corrected unless the overall accuracy for the sample is below 90%. Additional details on this protocol are provided in the EMAP-E Laboratory Methods Manual (U.S. EPA, in preparation). The results of all QC rechecks of species identification and enumeration should be recorded in a timely manner in a separate logbook maintained for this purpose. As organisms are identified, a voucher specimen collection (taxonomic reference collection) should be established. This collection should consist of representative specimens of each species identified in samples from an individual Province in a given year. For some species, it may be appropriate to include in the reference collection individuals collected in different geographic locations within the Province. The reference collection should be used to train new taxonomists and should be sent to outside consultants to verify the laboratory's taxonomic identifications. Any resulting discrepancies should be resolved in consultation with the EMAP-E Province Manager and/or the Province QA Coordinator. ------- Section 5 Revision 2 Date 4/92 Page 38 of 49 5.4.3 Biomass Measurements Performance checks of the balance used for biomass determinations should be performed routinely using a set of standard reference weights (ASTM Class 3, NIST Class S-l, or equivalents). In addition, a minimum of 10% of all pans and crucibles in each batch processed by a given technician must be reweighed by a second technician as a continuous monitor on performance. Samples to be reweighed should be selected randomly from the sample batch; the results of the reweigh should be compared against the original final weight recorded on the biomass data sheet. Weighing efficiency should be calculated using the following formula: Original final weight x 100 Reweighed final weight If weighing efficiency is between 95% and 105%, the sample has met the acceptable quality control criteria and no action is necessary. If weighing efficiency is between either 90% to 95% or 105% to 110%, the sample has met acceptable criteria, but the technician who completed the original weighing should be consulted and proper measurement practices reviewed. If the weighing efficiency is less than 90% or greater than 110%, then the sample has failed the quality control criteria and all samples in the associated batch must be reweighed (following technician re-training and/or troubleshooting of laboratory equipment to determine and eliminate the source(s) of the inconsistency). Corrections to the original data sheet should only be made in those cases where weighing efficiency is less than 90% or greater than 110%. The results of all QC reweighings should be recorded in a timely manner in a separate logbook or data sheet and maintained as part of the documentation associated with the biomass data. ------- Section 5 Revision 2 Date 4/92 Page 39 of 49 5.5 FISH SAMPLING 5.5.1 Species Identification. Enumeration and Length Measurements Fish species identification, enumeration and individual lengths will be determined in the field following protocols presented in the Virginian Province Field Operations Manual (Strobel and Schimmel 1991). The quality of fish identifications, enumerations and length measurements will be assured principally through rigorous training of field personnel prior to field sampling. Qualified taxonomists will provide independent confirmation of all fish identifications, enumerations and length measurements made by crew members during field and laboratory training sessions. An emphasis will be placed on correct identification of fish "target" species to be saved by the field crews for later chemical contaminant analyses. Fish identifications, enumerations and length measurements also will be confirmed by the QA Coordinator, Province Manager, or their designee(s) during field audits. In addition, each field crew will be required to save at least one "voucher" specimen of each species identified in the field. These voucher specimens will be preserved in fixative and sent back to the Field Operations Center on a regular basis throughout the field season. A qualified fish taxonomist will verily the species identifications and provide immediate feedback to the field crews whenever errors are found. The fish sent to the ERL- N laboratory for gross pathological and histopathological examination also will be checked for taxonomic determination accuracy. All erroneous identifications for a given field crew will be corrected in the database. The preserved voucher fish will be saved to provide a reference collection for use in subsequent years' training. The overall accuracy goal for all fish identifications, enumerations and length measurements in a given sampling season is 90% (i.e., less than 10% errors). If this goal is not met, corrective actions will include increased emphasis on training and more rigorous testing of field crews prior to the next year's sampling season. During the field ------- Section 5 Revision 2 Date 4/92 Page 40 of 49 season, the QA Coordinator, Province Manager and/or Field Coordinator must be informed of species misidentifications immediately so that the appropriate field crew can be contacted and the problem corrected. 5.5.2 Fish Gross Pathology and Histopathologv The field procedures for gross pathological examination of fish are detailed in the Virginian Province Field Operations and Safety Manual (Strobel and Schimmel 1991). As with fish identification and enumeration, the quality of gross pathology determinations will be assured principally through rigorous training of field personnel prior to field sampling. Qualified pathologists will be responsible for planning and overseeing all crew training and will provide independent confirmation of all pathologies noted by field personnel during the training sessions. During the actual sample collection period, these qualified pathologists also will record any gross external pathologies they find in examining fish which the crews send back to the laboratory for histopathological study. The laboratory pathologist(s) will perform these examinations without knowledge of the gross external pathologies noted by the field crews; this will provide a measure of the number and type of pathologies which were either incorrectly identified or missed in the field (i.e., false positives and false negatives). This information will be used to "customize" crew training in future years. A series of internal and external laboratory QC checks will be employed to provide verification of the fish histopathology identifications. In laboratories having multiple pathologists, all cases bearing significant lesions should be examined and verified by the senior pathologist. At least 5% of the slides read by one pathologist should be selected at random and read by a second pathologist without knowledge of the diagnoses made by the initial reader. For the external QC check, at least 5% of the slides should be submitted for independent diagnosis to a pathologist not involved with the laboratory. These slides should represent the range of pathological conditions found during the study, and the external pathologist should not be aware of the diagnoses made by the laboratory personnel. ------- Section 5 Revision 2 Date 4/92 Page 41 of 49 Each laboratory also should maintain a reference collection of slides that represent every type of pathological condition identified in the EMAP-E fish. Each of these slides should be verified by an external pathologist having experience with the species in question. The reference slide collection then can be used to verify the diagnoses made in future years to ensure intralaboratory consistency. The reference slides also can be compared with those of other laboratories to ensure interlaboratory consistency. A reference collection of photographs also can be made, but this should not substitute for a slide collection. ------- Section 5 Revision 2 Date 4/92 Page 42 of 49 5.6 WATER COLUMN MEASUREMENTS Characterization of the water column is accomplished through two types of measurements: point-in-time water column profiles and continuous, long-term near-bottom monitoring. The Seabird SBE 25 Sealogger CTD is used to obtain vertical profiles of temperature, salinity, dissolved oxygen, pH, light transmission, chlorophyll a fluorescence, and photosynthetically active radiation. The Hydrolab Datasonde3 is used to record long-term (48-72 hour) time series of temperature, salinity, dissolved oxygen, and pH in the near-bottom waters (ca. 1 meter off the bottom). A hand-held dissolved oxygen meter manufactured by Yellow Springs Instruments (YSI) is used to make an additional point measurement of near-bottom dissolved oxygen as a check on, and back-up to, the Seabird CTD measurement. Quality control of the water column measurements made with these electronic instruments consists of three aspects: calibrations, QC checks on the calibration, and QC checks on the deployment. The frequency of calibration of the Seabird CTD and Hydrolab Datasonde3 units varies both between and among instruments. Calibration checks are conducted after each calibration and at regular intervals to determine the need for recalibration. Checks also are conducted after retrieving each instrument in order to determine if the instrument performed properly during the CTD cast or Datasonde3 deployment. Specific QC procedures for each instrument are discussed in the following sections. 5.6.1 Seabird SBE 25 Sealogger CTD The Seabird SBE 25 Sealogger CTD provides depth profiles of temperature, salinity, dissolved oxygen, pH, light transmission, chlorophyll a fluorescence and photosynthetically active radiation. Individual sensor specifications are listed in the manufacturer's operating manual. The four CTD units used in the Virginian Province are programmed ------- Section 5 Revision 2 Date 4/92 Page 43 of 49 to log data internally at one second intervals. At least one vertical profile is obtained at each sampling station throughout the Province. Calibration Dissolved oxygen and pH sensors on the CTD are calibrated under controlled laboratory conditions by trained technicians following the procedures described in the Seabird manual. For the dissolved oxygen sensor, a two point calibration procedure is employed utilizing a zero adjustment (sodium sulfite solution or nitrogen gas) and a slope adjustment with air-saturated freshwater. The pH probe is calibrated at three points using pH 4, 7 and 10 standard buffer solutions. Calibrations are conducted prior to the field sampling and as needed throughout the field season. Immediately following calibration, the dissolved oxygen and pH sensors are checked for accuracy using Winkler titrations and pH standards, respectively. Temperature, conductivity, light transmission, fluorescence and photosynthetically active radiation sensors are calibrated by the manufacturer. If calibration checks of these sensors reveal a problem (see the following section), the instrument is returned to the manufacturer for troubleshooting and/or re-calibration. Calibration Checks Performance checks are conducted on the CTD units at the beginning and end of the field season. This procedure involves setting up the four CTD units to simultaneously log data in a well-mixed, 500-gallon seawater tank. Overall variability among instruments is assessed by comparing the simultaneous readings in the tank. The accuracy ------- Section 5 Revision 2 Date 4/92 Page 44 of 49 of the dissolved oxygen measurements is assessed by comparing the CTD readings against Winkler titration values. The accuracy of the CTD salinity (conductivity) measurements is assessed through comparison with readings obtained with a laboratory salinometer (Guildline AutoSal Model 8400) calibrated with IAPSO Standard Seawater (a.k.a. "Copenhagen" water). The instruments are removed from the tank and further tested: the transmissometer and fluorometer voltage endpoints (open and blocked light path) are recorded as described by the manufacturer, and the pH sensor readings are checked using three standard pH buffer solutions (pH 4, 7 and 10). Field QC checks of the CTD temperature, salinity, dissolved oxygen and pH readings are conducted once each week. Real-time CTD readings from just below the surface are compared to simultaneous measurements with a thermometer, refractometer, and YSI dissolved oxygen meter. The pH readings are checked using the pH 10 standard buffer solution. These weekly field checks act as a gross check on the operation of the sensors; however, if specified differences are exceeded (Table 5-4), the CTD instrument will be checked thoroughly and a determination made of the need for recalibration. If it is determined that a probe is malfunctioning and/or requires re-calibration, the instrument will be sent back to the Virginian Province Field Operations Center and replaced with a back-up unit. Deployment Checks The 1990 EMAP-NC Demonstration Project in the Virginian Province shed light on several CTD deployment problems that affected the performance of the dissolved oxygen sensor. The most commonly encountered problems were: 1.) air bubbles trapped in the dissolved oxygen plumbing loop, 2.) mud being sucked through the conductivity cell and into the plumbing loop upon contact of the instrument with ------- Section 5 Revision 2 Date 4/92 Page 45 of 49 Table 5-4. Maximum Acceptable Differences for Instrument Field Calibration Checks Instrument Frequency of Check Parameter Checked Aeainst Maximum Acceptable Difference Seabird CTD Once each week Temperature Salinity DO. pH Thermometer Refractometer YSI meter pH buffer solution ±2°C ± 2 ppt ± 1 mg/L ± 0.5 pH units Hydrolab DataSonde3 Pre- and post- deployment Temperature Salinity DO. pH Thermometer Refractometer YSI meter pH buffer solution ±2°C ± 2 ppt ± 1 mg/L ± 0.5 pH units YSID.O. Meter Once each week DO. Temperature Winkler titration Thermometer ± 0.5 mg/L ±2°C the bottom, and 3.) insufficient thermal equilibration time of the dissolved oxygen sensor. Deployment procedures have been modified in hopes of eliminating these problems (Strobel and Schimmel 1991). In addition, each CTD cast data file is reviewed in the field for evidence of deployment problems. A standard check on the data file is comparison of the downcast versus the upcast for all parameters, with particular attention to dissolved oxygen, salinity and light transmission. The dissolved oxygen profile is further evaluated by comparing the surface dissolved oxygen values at the beginning and end of the cast, and by comparing the bottom dissolved oxygen value to that recorded by the hand- held YSI meter. If either of these dissolved oxygen differences exceed 1 mg/L, the field crew should re-deploy the CTD to obtain a second profile. If the deployment QC criteria are still not met on the second CTD profile, the field crew should perform a calibration check (see preceding section) and associated troubleshooting to define the source(s) of the problem and, if necessary, ship the instrument back to the Field Operations Center by overnight express. ------- Section 5 Revision 2 Date 4/92 Page 46 of 49 5.6.2 Hvdrolab Datasonde 3 The Hydrolab Datasonde3 instruments are used for long-term monitoring of temperature, salinity, dissolved oxygen, pH and depth at each station; individual units are moored approximately 1 meter above the bottom inside a protective PVC housing. These instruments are programmed to record data internally at 15 minute intervals throughout their 48 to 72 hour deployments. Calibration The Datasonde3 instruments are calibrated prior to each long-term monitoring deployment. The conductivity cell, for measuring salinity, is calibrated using a secondary seawater standard that has been standardized against IAPSO Standard Seawater using a Guildline laboratory salinometer. The dissolved oxygen probe is calibrated using the water-saturated air calibration procedure recommended by the manufacturer. The pH probe is calibrated using two standard pH buffers (7 and 10) as recommended by the manufacturer. The pressure sensor used to measure depth is calibrated by setting the depth to zero meters while holding the instrument at the water's surface (i.e., sealevel). The calibration of the temperature sensor is set at the factory and cannot be changed. ------- Section 5 Revision 2 Date 4/92 Page 47 of 49 Calibration Checks Calibration QC checks are conducted at the dock on the morning that the instruments are to be deployed. The units are immersed in a bucket of local seawater or freshwater and their readings for temperature, salinity, and dissolved oxygen are compared to those recorded by a thermometer, rcfractomctcr. and the YSI dissolved oxygen meter, respectively. The pH probe readings are compared to a standard pH 7 buffer solution. If any of the specified differences are exceeded (Table 5-4), the instrument will be checked and, if necessary, recalibrated. If the instrument cannot be re-calibrated, an alternate (i.e., back-up) unit should be deployed and the malfunctioning unit should be sent back to the Field Operations Center for more detailed electronic troubleshooting and/or repair. The back-up instrument must pass all calibration QC checks prior to deployment. Deployment Checks The Datasonde3 instruments are checked for biological fouling of the probes (which can result in calibration drift and/or malfunction) upon retrieval from each long-term deployment. The procedures for the post-deployment QC checks are identical to the pre-deployment calibration QC checks (see previous section). If any of the sensor readings differ from the expected value by more than the specified limits (Table 5-4), the data logged during the deployment will be flagged as being outside the quality control criteria and will be reviewed for validity prior to inclusion in the database. ------- Section 5 Revision 2 Date 4/92 Page 48 of 49 5.6.3 YSI Dissolved Oxygen Meter The YSI Model 58 dissolved oxygen meter is used to measure dissolved oxygen concentration in water collected in a Go-Flo bottle from approximately one meter off the bottom at each station. The water is collected at about the same time the Seabird CTD is deployed. Comparison of the YSI and CTD near-bottom dissolved oxygen measurements provides a check on the operation of the CTD dissolved oxygen sensor during deployment. In addition, the YSI meter is used for bucket QC checks of the Hydrolab Datasonde3 units (prior to and following each Datasonde3 deployment) and side-by-side QC checks of the Seabird CTDs (once each week). Calibration The YSI dissolved oxygen meters are calibrated immediately prior to use at each station using the water- saturated air calibration procedure recommended by the manufacturer. Calibration Checks Calibration QC checks of the YSI meter are conducted at weekly intervals in the mobile laboratories. Following calibration, the YSI probe is immersed into a bucket of air-saturated water and allowed to stabilize. The dissolved oxygen of the water bath is determined by Winkler titration and compared to the YSI reading. The temperature of the water bath is measured with an alcohol thermometer and compared to the YSI temperature reading. If the dissolved oxygen or temperature difference exceeds the specified limits (Table 5-4), the instrument will be checked thoroughly and a determination made of the need for recalibration or probe replacement. ------- Section 5 Revision 2 Date 4/92 Page 49 of 49 5.7 NAVIGATION Station location information is logged through the SAIC Environmental Data Acquisition System (EDAS) which records navigation data through the interface of the Raytheon RAYNAV 780 LORAN and RAYSTAR 920 GPS. The EDAS utilizes a Kalman filter which allows navigation through either of the available positioning systems: GPS or calibrated LORAN-C. The station location, LORAN-C calibration factors, and a series of waypoints are saved in the EDAS log files for each station. The field crews are required to maintain a navigation log book and record all LORAN-C calibration information. In addition, the crews must record radar ranges and hand-held compass bearings for each sampling station on a station location information log sheet. These navigation logs will be checked for completeness and accuracy during the field audits. Following the completion of field activities, the range and bearing information from a subset of stations visited by each crew will be reviewed at random to verify the positioning acccuracy achieved using the electronic navigation system. ------- Section 6 Revision 1 Date 7/91 DRAFT 1 Page 1 of 4 SECTION 6 FIELD OPERATIONS AND PREVENTIVE MAINTENANCE 6.1 TRAINING AND SAFETY Proper training of field personnel represents a critical aspect of quality control. Field technicians are trained to conduct a wide variety of activities using standardized protocols to insure comparability in data collection among crews and across regions. Each field team consists of a Team Leader and two 4-member crews. Each crew is headed by a Crew Chief (one of which is the Team Leader), who is captain of the boat and the ultimate on-site decision maker regarding safety, technical direction, and communication with the Field Operations Center. Minimum qualifications for the Team Leaders and Crew Chiefs include an M.S. degree in biological/ecological sciences and three years of experience in field data collection activities, or a B.S. degree and five years experience. The remaining three crew members generally are required to hold B.S. degrees and, preferably, at least one year's experience. Prior to the actual sample collection period, each crew receives formal training and must undergo a fairly elaborate check-out procedure. Upon completion of an intensive two to three week training session, each crew chief must pass a practical examination. This examination is useful for assessing the effectiveness of the crew chief training session and serves to point out specific areas where further training is warranted. ------- Section 6 Revision 1 Date 7/91 DRAFT 1 Page 2 of 4 Following the preliminary crew chief training session, both crew chiefs and their crew members participate in a second intensive training program. Both classroom and "hands-on" training is coordinated by staff members at the EMAP-VP Field Operations Center; these personnel have extensive experience instructing field technicians in routine sampling operations (e.g., collection techniques, small boat handling). The expertise of the on-site EMAP staff is supplemented by local experts in such specialized areas as fish pathology, fish identification, benthic sampling, field computer/navigation system use, and first aid (including cardiopulmonary resuscitation (CPR) training). All the sampling equipment (e.g., boats, instruments, grabs, nets, computers, etc.) is used extensively during the "hands-on" training sessions, and by the end of the course, all crews members must demonstrate proficiency in all the required sampling activities. Upon completion of the formal crew training session, another practical examination is administered to all crew chiefs and crew members. At this time all crew chiefs and their crews should be satisfactorly checked out in all pertinent areas. Some sampling activities (e.g., fish taxonomy, gross pathology, net repair, etc.) require specialized knowledge. While all crew members are exposed to these topics during the training sessions, it is beyond the scope of the training program to develop proficiency for each individual in these areas. For each of the specialized activities, selected crew members (generally those with prior experience in a particular area) are provided with more intensive training. At the conclusion of the training program, at least one member of each crew must demonstrate proficiency in fish taxonomy, gross pathology, net repair, gear deployment, and navigation. If any crew does not meet these minimal requirements, further training is provided prior to actual field sampling. ------- Section 6 Revision 1 Date 7/91 DRAFT 1 Page 3 of 4 All aspects of field operations are detailed in the Field Operations and Safety Manual (Strobel and Schimmel 1991), which is distributed to all trainees prior to the training period. The manual includes a checklist of all equipment, instructions on equipment use, and detailed written descriptions of sample collection procedures. In addition, the manual includes flow charts and a schedule of activities to be conducted at each sampling location, along with a list of potential hazards associated with each sampling site. In addition to the formal classroom training and practical examinations, all crews are evaluated on their field performance during "dry runs" conducted just prior to the actual sampling period. Each crew is audited during these dry runs by either the Quality Assurance Officer or the Field Coordinator. The crews also are evaluated by other personnel at the Field Operations Center for their performance on other field activities, such as data entry, communications and shipping procedures. If any deficiencies within a crew are noted, they are remedied prior to field sampling. This is accomplished by additional training or by changing the crew composition. 6.2 FIELD QUALITY CONTROL AND AUDITS Quality control of measurements made during the actual field sampling period is accomplished through the use of a variety of QC sample types and procedures, as described in Sections 4 and 5 of this document. At least once during each field season, a formal site audit of each field crew is performed by either the QAO, the Field Coordinator, or the Province Manager to insure compliance with prescribed protocols. A checklist has been developed to insure comparability and consistency in the auditing process. Field crews will be re-trained whenever discrepancies are noted. ------- Section 6 Revision 1 Date 7/91 DRAFT 1 Page 4 of 4 6.3 PREVENTIVE MAINTENANCE The importance of proper maintenance of all gear cannot be understated. Failure of any piece of major equipment, especially when back-up equipment is unavailable, can result in a significant loss of data. Maintenance of equipment must be performed at regular intervals, as specified in the Field Operations and Safety Manual (Strobel and Schimmel 1991). It will be the responsibility of the Team Leader to maintain a logbook of equipment usage and assure that proper maintenance is performed at the prescribed time intervals. The equipment maintenance logbook will be examined during field audits and at the end of the field season to insure that proper procedures have been followed. ------- Section 7 Revision 1 Date 7/91 DRAFT 1 Page 1 of 2 SECTION 7 LABORATORY OPERATIONS 7.1 LABORATORY PERSONNEL, TRAINING, AND SAFETY This section addresses only general laboratory operations, while specific QA/QC requirements and procedures are presented in sections 4 and 5. Personnel in any laboratory performing EMAP analyses should be well versed in standard safety practices; it is the responsibility of the laboratory manager and/or supervisor to ensure that safety training is mandatory for all laboratory personnel. The laboratory is responsible for maintaining a current safety manual in compliance with the Occupational Safety and Health Administration (OSHA) regulations, or equivalent state or local regulations. The safety manual should be readily available to laboratory personnel. Proper procedures for safe storage, handling and disposal of chemicals should be followed at all times; each chemical should be treated as a potential health hazard and good laboratory practices should be implemented accordingly. 7.2 QUALITY CONTROL DOCUMENTATION In each laboratory, the following EMAP-Near Coastal documents must be current and available: o Laboratory Methods Manual - A document containing detailed instructions about laboratory and instrument operations (U. S. EPA, in preparation). o Quality Assurance Project Plan - A document containing clearly defined laboratory QA/QC protocols (this document). ------- Section 7 Revision 1 Date 7/91 DRAFT 1 Page 2 of 2 In addition to the official EMAP-NC documents, each laboratory should maintain the following: o Standard Operating Procedures (SOPs) - Detailed instructions for performing routine laboratory procedures, usually written in "cookbook" format. In contrast to the Laboratory Methods Manual, SOPs offer step-by-step instructions describing exactly how the method is implemented in a particular laboratory. o Instrument performance study information - Information on instrument baseline noise, calibration standard response, precision as a function of concentration, detection limits, etc. This information usually is recorded in logbooks or laboratory notebooks. 7.3 ANALYTICAL PROCEDURES Complete and detailed procedures for processing and analysis of samples in the field and laboratory are provided in the Field Operations and Safety Manual (Strobel and Schimmel 1991) and the Laboratory Methods Manual (U.S. EPA, in preparation), respectively, and will not be repeated here. 7.4 LABORATORY PERFORMANCE AUDITS Initially, a QA assistance and performance audit will be performed by QA personnel to determine if each laboratory effort is in compliance with the procedures outlined in the Methods Manual and QA Project Plan and to assist the laboratory where needed. Additionally, once during the study, a formal laboratory audit will be conducted by a team composed of the QA Officer and his/her technical assistants. ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 1 of 8 SECTION 8 QUALITY ASSURANCE AND QUALITY CONTROL FOR MANAGEMENT OF DATA AND INFORMATION 8.1 SYSTEM DESCRIPTION The Near Coastal Information Management System (NCIMS) is designed to perform the following functions: o document sampling activities and standard methods, o support program logistics, sample tracking and shipments, o process and organize both the data collected in the field and the results generated at analytical laboratories, o perform range checks on selected numerical data, o facilitate the dissemination of information, and o provide interaction with the EMAP Central Information System. A complete and detailed description of the NCIMS is provided in Rosen et. al. (1991) and will not be repeated here. 8.2 QUALITY ASSURANCE/QUALITY CONTROL Two general types of problems which must be resolved in developing QA/QC protocols for information and data management are: (1) correction or removal of erroneous individual values and (2) inconsistencies that damage the integrity of the data base. The following features of the NCIMS will provide a ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 2 of 8 foundation for the management and quality assurance of all data collected and reported during the life of the project. 8.2.1 Standardization A systematic numbering system will be developed for unique identification of individual samples, sampling events, stations, shipments, equipment, and diskettes. The sample numbering system will contain codes which will allow the computer system to distinguish among several different sample types (e.g., actual samples, quality control samples, sample replicates, etc.). This system will be flexible enough to allow changes during the life of the project, while maintaining a structure which allows easy comprehension of the sample type. Clearly stated standard operating procedures will be given to the field crews with respect to the use of the field computer systems and the entry of data in the field. Contingency plans will also be stated explicitly in the event that the field systems fail. 8.2.2 Prelabeling of Equipment and Sample Containers Whenever possible, sample containers, equipment, and diskettes will be prelabeled to eliminate confusion in the field. The prelabeling will reduce the number of incorrect or poorly-affixed labels. Containers with all the required prelabeled sample containers, sample sheets, and data diskettes will be prepared for the field crews prior to each sampling event (an event is defined as a single visit by a crew to a sampling site). These containers will be called "event boxes". Each event box will have the event number affixed to it using both handwritten and bar code labels. ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 3 of 8 8.2.3 Data Entry. Transcription, and Transfer To minimize the errors associated with entry and transcription of data from one medium to another, data will be captured electronically. When manual entry is required, the data should be entered twice by different data entry operators and then checked for non-matches to identify and correct errors. In many instances, the use of bar code labels should eliminate the need for manual entry of routine information. Each group transmitting data to the information center will be given a separate account on the Near Coastal VAX 3300. Standard formats for data transfer will be established by the Information Management Team. A specific format will be developed for each file type within each discipline. If data are sent to the Near Coastal Information Center in formats other than those specified, the files will be deleted and the sending laboratory or agency will be asked to resubmit the data in the established format. The communications protocols used to transfer data electronically will have mechanisms by which the completeness and accuracy of the transfer can be checked. In addition, the group sending the information should specify the number of bytes and file names of the transferred files. These data characteristics should be verified upon receipt of the data. If the file cannot be verified, a new file transfer should be requested. Whenever feasible, a hard copy of all data should be provided with transfer files. The data files tranmitted from the field will be fixed format text files. These files will be "parsed" by the system. The parsing process involves transferring records of similar type into files containing only those types of records. For example, observation on fish species and size will be copied from the original log file transmitted from the field to a "fish" data file. After the records have been parsed from the field log files, the individual data files will ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 4 of 8 be checked automatically for erroneous values, as described in the following section. Records in the field log file which are not entered into the data base (e.g., comments in text form) will be archived for documentation or future extraction. 8.2.4 Automated Data Verification Erroneous numeric data will be identified using automatic range checks and filtering algorithms. When data fall outside of an acceptable range, they will be flagged in a report for the quality assurance officer (QAO), or his designee. This type of report will be generated routinely and should detail the files processed and the status of the QA checks. The report will be generated both on disk and in hardcopy for permanent filing. The QAO will review the report and release data which have passed the QA check for addition to the data base. All identified errors must be corrected before flagged files can be added to a data base. If the QAO finds that the data check ranges are not reasonable, the values can be changed by written request. The written request should include a justification for changing the established ranges. If the QAO finds the need for additional codes, they can be entered by the senior data librarian. After such changes are made, the files may be passed through the QA procedure again. In the event that the QA check identifies incorrect data, the QAO will archive the erroneous file and request that the originator corrects the error and retransmits the data. Data base entries which are in the form of codes should be compared to lists of valid values (e.g., look up tables) established by experts for specific data types. These lists of valid codes will be stored in a central data base for easy access by data base users. When a code cannot be verified in the appropriate look up table, the observation should be flagged in the QAO report for appropriate corrective action (e.g., update of the look up table or removal of the erroneous code). ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 5 of 8 8.2.5 Sample Tracking Samples collected in the field will be shipped to analytical laboratories. All shipping information required to adequately track the samples (sample numbers, number of containers, shipment numbers, dates, etc.) will be transmitted by phone to the information center at the end of each sample day, using modems built into the portable field computers. Once the field crew have transmitted the data, it will be the responsibility of the data management team to confirm that the samples arrive at their destination. Each receiving laboratories will be required, upon receipt of the samples, to record and similarly transmit all tracking information (e.g., sample identification numbers, shipment numbers and the status of the samples) to the information center, using either microcomputers or the VAX. The use of barcode labels and readers will facilitate this process. The information management team will generate special programs to create fixed format records containing this information. 8.2.6 Reporting Following analysis of the samples, the summary data packages transmitted from the laboratories will include sample tracking information, results, quality assurance and quality control information, and accompanying text. If the laboratory has assigned internal identification numbers to the samples, the results should include the original sample number and the internal number used by the laboratory. The analytical laboratories will be responsible for permanent archiving of all raw data used in generating the results. ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 6 of 8 8.2.7 Redundancy (Backups') All files in the NCIMS will be backed up regularly. At least one copy of the entire system will be maintained off-site to enable the information management team to reconstruct the data base in the event that one system is destroyed or incapacitated. In the field, information stored on the hard drive will be sent to the on- board printer to provide a real time hardcopy backup. The information on the hard drive also will be copied to diskettes at the end of each day of sampling. At the Near Coastal Information Center in Narragansett, incremental backups to removable disk will be performed on all files which have changed on a daily basis. In addition, backups of all EMAP directories and intermediate files will be performed on a weekly basis to provide a backup in the event of a complete loss of the Near Coastal Information Center facility. All original data files will be saved on-line for at least two years, after which the files will be permanently archived on floppy diskette. All original files, especially those containing the raw field data, will be protected so that they can only be read (i.e., write and delete privileges will be removed from these files). 8.2.8 Human Review All discrepancies which are identified by the computer will be documented in hard copy. These discrepancy logs will be saved as part of the EMAP archive. All identified discrepancies should be brought to the attention of the QAO or his/her designee, who will determine the appropriate corrective action to be taken. Data will not be transferred to the data base until all discrepancies have been resolved by the QAO. Once data have been entered into the data base, ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 7 of 8 changes will not be made without the written consent of the QAO, who will be responsible for justifying and documenting the change. A record of all additions will be entered into a data set index and kept in hard copy. 8.3 DOCUMENTATION AND RELEASE OF DATA Comprehensive documentation of information relevant to users of the NCIMS will be maintained and updated as necessary. Most of this documentation will be accessible on-line, in data bases which decribe and interact with the system. The documentation will include a data base dictionary, access control, and data base directories (including directory structures), code tables, and continuously-updated information on field sampling events, sample tracking, and data availability. A limited number of personnel will be authorized to make changes to the Near Coastal data base. All changes will be carefully documented and controlled by the senior data librarian. Data bases which are accessible to outside authorized users will be available in "read only" form. Access to data by unauthorized users will be limited through the use of standard DEC VAX security procedures. Information on access rights to all EMAP-NC directories, files, and data bases will be provided to all potential users. The release of data from the NCIMS will occur on a graduated schedule. Different classes of users will be given access to the data only after it reaches a specified level of quality assurance. Each group will use the data on a restricted basis, under explicit agreements with the Near Coastal Task Group. The following four groups are defined for access to data: I. The Virginian Province central group, including the information management team, the field coordinator, the logistics coordinator, the Province Manager, the QA officer and the field crew chiefs. ------- Section 8 Revision 1 Date: 7/91 DRAFT 1 Page 8 of 8 II. Near Coastal primary users - ERL-N, VERSAR, SAIC, Gulf Breeze personnel, NOAA Near Coastal EMAP personnel, and EMAP quality assurance personnel. III. EMAP data users - All other task groups within EPA, NOAA, and other federal agencies. IV. General Public - university personnel, other EPA offices (includes regional offices), and other federal, state, and local governments. Requests for premature release of data will be submitted to the Information Management Team. The senior data analyst and the QAO will determine if the data can be released. The final authority on the release of all data is the technical director of EMAP Near Coastal. The long-term goal for the Near Coastal Information Management Team will be to develop a user interface through which all data will be accessed. This will improve control of security and monitoring of access to the data, and it help ensure that the proper data files are being accessed. ------- Section 9 Revision 1 Date 7/91 DRAFT 1 Page 1 of 1 SECTION 9 QUALITY ASSURANCE REPORTS TO MANAGEMENT A quality assurance report (or section of the Annual Statistical Summary) will be prepared by the Province QA Officer following each year's sampling efforts. This report will summarize the measurement error estimates for the various data types using the QA/QC sample data (see Sections 4 and 5). Precision, accuracy, comparability, completeness, and representativeness of the data will be addressed in this document. Within 30 days of each audit (field or laboratory), the QA Officer will submit an audit report to the Province Manager. The audit report will describe the results of the audit in full detail and note any deficiencies requiring management action. The QA Officer will monitor the implementation of corrective actions in response to negative audit findings, and will make regular reports to the Province Manager in this regard. In addition to the formal reports described above, the Province QA Officer will regularly report to the Province Manager on an informal basis. One of the primary responsibilities of the QA Officer is to keep the Province Manager informed of any issue or problem which might have a negative effect on the data collected. ------- Section 10 Revision 1 Date 7/91 DRAFT 1 Page 1 of 2 SECTION 10 REFERENCES Ballschmiter, K., W. Schafer and H. Buchert. 1987. Fresenius Z. Anal. Chem. 326:253-257. Cantillo, A.Y. 1990. Standard and Reference Materials for Marine Sciences. Intergovernmental Oceanographic Commission Manuals and Guides 21. Degraeve, G.M., N. G. Reichenbach, J. D. Cooney, P. I. Feder, and D. I. Mount. 1988. New developments in estimating endpoints for chronic toxicity tests. Abstract, Am. Soc. Test. Mater. 12th Symp. Aquat. Toxicol. Hazard Assess., Sparks, Nev. Federal Register, Part VIII, EPA, "Guidelines Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act: Final Rule and Proposed Rule. 40 CFR Past 136, Oct. 28, 1984. Hamilton, M. A., R. C. Russo, and R. V. Thurston. 1977. Trimmed Spearman-Karber method for estimating median lethal concentrations in toxicity bioassays. Environ. Sci. Technol. 11:714-719; Correction 12:417 (1978). Holland, A. F., ed. 1990. Near Coastal Program Plan for 1990: Estuaries. EPA 600/4-90/033. U.S. Environmental Protection Agency, Environmental Research Laboratory, Office of Research and Development, Narragansett, RI. Hunt, D. T. E., and A. L. Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques. 2nd ed. Royal Society of Chemistry, London, England 683 pp. Keith, L. H., W. Crumett, J. Deegan, Jr., R. A. Libby, J. K. Taylor, and G. Wentler. 1983. Principles of environmental analysis. Anal. Chem. 55:2210-2218. Keith, L. H. 1991. Environmental Sampling and Analysis: A Practical Guide. Lewis Publishers, Chelsea, MI, 143 pp. Kirchner, C. J. 1983. Quality control in water analysis. Environ. Sci. and Technol. 17(4):174A-181A. Krahn, M. M., C. A. Wigren, R. W. Pearce, L. K. Moore, R. G. Bogar, W. D. MacLeod, S. L. Chan, and D. W. Brown. 1988. Standard Analytical Procedures of the NOAA National Analytical Facility, 1988, New HPLC Cleanup and Revised Extraction Procedures for Organic Contaminants. NOAA Technical Memo. NMFS F/NWC-153. U.S. Dept. of Commerce, NOAA National Marine Fisheries Service, Seattle, Washington. Lauenstein, G. L. in preparation. A Compendium of Methods Used in the NOAA National Status and Trends Program. ------- Section 10 Revision 1 Date 7/91 DRAFT 1 Page 2 of 2 Plumb, R. H., Jr. 1981. Procedures for handling and chemical analysis of sediment and water samples. Technical Report EPA\CE-81-1. U.S. Environmental Protection Agency /U.S. Corps of Engineers Technical Committee on Criteria for Dredged and Fill Material, U.S. Army Waterways Experiment Station, Vicksburg, MS. 471 pp. Rosen, J. S., H. Buffum, J. Beaulieu, and M. Hughes. 1991. Information Management Plan for the EMAP-Near Coastal Program. U.S. Environmental Protection Agency, Environmental Research Laboratory, Office of Research and Development, Narragansett, RI. Stanley, T. W., and S. S. Verner. 1983. Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans. EPA/600/4-83/004. U.S. Environmental Protection Agency, Washington, D.C. Stanley, T. W., and S. S. Verner. 1985. The U. S. Environmental Protection Agency's quality assurance program, pp 12-19 In: J. K. Taylor and T. W. Stanley (eds.). Quality Assurance for Environmental Measurements, ASTM STP 867. American Society for Testing and Materials, Philadelphia, Pennsylvania. Strobel, C. J. and S. C. Schimmel. 1991. Near Coastal 1991 Virginian Province Field Operations and Safety Manual. U.S. Environmental Protection Agency, Environmental Research Laboratory, Office of Research and Development, Narragansett, RI. Taylor, J. K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers, Inc., Chelsea, Michigan. 328 pp. U.S. Environmental Protection Agency, in preparation. EMAP Laboratory Methods Manual: Estuaries. U. S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Office of Research and Development, Cincinnati, Ohio. U.S. Environmental Protection Agency. 1979a. Methods for chemical analysis of water and wastes. EPA-600/4-79/020. U. S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Office of Research and Development, Cincinnati, Ohio (revised March 1983). U.S. Environmental Protection Agency. 1979b. Handbook for analytical quality control in water and wastewater laboratories. U. S. Environmental Protection Agency, Environmental Monitoring and Support Laboratory, Cincinnati, Ohio, EPA/600/4-79/019. U.S. Environmental Protection Agency. 1989. Recommended Protocols for Measuring Selected Environmental Variables inPuget Sound. U.S. Environmental Protection Agency, Puget Sound Estuary Program, Office of Puget Sound, Seattle, Washington. ------- |