EJBD ARCHIVE EPA 601- D- 90- 003 United States Office of Research EPA - 600/X90/XXX Environmental Protection and Development April 1990 Agency Washington, DC 20460 1990 Demonstration Project Quality Assurance Project Plan for EMAP Near Coastal Environmental Monitoring and Assessment Program ------- Repository Material Permanent Collection 003 ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM NEAR COASTAL DEMONSTRATION PROJECT QUALITY ASSURANCE PROJECT PLAN by R. Valente and C. Strobel Science Applications International Corporation 27 Tarzwell Drive Narragansett, Rhode Island 02882 and J.E. Pollard, K.M. Peres, and T.C. Chiang Lockheed Engineering & Sciences Company 1050 E. Flamingo Road, Suite 209 Las Vegas, Nevada 89119 and "-0 J . Rosen Computer Sciences Corporation 27 Tarzwell Drive Narragansett, Rhode Island 02882 Project Officer D.T. Heggem Exposure Assessment Division Environmental Monitoring Systems Laboratory Las Vegas, Nevada 89193-3478 ENVIRONMENTAL MONITORING SYSTEMS LABORATORY OFFICE OF RESEARCH AND DEVELOPMENT U.S. ENVIRONMENTAL PROTECTION AGENCY CINCINNATI, OHIO 45268 k ' US EPA i,; US EPA ^ Headquarters and Chemical Libraries , ^ Heaaq^ West eidg Room 3340 Mailcode 3404T ^ 3 1301 Constitution Ave NW ? l Washington DC 20004 ^ 202-566-0556 ------- ABSTRACT This document outlines the integrated quality assurance plan for the Environmental Monitoring and Assessment Program's Near Coastal Demonstration Project. The quality assurance plan is prepared following the guidelines and specifications provided in 1983 by the Quality Assurance Management Staff of the U.S. Environmental Protection Agency Office of Research and Develop- ment. Objectives for five data quality indicators (completeness, representativeness, comparability, precision, and accuracy) are established for the Near Coastal Demonstration Project. The primary purpose of the integrated quality assurance plan is to maximize the probability that data collected over the duration of the project will meet or exceed these objectives, and thus provide scientifically sound interpretations of the data in support of the project goals. Various procedures are specified in the quality assurance plan to: (1) ensure that collection and measurement procedures are standardized among all participants; (2) monitor performance of the measurement systems being used in the Near Coastal Demonstration Project to maintain statistical control and to provide rapid feedback so that corrective measures can be taken before data quality is compromised; (3) allow for the periodic assessment of the performance of these measurement systems and their components; and, (4) to verify and validate that reported data are sufficiently representative, unbiased, and precise so as to be suitable for their intended use. These activities will provide users with information regarding the degree of uncertainty associated with the various components of the Near Coastal Demonstration Project data base. This quality assurance plan has been submitted in partial fulfillment of Contract Number 68-03-3249 to Lockheed Engineering & Sciences Company, Contract Number 68-C8-0066 to Science Applications International Corporation, and Contract Number 7176- 849 to Computer Sciences Corporation under the sponsorship of the U.S. Environmental Protection Agency. 11 ------- Table of Contents Revision 0 Date: 4/90 DRAFT TABLE OF CONTENTS Section Page Abstract ii Figures vi Tables vii Acknowledgments viii 1 INTRODUCTION 1 of 5 1.1 OVERVIEW 1 of 5 1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS 3 of 5 2 PROJECT ORGANIZATION 1 of 3 2.1 MANAGEMENT STRUCTURE 1 of 3 3 PROJECT DESCRIPTION 1 of 2 3.1 PURPOSE 1 Of 2 4 QUALITY ASSURANCE OBJECTIVES Iofl2 4.1 DATA QUALITY OBJECTIVES 1 of 12 4.2 REPRESENTATIVENESS 5 of 12 4.3 COMPLETENESS 6 Of 12 4.4 COMPARABILITY 7 of 12 4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR . 7 of 12 5 QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS CRITERIA, AND CORRECTIVE ACTION 1 of 50 5.1 CHEMICAL ANALYSIS OF SEDIMENT AND TISSUE SAMPLES 1 Of 50 5.1.1 General QA/QC Requirements 3 of 50 5.1.2 Initial Calibration 5 of 50 5.1.3 Initial Documentation of Detetection Limits 8 of 50 5.1.4 Initial Blind Analysis of Reference Material 10 of 50 5.1.5 Blind Analysis of Reference Material: Laboratory Intercomparison Exercise 11 of 50 5.1.6 Analysis of SRM's and Laboratory Control Materials 11 of 50 5.1.7 Calibration Check 13 of 50 111 ------- Table of Contents Revision 0 Date: 4/90 DRAFT Contents (Continued) Section Page 5.1.8 Laboratory Reagent Blank 14 of 50 5.1.9 Internal Standards 15 of 50 5.1.10 Injection Internal Standards .... 16 of 50 5.1.11 Laboratory Fortified Sample Matrix . 17 of 50 5.1.12 Laboratory Duplicates 18 of 50 5.1.13 Field Duplicates and Field Splits . 19 of 50 5.2 OTHER SEDIMENT MEASUREMENTS 20 of 50 5.2.1 Total organic carbon and acid volatile sulfide 20 of 50 5.2.2 Clostridium perfrinqens spore concentrations 21 of 50 5.2.3 Sediment grain size 22 of 50 5.3 TOXICITY TESTING OF SEDIMENT AND WATER SAMPLES 25 of 50 5.3.1 Sample Handling and Storage 26 of 50 5.3.2 Quality of Test Organisms 27 of 50 5.3.3 Facilities and Equipment 28 of 50 5.3.4 Test Conditions 29 of 50 5.3.5 Test Acceptability 31 of 50 5.3.6 Precision 32 of 50 5.3.7 Control Charts 33 of 50 5.3.8 Record Keeping and Reporting .... 34 of 50 5.4 BENTHIC COMMUNITY ANALYSIS 35 of 50 5.4.1 Species Composition and Abundance . . 36 of 50 5.4.2 Biomass 38 of 50 5.5 LARGE BIVALVE SAMPLING 38 of 50 5.6 FISH SAMPLING 39 of 50 5.6.1 Species Composition and Abundance . . 39 of 50 5.6.2 Fish Length Measurements 40 of 50 5.6.3 Fish Gross Pathology 40 of 50 5.7 SEDIMENT-PROFILE PHOTOGRAPHY 41 of 50 5.8 DISSOLVED OXYGEN MEASUREMENTS 43 of 50 IV ------- Table of Contents Revision 0 Date: 4/90 DRAFT Contents (Continued) Section Page 5.9 ANCILLARY MEASUREMENTS 45 of 50 5.9.1 Salinity 45 of 50 5.9.2 Temperature 46 of 50 5.9.3 pH measurements 46 of 50 5.9.4 Fluorometry 47 of 50 5.9.5 Transmissometry 48 of 50 5.9.6 Photosynthetically Active Radiation . 49 of 50 5.9.7 Apparent RPD Depth 49 of 50 6 FIELD OPERATIONS AND PREVENTIVE MAINTENANCE ... 1 of 6 6.1 TRAINING AND SAFETY 1 of 6 6.2 FIELD QUALITY CONTROL 4 of 6 6.3 FIELD AUDITS 5 of 6 6.4 PREVENTIVE MAINTENANCE 5 of 6 7 LABORATORY OPERATIONS 1 of 4 7.1 LABORATORY PERSONNEL, TRAINING, AND SAFETY . 1 of 4 7.2 QUALITY CONTROL DOCUMENTATION 2 of 4 7.3 SAMPLE PROCESSING AND PRESERVATION 3 of 4 7.4 SAMPLE STORAGE AND HOLDING TIMES 3 of 4 7.5 LABORATORY PERFORMANCE AUDITS 4 of 4 8 QUALITY ASSURANCE AND QUALITY CONTROL FOR MANAGEMENT OF DATA AND INFORMATION 1 of 13 8.1 SYSTEM DESCRIPTION 1 of 13 8.1.1 Field Navigation and Data Logging System 2 of 13 8.2 QUALITY ASSURANCE/QUALITY CONTROL 2 of 13 8.2.1 Standardization 3 of 13 8.2.2 Prelabeling of Equipment and Sample Containers 4 of 13 8.2.3 Data Entry and Transfer 4 of 13 8.2.4 Automated Data Verification .... 6 of 13 8.2.5 Sample Tracking 7 of 13 8.2.6 Reporting 8 of 13 8.2.7 Redundancy (Backups) 9 of 13 8.2.8 Human Review 10 of 13 8.3 DOCUMENTATION AND RELEASE OF DATA 10 of 13 9 QUALITY ASSURANCE REPORTS TO MANAGEMENT 1 of 2 10 REFERENCES 1 of 3 ------- Table of Contents Revision 0 Date: 4/90 DRAFT Figures Figure Page 2-1 Management structure for the 1990 Virginian Province Demonstration Project 2 of 3 9-1 Example of a control chart 2 of 2 VI ------- Tables Table Page 1-1 Sections in this Report and in Related Documents that Address the 15 Subjects Required in a Quality Assurance Project Plan 5 of 5 2-1 List of Key Personnel, Affiliations, and Responsibilities within the EMAP Near Coastal Demonstration Project 3 of 3 4-1 Measurement Quality Objectives for EMAP Near Coastal Indicators and Associated Data .... 3 of 12 4-2 Quality Assurance Sample Types, Frequency of Use, and Types of Data Generated for the EMAP-Near Coastal Demonstration Project 10 of 12 5-1 Key Elements for Quality Control of Chemical Analyses During the EMAP-Near Coastal Demonstration Project 6 of 50 5-2 Recommended Detection Limits for EMAP Near Coastal Chemical Analyses 9 of 50 8-1 Data Distribution Levels for the Near Coastal Demonstration Project 13 of 13 VII ------- ACKNOWLEDGMENTS We would like to thank the following individuals for their timely peer reviews of this document: D. Bender and L. Johnson, TAI, Inc. Cincinnati, Ohio; R. Graves, U.S. Environmental Protec- tion Agency, Environmental Monitoring Systems Laboratory, Cincin- nati, Ohio; C.A. Manen, National Oceanic and Atmospheric Adminis- tration, Rockville, Maryland; K. Summers, U.S. Environmental Protection Agency, Environmental Research Laboratory, Gulf Breez- e, Florida; R. Pruell and S. Schimmel, U.S. Environmental Protec- tion Agency, Environmental Research Laboratory, Narragansett, Rhode Island; F. Holland and S. Weisberg, Versar, Inc., Colum- bia, Maryland. The assistance provided by R. Graves in the development of measurement quality objectives for analytical chemistry is especially appreciated. Word processing support provided by A. Tippett and compila- tion of review comments by J. Aoyama, Lockheed Engineering & Sciences Company, Las Vegas, Nevada is greatly appreciated. Vlll ------- Section 1 Revision 0 Date 4/90 DRAFT 1 Page 1 of 5 SECTION 1 INTRODUCTION 1.1 OVERVIEW The U.S. Environmental Protection Agency (EPA), in cooperation with other federal and state organizations, has designed the Environmental Monitoring and Assessment Program (EMAP) to monitor indicators of the condition and health of the Nation's ecological resources. Specifically, EMAP is intended to respond to the growing demand for information characterizing the condition of our environment and the type and location of changes in our environment. Simultaneous monitoring of pollutants and environmental indicators will allow for the identification of the likely causes of adverse changes. When EMAP has been fully implemented, the program will answer the following critical questions: o What is the current status, extent and geographic distribution of our ecological resources (e.g., estuaries, lakes, streams, forests, grasslands, etc.)? o What percentage of resources appear to be adversely affected by pollutants or other anthropogenic environmental stresses? ------- Section 1 Revision 0 Date 4/90 DRAFT 1 Page 2 of 5 o Which resources are degrading, where, and at what rate? o What are the most likely causes of adverse effects? o Are adversely affected ecosystems improving as expected to control and mitigation programs? To answer these types of questions, the Near Coastal Demonstration Project has set four major objectives: o Provide a quantitative assessment of the regional extent of near coastal environmental problems by assessing pollution exposure and ecological condition. o Measure changes in the regional extent of environmental problems for the Nation's near coastal ecosystems. o Identify and evaluate associations among the ecological condition of the Nation's near coastal ecosystems and pollutant exposure, as well as other factors known or suspected to affect ecological condition (e.g., climatic conditions, land use patterns). ------- Section 1 Revision 0 Date 4/90 DRAFT 1 Page 3 of 5 o Assess the effectiveness of pollution control actions and environmental policies on regional scales (i.e., large estuaries like Chesapeake Bay, major coastal regions like the mid-Atlantic and Gulf coasts, and nationally). The Near Coastal component of EMAP will monitor the status and trends in environmental quality of the coastal waters of the United States. This program will complement and eventually merge with the National Oceanic and Atmospheric Administration's (NOAA) existing National Status and Trends Program for Marine Environmental Quality to produce a single, cooperative, coastal and estuarine monitoring program. The strategy for implementation of the Near Coastal project is a regional, phased approach starting in 1990 in the Virginian Province. This biogeographical province covers an area from Cape Cod, Massachusetts to Cape Henry, Virginia (U.S. EPA, 1989). Additional provinces will be added in future years, eventually resulting in full national implementation of the program. 1.2 QUALITY ASSURANCE PROJECT PLAN SPECIFICATIONS The quality assurance policy of the EPA requires every monitoring and measurement project to have a written and approved ------- Section 1 Revision 0 Date 4/90 DRAFT 1 Page 4 of 5 quality assurance plan (Stanley and Verner, 1983). This requirement applies to all environmental monitoring and measurement efforts authorized or supported by the EPA through regulations, grants, contracts, or other means. The quality assurance plan for the project specifies the policies, organization, objectives, and functional activities for the project. The plan also describes the quality assurance and quality control activities and measures that will be implemented to ensure that the data will meet all criteria for data quality established for the project. All projec personnel must be familiar with the policies and objectives outlined in this quality assurance plan to assure proper interactions among the various data acquisition and management components of the project. EPA guidance (Stanley and Verner, 1983) states that the 15 items shown in Table 1-1 should be addressed in the QA project plan. Some of these items are extensively addressed in other documents for this project and therefore, as allowed by the guidelines, are only summarized or referenced in this document. This document contains proposed protocols and designs for the integrated quality assurance program that will be implemented for the project. This plan is intended to be a "living" document and, accordingly, may be revised or appended as needs warrant. ------- Section 1 Revision 0 Date 4/90 DRAFT 1 Page 5 of 5 TABLE 1-1. SECTIONS IN THIS REPORT AND IN RELATED DOCUMENTS THAT ADDRESS THE 15 SUBJECTS REQUIRED IN A QUALITY ASSURANCE PROJECT PLAN3 Quality Assurance Subject This Report Title page Table of contents Project description Project organization and responsibility QA objectives Sampling procedures Sample custody Calibration procedures Analytical procedures Data reduction, validation, and reporting Internal QC checks Performance and system audits Preventive maintenance Corrective action QA reports to management Title page Table of contents Section 3 Section 2 Section 4 Section 6 Section 8 Section 5,6,7 Section 7 Section 8,9 Section 5 Section 5,6,7 Section 6 Section 5 Section 9 Addressing these 15 QA subjects is specified in Stanley and Verner (1983). ------- Section 2 Revision 0 Date 4/90 DRAFT 1 Page l of 3 SECTION 2 PROJECT ORGANIZATION 2.1 MANAGEMENT STRUCTURE For the Near Coastal Demonstration Project, expertise in specific research and monitoring areas will be provided by several EPA laboratories and their contracting organizations. The Environmental Research Laboratory in Narragansett, Rhode Island (ERL-NARR) has been designated as the principal laboratory for the demonstration project, and will therefore provide oversight and implementation support for all activities for the Demonstration Project. The Environmental Monitoring Systems Laboratory in Cincinnati, Ohio (EMSL-CIN) will provide technical support for quality assurance activities and analysis of chemical contaminants in sediment and tissue samples. The Environmental Monitoring Systems Laboratory in Las Vegas, Nevada (EMSL-LV) will provide quality assurance and logistics support. The Environmental Research Laboratory in Gulf Breeze, Florida (ERL-GB) has been designated as the principal laboratory for the statistical design of the Near Coastal Demonstration Project. Figure 2-1 illustrates the management structure for the 1990 Virginian Province Near Coastal Demonstration Project. All key personnel involved in the Near Coastal Demonstration Project are listed in Table 2-1. ------- EMAP QA Officer Associate Director Near Coastal Section 2 Revision 0 Date 4/90 DRAFT 1 Page 2 of 3 Technical Director Estuaries QA Coordinator Synthesis and Integration Group Demonstration Project Manager Processing Laboratories Data Management Support Group Operations Center Support Staff Field Activities Coordinator Figure 2-1. Management structure for the 1990 Virginian Province Demonstration Project (taken from Holland, et al., in preparation) . ------- Section 2 Revision 0 Date 4/90 DRAFT 1 Page 3 of 3 Table 2-1. List of Key Personnel, Affiliations, and Responsibilities within the EMAP Near Coastal Demonstration Project NAME R. Pruell B. Graves B. Thomas D. Heggem J. Scott C. Strobel S. Weisberg J. Rosen J. Baker J. Pollard R. Slagle K. Peres T. Chiang C. Manen ORGANIZATION U.S. EPA-NARR U.S. U.S. EPA-CIN EPA-CIN U.S. EPA-LV SAIC SAIC Versar CSC LESC LESC LESC LESC LESC NOAA RESPONSIBILITY R. J. J. F. K. S. R. Linthurst Messer- Paul Holland Summers Schimmel Valente U.S. EPA-DC U.S. EPA-RTP U.S. EPA-NARR Versar U.S. EPA-GB U.S. EPA-NARR SAIC EMAP Director Deputy Director NC Associate Director NC Acting Technical Director NC Design Lead NC Demo Project Lead Project QA Officer Analytical Chemistry Support EMAP QA Coordinator Contaminant Analysis Support QA Support Toxicology/Sampling Logistics Lead Technical Support Data Base Management Lead Logistics Support QA Support Data Base Management Support QA Support QA Support NOAA QA Liaison ------- Section 3 Revision 0 Date 4/90 DRAFT 1 Page 1 of 2 SECTION 3 PROJECT DESCRIPTION 3.1 PURPOSE The objectives of the 1990 Near Coastal Demonstration Project are to: o Obtain estimates of the variability associated with Near Coastal indicators which will allow establishment of program level data quality objectives (DQOs). o Evaluate the utility, sensitivity, and applicability of the EMAP Near Coastal indicators on a regional scale. o Determine the effectiveness of the EMAP network design for quantifying the extent and magnitude of pollution problems. o Demonstrate the usefulness of results for purposes of planning, prioritization, and determining the effectiveness of existing pollutant control actions. o Develop methods for indicators that can be transferred to other regions and other agencies. ------- Section 3 Revision 0 Date 4/90 DRAFT 1 Page 2 of 2 o Identify and resolve logistical issues associated with implementing the network design. Information gained from the 1990 demonstration project will also be used to refine the overall EMAP design. The demonstration project itself will serve as a model for the implementation of EMAP projects for other ecosystem types and in other regions. The strategy for accomplishing the above objectives will be to field test the proposed Near Coastal indicators and the network design through the demonstration project in the Virginian Province estuaries. Estuaries were selected as the target ecosystem because their natural circulation patterns concentrate and retain pollutants. Estuaries are spawning and nursery grounds for many species of living resources, and the estuarine watersheds receive a great proportion of the pollutants discharged in the waterways of the U.S. The Virginian Province was chosen because: (1) known pollution impacts are particularly severe; (2) unacceptable levels of contaminants are known to occur in the water, sediments, and biota; and (3) the vitality of many living resources are threatened (U.S. EPA, 1989). ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 1 of 12 SECTION 4 QUALITY ASSURANCE OBJECTIVES 4.1 DATA QUALITY OBJECTIVES To address the project objectives, the conclusions of the project must be based on scientifically sound interpretations of the data base. To achieve this end, and as required by EPA for all monitoring and measurement programs, objectives must be established for data quality based on the proposed uses of the data (Stanley and Verner, 1985). The primary purpose of the quality assurance program is to maximize the probability that the resulting data will meet or exceed the data quality objectives (DQOs) specified for the project. Data quality objectives established for the EMAP Near Coastal project, however, are based on control of the measurement system because error bounds cannot, at present, be established for end use of indicator response data. As a consequence, management decisions balancing the cost of higher quality data against program objectives are not presently possible. As data are accumulated on indicators and the error rates associated with them are established, end use DQOs can be established and quality assurance systems implemented to assure acceptable data quality to meet preestablished program objectives. ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 2 of 12 The data quality objectives presented for accuracy, precision, and completeness (Table 4-1} can be more accurately termed "measurement quality objectives" (MQOs). These objectives are based on the likely magnitude of error generated through the measurement process. The MQOs for the Near Coastal project were established by obtaining estimates of the most likely data quality that is achievable based on either the instrument manufacturer's specifications or historical data. Scientists familiar with each particular data type provided estimates of likely measurement error for a given measurement process. These MQOs are then used as quality control criteria both in field and laboratory measurement processes to set the bounds of acceptable measurement error. DQOs or MQOs are usually established for five aspects of data quality: representativeness, completeness, comparability, accuracy, and precision (Stanley and Verner, 1985) . In addition, recommended detection limits are established. These terms are defined below with general guidelines for establishing DQOs for each QA parameter. ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 3 of 12 Table 4-1. Measurement Quality Objectives for EMAP Near Coastal Indicators and Associated Data Indicator/Data Type Maximum Allowable Accuracy (Bias) Goal Maximum Allowable Precision Goal Completeness Goal Sediment contaminant concentration Organics Inorganics Sediment toxicity 30% 15% NA Benthic species composition and biomass Sample collection NA Sorting 10% Counting 10% Taxonomic identification 10% Biomass NA Sediment characteristics Grain size NA Total organic carbon 10% Percent water NA Acid volatile sulfides 10% Dissolved oxygen concentration 0.5 mg/L Salinity 1 ppt Depth 0.5 m 30% 15% NA NA NA NA NA 10% 10% (most abundant size class) 10% 10% 10% 10% 10% 10% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% 90% ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 4 of 12 Table 4-1. (Continued) Maximum Allowable Accuracy (Bias) Indicator/Data Type Goal Fluorometry Transmissometry pH 0.2 Temperature 0 Contaminants in fish and bivalve tissue Organics Inorganics Gross pathology of fish Fish community composition Sample collection Counting Taxonomic identification Length determinations Relative abundance of large burrowing bivalves Sample collection Counting Taxonomic identification Histopathology of fish Apparent RPD depth Water column toxicity NA NA pH units .5 °C 30% 15% NA NA 10% 10% ± 5 mm NA 10% 10% NA ± 5 mm NA NA Maximum Allowable Precision Completeness Goal Goal 10% 10% NA NA 30% 15% 10% NA NA NA NA NA NA NA NA NA 40% fChampia) 50%(Arbacia) 90% 90% 90% 90% 90% 90% 90% 75% 90% 90% 90% 75% 90% 90% NA 90% 90% ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 5 of 12 4.2 REPRESENTATIVENESS Representativeness is defined as "the degree to which the data accurately and precisely represent a characteristic of a population parameter, variation of a property, a process characteristic, or an operational condition" (Stanley and Verner, 1985). Representativeness applies to the location of sampling or monitoring sites, to the collection of samples or field measurements, to the analysis of those samples, and to the types of samples being used to evaluate various aspects of data quality. The location of sampling sites and the design of the sampling program in the Near Coastal Demonstration Project provide the primary focus for defining representative population estimates from the Virginian Province near coastal estuarine environment. The proposed sampling design combines the strengths of systematic and random sampling with an understanding of estuarine systems, to collect data that will provide unbiased estimates of the status of the Nation's estuarine resources. Field protocols are documented in the Near Coastal Field Operations Manual (Strobel et al., in prep.) for future reference and protocol standardization, as are laboratory measurement protocols in the Laboratory Methods Manual (Graves et al., in prep.). The types of QA documentation samples (i.e., performance ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 6 of 12 evaluation material) used to assess the quality of chemical data will be as representative as possible of the natural samples collected during the project with respect to both composition and concentration. 4.3 COMPLETENESS Completeness is defined as "a measure of the amount of data collected from a measurement process compared to the amount that was expected to be obtained under the conditions of measurement" (Stanley and Verner, 1985). An aspect of completeness that can be expressed for all data types is the amount of valid data (i.e., not associated with some criteria of potential unacceptability) collected. A criteria ranging from 75 to 90 percent valid data from a given measurement process is suggested as being reasonable for the Near Coastal Demonstration Project. As data are compiled for the various indicators, more realistic criteria for completeness can be developed. The suggested criteria for each data type to be collected is presented in Table 4-1. ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 7 of 12 4.4 COMPARABILITY Comparability is defined as "the confidence with which one data set can be compared to another" (Stanley and Verner, 1985). Comparability of reporting units and calculations, data base management processes, and interpretative procedures must be assured if the overall goals of EMAP are to be realized. The EMAP Near Coastal Demonstration Project will generate a high level of documentation for the above topics to ensure that future EMAP efforts can be made comparable. For example, both field and laboratory methods are described in full detail in manuals which will be made available to all field personnel and analytical laboratories. Field crews will undergo intensive training in a single three week session prior to the start of field work. Finally, the sampling design for the Demonstration Project has been made flexible enough to allow for analytical adjustments, when necessary, to ensure data comparability. 4.5 ACCURACY (BIAS), PRECISION, AND TOTAL ERROR The term "accuracy", which is used synonymously with the term bias in this plan, is defined as the difference between a measured value and the true or expected value, and represents an ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 8 of 12 estimate of systematic error or net bias (Kirchner, 1983; Hunt and Wilson, 1986; Taylor, 1987). Precision is defined as the degree of mutual agreement among individual measurements, and represents an estimate of random error (Kirchner, 1983; Hunt and Wilson, 1986; Taylor, 1987). Collectively, accuracy and precision can provide an estimate of the total error or uncertainty associated with an individual measured value. Measurement quality objectives for the various indicators are expressed separately as maximum allowable accuracy (i.e., bias) and precision goals (Table 4-1). Accuracy and precision goals may not be definable for all parameters due to the nature of the measurement type. For example, accuracy measurements are not possible for toxicity testing, sample collection activities, and fish pathology identifications because "true" or expected values do not exist for these measurement parameters (see Table 4-1). In order to evaluate the MQOs for accuracy and precision, various QA/QC samples will be collected and analyzed for most data collection activities. Table 4-2 presents the types of samples to be used for quality assurance/quality control for each of the various data acquisition activities except sediment and fish tissue contaminant analyses. The frequency of QA/QC measurements and the types of QA data resulting from these samples or processes are also presented in Table 4-2. Because ------- Section 4 Revision 0 Date 4/90 DRAFT 1 Page 9 of 12 several different types of QA/QC samples are required for the complex analyses of chemical contaminants in sediment and tissue samples, they are presented and discussed separately in Section 5.1 along with presentation of warning and control limits for the various QC sample types. ------- Table 4-2. Quality Assurance Sample Types, Frequency of Use, and Types of Data Generated for the EMAP-Near Coastal Demonstration Project (see Table 5-1 for Chemical Contaminant Analysis QA Sample Types). Variable QA Sample Type or Measurement Procedure Frequency of Use Data Generated for Measurement Quality Definition Sediment tox- icity tests Benthic Species Composition and Biomass: Sorting Sample counting and ID Biomass Sed. grain size Organic carbon and acid vola- Reference toxicant tests Each experiment Resort of complete sample including debris Recount and ID of sorted animals Duplicate weights Splits of a sample Sample splits and analysis of 10% of each tech's work 10% of each tech's work 10% of samples 10% of each tech's work 10% of samples Variance of replicated tests over time No. animals resorted No. of count and ID errors Duplicate results Duplicate results Duplicate results tile sulfide standards C. perfringens Sample splits spores 10% of samples Duplicate results (continued) •d Di ifl (D 5d D (D in h-> Ot < (D o O rt H- 0 §(D CO rt I--H- *• o o H\ 3 D O O -t» ------- Table 4-2. (Continued) Variable QA Sample Type or Frequency Measurement Procedure of Use Data Generated for Measurement Quality Definition Dissolved Oxygen Cone. Salinity Temperature Depth Fluorometry Trans- missometry PH Side-by-side collec- tion and measure- ment by Winkler titration Thermometer check Check bottom depth against depth finder on boat Duplicate chlorophyll samples from surface grab Duplicate suspended solids samples from surface grab QC check with buffer solution standard Once/day (CTD); Before and after retrieval (Hydrolab) Refractometer reading Once each day Once each day One at each sampling location 10% of stations 10% of stations Once each day Difference between probe value and Winkler value Difference between probe and refractometer Difference between probe and thermometer Replicated difference from actual Difference between duplicates Difference between duplicates Difference from standard (continued) •O 01 (D 9d O ID tfl K* 0) < (D •-• O rt H O 5fl> W 11 H- I— ^ O O 3 3 M O O 4^ ------- Table 4-2 Continued Variable QA Sample Type or Frequency Measurement Procedure of Use Data Generated for Measurement Quality Definition Fish community composition Fish gross pathology Fish histopathology Abundance of large bivalves Apparent RPD depth Duplicate counts Field audits NA Random recount and identification 10% of trawls Regular intervals or as needed NA 10% of collection Duplicate measurements 10% of samples Replicated difference between determinations Number of mis- identifications NA Duplicate results Duplicate results Hater column toxlcity tests Reference toxicant tests Each experiment Variance of replicated tests over time •o 0) D (0 W 0) hj *• O O ^t ^^. T T H* VO N> »-• O O *» ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 1 of 50 SECTION 5 QUALITY ASSURANCE/QUALITY CONTROL PROTOCOLS, CRITERIA, AND CORRECTIVE ACTION Complete and detailed protocols for field and laboratory measurements can be found in Strobel, et al. (in preparation) and Graves, et al. (in preparation), respectively. Critical features of the QA/QC procedures to be followed during the EMAP-NC Demonstration Project are presented in the following sections. 5.1 CHEMICAL ANALYSIS OF SEDIMENT AND TISSUE SAMPLES For analysis of the parts-per-billion levels of organic and inorganic contaminants in estuarine sediments and tissue (fish and bivalve), no procedure has been officially approved by the regulatory agencies. The recommended analytical methods for the purposes of this project are those prescribed by NOAA (MacLeod et al., 1985; Krahn et al., 1988), as well as those used in the Puget Sound Estuary Program (TetraTech, 1986a and 1986b). These procedures have been recommended both for the National Status and Trends Program and for the Puget Sound Estuary Program conducted by multiple agencies, including EPA and NOAA. These programs do not specifically require that particular analytical methods always be followed, but rather that participating laboratories demonstrate ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 2 of 50 proficiency through routine analysis of standard reference materials or similar types of accuracy-based materials. Through an interagency agreement, the primary and reference laboratories for the EMAP-Near Coastal demonstration project will participate in on-going performance evaluation exercises conducted by the NOAA National Status and Trends Program, both to demonstrate initial capability (i.e., prior to the analysis of actual samples) and on a continuous basis throughout the project. The EMAP-Near Coastal laboratories will be required to initiate corrective actions if their performance falls below certain pre-determined minimal standards, described in later sections. As discussed earlier, the data quality objectives for this project were developed with the understanding that the data will not be used for litigation purposes. Therefore, some of the requirements set by the EPA Contract Laboratory Program for legal and contracting purposes need not be applied to EMAP. In addition, it is the philosophy of this project that as long as proper QA/QC requirements are implemented and comparable analytical performance on standard materials is demonstrated, multiple procedures for the analysis of different compound classes used by different laboratories should yield comparable results. Based on this assumption, the QA/QC requirements for the analysis of contaminants in sediments and tissue will provide special emphasis on a ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 3 of 50 performance-based program, involving continuous laboratory evaluation through the use of accuracy-based materials (e.g., certified standard reference materials and laboratory control materials), laboratory fortified sample matrices, laboratory reagent blanks, calibration standards, and laboratory and field replicates. The conceptual basis for the use of these quality control samples is presented below. 5.1.1 General QA/QC Requirements The guidance provided in the following sections is based largely on the protocol developed for the Puget Sound Estuary Program (TetraTech, 1986a and 1986b); it is applicable to low parts-per-billion analyses of both sediment and tissue samples unless otherwise noted. QA/QC requirements are the foundation of this protocol because they provide information necessary to assess the comparability of data generated by different laboratories and different analytical procedures. It should be noted that the QA/QC requirements specified in this plan represent the minimum requirements for any given analytical method. Additional requirements which are method-specific should always be followed, as long as the minimum requirements presented in this document have been met. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 4 of 50 Data for all QA/QC variables must be submitted by the laboratory as part of the data package. Program managers and project coordinators must verify that requested QA/QC data are included in the data package as supporting information for the summary data. A detailed QA/QC review of the entire data package (especially original quantification reports and standard calibration data) will be conducted by QA personnel at the ERL- NARR. The QA/QC data will be used initially to document the accuracy and precision of individual measurement processes, and ultimately to assess comparability among different laboratories. The analysis results for the various QA/QC samples should be used directly by the analytical laboratory to determine when warning and control limits have been exceeded and corrective actions must be taken. Warning limits are numerical criteria that serve as flags to data reviewers and data users. When a warnina limit is exceeded, the laboratory is not obligated to halt analyses, but the reported data may be qualified during subsequent QA/QC review. Control limits are numerical data criteria that, when exceeded, require specific corrective action by the laboratory before the analyses may proceed. Warning and control limits and recommended frequency of analysis for each QA/QC element or sample type required in the EMAP-Near Coastal demonstration project are ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 5 of 50 summarized in Table 5-1. Descriptions of the use, frequency of analysis, type of information obtained, and corrective actions for each of these QA/QC sample types or elements are provided in the following sections. 5.1.2 Initial Calibration Equipment must be calibrated before any samples are analyzed, after each major equipment disruption, and whenever on-going calibration checks do not meet recommended control limit criteria (Table 5- 1) . Summary data documenting initial calibration and any events requiring recalibration and the corresponding recalibration data must be included with the analytical results. All standards used for initial calibration must be obtained from a single source and should be traceable to a recognized organization for the preparation of QA/QC materials (e.g., National Institute of Standards and Technology, Environmental Protection Agency, etc.). Calibration curves must be established for each element and batch analysis from a calibration blank and a minimum of three analytical standards of increasing concentration, covering the range of expected sample concentrations. The calibration curve must be established prior to the analysis of samples. Data will be quantified only within the demonstrated working calibration range. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 6 of 50 Table 5-1. Key Elements for Quality Control of Chemical Analyses During the EMAP-Near Coastal Demonstration Project. Element or Sample Type Recommended Recommended Recommended Warning Limit Control Limit Frequency 1.) Initial Demonstration of Capability (Prior to Analysis of Samples): - Instrument Calibration NA - Documentation of Detection Limits NA - Blind Analysis of Reference Material NA NA NA NA Initial Per analyte for each matrix Initial 2.) On-going Demonstration of Capability: - Blind Analysis of Reference Material (Interlaboratory Calibration Exercise) NA NA Three times per year Analysis of Laboratory Control Material: organic analyses 80%-120%a inorganic analyses 90%-110% 70%-130% 85%-115% One per batch or one every 15 samples Analysis of Standard same as same as Reference Material above above Four times per year Continued on following page Percent of true value ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 7 of 50 Table 5-1, continued Element or Recommended Recommended Recommended Sample Type Warning Limit Control Limit Frequency 3.) 4.) 5.) 6.) 7.) Calibration Check using Calibration Standard Laboratory Reagent Blank Laboratory Fortified Sample Matrix Laboratory Duplicate Field Duplicates (Field Splits) NA 15% of initial calibration on average for all analytes, 25% on average/ana lyte NA less than detection limit 50%b not specified NA ±30% (RPD)C NA NA Beginning and end of batch One per batch One per batch or one every 10 samples One per batch 10% of total no. of samples 8.) Internal Standards (Surrogate Analytes) Lab develops its own Each sample 9.) Injection Internal Standards Lab develops its own Each sample D Percent recovery c RPD = Relative percent difference - -J .~"-rv,.. " " T'~»" --^ ^ , . ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 9 of 50 Table 5-2. Recommended Detection Limits for EMAP Near Coastal Chemical Analyses Analyte Tissue Sediments Inorganics (concentrations in ppm, dry weight) Al Si Cr Mn Fe Ni Cu Zn As Se Ag Cd Sn Sb Hg Pb 10.0 _a 0.1 _a 50.0 0.5 5.0 50.0 2.0 1.0 0.01 0.2 0.05 _a 0.01 0.1 1500 10000 5.0 1.0 500.0 1.0 5.0 2.0 1.5 0.1 0.01 0.05 0.1 0.2 o.bi 1.0 Orqanics (concentrations in ppb, dry weight) PAH's -a PCS congeners 1.0 ODD, DDE, and DDT species 1.0 5.0 0.1 0.1 Not measured in tissue. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 10 of 50 5.1.4 Initial Blind Analysis of Reference Material A representative sample matrix which is homogenous and contains known concentrations of the analytes of interest typically is used as a reference material to evaluate the performance of each analytical laboratory prior to the analysis of samples. In some instances, the material analyzed will be a standard reference material (SRM) which has been certified by a recognized authority (e.g., NIST, EPA, or the National Research Council of Canada (NRCC)). However, other materials may be distributed by the NOAA National Status and Trends program for the initial demonstration of laboratory capability, provided the material is a representative matrix which is uncompromised, readily available and contains the analytes of interest at the concentrations of interest. The initial analysis of whatever reference material is provided must be blind (i.e., the laboratory must not know the concentrations of the analytes of interest). The control limit for this analysis generally will be ±15% of the actual value of each analyte or measurement parameter. If any of the values resulting from the initial analysis are outside the control limit, the laboratory will be required to repeat the analysis until the control limit is met, prior to the analysis of real samples. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 11 of 50 5.1.5 Blind Analysis of Reference Material: Laboratory Intercomparison Exercise The NOAA National Status and Trends Program conducts an intercoroparison excercise three times a year to evaluate both the individual and collective performance of its participating analytical laboratories. Each laboratory in the EMAP-NC program will participate in these intercomparison exercises as a continuing check on performance and intercomparability. Each intercomparison exercise involves the blind analysis of a reference material, similar to what has been described for the initial demonstration of laboratory capability. Laboratories which fail to achieve acceptable performance in any intercomparison exercise must provide an explanation and may be required to undertake corrective actions, as appropriate. 5.1.6 Analysis of SRM's and Laboratory Control Materials Standard reference materials generally are considered one of the most useful QC samples for assessing the accuracy of a given analysis (i.e., the closeness of a measurement to its true value). The SRM concentrations of the target analytes should be known to the analyst. If the values are outside the control limits (Table 5-1), the SRM should be reanalyzed to confirm the results. If the ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 12 of 50 values are still outside the control limits in the repeat analysis, the laboratory is required to determine the source(s) of the problem and repeat the analysis until control limits are met, before continuing with sample analyses. A laboratory control material is like an SRM in that it is a matrix which is similar to the sample matrices being analyzed, and the concentrations of certain analytes of interest in the matrix are known with reasonable accuracy (i.e., as a result of a statistically-valid number of replicate analyses by one or several laboratories). In practice, this material is not certified, but is kept in-house by a single laboratory for use as an "internal SRM." A laboratory control material should be analyzed along with each batch of samples. An SRM should be analyzed at the frequency specified in Table 5-1, to provide a further check on both accuracy and precision. In situations where certified SRM's cannot be run at the stated frequency because they're unavailable or prohibitively expensive, a laboratory control material may be used exclusively. Analysis results for laboratory control materials should be reported along with the results for each sample batch, and also plotted on control charts maintained in the laboratory. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 13 of 50 Quaterly SRM results must also be reported. Warning and control limits and corrective actions for laboratory control materials and SRM's are provided in Table 5-1. 5.1.7 Calibration Check The initial instrument calibration is checked through the analysis of a calibration standard. The calibration standard solution used for the calibration check should be obtained from a different source than the intitial calibration standards, so that it can provide an independent check both on the calibration and the accuracy of the standard solutions. Analysis of the calibration standard should occur at the beginning of a sample set, once every 10 samples or every two hours during a run, and after the last analytical sample. If the control limit for analysis of the calibration standard (Table 5-1) is not met, the initial calibration will have to be repeated. If possible, the samples analyzed before the calibration check that failed the control limit criteria should be reanalyzed following the re-calibration. The laboratory should begin by reanalyzing the last sample analyzed before the calibration ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 14 of 50 standard which failed. If the relative percent difference (RPD) between the results of this reanalysis and the original analysis exceeds 30 percent, the instrument is assumed to have been out of control during the original analysis. If possible, reanalysis of samples should progress in reverse order until it is determined that there is less than 30 RPD between initial and reanalysis results. If it is not possible or feasible to perform reanalysis of samples, all earlier data (i.e., since the last successful calibration control check) should be flagged. 5.1.8 Laboratory Reagent Blank Laboratory reagent blanks (commonly called method blanks) are used to assess laboratory contamination during all stages of sample preparation and analysis. For both organic and inorganic-analyses, one reagent blank should be run in every sample batch or for every 12-hour shift, whichever is more frequent. Control limits for blanks will be based on the recommended detection limits presented in Table 5-2. As indicated earlier, these limits are based on empirical results and will be refined as the method detection limits are developed. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 15 of 50 5.1.9 Internal Standards Internal standards (commonly referred to as surrogate spikes or surrogate analytes) are compounds chosen to simulate the analytes of interest. The internal standard represents a reference against which the signal from the analytes of interest is compared directly for the purpose of quantification. Internal standards must be added to each sample, including QA/QC samples, prior to extraction, purging, or digestion. The reported concentration of each analyte should be adjusted to correct for the recovery of the internal standard, as is done in the NOAA National Status and Trends Program. The internal standard recovery data therefore should be carefully monitored; each laboratory must report the absolute amounts and the percent recovery of the internal standards along with the target analyte data for each sample. If possible, isotopically-labeled analogs of the analytes should be used as internal standards. Recommended control limits for internal standard recoveries are not specified for the EMAP-NC demonstration project. Instead, each laboratory must set its own warning and control limits based on the experience and best professional judgement of the analyst. It is the responsibility of the analyst to demonstrate that the ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 16 of 50 analytical process is always "in control" (i.e., highly variable recoveries are not acceptable). 5.1.10 Injection Internal Standards For GC analysis, injection internal standards are added to each sample just prior to injection to enable optimal quantification, particularly of complex extracts subject to retention time shifts relative to the analysis of standards. Injection internal standards are essential if the actual recovery of the internal standards added prior to extraction is to be calculated. The injection internal standards can be used to detect and correct for problems in the GC injection port or other parts of the instrument. The compounds used as injection internal standards must be different from those already used as internal standards. The analyst must monitor injection internal standard retention times and recoveries to determine if instrument maintenance or repair, or changes in analytical procedures, are indicated. Corrective action must be initiated based on the experience of the analyst and not because warning or control limits were exceeded. Instrument problems that may have affected the data or resulted in the reanalysis of the sample must be documented in the analyst's logbook and on the raw data report. Justification for reanalysis must be submitted with the data package. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 17 of 50 5.1.11 Laboratory Fortified Sample Matrix A laboratory fortified sample matrix (commonly called a matrix spike) will be used to evaluate the effect of the sample matrix on the recovery of the compound(s) of interest. This type of sample should be analyzed with every sample batch, or once every ten samples, as appropriate. The compounds used to fortify samples should include a wide range of representative analyte types. These compounds should be added at 1 to 5 times the concentration of compounds in the sample. The recovery data for each fortified compound, which must be reported along with the rest of the data for each sample, ultimately should provide a statistical basis for determining the prevalence of matrix effects in the sediment samples analyzed during the demonstration project. If the percent recovery for any analyte is less than the recommended warning limit of 50 percent, the chromatograms and raw data quantitation reports should be reviewed. If an explanation for a low percent recovery value is not discovered, the instrument response may be checked using a calibration standard. Low matrix spike recoveries may be a result of matrix interferences and further instrument response checks may not be warranted, especially if the other laboratory QC samples ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 18 of 50 indicate that the analysis for that batch of samples was in control. An explanation for low percent recovery values for matrix spike results should be discussed in a cover letter accompanying the data package. Corrective actions taken and verification of acceptable instrument response must be included. 5.1.12 Laboratory Duplicates One sample per batch should be split in the laboratory and analyzed in duplicate to provide an estimate of analytical precision. Duplicate analyses also are useful in assessing potential sample heterogeneity and matrix effects. If results fall outside the control limit (Table 5-1), a replicate analysis is required to confirm the problem before the data are reported. If results continue to exceed the control limit, subsequent corrective action is at the discretion of the program manager or QA officer, because matrix effects or laboratory error may be contributing factors. A discussion of the results of duplicate sample analysis should include probable sources of laboratory error, evidence of matrix effects, and an assessment of natural sample variability. Data outside the control limit may be flagged pending QA review of the probable laboratory or field sources of variation. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 19 of 50 5.1.13 Field Duplicates and Field Splits For the EMAP-NC demonstration project, sediment will be collected at each station using a grab sampler. Each time the sampler is retrieved, the top 2 cm of sediment in it will be scraped off and placed in a large mixing container and homogenized, until a sufficient amount of material has been obtained. At 10% of the stations, the homogenized material will be placed in four separate sample containers for subsequent chemical analysis. Two of the sample containers will be submitted blind (i.e., unknown) to the primary analytical laboratory. These two samples are called field duplicates. The other two containers, also called field duplicates, will be sent to a second, reference laboratory. Together, the two pairs of duplicates are called field splits. The analysis of the field duplicates will provide an assessment of single laboratory precision. The analysis of the field duplicates and field splits will provide an assessment of both inter- and intra-laboratory precision, as well as an assessment of the efficacy of the field homogenization technique. If the recommended control limit for analysis of these samples is not met, the QA officer must initiate action to determine if the source of the error is field or laboratory based, so that appropriate corrective actions can be taken. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 20 of 50 5.2 OTHER SEDIMENT MEASUREMENTS 5.2.1 Total organic carbon and acid volatile sulfide Quality control for the measurement of total organic carbon and acid volatile sulfide in sediment samples is accomplished by strict adherence to protocol, as well as through analysis of QA/QC samples. If levels of precision or accuracy do not fall within MQO windows (see Table 4-1) , the measurements should be stopped and the system corrected before continuing the analyses. For both parameters, precision will be determined by duplicate analysis of a single, homogenized sample. Minimally, one set of duplicate analyses should be performed each day or for every ten samples, whichever is applicable. The relative percent difference (RPD) between the two duplicate measurements should be less than 10. For the measurements of total organic carbon, accuracy will be determined by analysis of a NIST-traceable standard reference material; at least one standard should be analyzed every 10 samples. The RPD between the laboratory value and the standard value should be less than 10. In addition, a method blank should be analyzed with each batch of samples. If the induction furnace does not appear to be operating properly, the manufacturer's instructions for troubleshooting and repair should be followed. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 21 of 50 Total organic carbon should be reported as a percentage of the dry weight of the unacidified sediment sample to the nearest 0.1 unit. Results should be reported for all determinations, including QA duplicates, standards, and method blanks. Any factors that may have influenced sample quality should also be reported. A standard reference material does not exist for the measurement of acid voltatile sulfide in marine sediments. For each batch of samples, accuracy of the method should be determined by analyzing a sodium sulfide crystal of known weight. The crystal should be carried through the entire analytical process, with the results agreeing within ± 10% of those expected based on the amount of sulfide in the crystal. If this accuracy goal is not met, the samples in that batch should be re-analyzed, if possible, or the data flagged. Results of the analysis of the sodium sulfide "standard" must be included along with the data package, and any failure to meet the recommended accuracy goal should be explained. 5.2.2 Clostridium perfringens spore concentrations Sediment levels of spores of Clostridium perfrinaens will be measured as an indication of sewage loading (Bisson and Cabelli 1980) at stations occupied during the Demonstration Project. Every tenth sample will be homogenized and split in the laboratory for ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 22 of 50 duplicate analysis; the results should agree within 10%. Failure to achieve this level of precision will result in a review of the possible causes of variability and appropriate corrective actions. Ten percent of the samples also will be selected for colony verification. At least five colonies from each plate should be verified. As a final QC check, the acceptability of the test medium used for each batch of samples will be determined by first preparing a fresh batch of non-inhibitory control medium. Two equal volumes of a solution containing C. perfringens spores should be passed through individual membrane filters and the filters placed on both the test medium and the control medium. If the test medium recovers at least 90% of what the control medium recovers (in terms of colony formation), the test medium will be considered acceptable. 5.2.3 Sediment grain size Quality control of sediment grain size is accomplished by strict adherence to protocol and documentation of quality control checks. Several procedures are critical to the collection of high quality particle size data. Most important to the dry sieve analysis is that the screens are clean before conducting the analysis, and that all of the sample is retrieved from them. To clean a screen, it should be inverted and tapped on a table, while ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 23 of 50 making sure that the rim hits the table evenly. Further cleaning of brass screens may be performed by gentle scrubbing with a stiff bristle nylon brush. Stainless steel screens may be cleaned with a nylon or brass brush. The most critical aspect of the pipet analysis is knowledge of the temperature of the silt-clay suspension. An increase of only 1 °C will increase the settling velocity of a particle 50 /im in diameter by 2.3 percent. It is generally recommended that the pipet analysis be conducted at a constant temperature of 20 °C. However, Plumb (1981) provides a table to correct for settling velocities at other temperatures; this table is included in the EMAP-NC Laboratory Methods Manual (Graves et al., in prep.). Thorough mixing of the silt-clay suspension at the beginning of the analysis is also critical. A perforated, Plexiglas disc plunger is very effective for this purpose. If the mass of sediment used for pipet analysis exceeds 25 g, a subsample should be taken as described by Plumb (1981). Silt-clay samples in excess of 25 g may give erroneous results because of electrostatic interactions between the particles. Silt-clay samples less than 5 g yield a large experimental error in weighing relative to the total sample weight. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 24 of 50 The analytical balance, drying oven, sieve shaker, and temperature bath used in the analysis should be calibrated at least monthly. Quality assurance for the sediment analysis procedures will be accomplished primarily by reanalyzing a randomly selected subset of samples from each batch. A batch of samples is defined as a set of samples of a single textural classification (e.g., silt/clay, sand, gravel) processed by a single technician using a single procedure. Approximately 10% of each batch completed by the same technician will be reanalyzed. If the difference between the original value and the second value is greater than 10% (in terms of the percent of the most abundant sediment size class), then the second value will be flagged and added to the database. In addition, all the other samples in the same batch must be re- analyzed, and the laboratory protocol and/or technician's practices should be reviewed and corrected to bring the measurement error under control. If the percent of the most abundant sediment size class in the original sample and the re-analyzed sample differs by less than 10, the original value will not be changed and the sediment analysis process will be considered in control. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 25 of 50 5.3 TOXICITY TESTING OF SEDIMENT AND WATER SAMPLES Standard water column toxicity tests will be conducted in the Demonstration Project to evaluate their utility for regional scale assessments of environmental conditions. Three short-term methods will be used to estimate the chronic toxicity of water collected at various stations: the sea urchin fArbacia punctulatal fertilization test, the red algal (Champia parvulal sexual reproduction test, and the bivalve (Mulinia lateralis) fertilization and larval growth test. The toxicity of sediments collected in the field will be determined as an integral part of the benthic indicator suite, using 10-day acute toxicity tests with either the freshwater amphipod Hvalella azteca or the marine amphipod Ampelisca abdita. Complete descriptions of the methods employed for the water column and sediment toxicity tests are provided in the Laboratory Methods Manual (Graves et al., in preparation). Quality assurance/quality control procedures for water column and sediment toxicity tests involve: (I) sample handling and storage; (2) the source and condition of the test organisms; (3) condition of facilities and equipment; (4) test conditions; (5) instrument calibration; (6) replication; (7) use of reference ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 26 of 50 toxicants; (8) record keeping; and (9) data evaluation. These procedures are described in the following sections. 5.3.1 Sample Handling and Storage Techniques for sample collection, handling, and storage are described in the field methods manual (Strobel, et al., in preparation). Both water and sediment samples for toxicity testing should be chilled to 4°C when collected, shipped on ice, and stored in the dark in a refrigerator at 4°C until used. Water column toxicity tests should begin within 48 hours of sample collection. Sediment for toxicity testing should be stored for no longer than two weeks before the initiation of the test, and should not be frozen or allowed to dry. Sample containers should be made of chemically inert materials to prevent contamination, which might result in artificial changes in toxicity (see Strobel et al., in preparation). To avoid contamination during collection, all sampling devices and any other instruments in contact with water or sediments should be cleaned with water and a solvent rinse between stations (see Strobel et al., in preparation). Contact of the samples with metals, including stainless steel, and plastics (including polypropylene and low density polyethylene) should be avoided as ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 27 of 50 contaminant interactions may occur. Only sediments not in contact with the sides of the sampling device should be subsampled, composited, and subsequently homogenized using teflon instruments and containers. The adequacy of the field homogenization technique for sediments will be documented in a special study prior to the start of field work. 5.3.2 Quality of Test Organisms All organisms used in the tests should be disease-free and should be positively identified to species. If organisms are collected from the field prior to testing, they should be obtained from an area known to be free of toxicants and should be held in clean, uncontaminated water and facilities. Organisms held prior to testing should be checked daily, and individuals which appear unhealthy or dead should be discarded. If greater than 5 percent of the organisms in holding containers are dead or appear unhealthy during the 48 hours preceding a test, the entire group should be discarded and not used in the test. Whenever test organisms are obtained from an outside source (e.g., field collected or obtained from an outside culture facility), their sensitivity must be evaluated with a reference toxicant in an appropriate short-term toxicity test performed ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 28 of 50 concurrently with the water column or sediment toxicity tests. For the sediment tests using amphipods, a 96-hour toxicity test without sediment will be used to test sensitivity by generating LC-50 values. If the laboratory maintains breeding cultures of test organisms, the sensitivity of the offspring should be determined in a toxicity test performed with a reference toxicant at least once a month. If preferred, this test also may be performed concurrently with the water column or sediment toxicity tests. 5.3.3 Facilities and Equipment Laboratory and bioassay temperature control equipment must be adequate to maintain recommended test temperatures. Recommended materials must be used in the fabrication of the test equipment in contact with the water or sediment being tested, as specified in the laboratory methods manual (Graves et al., in preparation). The acceptability of new holding or testing facilities should be demonstrated by conducting "non-toxicant" tests in which test chambers contain control sediment and clean seawater or dilution water, as appropriate for a given method. Such tests may be performed concurrent with, and serve as controls for, the reference toxicant tests used to assess single laboratory precision. These tests will demonstrate whether facilities, water, control sediment, ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 29 of 50 and handling techniques are adequate to result in acceptable control level survival. 5.3.4 Test Conditions Parameters such as water temperature, salinity (conductivity), dissolved oxygen, alkalinity, water hardness, and pH should be checked as required for each test and maintained within the specified limits (Graves et al., in prep.). Instruments used for routine measurements must be calibrated and standardized according to instrument manufacturer's procedures (see EPA methods 150.1, 360.1, 170.1, and 120.1, U.S. EPA, 1979a). All routine chemical and physical analyses must include established quality assurance practices as outlined in Agency methods manuals (U.S. EPA, 1979a,b). The wet chemical method used to measure alkalinity must be standardized according to the procedure in the specific EPA method (see EPA Method 130.2, U.S. EPA 1979a). Overlying water or dilution water for the tests described here must meet the requirements for uniform quality specified for each method (Graves et al., in preparation). The minimum requirement for acceptable dilution or overlying water is that it allows acceptable control survival without signs of organism disease or apparent stress (i.e., unusual behavior or changes in appearance). ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 30 of 50 The dilution water used in the water column toxicity tests and the •erlying water used in the sediment toxicity tests with Ampelisca may be natural seawater, hypersaline brine (100 o/oo) prepared from natural seawater, or artificial seawater prepared from sea salts if recommended in the method. If natural seawater is used, it should be obtained from an uncontaminated area known to support a healthy, reproducing population of the test organism or a comparably sensitive species. Hypersaline brine prepared from uncontaminated, natural seawater also may be used to raise the salinity of fresh or intermediate salinity water samples to the appropriate levels for water column toxicity testing. Distilled or deionized water from a properly operated unit may be used to lower test water salinity. Whatever dilution water ultimately is used should be appropriate to the objectives of the study and the logistical constraints. Fresh overlying water used in the sediment tests with Hvalella may be reconstituted water prepared by adding specified amounts of reagent grade chemicals to high quality distilled or deionized water, or natural water obtained from an uncontaminated well, spring, or surface source. Sea salt or hypersaline brine prepared from uncontaminated, natural seawater may be used to raise the salinity of this water, as appropriate to the study design. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 31 of 50 5.3.5 Test Acceptability Survival of organisms in control treatments should be assessed during each test as an indication of both the validity of the test and the overall health of the test organism population. The results of the sea urchin test using Arbacia punctulata are acceptable if control egg fertilization equals or exceeds 70 percent. However, greater than 90 percent fertilization may result in masking of toxic responses. The macroalga test using Champia parvula is acceptable if survival is 100 percent, and the mean number of cystocarps per plant in the controls equals or exceeds 10. The bivalve larvae test using Mulinia lateral is is acceptable if greater than 60 percent of the embryos in the control treatments result in live larvae with completely developed shells at the end of the test. The araphipod tests with Ampelisca abdita or Hyalella azteca are acceptable if mean control survival is greater than or equal to 90 percent, and if survival in individual control test chambers exceeds 80 percent. Additional guidelines for acceptability of the individual water and sediment toxicity tests are presented in the Laboratory Methods Manual (Graves et al., in preparation). An individual test may be conditionally acceptable if temperature, dissolved oxygen (DO), and other specified conditions fall outside specifications, ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 32 of 50 depending on the degree of the departure and the objectives of the tests. Any deviations from test specifications must be noted and reported to the QA Officer when reporting the data so that a determination can be made of test acceptability. 5.3.6 Precision The ability of the laboratory personnel to obtain consistent, precise results will be demonstrated with reference toxicants before attempts are made to measure the toxicity of actual samples. The single laboratory precision of each type of test used in the laboratory should be determined by performing at least five or more preliminary tests with a reference toxicant. For the amphipod tests, short-term (i.e., 96-hour) reference toxicant tests without sediments will be used for this purpose. The trimmed Spearman-Karber method of regression analysis (Hamilton et al., 1977) or the monotonic regression analysis developed by DeGraeve et al. (1988) can be used to determine an LC- 50 or IC-50 value for each 96-hour reference toxicant test. Precision then can be described by the LC-50 or IC-50 mean, standard deviation, and percent relative standard deviation (coefficient of variation, or CV) of the five (or more) replicate reference toxicant tests. Based on data reported by Morrison et ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 33 of 50 al. (1989), a CV of 40 percent or less for the Champia test and a CV of 50 percent or less for the Arbacia test will be considered acceptable for demonstrating single laboratory precision prior to testing of actual samples. If the laboratory fails to achieve these precision levels in the five preliminary reference toxicant tests, the test procedure should be examined for defects and the appropriate corrective actions should be taken. The tests will then be repeated until acceptable precision is demonstrated. Throughout the testing period, precision will be assessed continually through the use of control charts. Single laboratory precision for the Mulinia lateralis larvae test and the amphipod tests using Ampelisca and Hvalella has not been previously determined, but will be assessed prior to and during the conduct of the Near Coastal Demonstration Project to establish acceptable precision levels in the future. 5.3.7 Control Charts A control chart should be prepared for each reference toxicant-organism combination, and successive toxicity values should be plotted and examined to determine if the results are within prescribed limits (see example in Figure 9-1). In this technique, a running plot is maintained for the toxicity values ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 34 of 50 (Xi) from successive tests with a given reference toxicant. The types of control charts illustrated (U.S. EPA, 1979b) are used to evaluate the cumulative trend of results from a series of samples. For regression analysis results (such as LC-50s or IC-50s), the mean (X) and upper and lower control limits (±2S) are recalculated with each successive point until the statistics stabilize. Outliers, which are values which fall outside the upper and lower control limits, and trends of increasing or decreasing sensitivity, are readily identified. At the P=0.05 probability level, one in twenty tests would be expected to fall outside of the control limits by chance alone. If the toxicity value from a given test with the reference toxicant does not fall in the expected range for the test organisms, the sensitivity of the organisms and the overall credibility of the test are suspect. In this case, the test procedure should be examined for defects and, if possible, the test should be repeated with a different batch of test organisms. 5.3.8 Record Keeping and Reporting Proper record keeping is mandatory. Bound notebooks should be used to maintain detailed records of the test organisms such as species, source, age, date of receipt, and other pertinent ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 35 of 50 information relating to their history and health, and information on the calibration of equipment and instruments, test conditions employed, and test results. Annotations should be made on a real time basis to prevent loss of information. Data for all QA/QC variables, such as reference toxicant test results and copies of control charts, should be submitted by the laboratory along with test results. 5.4 BENTHIC COMMUNITY ANALYSIS Sediment samples for benthic community analysis will be collected at each station using a Young-modified Van Veen grab sampler. In order to be considered acceptable, each grab sample must meet certain pre-established quality control criteria, as specified in the Field Operations Manual (Strobel et al., in preparation). The collected sediment will be sieved in the field through a 0.5 mm screen and the material collected on the screen preserved and returned to the laboratory for processing. Details of field and laboratory processing procedures can be found in Strobel et al. (in preparation) and Graves et al. (in preparation), respectively. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 36 of 50 5.4.1 Species Composition and Abundance Quality control for processing grab samples involves both sorting and counting check systems for quality control. A check on the efficiency of the sorting process is required to document the accuracy of the organism extraction process. In addition to sorting QC, it is necessary to perform checks on the accuracy of sample counting. This can be done in conjunction with taxonomic identification and uses the same criteria presented below for taxonomic identification quality control. Sorting QC can be separated into two levels of intensity. Inexperienced sorters require an intensive QC check system, while experienced personnel require a less frequent QC schedule. It is recommended that experienced sorters or taxonomists check each sample for missed organisms until proficiency in organism extraction is demonstrated by inexperienced personnel. Two types of QC sorting criteria are recommended to maintain control and comparability of the sorting process. One criterion for completion of sorting that has been used successfully in fresh water systems is to sort a sample until the sorter feels that the ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 37 of 50 sample is finished, then continue to sort until no organisms or fragments can be found in a one-minute continuous examination (Pollard and Melancon, 1984; Peck et al.f 1988). The time criterion for completion of a sort will depend on the composition of the sample and will need to be established for marine benthic samples, but must be initially based on the sorter's judgement that the sample sort is complete. The criterion that is used for initial sorting of a sample should also be used for the quality control sort. The second criterion for sorting acceptability is the extraction efficiency of a given sorter. Acceptable quality for sorting extraction should be that no more than 10 percent of the original organism count is removed upon a QC check sort. A minimum of 10 percent of samples processed by a given sorter should be subjected to a QC sort at regular intervals during sample processing. If a sorter fails QC sorts, then all samples processed from the last successful QC check are resorted and any additional animals found are added to each sample. If QC sorting passes, but some animals are found, these animals are not added to the original sample sort. As organisms are identified and corrected, a voucher specimen collection will be compiled. This specimen collection can be used for training new taxonomists and as a quality crosscheck by sending specimens to a separate laboratory for identification. All ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 38 of 50 specimens will be taxonomically confirmed by an outside source and any discrepancies resolved. Identification and enumeration accuracy should be checked internally by a second taxonomist for at least 10 percent of the samples processed by a given technician. There should be no more than 10 percent total error (i.e., for all species) -in identification or enumeration in any sample. The same procedures for sample reprocessing that are used for sorting apply to identification and counting. 5.4.2 Biomass Biomass determination procedures involve drying the sample, and, as a consequence, cannot be controlled and corrected in a similar manner to the sorting, identification, and enumeration processes. Duplicate weight measurements by a separate technician will be taken before and after drying of the samples to control and document the precision of this measurement process. If the two technician's results differ by more than 10 percent, the source of error must be identified and corrected before analysis proceeds. 5.5 LARGE BIVALVE SAMPLING Large bivalves collected with a rocking chair dredge will be identified to species and measured in the field. Samples will be ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 39 of 50 placed in bags and iced prior to transport and storage (see Strobel et al., in preparation, for details of field procedures). Quality of identification and measurement will be documented during training and during the final field audit, by having a different person re-count, re-measure and confirm the identification of the organisms collected. The acceptance criteria for abundance and composition is to be accurate within 10 percent of the original determination. 5.6 FISH SAMPLING 5.6.1 Species Composition and Abundance Fish species composition and abundance will be determined in the field following protocols presented in the field methods manual (Strobel et al., in preparation). Documentation of the guality of these data will be accomplished by performing field crew training and QA audits using personnel qualified to verify the identification and enumeration of the field crew. The accuracy goal for the fish abundance data is that the original results and the results of the field QA audit should agree within 10 percent. In addition, all species should be correctly identified. If these goals are not met, corrective actions will include re-training the field crew and flagging the previous data from that crew for those ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 40 of 50 species which had been misidentified. The fish sent to the EPA's Gulf Breeze laboratory for histopathological examination also will be checked for taxonomic determination accuracy. The QA officer must be informed immediately of any species misidentifications so that the appropriate field crew can be contacted and the problem corrected. 5.6,2 Fish Length Measurements A random subset of the fish measured in the field will be set aside for duplicate measurements by a second technician. The acceptable error in this procedure is ± 5 mm. If this re- measurement procedure cannot be followed due to logistical constraints, then quality assurance documentation of fish length will be accomplished during field auditing. 5.6.3 Fish Gross Pathology The field procedures to be used for determination of fish pathology are detailed in Strobel, et al., in preparation. The guality of gross scanning for fish pathology will be documented during field training and QA audits. In addition, the quality of fixation techniques and laboratory techniques will be documented during the QA audits. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 41 of 50 5.7 SEDIMENT-PROFILE PHOTOGRAPHY The field procedures for sediment-profile photography are described in the field methods manual (Strobel et al., in preparation). The techniques for measuring various physical and biological parameters (e.g., sediment grain size, camera penetration depth, redox potential discontinuity (RPD) depth, infaunal successional stage) in the sediment-profile photographs are described in the laboratory methods manual (Graves et al., in preparation). The main features of the quality assurance/quality control protocol for sediment-profile photography are described in the following sections. The camera will be operated in the field by a skilled, experienced technician who will accompany the various field crews on a rotating basis. At the beginning of each field operation, the time on the data logger mounted on the sediment-profile camera will be synchronized with the clock on the navigation system computer. Each photograph can then be identified by the time recorded on the film, and matched with the time recorded on the computer along with vessel position. Redundant sample logs will be kept by the field crew and by computer printout. Test photographs will be taken on deck at the beginning and end of each roll of film to verify that ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 42 of 50 all internal electronic systems are working to the proper design specifications. Spare cameras and charged batteries will be carried in the field at all tines to insure uninterrupted sample acquisition. After deployment of the camera at each sampling site, the camera technician will check the frame counter (digital display) to make sure that the requisite number of replicate photographs has been taken. In addition, the prism penetration depth indicator on the camera frame will be checked to see that the optical prism has actually penetrated the bottom to a sufficient depth to acquire a profile image. If photographs have been missed (frame counter indicator) or the penetration depth is insufficient (penetration indicator), additional replicates will be taken. All film will be developed at the end of every survey day to verify successful data acquisition; strict controls will be maintained for development temperatures, times, and chemicals to insure consistent density on the film emulsion to minimize interpretive error by the computer image analysis system. After it is developed, the technician will visually inspect the film under magnification. Any images that are of insufficient quality for computer image analysis will be noted, and, if possible, the appropriate sampling site will be revisited at a future date. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 43 of 50 Computer analysis of the sediment-profile photographs must be performed only by experienced technicians who have demonstrated proficiency in the technique. During computer analysis, all measurements from each photograph are stored on disk and a summary display is made on the computer screen so the operator can visually verify if the values stored in memory for each variable are within the expected range. If anamolous values are detected, software options allow remeasurement and recalculation before storage on disk. All computer data disks are backed-up by redundant copies at the end of each analytical day. All data stored on disks also are printed out on data sheets to provide a hard copy backup; a separate data sheet is generated for each sediment-profile photograph which has been analyzed. As a final quality control check, all data sheets are edited and verified by a senior-level scientist before being approved for final data synthesis, statistical analyses, and interpretation. 5.8 DISSOLVED OXYGEN MEASUREMENTS Dissolved oxygen will be measured using polarigraphic probes attached to either a Hydrolab DataSonde III unit or a SeaBird CTD instrument. Both probes are rated by their manufacturers as being accurate to 0.2 ppm (Strobel et al., in preparation). The probe ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 44 of 50 attached to the CTD will be used for daily dissolved oxygen measurements, while the one attached to the Hydrolab unit will be used for long-term measurements (i.e., 10-day continuous deployments). The probes will be calibrated prior to deployment using the saturated air calibration procedure recommended by the manufacturers. In addition, a supersaturated solution of sodium sulfite will be used to provide a zero calibration check for either probe. All calibration values will be recorded prior to deployment of the probes. The calibration of the probe attached to the CTD will be checked once each day by taking a simultaneous water sample and measuring dissolved oxygen concentration by Winkler titration. If the Winkler results and those obtained from the probe differ by greater than 0.5 ppm, the probe must be checked for malfunctions, recalibrated, then rechecked for calibration before it can be redeployed. All previous data (i.e., since the last successful calibration check) will be flagged. Simultaneous Winkler titrations also will be used to check the calibration of the probe on the Hydrolab unit both prior to deployment and following retrieval; the dissolved oxygen probe on the CTD will serve as a backup "instrument check" on the Hydrolab probe. If the Winkler results and those obtained from the Hydrolab probe differ by greater than 0.5 ppm prior to deployment, the unit will not be ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 45 of 50 deployed but will be replaced by a backup. If these results differ by greater than 0.5 ppm when the probe is retrieved after the long- term deployment, the data will be flagged as being outside the quality control criteria and will be reviewed for validity prior to data release. 5.9 ANCILLARY MEASUREMENTS 5.9.1 Salinity Salinity will be measured using the SeaBird CTD profiling recording probe which is rated by the manufacturer as being accurate to 1 percent (Strobel, et al.( in preparation). Salinity meters are calibrated by the manufacturer; this calibration will be checked once each day using a refractometer. It is expected that the probe on the CTD will be more accurate than the refractometer; therefore, the refractometer measurement will act only as a gross check on the operation of the probe. However, if the refractometer reading differs from the probe value by greater than 1 part per thousand, the CTD instrument will be checked thoroughly and a determination made of the need for recalibration. ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 46 of 50 5.9.2 Temperature Temperature will be measured using the SeaBird CTD profiling recording probe which is rated by the manufacturer as being accurate to 0.2 °C (Strobel et al., in preparation). The temperature sensor on the probe will be calibrated by the manufacturer using a National Bureau of Standards [NBS] certified thermometer, and the calibration value recorded prior to probe use. Probes will be tested for calibration stability each day using a thermometer. Drift from the original calibration will be used as a criteria for data guality acceptance and as a data flagging criteria. If calibration results differ from the original calibration by greater than 0.5 °C, the data will be flagged as being outside the quality control criteria and will then be reviewed for validity prior to data release. 5.9.3 pH Measurements Measurements of pH will be taken with the SeaBird CTD. The instrument will be calibrated to pH 7 and pH 10 as described in Strobel et al. (in preparation) . Following calibration, a QC check will be performed using an intermediate range buffer solution (pH f '" " " ' « ' . f . i •«v>...... , f,,. . ' .1. . ' —i "'Vr ~ r ""» ""^ r- -> „ - ^ ' '' ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 47 of 50 8 is suggested). The QC check should be within 0.2 pH units of the true value for the buffer solution. If the QC check is outside control limits, the instrument calibration should be checked. Quality control checks should be performed and recorded prior to and following deployment of the CTD. 5.9.4 Fluorometrv In situ fluorescence will be measured using a Sea Tech fluorometer attached to the Seabird CTD. The optical filters used in this fluorometer have been selected for optimum measurement of chlorophyll a fluorescence. Prior to each deployment, the instrument will be checked to insure that it is functioning properly, following the manufacturer's instructions. At each station, a surface water sample will be collected simultaneously with deployment of the instrument. A pre-determined volume of the water sample will be filtered on-board and the filter frozen for subsequent determination of chlorophyll a concentration. Over time, this will provide a means of calibrating each fluorometer (i.e., converting its fluorescence readings into chlorophyll a concentrations). At every tenth station, a second volume of the water sample, identical to the first, will be filtered to provide duplicate chlorophyll a measurements. These duplicate measurements should not differ by more than 10%. Failure to achieve this ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 48 of 50 precision goal will result in a thorough review of the field and laboratory procedures, to determine the cause of the discrepancy and eliminate it. 5.9.5 Transmissometry A Sea Tech 10 cm pathlength transmissometer will be used to provide in situ measurements of beam transmission and the concentration of suspended matter at each station occupied. The manufacturer's procedures for internal calibration in air and instrument check-out must be followed prior to each deployment; these procedures are decribed in the Field Operations Manual (Strobel et al., in preparation). In general, optical devices such as transmissometers are useful for determining suspended particle concentrations in near coastal waters as long as the nature of the suspended matter does not change much from region to region. In the EMAP-NC Demonstration Project, each transmissometer will be calibrated based on field measurements of suspended particle concentrations. Suspended particle concentrations will be determined in surface water samples taken simultaneously with the transmissometer reading. A known volume of the surface water sample will be filtering on board and frozen for later laboratory measurements of ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 49 of 50 suspended solids (i.e., particle) concentration. At every tenth station, a second volume of the water sample, identical to the first, will be filtered to provide duplicate suspended solids measurements. These duplicate measurements should not differ by more than 10%. Failure to achieve this precision goal will result in a thorough review of the field and laboratory procedures, to determine the cause of the discrepancy and eliminate it. 5.9.6 Photosyntheticallv Active Radiation Photosynthetically active radiation will be measured by a sensor mounted on the SeaBird CTD. This sensor is calibrated by the manufacturer; no QA/QC procedures are specified for this measurement other than those outlined in Strobel et al. (in preparation). 5.9.7 Apparent RPD Depth The depth of the apparent RPD (redox potential discontinuity) will be determined through visual observation of clear plastic cores inserted into undisturbed sediment grab samples at each station. In fine-grained sediments, the apparent RPD depth is measured from the sediment surface to the point at depth where the color changes from light to dark. As a QC check, a random subset ------- Section 5 Revision 0 Date 4/90 DRAFT 1 Page 50 of 50 of samples will be re-measured by a second field crew member or field auditor. The result of this re-measurement should be within ± 5 mm of the first measurement. Failure to achieve this level of precision will result in re-training of crew members. ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 1 of 6 SECTION 6 FIELD OPERATIONS AND PREVENTIVE MAINTENANCE 6.1 TRAINING AND SAFETY A critical aspect of quality control is to ensure that the individuals involved in each activity are properly trained to conduct the activity. Field sampling personnel are being asked to conduct a wide variety of activities using comparable protocols. Each field team will consist of a Team Leader and two 4-member crews. Each crew will have a Crew Chief (one of which is the Team Leader) , who will be the captain of the boat and will be the ultimate on-site decision maker regarding safety, technical direction, and communication with the Operations Center. Qualifications for the Team Leaders and Crew Chiefs an M.S. degree in Biological/Ecological Sciences and three years of experience with field data collection activities, or a B.S. degree amd five years experience. The remaining three crew members generally will be required to have a B.S. degree and, preferably, at least one year's experience. All field team members will be required to take part in an intensive one month training period. ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 2 of 6 Classroom training will be conducted by the University of Rhode Island's Marine Advisory Service and Fisheries Department. The instructors and staff of this department have wide-ranging experience in training scientific personnel in routine sampling operations (e.g., collection techniques, small boat handling). Their expertise will be supplemented by recognized experts in such specialized areas as fish pathology (Dr. Linda Despres-Patanjo NMFS, Woods Hole, Massachusetts and Mr. John Ziskowski, NMFS, Milford, Connecticut); fish identification (Dr. Don Flescher, NMFS, woods Hole); benthic sampling (Ms. Anna Shaughnessy, Versar, Inc., Columbia, Maryland); first aid, including cardio pulmonary resuscitation (CPR) (American Red Cross); and field computer/navigation system use (Mr. Jeffrey Parker, Science Applications International Corporation, Newport, Rhode Island). All EMAP equipment (e.g., boats, sampling gear, computers) will be used during the training sessions, and by the end of the course, all crews members must demonstrate proficiency in: o Towing and launching the boat. o Making predeployment checks on all sampling equipment. ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 3 of 6 o Locating stations using the appropriate navigation system (LORAN and/or GPS). o Entering and retrieving data from the onboard lap-top computers. o Using all the sampling gear. o Administering first aid, including CPR. o General safety practices. In addition, all field crew members must be able to swim and will be required to demonstrate that ability. Some sampling activities (e.g., fish taxonomy, gross pathology, net repair, etc.) require specialized knowledge. While all crew members will be exposed to these topics during the training sessions, it is beyond the scope of the training program to develop proficiency for all crew members in these areas. For each of the specialized activities, selected crew members, generally those with prior experience in a particular area, will be provided intensive training. At the conclusion of the training program, at least one member of each crew will have demonstrated ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 4 of 6 proficiency in fish taxonomy, molLusk taxonomy, gross pathology, net repair, gear deployment, and navigation. All phases of field operations are detailed in the field methods manual (Strobel, et al., in preparation) that will be distributed to all trainees prior to the training period. The manual will include a checklist of all equipment, instructions on the use of all equipment, and sample collection procedures that the field crews will be required to conduct. In addition, the manual will include flow charts and a schedule of activities to be conducted at each sampling location. It will also contain a list of potential hazards associated with each sampling site. 6.2 FIELD QUALITY CONTROL Quality control of field measurements will be accomplished by use of a variety of QC sample types. Specific field QC protocols can be found in Strobel et al. (in preparation). A description of the general protocols, control limits, and sample types used for this purpose can be found in sections 4 and 5 of this document. ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 5 of 6 6.3 FIELD AUDITS Initial review of the field team observations wij.. be performed by training personnel during the training program. Following training, an initial site assistance audit should be performed by a combination of QA and training personnel. This audit should be considered a "shake down" assistance procedure to help field teams provide a consistent approach to collection of samples and generation of data. At least once during the program, a formal site audit will be performed by the QAO and the Demonstration Project manager to determine compliance with the QA plan and field operations document. Checklists and audit procedures will be developed for this audit that are similar to those presented in U.S. EPA (1985). Corrective action and/or retraining of crew personnel will be initiated if discrepancies are noted. 6.4 PREVENTIVE MAINTENANCE The importance of proper maintenance of all gear cannot be understated. Failure of any piece of major equipment, especially when back-up equipment will be used by a fourth team, could result ------- Section 6 Revision 0 Date 4/90 DRAFT 1 Page 6 of 6 in a significant loss of data. Maintenance of equipment should be performed as described in Strobel et al (in preparation). It will be the responsibility of the Team Leader to maintain a record of equipment usage, and assure that proper maintenance is performed at the prescribed time intervals. The following equipment will be regularly checked and/or serviced as specified in Strobel, et al. (in preparation): Boat trailers, boats, outboard engines, electronics, hydraulics, rigging, vehicles, grid computers, Seabird CTD's and DataSonde III Hydrolabs. ------- Section 7 Revision 0 Date 4/90 DRAFT 1 Page l of 4 SECTION 7 LABORATORY OPERATIONS 7.1 LABORATORY PERSONNEL, TRAINING, AND SAFETY Laboratory operations and preventive maintenance necessary for proper operation of laboratory equipment are discussed in detail in Graves et al. (in preparation). This section addresses only general laboratory operation considerations, while the laboratory QA/QC considerations are presented in sections 4 and 5. The toxicity or carcinogenicity of individual compounds or reagents used in this project has not been precisely determined. Therefore, each chemical should be treated as a potential health hazard and good laboratory practices should be implemented accordingly. Laboratory personnel should be well versed in standard laboratory safety practices and standard operating procedures (SOPs) strictly followed as presented in Graves, et al. (in preparation). It is the responsibility of the laboratory manager and supervisor to ensure that safety training is mandatory for all laboratory personnel. The laboratory is responsible for maintaining a current safety manual in compliance with the Occupational Safety and Health Administration (OSHA) regulations regarding the safe handling of the chemicals specified for this ------- Section 7 Revision 0 Date 4/90 DRAFT 1 Page 2 of 4 project and individual chemical safety data sheets. These procedures and documents should be made available to and followed by all personnel involved in this project. 7.2 QUALITY CONTROL DOCUMENTATION The following documents and information must be current, and must be available to all laboratory personnel and to the principal investigators: o Laboratory methods manual - A document containing detailed instructions about laboratory and instrument operations (Graves et al., in preparation). o Quality assurance plan - Clearly defined laboratory protocols, including personnel responsibilities and the use of QA/QC protocols (this document). o Instrument performance study information - Information on baseline noise, calibration standard response, precision as a function of concentration, and detection limits. ------- Section 7 Revision 0 Date 4/90 DRAFT 1 Page 3 of 4 o Training and field operations and manual (Strobel et al., in preparation) including quality control performance criteria (e.g., calibration routines and acceptance criteria). 7.3 SAMPLE PROCESSING AND PRESERVATION Sample processing and preservation protocols are presented in Strobel et al. (in preparation) for field collected data, and in Graves et al. (in preparation) for laboratory processed data. Strict adherence to the protocols provided in these documents is critical to maintain data integrity. 7.4 SAMPLE STORAGE AND HOLDING TIMES Water samples for toxicity testing should be shipped on ice, but not frozen. Transit and subsequent holding time should not exceed 48 hours. Sieved biota from sediments must be preserved on the boat according to procedures presented in Strobel et al. (in preparation). For the analyses of organic contaminants in sediments, it is recommended that the sediment samples be extracted within 10 days and extracts analyzed within 40 days following extraction (Contract Laboratory Program [CLP], Statement of Work ------- Section 7 Revision 0 Date 4/90 DRAFT 1 Page 4 of 4 [SOW] 288). For inorganic sediment contaminants (except mercury), it is recommended that samples be digested within 180 days and the extracts analyzed within 1 day (for Sb, Pb, Hg, Se, and Ag) , within 2 days (for As and Cd), and within 1 week (for Cr, Cn, Ni, and Zn). For mercury, the holding time is 26 days (CLP SOW 288). For the analyses of contaminants in fish muscle tissue, the whole fish will be shipped frozen on dry ice and should be held frozen until the time of analysis. 7.5 LABORATORY PERFORMANCE AUDITS Initially, a QA assistance and performance audit will be performed by QA personnel to determine if each laboratory effort is in compliance with the procedures outlined in the QA plan and to assist the laboratory where needed. Additionally, once during the study, a formal laboratory audit following protocols similar to those presented in U.S. EPA (1985) checklists that are appropriate for each laboratory operation will be developed and approved by the QA Officer prior to the audits. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 1 of 13 SECTION 8 QUALITY ASSURANCE AND QUALITY CONTROL FOR MANAGEMENT OF DATA AND INFORMATION 8.1 SYSTEM DESCRIPTION The prototype of the Near Coastal Information Management System (NCIMS) will be developed at the Environmental Research Laboratory in Narragansett (ERL-N). The design for this system will be reviewed by the EMAP Information Management committee and by the technical director of the Near Coastal Demonstration Project. Ultimately, the NCIMS will: o document sampling activities and standard methods, o support program logistics, sample tracking and shipments, o process and organize both the data collected in the field and the results generated at analytical laboratories, o perform range checks on selected numerical data, o facilitate the dissemination of information, and o provide interaction with the EMAP Central Information System. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 2 of 13 8.1.1 Field Navigation and Data Logging System The primary means of data logging in the field will be manual recording of information on data sheets. However, portable microcomputers modified to withstand the rigors of use on small boats represent an important back-up component of the data management system for the Near Coastal project. The software on these machines will provide navigation and real time positioning of the boat, and control some sampling activities, sample logging, and data storage through an interactive menu. The software to be used is a modification of the Integrated Navigation and Survey System (INSS) developed by Science Applications International Corporation. The INSS is a simple, automated, menu-driven software package with complete logging facility; it has been used successfully on numerous environmental field programs during the past decade. 8.2 QUALITY ASSURANCE/QUALITY CONTROL Two general types of problems which should be resolved in developing QA/QC protocols for information and data management are: (1) correction or removal of erroneous individual values and (2) ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 3 of 13 inconsistencies that damage the integrity of the data base. The following features of the NCIMS will provide a foundation for the management and quality assurance of all data collected and reported during the life of the project. 8.2.1 Standardization A systematic numbering system will be developed for unique identification of individual samples, sampling events, stations, shipments, equipment, and diskettes. The sample numbering system will contain codes which will allow the computer system to distinguish among several different sample types (e.g., actual samples, quality control samples, sample replicates, etc.). This system will be flexible enough to allow changes during the Demonstration Project, while maintaining a structure which allows easy comprehension of the sample type. Clearly stated standard operating procedures will be given to the field crews with respect to the use of the field computer systems and the entry of data in the field. Contingency plans will also be stated explicitly in the event that the field systems fail. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 4 of 13 8.2.2 Prelabelinq of Equipment and Sample Containers Whenever possible, sample containers, equipment, and diskettes will be prelabeled to eliminate confusion in the field. The prelabeling will reduce the number of incorrect or poorly-affixed labels. Containers with all the required prelabeled sample containers, sample sheets, and data diskettes will be prepared for the field crews prior to each sampling event (an event is defined as a single visit by a crew to a sampling site). These containers will be called "event boxes". Each event box will have the event number affixed to it using both handwritten and bar code labels. 8.2.3 Data Entry. Transcription, and Transfer To minimize the errors associated with entry and transcription of data from one medium to another, data will be captured electronically. When manual entry is required, the data should be entered twice by different data entry operators and then checked for non-matches to identify and correct errors. In many instances, the use of bar code labels should eliminate the need for manual entry of routine information. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 5 of 13 Each group transmitting data to the information center will be given a separate account on the Near Coastal VAX 3300. Standard formats for data transfer will be established by the Information Management Team. A specific format will be developed for each file type within each discipline. If data are sent to the Near Coastal Information Center in formats other than those specified, the files will be deleted and the sending laboratory or agency will be asked to resubmit the data in the established format. The communications protocols used to transfer data electronically will have mechanisms by whicu the completeness and accuracy of the transfer can be checked. In addition, the group sending the information should specify the number of bytes and file names of the transferred files. These data characteristics should be verified upon receipt of the data. If the file cannot be verified, a new file transfer should be requested. Whenever feasible, a hard copy of all data should be provided with transfer files. The data files tranmitted from the field will be fixed format text files. These files will be "parsed" by the system. The parsing process involves transferring records of similar type into files containing only those types of records. For example, observation on fish species and size will be copied from the ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 6 of 13 original log file transmitted from the field to a "fish" data file. After the records have been parsed from the field log files, the individual data files will be checked automatically for erroneous values, as described in the following section. Records in the field log file which are not entered into the data base (e.g., comments in text form) will be archived for documentation or future extraction. 8.2.4 Automated Data Verification Erroneous numeric data will be identified using automatic range checks and filtering algorithms. When data fall outside of an acceptable range, they will be flagged in a report for the quality assurance officer (QAO), or his designee. This type of report will be generated routinely and should detail the files processed and the status of the QA checks. The report will be generated both on disk and in hardcopy for permanent filing. The QAO will review the report and release data which have passed the QA check for addition to the data base. All identified errors must be corrected before flagged files can be added to a data base. If the QAO finds that the data check ranges are not reasonable, the values can be changed by written request. The written request should include a justification for changing the established ranges. If the QAO finds the need for additional codes, they can be entered ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 7 of 13 by the senior data librarian. After such changes are made, the files may be passed through the QA procedure again. In the event that the QA check identifies incorrect data, the QAO will archive the erroneous file and request that the originator corrects the error and retransmits the data. Data base entries which are in the form of codes should be compared to lists of valid values (e.g., look up tables) established by experts for specific data types. These lists of valid codes will be stored in a central data base for easy access by data base users. When a code cannot be verified in the appropriate look up table, the observation should be flagged in the QAO report for appropriate corrective action (e.g., update of the look up table or removal of the erroneous code). 8.2.5 Sample Tracking Samples collected in the field will be shipped to analytical laboratories. All shipping information required to adequately track the samples (sample numbers, number of containers, shipment numbers, dates, etc.) will be transmitted by phone to the information center at the end of each sample day, using modems built into the portable field computers. Once the field crew have transmitted the data, it will be the responsibility of the data ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 8 of 13 management team to confirm that the samples arrive at their destination. To facilitate this, the receiving laboratories will be required, upon receipt of the samples, to record and similarly transmit all tracking information (e.g., sample identification numbers, shipment numbers and the status of the samples) to the information center, using either microcomputers or the VAX. The information management team will generate special programs to create fixed format records containing this information. 8.2.6 Reporting Following analysis of the samples, the summary data packages transmitted from the laboratories will include sample tracking information, results, quality assurance and quality control information, and accompanying text. If the laboratory has assigned internal identification numbers to the samples, the results should include the original sample number and the internal number used by the laboratory. The analytical laboratories will e responsible for permanent archiving of all raw data used in generating the results. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 9 of 13 8.2.7 Redundancy (Backups) All files in the NCIMS will be backed up regularly. At least one copy of the entire system will be maintained off-site to enable the information management team to reconstruct the data base in the event that one system is destroyed or incapacitated. In the field, information stored on the hard drive will be sent to the on- board printer to provide a real time hardcopy backup. The information on the hard drive also will be copied to diskettes at the end of each day of sampling. At the Near Coastal Information Center in Narragansett, incremental backups to removable disk will be performed on all files which have changed on a daily basis. In addition, backups of all EMAP directories and intermediate files will be performed on a weekly basis to provide a backup in the event of a complete loss of the Near Coastal Information Center facility. All original data files will be saved on-line for at least two years, after which the files will be permanently archived on floppy diskette. All original files, especially those containing the raw field data, will be protected so that they can only be read (i.e., write and delete privileges will be removed from these files). ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 10 of 13 8.2.8 Human Review All discrepancies which are identified by the computer will be documented in hard copy. These discrepancy logs will be saved as part of the EMAP archive. All identified discrepancies should be brought to the attention of the QAO or his designee, who will determine the appropriate corrective action to be taken. Data will not be transferred to the data base until all discrepancies have been resolved by the QAO. Once data have been entered into the data base, changes will not be made without the written consent of the QAO, who will be responsible for justifying and documenting the change. A record of all additions will be entered into a data set index and kept in hard copy. 8.3 DOCUMENTATION AND RELEASE OF DATA Comprehensive documentation of information relevant to users of the NCIMS will be maintained and updated as necessary. Most of this documentation will be accessible on-line, in data bases which decribe and interact with the system. The documentation will include a data base dictionary, access control, and data base directories (including directory structures), code tables, and continuously-updated information on field sampling events, sample tracking, and data availability. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 11 of 13 A limited number of personnel will be authorized to make changes to the Near Coastal data base. All changes will be carefully documented and controlled by the senior data librarian. Data bases which are accessible to outside authorized users will be available in "read only" form. Access to data by unauthorized users will be limited through the use of standard DEC VAX security procedures. Information on access rights to all EMAP-NC directories, files, and data bases will be provided to all potential users. The release of data from the NCIMS will occur on a graduated schedule. Different classes of users will be given access to the data only after it reaches a specified level of quality assurance. Each group will use the data on a restricted basis, under explicit agreements with the Near Coastal Task Group. The following four groups are defined for access to data: I. The Near Coastal central group, including the information management team, the field coordinator, the logistics coordinator, the Demonstration Project coordinator, the QA officer and the field crew'chiefs. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 12 of 13 II. Near Coastal primary users - ERLN, VERSAR, SAIC, Gulf Breeze personnel, NOAA Near Coastal EMAP personnel, and EMAP quality assurance personnel. III. EMAP data users - All other task groups within EPA, NOAA, and other federal agencies. IV. General Public - university personnel, other EPA offices (includes regional offices), and other federal, state, and local governments. The Table 8-1 summarizes the policy of the Near Coastal Task Group with respect to the distribution of data. The Roman numerals in the table refer to the groups listed above. Requests for premature release of data will be submitted to the Information Management Team. The senior data analyst and the QAO will determine if the data can be released. The final authority on the release of all data is the technical director of EMAP Near Coastal. The long term goal for the Near Coastal Information Management Team will be to develop a user interface through which all data will be accessed. This will improve control of security and monitoring of access to the data, and it help ensure that the proper data files are being accessed. ------- Section 8 Revision 0 Date: 4/90 DRAFT 1 Page 13 of 13 Table 8-1. Data Distribution Levels for the Near Coastal Demonstration Project QA/LEVEL Synthesis level NO QA/QC Machine QA/QC Human QA/QC Techincal Data Analysis RAW A FIRST FINAL SUMMARY SUMMARY B C I* I* I* I, I, I, II* II* II, III I, If If II II II ,111* ,111* ,111* I-IV I-IV I-IV * Explicit restrictions on the uses and dissemination of the data must be made and agreed to by all participants in these groups. ------- Section 9 Revision 0 Date 4/90 DRAFT 1 Page 1 of 2 SECTION 9 QUALITY ASSURANCE REPORTS TO MANAGEMENT The first annual report for the Near Coastal project is scheduled in June of 1991 after completion of the Near Coastal Demonstration Project in the Virginian Province. This report will, in part, provide an assessment of QA activities and an evaluation of the design and research indicators initially used for the project. After full implementation of the Near Coastal component of EMAP, progress will be reported on an annual basis. Control charts will be used extensively to document measurement process control. An example of a control chart is shown in Figure 9-1. Control charts must be used with QC check standards for controlling instrument drift, matrix spike, or surrogate recoveries to measure extraction efficiency or matrix interference, certified performance evaluation samples and blank samples to control overall laboratory performance, and with reference toxicant data to assess laboratory precision and variability in bioassay test species sensitivity. These control charts should be maintained at the laboratory and included as part of the data packages. ------- Section 9 Revision 0 Date 4/90 DRAFT 1 Page 2 of 2 A quality assurance report (or section of the project report) will be prepared following the project's completion, which will summarize the measurement error estimates for the various data types using the QA/QC sample data (see Section 4 and 5). Precision, accuracy, comparability, completeness, and representativeness of the data will be addressed in this document and method detection limits reported. a HI z m o > HI x t- 3S •- x * 2S CERTIFIED MEAN (») x - 2S x - 3S ~1 1 1— TIME SCALE x t 2S = WARNING LIMIT (95% CONFIDENCE) x ± 3S = ACTION LIMIT Figure 9-1. Example of a control chart. ------- Section 10 Revision 0 Date 4/90 DRAFT 1 Page 1 of 3 SECTION 10 REFERENCES Bisson, J. W. and V. J. Cabelli. 1980. Clostridium perfrinaens as a water pollution indicator. J. Water Poll. Control Fed. 52: 241-248. Degraeve, G.M. , N. G. Reichenbach, J. D. Cooney, P. I. Feder, and D. I. Mount. 1988. New developments in estimating endpoints for chronic toxicity tests. Abstract, Am. Soc. Test. Mater. 12th Symp. Aquat. Toxicol. Hazard Assess.,Sparks, Nev. Federal Register. 1984. Rules and Regulations. Vol. 49, No. 209, Friday, October 26, 1984. pp. 198-199. Graves, R. L., J. Lazorchak, R. Valente, D. McMullen, and K. Winks. In Prep. Laboratory Methods Manual for the EMAP- NC Demonstration Project. Hamilton, M. A., R. C. Russo, and R. V. Thurston. 1977. Trimmed Spearman-Karber method for estimating median lethal concentrations in toxicity bioassays. Environ. Sci. Technol. 11:714-719; Correction 12:417 (1978). Holland, A. F., S. Weisberg, K. J. Scott, S. Schimmel, R. Valente, J. Rosen and K. Summers. In prep. Environmental Monitoring and Assessment Program - Near Coastal Program Plan for 1990. Environmental Research Laboratory, Office of Research and Development, U. S. Environmental Protection Agency, Narragansett, RI. Hunt, D. T. E., and A. L. Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques. 2nd ed. Royal Society of Chemistry, London, England 683 pp. Kirchner, C. J. 1983. Quality control in water analysis. Environ. Sci. and Technol. 17(4):174A-181A. Krahn, M. M., C. A. Wigren, R. W. Pearce, L. K. Moore, R. G. Bogar, W. D. MacLeod, S. L. Chan, and D. W. Brown. 1988. Standard Analytical Procedures of the NOAA National Analytical Facility, 1988, New HPLC Cleanup and Revised Extraction Procedures for Organic Contaminants. NOAA Technical Memo. NMFS F/NWC-153. U.S. Dept. of Commerce, NOAA National Marine Fisheries Service, Seattle, Washington. ------- Section 10 Revision 0 Date 4/90 DRAFT 1 Page 2 of 3 MacLeod, W. D., D. W. Brown,A. J. Friedman, D. G. Burrows, 0. Maynes, R. W. Pearce, C. A. Wigren, and R. G. Bogar. 1985. Standard Analytical Procedures of the NOAA National Analytical Facility, 1985-1986, Extractable Toxic Organic Compounds, Second Edition. NOAA Technical Memorandum NMFS F/NWC-92. U.S. Department of Commerce, National Oceanic and Atmospheric Administration, National Marine Fisheries Service, Seattle, Washington. Morrison, G., E. Torello, R. Comeleo, R. Walsh, A. Kuhn, R. Burgess, M. Tagliabue, and W. Greene. 1989. Intralaboratory precision of saltwater short-term chronic toxicity tests. Res. J. Wat. Poll. Cont. Fed. 61:1707-1710. Peck, D. V., J. L. Engels, K. M. Howe, and J. E. Pollard. 1988. Aquatic Effects Research Program, Episodic Response Project Integrated Quality Assurance Plan. EPA/600/X-88/274. U.S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Las Vegas, Nevada. Plumb, R. H., Jr. 1981. Procedures for handling and chemical analysis of sediment and water samples. Technical Report EPA\CE-8l-l. U.S. Environmental Protection Agency/U.S. Corps of Engineers Technical Committee on Criteria for Dredged and Fill Material, U.S. Army Waterways Experiment Station, Vicksburg, MS. 471 pp. Pollard, J. E. and S. M. Melancon. 1984. Field washing efficiency in removal of macroinvertebrates from aquatic vegetation mats. J. Freshwater Ecol. 2(4):383-392. Rosen, J. S., H. Buffurn, J. Beaulieu, and M. Hughes. In prep. Information Management Plan for the EMAP-NC Demonstration Project. Stanley, T. W., and S. S. Verner. 1983. Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans. EPA/600/4-83/004. U.S. Environmental Protection Agency, Washington, D.C. Stanley, T. W., and S. S. Verner. 1985. The U. S. Environmental Protection Agency's quality assurance program, pp 12-19 In: J. K. Taylor and T. W. Stanley (eds.). Quality Assurance for Environmental Measurements, ASTM STP 867. American Society for Testing and Materials, Philadelphia, Pennsylvania. ------- Section 10 Revision 0 Date 4/90 DRAFT 1 Page 3 of 3 Strobel, C., S. C. Schinunel, and R. Valente. In prep. EMAP-NC Training and Field Operations Manual. Taylor, J. K. 1987. Quality Assurance of Chemical Measurements. Lewis Publishers, Inc., Chelsea, Michigan. 328 pp. TetraTech, Inc. 1986a. Recommended Protocols for Measuring Metals in Puget Sound Water, Sediment, and Tissue Samples. Final Report TC-3090-04. Bellevue, Washington. TetraTech, Inc. 1986b. Recommended Protocols for Measuring Organic Compounds in Puget Sound Sediment and Tissue Samples. Final Report TC-3991-04. Bellevue, Washington. U.S. Environmental Protection Agency. I979a. Methods for chemical analysis of water and wastes. U. S. Environmental Protection Agency, Environmental Monitoring and Support Laboratory, Cincinnati, Ohio, EPA-600/4-79/020, revised March 1983. U.S. Environmental Protection Agency. 1979b. Handbook for analytical quality control in water and wastewater laboratories. U. S. Environmental Protection Agency, Environmental Monitoring and Support Laboratory, Cincinnati, Ohio, EPA/600/4-79/019. U.S. Environmental Protection Agency. 1985. Standard Operating Procedures for Conducting Surplus and Sample Bank Audits. U.S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Las Vegas, Nevada. EPA/600/4-85/003. 71 pp. U.S. Environmental Protection Agency. 1989. Environmental Monitoring and Assessment Program, Conceptual Overview and Issues. Office of Research and Development, Washington, D.C. (Draft Report). ------- |