United States Office of Water EPA 841-B-95-004 Environmental Protection S4503F) July 1995 Agency EPA Generic Quality Assurance Project Plan Guidance for Programs Using Community Level Biological Assessment in Wadable Streams and Rivers Recycled/Recyclable Printed with Vegetable Based Inks on Recycled Paper (20% Postconsumer) ------- ACKNOWLEDGMENT This document was developed by the U.S. Environmental Protection Agency through contract no. 68-C3-0303 with Tetra Tech, Inc. The project managers were Martin W. irossman and Chris K. Faulkner, U.S. EPA Office of Wetlands, Oceans, and Watersheds. Principal authors include Dr. James B. Stribling and Ms. Christiana Gerardi, Tetra Tech, Inc., Owings Mills, Maryland. The document was improved with substantial input by the reviewers listed on page ix and x. ii ------- FOREWORD In order to help ensure that environmental monitoring data are of known quality, U.S. Environmental Protection Agency (USEPA) has established specific requirements for development of Quality Assurance Project Plans (QAPPs). These QAPPs are required for environmental monitoring tasks accomplished within USEPA by its contractors and its grantees. Since 1980, the standard guidance for developing QAPPs has been the Quality Assurance Management Division's (QAMD) 005/80 "Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans". This guidance has now been replaced by EPA QA/R-5 "EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations," Draft Interim Final August 1994. The new QAPP guidance provides considerable versatility in preparation of QAPPs for particular data needs. Among the new materials and approaches introduced by EPA QA/R-5;are inclusion of the Data Quality Objectives process in the QAPP; an expansion of additional elements to be addressed in QAPP; and an approach that permits "tailoring" the comprehensiveness of the QAPP to the nature of work being performed and the particular use of the data. For many years the major water monitoring efforts of USEPA have focused on chemical/physical measurements. Accordingly, guidance documents, such as those for developing QAPPs, have tended to utilize terminology and examples relevant to these monitoring measurements. The recent expansion of biological monitoring has brought new terminology and approaches which do not fit "comfortably" in the past chemical/physical descriptions. Within the USEPA, Office of Water it has become apparent that some means must be found to ensure effective control of data quality for these measurements. Accordingly, it was decided that a generic QAPP for biological measurements, following the structure of the QAPP which had evolved from chemical/physical measurements, would be of considerable value; hence, this document. This guidance is based upon EPA QA/R-5. However, wherever appropriate, biological terminology and examples are given to facilitate use in the discipline of biological monitoring. In addition, "element" descriptions have been expanded to facilitate use by biologists and others who may not be familiar with the terminology and approaches typical of chemical/physical monitoring and laboratory analysis. Development of this guidance has involved extensive inputs, reviews, and recommendations of a wide community of biologists expert in various areas of biological monitoring and analysis. USEPA Quality Assurance Officers well-versed in the use of QAPPs in more typical chemical/physical measurement and analysis have iii ------- also reviewed this document. The Quality Assurance Management Division of USEPA, responsible for the USEPA QA program and its guidance documents, has provided assistance in this adaptation of EPA/QA/R-5 to biological monitoring. As in the case of all new guidance, however, considerable insight for improvement will be gained from its use. Hence, the users of this document are urged to send comments on utility and suggestions for improvement/expansion to USEPA 4503F, Assessment and Watershed Protection Division, Monitoring Branch, Washington, D.C. 20460, Attention: Biological Monitoring Coordinator. As experience is gained and use expands, revised editions of the document will be considered. iv ------- CONTENTS Acknowledgment ii Foreword iii Contents v List of Tables vii List of Figures viii List of Reviewers ix Introduction 1 1. Title and Approval Sheet 6 2. Contents 8 3. Distribution List . . . 9 4. Project/Task Organization 10 5. Problem Definition/Background; Project Description . 14 6. Quality Objectives for Measurement Data 16 7. Project Narrative 24 8. Special Training Requirements/Certification 25 9. Documentation and Records 26 10. Sampling Process Design {Experimental Design}/ Sampling Methods Requirements 27 11. Sample Handling and Custody Requirements 37 12. Analytical Methods Requirements 42 13. Quality Control Requirements 48 14. Instrument/Equipment Testing, Inspection, and Maintenance Requirements 50 15. Instrument Calibration and Frequency 53 16. Inspection/Acceptance Requirements for Supplies and Consumables .... 55 v ------- 17. Data Acquisition Requirements {non-direct measurements) 56 18. Data Management . 57 19. Assessments and Response Actions 58 20. Reports to Management 61 21. Data Review, Validation, and Verification Requirements 64 22. Validation and Verification Methods . 65 23. Reconciliation with Data Quality Objectives 67 Literature Cited 68 Appendix A Abbreviated QAPP Form Appendix B QAPP Glossary of Terms vi ------- LIST OF TABLES 4-1 A list of key positions or areas of responsibility often included in a project organization framework. 6-1 Summary of some measurement (indicator) selection criteria. 6-2 Example summary table of some hypothetical data quality requirements. 10-1 Rule-of-thumb for number of replicate QC samples based on numbers of sites. 12-1 Comparison of reference and voucher collections. 14-1 Example of equipment and supply list for benthic macroinvertebrate sampling. 14-2 Example of equipment list for fish sampling in wadable streams. vii ------- LIST OF FIGURES 1 Generalized flow diagram for the preparation, approval, and implementation process of QAPPs. 1-1 Example of title page format for QAPPs. I-2 Example of a document control header. 4-1 Organizational chart illustrating project organization and lines of communication. 6-1 The seven step DQO process. 10-1 Example of sample label information. 10-2 Alternative examples of sample identification numbering. II-1 Chain-of-custody record. 12-1 Macroinvertebrate laboratory bench sheet. viii ------- LIST OF REVIEWERS The following individuals have provided useful reviews and comment on previous drafts of this document. Dr. Loren Bahls Dept. of Health and Environ. Sciences Ecosystems Mgmt. Section Cogswell Building Room A-206 Helena, MT 59620 Ms. Celeste Barr USEPA Region 1 60 West View St. Lexington, MA 02173 Mr. Dan Boward MD Department of the Environment 2500 Broening Hwy. Baltimore, MD 21224 Mr. Bruce Duncan USEPA Region 10 1200 Sixth Ave. Seattle, WA 98101 Mr. Mark Gordon Specialist-Watershed Management Ministry of the Environment Water Resources Branch 135 St. Clair Avenue West Toronto, Ontario Canada M4V 1P5 Mr. Donald Hart Senior Environmental Biologist Beak Consultants Limited 14 Abacus Road Brampton, Ontario Canada L6T 5B7 Ms. Gretchen Hayslip USEPA Region 10 Mail Stop ES097 1200 Sixth Avenue Seattle, WA 98101 Mr. Terry Hollister USEPA Region 6, Houston Branch 10625 Fa 11 stone Rd. Houston, TX 77099 Dr. Robert M. Hughes Man-Tech Environmental 200 S.W. 35th Street Corvallis, OR 97333 Mr. Phil Johnson USEPA Region 8 999 18th Street, Suite 500 Denver, CO 80202-2466 Mr. Roy R. Jones USEPA Region 10 1200 Sixth Avenue Seattle, WA 98101 Dr. Donald J. Klemm USEPA EMSL Bioassess. and Biotox. Branch 3411 Church St. Cincinnati, OH 45268 ix ------- Dr. James M. Lazorchak USEPA EMSL Bioassess. and Biotox. Branch 3411 Church St. Cincinnati, OH 45268 Ms. Janice Smithson Office of Water Resources Division of Environmental Protection 1201 Greenbrier St. Charleston, WV 25311-1088 Mr. Ed Liu USEPA Region 9 75 Hawthorne St. San Francisco, CA 94105 Dr. Jerry Smrchek USEPA OPPT HERD 401 M St., SW #7403 Washington, DC 20460 Ms. Suzanne Lussier USEPA/ORD 27 Tarzwell Dr. Narragansett, Rl 02882 Mr. Michael Tucker USEPA Region 7 25 Funston Rd. Kansas City, KS 66115 Dr. Eugenia McNaughton USEPA Region 9 75 Hawthorne St. San Francisco, CA 94105 Mr. Mike Papp USEPA GLNPO 230 South Dearborn Street Chicago, IL 60605 Mr. David Peck Lockheed Engineering and Sciences Environ. Assessment Dept. 1050 E. Flamingo Road Las Vegas, NV 89119 Dr. Donna Reed USEPA/OWM 401 M St., SW #4203 Washington, DC 20460 Mr. George Schupp c/o Ms. Denise Boone USEPA Region 5 77 West Jackson Blvd. Chicago, IL 60604-3590 x ------- INTRODUCTION Quality Assurance (QA) - an integrated system of activities involving quality planning, quality control, quality assessment, quality reporting and quality improvement to ensure that a product or service meets defined standards of quality with a stated level of confidence. Quality Control (QC) - the overall system of technical activities whereby the purpose is to measure and control the quality of a procedure or service so that it meets the needs of users. The aim is to provide quality data that is satisfactory, adequate, dependable, and economical. One example of a quality control element for biological sampling is taking replicate samples to ensure consistency among and within sampling crews. Quality Assurance Project Plan (QAPP) - a formal document describing the management policies, objectives, principles, organizational authority, responsibilities, accountability, and implementation plan of an agency, organization, or laboratory for ensuring quality in its products and utility to its users. A QAPP is a technical planning document that defines the objectives of a project or continuing operation, as well as the methods, organization, analyses, and QA and QC activities necessary to meet the goals of that project or operation. The EPA requires that all monitoring and measurement projects carried out by or supported by USEPA have written and approved Quality Assurance Project Plans (QAPPs). This document represents generic guidance for development of QAPPs for specific bioassessment projects or programs, This generic QAPP is based upon "EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations," EPA QA/R-5 (USEPA 1994, Draft Interim Final)1, The expanded descriptions and application guidance have benefited from utilization of the Office of Water Quality Management Plan and previous Office of Water QAPP guidance OWRS QA-1 "Combined Work/QA Project Plans for Environmental Monitoring" (USEPA 1984). A variety of sources have provided materials assisting in development of "biological" examples in the QAPP, These include the work of the Environmental Monitoring Systems Laboratory (Cincinnati, Ohio) to develop QA guidance for establishment of biological assessment programs; technical QA literature (Smith et al. 1988); and selected bioassessment documents (Karr et al. 1986; Ohio EPA 1987; Plafkin et al. 1989). "'However, a slight modification in formal has been made. The "elements" of this QAPP guidance are numbered sequentially instead of being broken down by sections A, B, C, and D, The items covered have the same titles as in QA/R-5. 1 ------- This guidance does not promote one bioassessment procedure over another; it does provide a QA framework to which different bioassessment programs may be adapted. It is designed to allow flexibility with regards to all components in developing a bioassessment program. It has been specifically designed for use by states using bioassessment protocols that focus on community-level responses as indicated by a multimetric approach and taxonomy to the genus/species level. Sampling gears should be appropriate for the habitat and region being sampled and may include active or passive collection devices such as square meter kicknets, dipnets, square foot surber samplers, ponars, Hester-Dendy artificial sampler, basket samplers for macroinvertebrates; electrofishers, seines, and Fyke nets for fish; and knives for scraping, eyedroppers, and containers for dislodging epiphytic algae from macroalgae for algae collections. As is customary for biological programs, pilot studies (or initial year of data) are recommended to investigate sources or error, variability, and representativeness of the monitoring program. Who is responsible for having QAPPs? USEPA QA policy (Order 5360.1) stipulates that specific monitoring projects or continuing operations undertaken with all or partial USEPA funding be covered by a QAPP. A continuing operation is one in which the procedures are not modified significantly from year to year. For this type of environmental program, a single QAPP that describes these routine activities would be prepared. The QAPP serves as the blueprint for implementing the data collecting activity and ensures that the technical and quality goals of the operation are met. It also provides the necessary link between the required data quality constraints and the sampling and analysis activities to be conducted. Programs that have ongoing, repetitive, or small scale sampling events that follow specific Standard Operating Procedures (SOPs) should develop a QAPP for the overall program; this alleviates the need for specific QA plans for each sampling event. The QAPP is then cited in the workplan. State programs developing QAPPs should query other state agencies (e.g., Department of Fish and Wildlife, Department of Health, Department of the Environment, Department of Natural Resources) to determine if a base QAPP currently exists for their type of project. Agencies can draw from this base plan by outlining the rationale for any changes made in adapting it to their project; or if it is suitable for a program, the base QAPP can be cited as the program QAPP. If no base plan exists in the state for community-level, organism- Community - a group of interacting assemblages in a given geographic location. Consists of all living components: fish, amphibians, benthic macroinvertebrates, algae, macrophytes, microbes, etc. Assemblage - a group of interacting populations of organisms in a given geographic location (for example: a fish assemblage or a benthic macroinvertebrate assemblage). 2 ------- based, biological assessments, the program should use this guidance as the template for developing their QAPP. In the case of single biological assessment events (i.e., a non-routine assessment or special study), abbreviated QAPPs can be developed (Appendix B). Such a plan will not need to include extensive language, rationale, or justification for all elements and can take the format of an outline. If individual elements of the QAPP guidance are not related to any aspect of the project, it should be noted in those sections as "not applicable". This short form also provides an overview of the QAPP. What is the process for implementing the QAPP? For internal USEPA projects, the QAPP is reviewed and approved by the Quality Assurance Officer (QAO). QAPPs are distributed to the personnel performing the assigned work, and implemented as written unless modified as described above. The process of preparing and implementing QAPPs is shown in Figure 1. Review and control mechanisms are established for each project in the QAPP and will vary in complexity and scope depending on the particular project. Large-scale, national projects will form QA-task groups to provide the lead in preparing Data Quality Objectives (DQOs) (Section 6) and QAPPs. These same groups will review the QA data on an ongoing basis, conduct audits, and recommend remedial action. If changes to work in progress are needed, the QAPP should be revised, reviewed, and approved by the Project Officer and the QAO and then distributed to personnel performing the work. For continuing operations, the QAPP is reviewed annually and revised whenever significant changes are made in procedures or organizational responsibilities. A QAPP must be approved by the QAO prior to the initiation of data collection activities. How is this guidance document organized? Sections 1 and 2 of this document give examples of an appropriate QAPP title page and table of contents. In addition, all QAPPs must be prepared using a document control header placed in the upper corner opposite the binding of each document page. At a minimum, the header should include the information indicated in Section 1.3. Possible techniques for presentation of project organization and lines of responsibility are outlined in Section 4. Section 5 provides suggestions for producing a project description that illustrates the background and rationale of the project. 3 ------- EPA Office of Water Quality Management Plan (QMP) Establishes policy on applicability of QAPPs Establishes Program-level DQOs, policies for preparation of QAPPs Establishes policies, procedures, and schedules for oversight/auditing of QAPPs No QAPP Prepared Project staff prepares draft QAPP ¦ Oevelop Project Specific DQOs (with process for modifying onsite decisions) Design in accordance with QMP > Comply with "Draft-Final EPA Requirements for Quality Assurance Plans" (EPA QA/R-5) QAO receives approved and signed QAPP and provides implementation guidance ± Project staff implements QAPP I Audit or other QAPP Implementation review I Results integrated into project report including implications of meeting or deviating from QAPP KEY QMP: Quality Management Plan QAO. Quality Assurance Officer QAPP; Quality Assurance Project Plan FIGURE 1 Generalized flow diagram for the preparation, approval, and implementation process of QAPPs. 4 ------- Production of DQOs for the project are discussed in Section 6. Calculation and information presentation procedures for data quality requirements (precision, completeness, representativeness, comparability) are provided in Sections 6, 19, and 23. Procedural and QA guidance for biomonitoring field and laboratory activities are presented in Sections 10 through 12, Section 13 outlines specific QC activities. Section 19 relates to required activities for rectifying project or procedural problems in reducing error sources. Section 20 presents guidance for presenting endpoints in individual QA procedures or sets thereof within formal QA reports. 5 ------- SECTION 1 TITLE AND APPROVAL SHEET 1.1 Each QAPP should include a title page noting the title of the plan, name of the organization(s) implementing the project, as well as names and titles for; Organization's Project Manager Organization's Quality Assurance Manager USEPA Project Manager (Required) USEPA Quality Assurance Manager (Required) Others, as needed (e.g., State, other Federal agency) 1.2 If the project is to be conducted by personnel from more than one institution, appropriate individuals from each institution should sign the title page. Figure 1-1 presents an example of the title page. 3 All QAPPs must be prepared using a document control header placed in the upper Qorner opposite the binding of each document page (Figure 1-2), The following information must be included in the header: a) Section Number which identifies the section or chapter. b) Revision Number which identifies the most recent revision. c) Date is the date of the most recent revision. d) Page of which identifies the specific page and the total number of pages in the section. Section No. Revision No. Date Page of FIGURE 1-2 Example of a document control header. 6 ------- Section 1 Revision No. 5 February 16, 1995 Page 1 of 1 Quality Assurance Project Plan for fProiect Name* Prepared by: (Name) (Address) (Phone Number) Prepared for: (Name) (Address) (Phone Number) (Date) Approvals: Project Manager, Title/Date Agency Primary Qh Manager, Title/Date Agency USEPA Project Manager, Title/Date Agency USEPA QA Officer, Title/Date Agency FIGURE 1-1 Example of title page format for QAPPs. 7 ------- SECTION 2 CONTENTS 2.1 List the sections, figures, tables, references, and appendices included in the document. Corresponding page numbers should be provided for sections/chapters and the literature cited section. 2.2 In some cases, particularly where abbreviated form QAPPs are produced, the content section is optional. 8 ------- SECTION 3 DISTRIBUTION LIST 3.1 A list of the individuals and their organizations who will receive copies of the approved QAPP should be included; subsequent revisions should be compiled and included in the QAPP, All managers who are responsible for implementing any portion of the plan, as well as the QA managers and representatives of all groups involved, should be included. 9 ------- SECTION 4 PROJECT/TASK ORGANIZATION 4.1 The organizational aspects of a project provide a framework for conducting tasks within the project. The organizational structure and function can also facilitate project performance and adherence to QC procedures and QA requirements. Key individuals, including the QAO, responsible for ensuring the collection of valid data and the routine assessment of the data analysis for precision and accuracy must be included in the project organization description. Also identify the data users and the person(s) responsible for approving and accepting final products and deliverables. An example of a project organizational diagram is presented in Figure 4-1. The relationships and lines of communication among all project participants and data users need to be included in the organizational chart. Where direct contact between managers and data users does not occur, such as between a project consultant for a Potentially Responsible Party and the USEPA risk assessment staff, the chart should illustrate the route by which information is exchanged. The chart should be realistic and practical, and should reflect only actual lines of authority and communication for the project. 4.2 Effective QA/QC procedures and a clear delineation of QA/QC responsibilities are essential to ensure the utility of environmental monitoring results. All aspects of the project (field operations, laboratory activities, and data handling and analysis) must be addressed for the organization process to be complete. In order for a monitoring or assessment study to proceed smoothly and yield valid and usable data, it is essential that all Individuals are clearly informed of and understand their responsibilities. Key positions and general duties often included in the project organization and responsibility section of the QAPP are listed in Table 4-1, It is recognized that some agencies have small staffs, therefore, WITH THE EXCEPTION OF THE PRIMARY QA OFFICER ROLE, TWO OR MORE OF THE DUTIES LISTED IN TABLE 4-1 MAY BE THE RESPONSIBILITY OF THE SAME INDIVIDUAL. These individuals must be identified by title, level of expertise, and a brief description outlining their responsibilities. 10 ------- ADVISORY | PANEL . I Sampling Design I Sampling Design Coordinator Design Field QC QC Project Manager/Principal Investigator OA Officer] ECOLOGICAL PROJECT ACTIVITY CLASSES Field Activities I Field Leader Statistician Senior Personnel User Contacts I Laboratory Activities I Laboratory Manager/ Leader Biota ] -{ Water | Laboratory QC I Taxonomy ] Sample Processing | Habitat | | Sample Handling | ~~r~ Data Analysis I Data Processing Leader Data QC Reporting I Document Production Coordinator Reporting QC | Data Presentation | | Data Interpretation! | Data Entry | | Technical Editor | FIGURE 4-1 Organizational chart illustrating project organization and lines of communication. 11 ------- TABLE 4-1 Key positions or areas of responsibility often included in a project organization framework (a sole staff member is NOT required for each of these positions; an individual may be called upon to perform one, two, or several of these sets of responsibilities}. TITLE DESCRIPTION OF DUTIES/RESPONSIBILITIES Advisory Panel (if necessary) The Advisory Panel holds intermittent meetings for review process of the overall program in order to confirm or refute whether the objectives are being met. The group may make suggestions for changing specific procedures or overall organization in the event that the program design fails to meet the stated goals. Project Manager/ Principal Investigator The Project Manager supervises the assigned project personnel (scientists, technicians, and support staff) in providing for their efficient utilization by directing their efforts either directly or indirectly on projects. Other specific responsibilities include: coordinate project assignments in establishing priorities and scheduling, ensure the completion of high-quality projects within established budgets and time schedules, provide guidance and technical advice to those assigned to projects by evaluating performance, implement corrective actions and provide professional development to staff, and prepare and/or review preparation of project deliverables, interact with clients, technical reviewers, and agencies to assure technical quality requirements are met in accordance with contract or grant specifications. Project QA Officer The QA Officer reports to the Project Manager and is independent of the field, laboratory, data, and reporting staff. Major responsibilities include monitoring QC activities to determine conformance, distributing quality related information, training personnel on QC requirements and procedures, reviewing QA/QC plans for completeness and noting inconsistencies, and signing-off on the QA plan and reports. Sampling Design Coordinator The Sampling Design Coordinator is responsible for completion of the sampling design by coordinating resources from the statistician, senior contributing personnel and the needs of the user or contacts that are relative to the sample design. Sampling Design GC Officer The Sampling Design QC Officer is responsible for performing QC evaluations to ensure that quality control is maintained throughout the sampling design process. Field/Sampling Leader(s) The Field or Sampling Leader(s) is responsible for on-schedule completion of assigned field work with strict adherence to SOPs and complete documentation. The Field Leader(s) will supervise all field activities, including implementation of the QA/QC program. Sampling QC Officer The Sampling or Field Operations QC Officer is responsible for performing QC evaluations to ensure that quality control is maintained throughout the entire field sampling procedure. Laboratory Manager/ Leader The Laboratory Manager is responsible for on-schedule completion of assigned laboratory analyses with strict adherence to laboratory SOPs. The Lab Manager will supervise all lab activities, including implementation of the QA/QC program. 12 ------- TABLE 4-1 Continued. TITLE DESCRIPTION OF DUTIES/RESPONSIBILITIES Laboratory QC Officer The Laboratory QC Officer is responsible for performing QC evaluations to ensure that quality control is maintained throughout the entire sample processing procedures that occur within the laboratory. Data Processing Leader The Data Processing Leader is responsible for on-schedule completion of assigned data processing work and complete documentation. The data processing leader/manager will supervise all data processing activities, including implementation of the QA/QC program. Data QC Officer The Data Processing QC Officer is responsible for performing QC evaluations to ensure that quality control is maintained throughout the data analysis process. Document Production Coordinator Document Production Coordinator is responsible for on-schedule completion of assigned writing, editing and data interpretation work. The Document Production Coordinator will direct all reporting activities, including in-house and outside review, editing, printing, copying, and distributing or journal submission. Reporting QC Officer The Reporting QC Officer is responsible for performing QC evaluations to ensure that quality control is maintained throughout the entire reporting and document production process. 13 ------- SECTION 5 PROBLEM DEFINITION/BACKGROUND; PROJECT DESCRIPTION 5.1 The specific problem to be solved or decision to be made is stated in the problem definition. Sufficient background information should be included to provide a historical perspective articulating the regulatory or alleged toxic exposure situation that led to the need for the project. The past history and problem situation should include any previous work or data as well as any regulatory or legal elements that will allow a technically-trained reader to understand the project objectives. 5.2 The purpose of the project description is to define the specific objectives of the bioassessment project and describe how the project will be designed to obtain the information needed to accomplish the project goals and uses. As this section supplies information needed by the intended users of the data, the project description should include a general overview, study/monitoring design features and rationale (methods for selecting sampling station locations, sampling period, etc.), and project timetable. 5.2.1 The general overview contained in the project description should include, at a minimum, the following information: statement of problem, decision, or specific questions to be resolved; description of the study site, facility, process, or operating activities to be evaluated; applicable technical, regulatory, or program-specific quality standards, criteria, or objectives; requirements for any special personnel or equipment; the assessment tools needed for the project {i.e., program technical reviews, peer reviews, and technical audits as needed and/or specified by the Quality Assurance Plan (QAP); anticipated uses of data to answer questions and make decisions; the consequences of Type I or Type II errors based on these results; historical conditions, existing datasets, 5.2.2 Evaluation of Historical Datasets Previous projects that provide information on habitat, biota, or methods should be evaluated. Such evaluation can give invaluable guidance in study design, including sampling gear and study site selection. Use of comparable design, gear, site location, and index period can considerably strengthen the temporal component of an ecological study. Aspects of historical datasets evaluated include (but are not limited to): 14 ------- sample types (assemblage, physical habitat assessment, accompanying water/sediment chemistry) dates of sample collection location of sampling sites type of sampling gear intensity of laboratory processing (taxonomy, sorting). 5.3 The study/monitoring design features should be described and should include the following: A list of all measurements or variables to be taken (e.g., assemblage metrics, physical and biological habitat parameters, or water chemistry variables), including a designation of which measurements are critical versus non-critical to the accomplishment of project goals. It should be noted that many biological metrics and parameters will undergo revision and fine tuning after evaluations of pilot studies and/or revaluation of their effectiveness for the program; these revisions can be appended to the overall program QAPP until the next annual review and revision. A statement of how measurements will be evaluated; e.g., by comparison to reference data, literature, models, internal statistical properties, or other historical information. If statistics are used to analyze data, the rationale used to select the statistic should be stated. Explicit delineation of ecosystems to which decisions will be applied, and a summary table listing the following for each sampling station: types of samples (benthos, fish, periphyton, plankton, physical and biological habitat assessment, or water quality) numbers of samples of each type (designate primary and quality control) sampling gear. 5.4 A project timetable is included with beginning and ending dates for the general project and for specific activities within the project. Any constraints, such as seasonal variations in biota or stream flow, sampling logistics or site access, should be identified in the timetable. The timetable needs to be detailed yet flexible to account for unanticipated problems such as bad weather; guidance should be included for handling such problems. 15 ------- SECTION 6 QUALITY OBJECTIVES FOR MEASUREMENT DATA 6.1 The QAPP must include a statement of the project's Data Quality Objectives (DQOs). DQOs are qualitative and quantitative statements developed by data users, at the programmatic, project and measurement levels, to specify the quality of data needed to support specific decisions. Data users include representatives from public or private sectors, stakeholders, and managers. The logic process related to DQO development is made up of several of the components of the project description outlined previously in Section 5. DQOs encompass all aspects of data collection, analysis, validation, and evaluation. The DQO process provides a means of ensuring the required confidence level in the data needed by decisionmakers. Evaluation of candidate measurement parameters (indicators) relative to stated selection criteria (Table 6-1) ensures linkage of the decision process to DQOs. The process involves establishing the allowable uncertainty of a data set which may lead to Type I or Type II errors: false positives fa problem is found to exist when in fact it does not) and false negatives (a problem is not found when in fact it does exist). The acceptance probabilities of those errors as established by decisionmakers are the DQOs. The DQO process entails establishing action- triggering values and selecting rates of false positive and false negatives that are acceptable to the decisionmaker. The quality of a particular data set is a measure of the types and amount of error associated with the data. Data quality is described by qualitative and quantitative parameters, including precision, accuracy, representativeness, completeness, comparability, and sensitivityall included in the QAPP. The seven steps of the DQO process are in Figure 6-1. 16 ------- TABLE 6-1 Summary of soma measurement (indicator) selection criteria. CRITERIA/QUALITY DEFINITIONS) Scientific Validity (Technical Considerations) Measurable/ Quantitative Feature of environment measurable over time; has defined numerical scale and can be quantified simply. Sensitivity Responds to broad range of conditions or perturbations within an appropriate timeframe and geographic scale; sensitive to potential impacts being evaluated. Resolution/ Discriminatory Power Ability to discriminate meaningful differences in environmental condition with a high degree of resolution (high signal ; noise ratio). Integrates Effects/ Exposure Integrates effects or exposure over time and space. Validity/Accuracy Parameter is true measure of some environmental condition within constraints of existing science. Related or linked unambiguously to an endpoint in an assessment process. Reproducible Reproducible within defined and acceptable limits for data collection over time and space. Sampling produces minimal environmental impact. Representative Changes in parameter/species indicates trends in other parameters they are selected to represent. Scope/Applicability Responds to environmental changes on a geographic and temporal scale appropriate to the goal or issue. Reference Value Has reference condition or benchmark against which to measure progress. Data Comparability Can be compared to existing datasets/past conditions. Anticipatory Provides an early warning of changes, Practical Considerations Cost/Cost Effective Information is available or can be obtained with reasonable cost/effort. High information return per cost. Level of Difficulty Ability to obtain expertise to monitor. Ability to find, identify, and interpret chemical parameters, biological species, or habitat parameter. Easily detected. Generally-accepted method available Programmatic Considerations Relevance Relevant to desired goal, issue, or agency mission (e.g., fish fillets for consumption advisories; species of recreational or commercial value). Program Coverage Program uses suite of indicators that encompass major components of the ecosystem over the range of environmental conditions that can be expected. Understandable Indicator is or can be transformed into a format that target audience can understand (e.g., nontechnical for public). 17 ------- FIGURE 6-1 The seven step DQO process. 18 ------- 6.2 The scope of the project should be described (e.g., the geographic locale, environmental medium, time period, etc.), and any constraints such as time (index period} or resources. 6.3 Sources of error or uncertainty associated with variables and indicators should be evaluated in pilot studies and include; measurement error: the difference between sample values and in situ true values; analytical error: error associated with the measurement process; sampling error: a function of natural spatial and temporal variability and sampling design (non-measurement error); sources such as collection, handling, storage, and preservation. QA activities and QA documentation procedures described in this guide are intended to assist in reducing the magnitude of these sources and their frequency of occurrence. 6.4 The uncertainty component of a multimetric index can be represented by the range of values from aggregated reference sites. The values are either individual metric values or total bioassessment scores calculated from X number of reference sites within a stream class. Reference sites are selected based on the concept of minimal anthropogenic disturbance, their being typical for the ecoregion, subecoregion, and waterbody type, or their potential for consideration as a natural landscape (Hughes 1995; Hughes et al. 1994). Repeat sampling within the same index period from year to year will address natural interannual variability which can be useful for evaluating uncertainty. Expression of uncertainty is difficult when there is insufficient statistically-valid measurements. Costanza et al. (1992) have developed a data-quality grading system intended to allow statements of uncertainty on data ranging from quantitative measurements to informed guesses. The approach is a notational system that attaches a five-part description of data quality to datasets, including numeric, unit, spread, assessment, and pedigree; it is introduced in this document to help fill a gap in the ability to report confidence in environmental measurements. 19 ------- Measurement Quality Objectives 6.5 For each major measurement or value, the QA objectives for precision, representativeness, comparability, accuracy, and completeness (measurement quality objectives or MQOs) should be presented. These features are defined in Smith et al. {1988) and USEPA (1989). 6.6 Precision is a measure of mutual agreement among individual measurements or enumerated values of the same property of a sample, usually under demonstrated similar conditions. Assuring consistency of sampling and sample processing, and striving for repeatability of measurements (Platts et al. 1983) will increase the precision of data. For example, take replicate samples at adjacent sites where different assessment results are not expected (due to the apparent absence of additional stressors) and measure the precision of the procedure. Precision, representativeness, and comparability can also be compared using raw data, metric index values, and, possibly, final bioassessment scores. Appropriate methodology and adequate training and instruction of personnel in methods application is the most certain way to ensure consistency, repeatability, and precision. 6.7 If precision is to be calculated from two replicate samples, use Relative Percent Difference (RPD) calculated as (c,-cyxtoo RPO= (c1+cy*2 where C, = the larger of the two values and C2 = the smaller of the two values. And, if it is to be calculated from three or more replicate samples, use Relative Standard Deviation (RSD) calculated as RSD==x 100 x where s = standard deviation and x = mean of replicate samples. The standard deviation or the standard error of a sample mean (s) is calculated as 20 ------- \ f (X,-X)2 M n-1 where = measured value of the replicate, x = mean of replicate sample measurements, n = number of replicates. Precision can also be expressed in terms of the range of measurement values, 6.8 Data representativeness is the degree to which data accurately and precisely represent a characteristic of a population or community, natural variability at a sampling point, or an environmental condition. Representativeness of a sample depends largely on randomized sampling of the target assemblage (Green 1979; Smith et al, 1988; Freedman et al, 1991} and therefore is highly dependent on the sampling program design. Generally, the sampling program should be designed to ensure representative sample collection of the habitat or population being sampled and adequate sample replication. The original Rapid Bioassessment Protocols (RBPs) (Plafkin et al. 1989) were developed primarily for higher gradient streams with a predominance of riffles, which are considered to be the most biologically- productive habitat in such streams. However, in streams that are in a lower- gradient topography (as in many coastal plains and deltaic zones), there is often a lack of riffles. The Mid-Atlantic Coastal Plains Streams Workgroup has developed a multihabitat sampling procedure for that area (MACS 1993, draft). For example, for collection of benthic macroinvertebrates in low gradient (primarily non-riffle) streams using the 20-jab dipnet collection method, the sampling would focus on representative sampling reach characteristics. That is, if the suitable sampling habitat within the sampling reach consisted of 70 percent snags, 20 percent banks/ shorezone vegetation, and 10 percent submerged macrophytes, the collection effort would comprise 14 snags, four banks, and two submerged macrophytes. Representativeness is, in part, addressed by the description of the sampling techniques and the rationale used to select the sampling locations. Sampling techniques should be verified and validated in separate studies (Section 10). 6.9 Comparability is a measure of the confidence with which one data set can be compared to another. It is often described in non-quantitative terms, but must be considered in designing the sampling plans, index period, critical habitat characteristics, topographic, geological, and hydrogeologic information, analytical methodology, quality control, and data reporting. The use of standardized sampling techniques and USEPA-approved analytical methods enhances the comparability of parameters being measured with data similarly generated from 21 ------- other sources. Reporting of data in units used by other organizations improves comparability. For biological assessments, comparability of data would need to be determined by classifications such as ecoregion (or smaller geographic unit), index period, and sampling gear. For example, samples collected within the same ecoregion using the same gear but collected during different seasons (index periods) may not be comparable. 6.10 Completeness is defined as the percentage of measurements made that are judged to be valid according to specific validation criteria and entered into the data management system. To achieve this objective, every effort is made to avoid sample and/or data loss through accidents or inadvertence. Accidents during sample transport or lab activities that cause the loss of the original sample will result in irreparable loss of data. Collection of sufficient samples allows reanalysis in the event of an accident involving a sample. The assignment of a set of continuous (serial) laboratory numbers to a batch of samples which have undergone chain-of-custody inspection makes it more difficult for the technician or taxonomist to overlook samples when preparing them for processing and identifications. The laboratory serial numbers also make it easy during the data compilation stage to recognize if some samples have not been analyzed. 6.11 Percent completeness (% C) for all measurements can be defined as follows (USEPA 1989); Where v = the number of measurements judged valid and T = the total number of measurements. 6.12 Table 6-2 provides hypothetical examples of MQOs for precision and completeness. For example, when comparing two samples to determine precision, a relative percent difference of 50% of the number of individuals (benthos) may be an acceptable difference depending on the objectives and MQOs stated for the project. Data quality requirements should be based on prior knowledge of the sampling procedure or measurement system by use of replicate analyses, reference conditions (site-specific or ecoregional), or requirements of the specific project (USEPA 1989). 22 ------- TABLE 6-2 Example summary table of some hypothetical data quality requirements. MEASUREMENT PARAMETER REFERENCE METHODS PRECISION (e.g., RPDJ COMPLETENESS (%» Benthos No. individuals Plafkin et al. 1989 50 95 No. taxa 15 95 Fish No. individuals Karr et al. 1986 25 95 No. species 15 95 DQOs that cannot be expressed in terms of precision, accuracy, or comparability should be reported by describing the specified method that will satisfy all regulatory requirements specified; all other QAPP requirements would still need to be fulfilled. 23 ------- SECTION 7 PROJECT NARRATIVE 7.1 Discuss in narrative form the following issues as they pertain to the project; anticipated use(s) of the data how the success and/or failure of the project will be determined survey design requirements and description sample type and sampling location requirements sample handling and custody requirements selection of analytical methods calibration and replicate samples SOPs for field sampling activities plans for peer reviews prior to data collection, and any ongoing assessments during actual operations (oversight). The narrative should allow technical or QA readers to relate the project to the DQOs and to the problem definition. Since this element addresses many other QAPP elements in narrative form, it is not necessary to repeat information for those categories that are covered in more detail elsewhere. For example, SOPs for field sampling are discussed in detail in section 10 of a QAPP and therefore would not have to be repeated in great detail in this section. 24 ------- SECTION 8 SPECIAL TRAINING REQUIREMENTS/CERTIFICATION 8.1 Specialized training or certification requirements needed for personnel to successfully complete the project should be identified and described. Describe how the training will be provided and how the required skills will be assured and documented. 8.2 STAFF TRAINING 8.2.1 All personnel conducting assessments must be trained in a consistent manner, preferably by the same person(s), to maximize the likelihood of standardization and properly conducted assessments. If possible, sampling crews should be trained by the most experienced individual; this individual would be considered "certified" to train and may verify others to train upon sufficient demonstration of knowledge and consistency of techniques. The certified trainer is considered part of the QA program and should conduct revisits of sampling sites to determine bias. Crews could revisit some of their own assessment sites to determine among-crew precision. Precision of multiple sampling teams can be determined after adequate evaluation of among-crew precision. 8.2.2 The designated QAO is responsible for confirming consistency among investigators. At regularly scheduled intervals, a different field group should visit selected overlap sites and perform assessment techniques to use as a replicate of a previous assessment. Results from two separate assessments conducted by two different teams can determine if reproducible results are being attained. This is an ongoing process that should, over time, allow for consistent investigations of all sampling teams. Quality control of picking, sorting, and taxonomic identifications can also be evaluated in this way. 8.2.3 Training is not only for the inexperienced, but also is used to maintain consistency among ail crews. Training should be conducted at regularly scheduled intervals and should occur in all aspects of a program. This can be accomplished through workshops, seminars, or field demonstrations. Management should periodically assess the training needs of all personnel engaged in fieldwork and recommend and support their participation in appropriate and relevant seminars, training courses, and professional meetings. Biologists and technicians should be expected to participate regularly in evaluation and/or certification programs where appropriate. These programs should be included as current resumes which are on file for each person responsible for the sampling, analysis, evaluation, and reporting of biological data. 25 ------- SECTION 9 DOCUMENTATION AND RECORDS 9.1 State what type of information and records that must be included in a report package for the project or task, and specify the reporting format if applicable. Documentation can include raw data or QC checks. Specify required laboratory turnaround time and whether a field sampling and/or laboratory analysis "case narrative" is required to provide a complete description of any difficulties encountered during sampling or analysis. 9.2 Specify any requirements for the final disposition of records and documents from the project, including location and length of retention period. This section can also identify the length of time that a voucher collection is to be maintained for a particular project. Since for most institutions space is a premium, long-term or indefinite maintenance may be accomplished by storing voucher collections at universities and museums. 9.3 Be specific regarding the preparation, maintenance, and location (address) of voucher collection(s), and the primary person responsible for it (them). Also identify the maximum time period for which voucher materials will be maintained. 26 ------- SECTION 10 SAMPLING PROCESS DESIGN (EXPERIMENTAL DESIGN)/ SAMPLING METHODS REQUIREMENTS 10.1 SAMPLING PROCESS DESIGN The experimental design and anticipated project activities, including the types of samples required, sampling frequencies, measurement parameters, and the rationale for the design, should all be outlined. Specific techniques or guidelines for selecting sampling sites or use of sampling equipment should be described. All specified measurements should be classified as critical (required to achieve project objectives) or non-critical (informational purposes only). Complete documentation and validation of the sampling and analytical methodologies must be included. If non-standard methods or unusual gear will be used, the rationale and appropriate methods validation information need to be included. In the event that validation studies were not previously performed, they must be conducted during the project study, and the results included as part of the project results. Methods citations that allow for various options to be selected should include the exact option chosen for the project. 10.2 SAMPLING METHODS REQUIREMENTS Collecting representative samples is crucial to subsequent decision making. Obtaining good results on non-representative samples are inappropriate because such results could lead to incorrect decisions. Specific implementation requirements for the selected method and gear should be outlined and any procedures for onsite modification of sampling methods or corrective actions for disruptions in the sampling methodology should be included. For example, if a sampling method usually produced one liter of sample, but due to habitat and seasonal effects during a sampling event the same sampling method produced 10 liters of sample (due to large amounts of detritus), the adjusted sampling methodology or subsampling measures needed for sample processing should be described. For all sampling events, the sampling crew needs to have access to the Field/Sampling Leader or the Principle Investigator to discuss any onsite adjustments that may not be outlined in the SOPs or QAPP. The sample containers used and preservation methods appropriate for the selected assemblage should also be described. For example, the collection of periphyton may follow Procedure 6.2.2 in the Field Procedures Manual of the Montana Water Quality Bureau (DHES 1989), using a pocket knife for scraping, and a large-bore eyedropper for lifting microalgae. Specific requirements for sample collection include the scraping of the entire 27 ------- surface of several rocks selected at random; lifting microalgae from mud or silt substrates with an eyedropper; hand picking macroalgae in proportion to their abundance; and removing epiphitic algae from macroalgae by shaking in a separate container (Bahls 1992). The goal of this SOP is to obtain a single composite sample that is a miniature replica of the algae present at the site. The algae is then placed in a water tight, non-breakable sample container (125ml capacity) and preserved with enough ambient water to cover the sample and iodine potassium iodide (Lugol's solution) to a reddish-brown tint. Any onsite modifications should be made by the crew leader or the approval (phone) of the Principle Investigator with documentation of the rationale used for the modification along with the appropriate methods validation information. The sample should be kept dark and cold until processing (refrigeration is not needed for transport). Samples stored at room temperature in daylight should have the Lugol's solution replenished every few weeks. Any further specific requirements for assemblage collections would also be included in this section. 10.3 STANDARD OPERATING PROCEDURES (SOPs) 10.3.1 SOPs are written procedures that detail the precise methods to be used during each step of the sample collection, handling, transportation, and holding time process. When existing written procedures are applicable and available, they may be incorporated into the QAPP by reference. The SOPs should provide an outline of the steps to be taken to assure the quality of the samples and sample data. A complete set of SOPs should be approved by the Project Manager and bound together in a looseleaf notebook that is easily accessible to personnel for referral (ASTM 1991). Any deviations from the SOPs should be documented with the reason for the deviation and any possible effect the deviation might have on the resulting data. Modifications made to SOPs due to addition of new information or correction of errors need approval by the Project Manager. SOPs should contain document headers (as described in section 1) in order to track the latest revisions; old versions should be discarded. 10.3.2 QAPPs should provide for the review of all activities which could directly or indirectly influence sample quality and the determination of those operations which must be covered by SOPs. The SOPs should describe the following: method summary/rationale selection of target assemblage(s) sampling methodology, including decontamination procedures physical and biological habitat assessment methodology equipment/materials reagents details of preservation, holding times, and transport use and calibration of instruments 28 ------- replication and QC requirements safety sampling site selection (including reference sites) sample labeling sample subsampling data reduction formulas. 10,4 GUIDELINES USED TO SELECT SAMPLING SITES 10.4.1 Proper selection of sampling sites should be directed toward maximizing accuracy, minimizing uncertainty or, at least, providing a means by which variability may be reduced. This is related to the concept of only comparing community-level data (between reference and test sites) if similar habitat exists. Ideally, site selection criteria would be concise so that if two researchers were to follow them, each would choose similar locations. The criteria should minimize the amount of subjectivity that enters into the site selection process. For example, the two primary criteria used in selecting reference sites for stream bioassessment are minimal impairment and representativeness (Gibson 1994). The conditions at reference sites should represent the best range of minimally-impaired conditions that can be achieved by similar streams within a particular ecological region (Hughes et al. 1995) (Section 6.4). 10.4.2 A probabilistic site selection process is one whereby sampling sites are selected at random to ensure representativeness. Random selection provides a statistically-valid estimate of the condition of a waterbody class or other habitat class (e.g., lakes, large rivers, streams). These gross-level classes can be further stratified into finer divisions based on geographic or other ecological, physical, and chemical factors, and are used to group sites that share them. Probabilistic site selection is the foundation of the regional monitoring approach of USEPA/Environmenta! Monitoring and Assessment Program (EMAP). For point source assessments, the sampling site selection should include stations upstream and downstream of the source, as well as at least three regional reference stations. For instance, the selection of sampling sites should be conducted in such a way as to reduce variability and uncertainty by ensuring that the physical characteristics between sampling sites are similar. If surveys are conducted to determine use designations, sampling locations should be representative of the stream reach. Reference conditions should include minimally impaired sites in the same ecoregion, size class, and stream type (width, depth, gradient). The objectives of the study will determine the selection of specific sampling habitat and the gear best suited for the physical habitat sampled within a sampling reach. 29 ------- 10.5 HABITAT CHARACTERIZATION 10.5.1 An evaluation of physical and biological habitat quality is critical to any assessment of ecological integrity (Karr et al. 1986; Barbour and Stribling 1991; Plafkin et al. 1989; Kaufmann 1993). Habitat assessment supports understanding of the relationship between physical habitat quality and biological conditions and should consist of parameters appropriate for the assemblage being sampled since habitat features important for one assemblage may be different than those for another. Such assessment identifies obvious constraints on the attainable potential of the site, assists in the selection of appropriate sampling stations, and provides basic information for interpreting biosurvey results. Different habitats or habitat types often become strata in the design. 10.5.2 QA methods for physical and biological habitat assessment should be documented. Standardized field data sheets should be employed as should multiple observers at some percentage of the study sites which range from physically impaired to unimpaired. 10.5.3 Each investigator involved in the physical and biological habitat characterization must be appropriately trained to minimize variability in the final conclusions. 10.6 SAMPLING PERIOD 10.6.1 Sample timing should be consistent to reduce variability within or among datasets. A critical value to assess when considering data variability is the most appropriate period for sample collection, which can vary with the target assemblage; both season and time of day should be considered. Pilot studies should be conducted to determine an index period with the least sampling variability. The sampling period, like the sampling area, defines the domain of study and should be documented. By considering the following issues, some QA concerns can be addressed: Seasonal Influence - Time of year should be considered to determine its influence on the objectives of the project. For example, food availability, flow, and temperature are important seasonal factors that influence condition of the biota. This determination can be accomplished through literature searches on similar ecosystem monitoring studies as well as through reconnaissance and pilot studies. * Community Succession and Life Stages - A familiarity of life cycles may be critical in monitoring some community assemblages. 30 ------- Habitat/substrate disturbance (e.g., elapsed time since last storm event, etc.). As general guidance, sampling periods should be selected in order to: maximize the efficiency of the chosen sampling gear; maximize the accessibility of the targeted assemblage or species; minimize natural variability; maximize the availability of technical personnel. Sampling periods should also be selected so that extremes in climatic conditions are avoided {e.g., water temperature and flow [drought, rainy season, snowmelt]} unless the object of the study is to investigate the limiting affects of seasonal variations on the biota, or system complexity and recovery following storm events (which could be hazardous to the sampling crew). Sampling conducted during periods of stress should be well documented since natural stressors (i.e., high and low flow, temperature, etc.) can mask or accentuate impacts. 10.7 DOCUMENTATION 10.7.1 The field data sheets should be filled out completely and accurately to provide a record in support of the survey and analysis conclusions. Abbreviations commonly used in documentation (e.g., scientific names) should be standardized and defined to decrease data manipulation errors. Portable data recorders (PDR) may be used to increase the completeness and accuracy of field data and computer entry time. 10.7.2 Each sample collected should also be documented by assigning a unique identification number, log number, and internal and external labels. An example of the sample labelling information can be found in Figure 10-1; sample numbering examples are presented in Figure 10-2. Data should be documented to allow 31 ------- Agency/Client Project No. Sample No. Location/Waterbody Station Assemblage Habitat Method (Gear! Preservative Collected by QC Sample Date Time Log # FIGURE 10-1 Example of sample label information. 32 ------- 1) A13 / 91 / 006 corresponds to: Alex Branch, Station 13 / Year / Serial log number of project 2) 6874-01 / A13 /006 corresponds to: Project number / Alex Branch, Station 13 / Serial log number of project FIGURE 10-2 Alternative examples of sample identification numbering. Example 1 relates the sample to the year; example 2 relates the sample to a contract or project number. 33 ------- complete reconstruction from initial field record through data storage system retrieval. 10.7.3 The Field/Sampling Leader and Laboratory Leader personnel should keep complete and permanent records of all conditions and activities that apply to each individually-numbered sample. All field and laboratory data sheets should be dated and signed by the individual doing the sampling and the analyst, respectively; taxonomic reference documents should be approved and noted. Notebooks, data sheets, and all other records that may be needed to document the integrity of the data should be archived at study completion and kept permanently filed in a safe and fireproof location. Records and voucher samples should be maintained at least 3 years or other specified time as stated in the workplan and/or contract. Archived records are the responsibility of the study sponsor, unless the responsibility is delegated. 10.8 REPLICATION 10.8.1 Field sampling validation involves two procedures: 1) collection of replicate samples at a randomly selected 10 percent of sampling stations to document the precision of the collection effort; and 2) for visual-based qualitative habitat structure assessments, two or more observers should independently complete field sheets for at least 10 percent of stations. The precision of this procedure can then be evaluated by relative percent difference (RPD) (Section 6.6). The 10 percent figure should be viewed as rule-of-thumb guidance for replication. For large projects, 10 percent replication would probably be too many; similarly, for small projects, 10 percent would likely not be enough. General recommendations for different levels of replication are presented in Table 10-1. 10.9 METHOD AND GEAR SELECTION 10.9.1 Selection of sampling gear should be appropriate for the target assemblage, habitat, and analytical methods employed and depends on the DQOs of the decisionmakers, the expertise of the biologist, and the unresolvable components of variation (USEPA 1990a). Where appropriate, methods should include decontamination SOPs. For example when using a net for collection of small organisms, the nets must be thoroughly cleaned between samples so that 34 ------- TABLE 10-1 Rute-of-thumb for number of raplicata QC samples based on numbers of sites. No. of sampling sites in project (n) No. of replicates n 150 15 30 n ^ 150 10% n < 30 3 n = 2 3 n = 1 3 organisms captured at one station do not get carried over and included in the next station. In cases where it is suspected that an organism has been carried over from one station to the next, the field notes should be consulted to determine if the (outlier) organism could indeed be left over from a previous sample. If it becomes obvious that an organism was left on the gear from a previous sample, a statement should be included on the data sheet that the carry-over organism will not be included in the metric calculations. This situation is rare but when it occurs, it will most often occur with gear used in sampling small organisms. 10.9.2 The type of gear used in the sampling process depends on the assemblage and the specific habitat sampled within a site. For instance, macroinvertebrate sampling gear may include kicknets, Hess sampler, surber samplers, or other stream-net samplers (USEPA 1990a) used in riffles; dipnets are used in low gradient streams in shorezones, banks, snags, and submerged vegetation. Electrofishers, seines, and Fyke nets may be used for fish. Knives used for scraping algae from rock, spoons or large-bore eyedroppers for lifting microalgae from silt substrates, hand picking macroalgae, and dislodging epiphytic algae from macroalgae by shaking in a container are some gear types or methods used in algal collections. 10.10 LEVEL OF EFFORT (LOE) 10.10.1 The level of effort required for the completion of a specific task should be outlined before the task is undertaken. For example, for field sampling, the appropriate number of people for unbiased and consistent operation of field gear should be available (e.g., two people for collections with a net or hand collections, three to four people for electrofishers), and the appropriate amount of time per site needs to be allowed for consistent and proper application of methodologies. The amount of time necessary for effective gear operation will vary per site with the target plant or animal assemblages and the gear used. For planning purposes, Plafkin et al. (1989) provided approximate estimates for the amount of time needed for sampling protocols for use of the square meter kicknet in sampling 35 ------- benthic macroinvertebrates in riffles. Approximately 1.5 person-hours are required per site (two people, approximately 40 minutes each}. The 40 minutes of actual time includes writing labels, preparing sample containers, taking a double composite square meter sample, field preservation, and collection of supplemental Course Particulate Organic Material (CPOM) samples. An additional 15 to 20 minutes is required to complete the habitat assessment forms. More time is added to total station LOE for additional observers completing duplicate habitat assessments. Additionally, time must be allocated for traveling to the site. 36 ------- SECTION 11 SAMPLE HANDLING AND CUSTODY REQUIREMENTS 11.1 SAMPLE HANDLING 11.1.1 Sample handling requirements vary with the assemblage being studied in the survey. For most biological assessments, the minimum sample size needed to fulfill the data quality objective for representativeness should be incorporated into the sampling design so that sampling produces minimal environmental impact. For those samples that will be analyzed in the laboratory, the organisms are sacrificed and field preservation, labelling and transport protocols must be followed. For many fish surveys, experienced fish biologists are proficient in field identifications and thus most specimens are returned to the water following identification and enumeration. Exceptions are juveniles, hybrids, and difficult-to-identify species. Also, if temporary crews are used, samples of all collections should be verified. Voucher specimens are appropriate for all species and should be stored in fish museums or universities whenever possible. For a review of fish methods, readers should refer to USEPA (1993a), Meador et al. (1993), or Ohio EPA {1989). Most of the following information is appropriate for those types of samples that are returned to laboratories for processing and identification (benthic invertebrates, phytoplankton, and periphyton). 11.1.2 All activities categorized under sample handling should be documented (as SOPs) and followed closely to prevent or minimize the introduction of error. Consistency should be the rule in all of the following activities: field preservation labelling storing or transportation. 11.1.3 The following information associated with each sample should be identified: exact location and ambient conditions associated with sample collection should be maintained in field notebooks, field collection sheets, or PDRs; possession and analysis logs should be maintained in the laboratory; chain-of-custody forms, sample preservation, if any, and dates and times of sample transfer and analysis; procedures for transferring and maintaining custody of samples. 37 ------- 11.1.4 Certain sampling protocols (e.g., the Tier 2 protocols of Plafkin et al. [1989] and most fish sampling) involve sorting, identification, and enumeration of specimens in the field. When benthic macroinvertebrate and fish samples are field- identified, the field data sheets become the item of custody as do any preserved specimens used for taxonomic verification. All header information should be completely filled out and copies of all sheets distributed. As specimens are laboratory-identified, they can be preserved, archived as vouchers, and placed in a repository. Location of the repository and a record of the specimen preservation is entered into a log book. 11.2 SAMPLE CUSTODY 11.2.1 The primary objective of the chain-of-custody procedure is to create a written record that can be used to trace the possession of the sample from the moment of collection through the entire data analysis. Field crews as well as laboratory personnel should follow written chain-of-custody procedures for collecting, transferring, storing, analyzing, and disposing samples. Sample custody procedures are important to ensure the integrity of the samples whether for legal or other purposes. Explicit procedures must be followed to maintain the documentation. An example chain-of-custody form is presented in Figure 11-1; separate forms should be filled out for each sample if the samples are likely to become separated. Notations should be entered in the logbook regarding the condition of the samples. All sample labels, as well as the chain-of-custody form, should contain the following information at a minimum; ID or log number (can be same as sample no.) Location - state, county, approximate distance from nearest town, name of waterbody being sampled Date - date of sample collection Time - time of sample collection Sampled By - initials of personnel collecting the sample Type of Sample - e.g., benthos, periphyton Preservative - e.g., formalin, 95 percent ethanol, Lugol's solution Station - numbers or letters to designate station location Sampling Gear - e.g., kicknet, seine, eyedropper. 11.2.2 Samples from which courtbound data are to be derived are kept in sample storage areas of the laboratory where access is limited to laboratory personnel and controlled by locked doors: treating all samples as if they were courtbound decreases the likelihood of mishandling actual courtbound samples. The samples are routinely retained at the laboratory as required by the project after the data have been forwarded to the appropriate person(s) so that any analytical problems 38 ------- Project Manager or Contact: Address/Phone: # s a m P 1 e s # j a r s Sample Type CHAIN-OF-CUSTODY J Agency Name: Address: Project Number Project Name: Sample Location: Page of Gear Method Log Number Date Time Sample Station Sampled by: (signature) Date/Time: Relinquished by: (signature) Date/Time: Received by: (signature) Date/Time: Received by: (signature) Date/Time: Received by: (signature) Date/Time: Received by: (signature) Date/Time: FIGURE 11-1. Chain-of-custody record. 39 ------- can be addressed. The samples are discarded at the end of a specified time period {see Table 12-1); 1 to 5 years may be appropriate, depending on project requirements. Long term preservation methods of biological samples can be found in Klemm et al. (1990). A sample evidence file should be maintained which includes copies or original laboratory bench sheets, field notes, chain-of-custody forms, logbooks, sample location and project information, and final report. The location and responsible agency of the evidence file should be named in the project plan. 11.2.3 Specific tasks/conditions for sample storage may include the following: Samples will be stored in a secure area. The secure area will be designed to comply with the storage method(s) defined in the contract (i.e., fireproof, ventilated, etc.). The samples will be removed from the shipping container and stored in their original containers unless damaged. "Damaged and unusable samples" (for example, a sample container that broke and part or all of the sample was not recoverable) will be disposed of in an appropriate manner and disposal will be documented. "Damaged and usable samples" (for example, a sample container that broke in such a way as to salvage all organisms) will be documented and transferred to a new container, if possible and necessary. The field leader and the Project Manager will be notified immediately of any damaged or disposed samples. The storage area will be kept secure at all times. The sample custodian will control access to the storage area. Duplicate keys for locked storage areas should be maintained only by the appropriate personnel. Whenever samples are removed from storage, this removal will be documented; all transfers of samples can be documented on internal chain-of-custody records. Samples, reference, and voucher specimens (section 12.7) will be stored after completion of analysis in accordance with the contract or until instructed otherwise by the Project Manager. 40 ------- The location of stored reference or voucher specimens will be recorded. Reference or voucher specimens will not be stored with samples. The sample storage area will be described. 41 ------- SECTION 12 ANALYTICAL METHODS REQUIREMENTS 12.1 Methods of sample arid data analyses should be well-documented, These methods must be appropriate for all parameters and should be USEPA-approved or otherwise validated/published standard methods. 12.2 For USEPA-approved or standard methods, pertinent literature should be referenced. Pertinent literature would include appropriate validation data for the methods to be used. 12.3 For non-standard, state developed, or modified methods (Caton 1991), detailed SOPs should be provided which include methods for all sample preparation, picking and sorting, and identification procedures. 12.4 BIOLOGICAL SAMPLE LABORATORY PROCESSING 12.4.1 Biological sample laboratory processing generally falls into two broad divisions. The initial or primary sample processing may include sorting, subsampling, and re-sorting checks. Secondary or final phase processing may include taxonomic identification and verification procedures, tabulation, enumeration, and measurements. The secondary phase might also include calculation of metrics or indices. An example of a macroinvertebrate data sheet is presented in Figure 12-1. 12.5 PRIMARY PHASE SAMPLE PROCESSING 12.5.1 Subsampling - In biomonitoring programs where resource limitations restrict expendable sampling and analytical effort, subsampling is recommended as a cost-effective and valid procedure for (a) selecting a representative estimate of the total sample collected and (b) standardizing the level of effort expended on each sample. Subsampling methods vary according to the assemblage. For example, methods may include procedures for cleaning diatom strewn mounts, and establishing counting transects on coverslip (Bahls 1992). Caton (1991) has developed a gridded screen technique for increased objectivity in field or laboratory subsampling of benthic macroinvertebrates. As subsampling methods are developed, every attempt should be made to reduce bias. SOPs should, therefore, be developed to standardize the unit of effort and to eliminate subsampler subjectivity and errors in sorting and picking. Subsampling error should be quantified depending on the type and volume of the subsample. 12.5.2 Sorting macroinvertebrates includes rough segregation of individuals within a sample or subsample by some predetermined taxonomic grouping into pre- 42 ------- labelled containers. Sorters should be trained so that they can identify the organisms from the surrounding debris; QC checks on new sorters should be frequent until it is clear that the sorter knows what he or she is looking for. Subsequent sorting results in containers of finer taxonomic groupings. For algae samples, this step may include estimates of relative abundance of non-diatom (soft-bodied) algal taxa determined by cells per field of view of a composite wet mount (Bahls 1992). 12.5.3 Sorting Checks (postprimary sorting") - A portion of sample residues must be re-checked by intralaboratory QC personnel for missed specimens (under- recovery). Re-sorting checks can be used to measure repeatability. A portion of sample residues may also be re-checked by separate laboratories for interlaboratory QC. 12.6 SECONDARY PHASE SAMPLE PROCESSING 12.6.1 Taxonomic Identifications, Verification Procedures - Training, experience, and possession of proper laboratory equipment and taxonomic literature are crucial factors affecting the quality of identification activities. Abbreviations commonly used in documentation {e.g., for scientific names) should be standardized and defined in the data pack to decrease data manipulation errors. A general guide is that specimens should be identified to the lowest possible taxonomic level using the most current literature available. Some parameters or other analytical techniques, however, may only require identification to the ordinal, familial, or generic level (Plafkin et al. 1989; Ohio EPA 1987). An argument against such an approach is that sensitivity and tolerance information is more accurate at the species level of identification. All questionable taxonomic identifications should have a post-determined level of uncertainty identified. For instance, define a scale of uncertainty (e.g., 1-5 where 1 is most certain and 5 is least certain) for each identification and specify reasons for any uncertain identification (e.g., missing gills, headless specimen, etc.). Define the criteria for assigning tolerance values to uncertain identifications. For example, if the generic level of identification is questionable, determine an average tolerance value for the family levet. For those taxa that are in good condition and easily identified by the taxonomist, the rating can either be noted as 1 (certain) or left blank. 12.6.2 Verification should be done in one of two ways; Comparison with a pre- established reference or research specimen collection can yield rapid and 43 ------- Benthic Macroinvertebrates Laboratory Bench Sheet Site/Project# Location Samolina Station Type of Sample {Gear! Subsamole: Total 100 200 300 Other Taxonomist Date Sampled Sorter truer hamiiy ana/or oenus and Si jocies name on Dianic line. a = aouit i = immature Oroanisms No A i TCR Oroanisms Imo A I TCR Diotera Heteroutera Chironomidae Coleootera Other Neurootera and Meoalootera Trichootera Crustacea Oliaochaeta Plecootera Eohemerootera Hirudinea Bivalvia Gastropoda Odonata Other Taxonomic certainty rating (TCR) 1-5: 1 =most certain, 5 = least certain. If rating is 3-5, give reason (e.g., missing gills). Total No, Organisms Total No. Taxa FIGURE 12-1 Macroinvertebrate laboratory bench sheet. 44 ------- accurate results. A reference collection is defined as a set of biological specimens, each representing some taxonomic level and not necessarily limited to specific projects or activities. Reference collections should have expert confirmation of each taxon. Reference collections are used for verifying identifications of subsequent samples. One potential problem with this approach may be the previous misidentification of the reference specimens. An approach most likely entailing the least uncertainty is to send samples to taxonomic experts familiar with the group in question for confirmation (Borror et al. 1989). Detailed documentation of independent taxonomic verification by recognized experts should be provided along with address and telephone number. Potential problems might result by establishing a set of contacts among recognized experts in various groups of organisms. The taxonomist should always be contacted by telephone or correspondence prior to sending specimens. Just as important is the receipt of advice on proper methods for preserving, packing, and shipping samples to them. Damaged specimens are often useless and impossible to identify; thus, careful preservation and packing is essential. 12.7 VOUCHER COLLECTION 12.7.1 The true data of a project are the actual specimens collected in a survey for that project. Following identification and enumeration, these specimens should be maintained in a voucher collection. Voucher collections can be maintained any specified length of time for the project. For instance, if space is a critical issue, the voucher collection can be disposed of after the data have been reviewed and the report finalized. 12.7.2 Voucher collections may sometimes serve as reference collections but usually not vice-versa. This is primarily because reference collections are arranged/curated based on taxonomic and/or phylogenetic order and are not usually associated with particular projects or specific waterbodies (although that information will be included with label data). If there are ever questions regarding the accuracy of taxonomic identification that have been used in parameter calculation and reporting, referral to the voucher collection will be an initial step taken in resolution. Also, a complete list of taxonomic references used should be compiled for each project such as is found in USEPA (1990a). A comparison of various attributes of reference and voucher collections is presented in Table 12-1. 45 ------- TABLE 12-1 Comparison of reference and voucher collections. CONSIDERATIONS REFERENCE VOUCHER Usual Curatorial Arrangement Taxonomic and/or phylogenetic By project or sample lot Taxonomic Verification By expert By expert, or comparison to reference collection Number of individual organisms by designated taxon At least one, but several may be included {or added over time) to illustrate sexual dimorphisms or other morphological variability {including deformities) as well as to document geographic distribution If full number of individuals enumerated from sample and used in data calculation is archived, it serves as the sample voucher; if a selected number of individuals from a sample {e. g., 10-20 out of 100 total) that represent an identifier's concept of a particular taxon, the specimens serve as the taxonomic voucher Slide-Mounted Specimens Permanent Permanent or temporary Required Storage Space Large, but would tend to increase only as fast as additional taxa are incorporated; will be somewhat dependent on geographic area of responsibility Could be very large, and could continue to grow at rapid pace if there are vouchers retained from all samples from all projects; particularly a problem with archiving of fish samples, progressively less so with benthos and periphyton. Fish vouchers should be deposited in a state or regional fish museum or university, where they are also useful for systematic and biogeographical research by others. Can serve as reference collection? Sometimes, if curated in taxonomic arrangement within each project or sample lot; if project-oriented samples are segregated by taxon and distributed in collection among appropriate taxonomic groupings Can serve as voucher collection? Usually not, reference collections don't contain the totality of a sample; if they do, it could (but, not necessarily} take considerable effort to reassemble the sample; also, will not normally contain representative specimens from all project samples thus limiting its utility as a voucher collection 46 ------- TABLE 12-1 Continued. CONSIDERATIONS REFERENCE VOUCHER Duration of Maintenance Permanent, ongoing As required by contract or project specifications [e. g., project terms may specify maintenance of vouchers for a period of 5 years following final report approval after which they may be discarded}; may be permanent; after this, and if not already done, voucher specimens may be incorporated into the reference collection Major function and use Taxonomic verification of future identifications Data verification for specific projects 47 ------- SECTION 13 QUALITY CONTROL REQUIREMENTS 13.1 The QC checks that are needed are determined by the project QA objectives and the anticipated uses of the results. QC checks apply to field activities, laboratory activities, and the data analysis. 13.2 Field activity QC checks should include: collection of replicate samples at various stations (usually 10 percent of total number of samples collected Section 10.9.2 and Table 10-1) to assess the consistency of the collection effort; repeat {and/or parallel) field collections and analyses performed by separate field crews to provide support for the bioassessment (Table 10-1); occasional alternating and mixing of field personnel to maintain objectivity (minimize individual bias) in the bioassessment; and in visual-based physical habitat assessment, final conclusions are potentially subject to variability among investigators. This limitation can be minimized, however, by ensuring that each investigator is appropriately trained in the evaluation technique and periodic cross- checks are conducted among investigators to promote consistency. Consistency among parallel and independent physical habitat assessments can be evaluated by rank order comparisons of the evaluated sites. Thus, comparing the score for each parameter is not as important as comparing the total score for each habitat assessment which gives the rank order of sites (their placement in the assessment from good to bad). 13.3 Laboratory activity QC checks should include: Periodic sorting checks of samples to uphold a minimum established, at least 90 percent, percent recovery error to maintain sample processing and sorting efficiency. When the established percent recovery error is not met, then an appropriate number of samples should be re-checked until the percent recovery error is within accepted limits. A record of all samples sorted along with a list of QC checks should be maintained to document the QC process for the samples. 48 ------- * Taxonomists, who are identifying organisms, should have adequate taxonomic references to perform the level of identification required. These references should be on file at the laboratory so that periodic checks can be made to facilitate obtaining new references or updating existing references needed for the identification of specimens to the lowest taxonomic level possible. * Representative specimens of all taxa identified should be checked and verified by a specialist in that particular taxonomic group. These specimens should be properly labelled as reference or voucher specimens {including the name of the verifying authority), permanently preserved, and stored in the laboratory for future reference. 13.4 Data management QC checks should include: * hard copies of all computer-entered data should be reviewed by the data entry personnel by direct (side-by-side) comparison with the field or laboratory handwritten data sheets. 13.5 Data analysis QC checks should include the following: * Periodic checks by trained staff or peer review throughout the data analysis process. Data validation and verification QC checks include examination of outliers, total numbers, odd numbers, and unusual species. Errors can occur if inappropriate statistics are used to analyze the data. * Transcription error and a poor presentation can occur if care is not taken to provide adequate training and appropriate review, QC checking of data reports by peer review, the use of a technical editor, and following a standard format will help to ensure complete and relevant data analyses and reporting. 49 ------- SECTION 14 INSTRUMENT/EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE REQUIREMENTS 14.1 To help ensure collection of consistently high-quality data, a plan of routine inspection and preventive maintenance should be developed for all field and laboratory equipment and facilities. The following types of preventive maintenance items should be considered and addressed in the QAPP: a schedule of important preventive maintenance tasks that must be carried out to minimize downtime in the field and laboratory should be kept; a list of any critical spare parts that must be on hand to minimize downtime in the field and laboratory {Section 15.3) should be included; personnel whose duties include operation of specific pieces of sampling gear or detection instruments should have primary responsibility for inspection of such equipment; personnel should be assigned responsibility for locating and gathering of all necessary field equipment at least 24 hours in advance of departure to sampling stations. 14.2 Example equipment and supply lists for benthic macroinvertebrates and fish are presented (Tables 14-1, 14-2). 50 ------- TABLE 14-1 Example of equipment and supply list for banthic macrolnvertebrate sampling (Plafkin et al. 1989J. 1. Square meter kicknet, standard no. 30 mesh (595 //m openings) {including pole attachments) 2. Additional kicknet, as backup 3. Sample containers, two-three 1-liter, plastic, opaque, straight-sided, w/screw tops (per station} 4. Maps of site location and access routes 5. Two internal labels (per station) 6. 12 Pencils, no. 2 soft lead 7. Grease pencils, two-three (per trip) 8. Scissors, one pair 9. Forceps, three or four pair 10. Gridded screen subsampling equipment ICaton 1991] 11. Wash bottle, 1-liter capacity 12. Sieve bucket, standard no. 30 mesh (595 fjm openings) (Wildco cat. no. 90) 13. Two 1-gallon buckets (plastic) 14. One clipboard 15. 95 percent ethanol, or 10 percent formalin; 0.5 gallon per station [container should be appropriate for pouring into sample containers w/minimum spillage) 16. Funnel 17. Hip waders, one per crew member 18. Log book (boundi/Field notebook 19. Data sheets (may be rite-in-rain) or PDR 20, Box or cooler for sample transport 21. Dice for random numbers determination 22. First aid kit 23. Rubber gloves, heavy gloves 24. Rain gear for each person 25. Waterproof tape 26. Compass 27. Kim wipes in ziplock bags 28. Watch with timer or stop watch 29. Camera 30. Patch kit for waders 51 ------- TABLE 14-2 Example of equipment list for fish sampling in wadabls streams. 1. Backpack electrofisher 2. Spare batteries (or gasoline) and spare electrofisher if distant from base 3. Insulated rubber gloves, one pair per person 4. Waders, hip or chest, one pair per person 5. Long-handled nets, two 6, Plastic buckets, three or four 1-5 gallon capacity 7. Block nets, two 20 meters in length 8. Measuring tape, 100 meter 9. Fish measuring board (length! 10. Weight scales 11. Clipboard 12. Data sheets (may be rite-in-rain] or PDR 13. 12 Pencils, no. 2 soft lead 14. Ear plugs (if gasoline generator) 15. Patch kit for waders 16. Fish field guide 17. Anesthesia (MS-222) 18. Plastic collection jars with tight fitting lids (multiple sizes) 19. Electricians tape 20. Labels 21. Preservative (formaldehyde or isopropyl alcohol) 22. Probe 23. Calipers 24. Stopwatch with timer 25. Small dip net 26. White enamel pan 27. Conductivity pen 28. Camera 52 ------- SECTION 15 INSTRUMENT CALIBRATION AND FREQUENCY 15.1 The purpose of this section is to document detailed description or reference of the appropriate SOPs for assuring that field and laboratory equipment are functioning optimally. Instruments used for measuring water quality, current velocity, or any other measurable parameters should be calibrated with certified equipment and/or standards (with known, valid relationships to nationally recognized performance standards) prior to gathering data at each sample location. In the absence of nationally recognized standards, documentation for the basis of the calibration is needed. Permanent records with dates and details of these calibrations and checks must be maintained. Documentation is necessary to identify each specific measuring device, where and when it is used, what maintenance was performed, and the dates and steps used in instrument calibration. This information should be traceable to each instrument. Definition should be given for the acceptance criteria for all calibration measurements. All field measurements should be accompanied by documentation of the type of instrument and the identification number of the instrument used. 15.2 For biological field equipment, there should be routine procedures to ensure that equipment is appropriate for the needed sample and is in proper working order. For example, for benthic macroinvertebrates and algal collections using artificial or introduced substrata should have confirmation of surface area. Multiplate samplers (e.g., Hester-Dendy), should have the number, area, and thickness of plates and spacing dividers confirmed and documented prior to departure from storage. Rock baskets (introduced substrate) should have the surface area of the rocks confirmed and documented prior to departure from storage. For benthic macroinvertebrate net collections, there should be measurement of the device dimensions and knowledge of size of mesh net openings. There should also be effort toward repairing holes or replacing nets. Zooplankton and phytoplankton pumps, traps, and nets should be checked for proper working order and size of mesh net openings; hand collection gear, bottles, knives, and droppers should be clean and in good working order. For fish (electrofishers) there should be consistent checks of voltage, amperage, wattage, and field pattern in the context of conductivity. Calibration of electrical instruments should occur at each sampling site. Confirmation and notation of condition for proper biological sampling gear should be at least 24 hours prior to scheduled departure for fieldwork. 15.3 For biological field gear, there are several components that need to be checked initially and then prior to fieldwork. When gear is constructed or received 53 ------- from the manufacturer, initial documentation of equipment specifications should be recorded. For example, gear dimensions gear specifications gear condition net dimensions mesh size appropriateness of gear for study objectives. Prior to each field effort, gear should be checked. For example, gear condition working order spare parts repair kits extra units. 15.4 In essence, taxonomic identification performance is partly accomplished by ensuring use of the most current technical taxonomic literature, by development and use of an appropriate reference collection, and by use of an expert taxonomist (Sections 12.6, 12.7). 54 ------- SECTION 16 INSPECTION/ACCEPTANCE REQUIREMENTS FOR SUPPLIES AND CONSUMABLES 16.1 Discuss how and by whom supplies and consumables, such as sample bottles, reagents, nets, etc., will be inspected and accepted for use in the project. Identify the acceptance criteria for such supplies in order to satisfy the technical and quality objectives of the project. 55 ------- SECTION 17 DATA ACQUISITION REQUIREMENTS (NON-DIRECT MEASUREMENTS) 17.1 Identify the types of data that will likely be acquired from non-measurement sources such as computer databases, spreadsheets, and literature files; for example, metric calculations performed on a personally designed spreadsheet routine, identification of literature review for tolerance values, information from topographical maps, historical reviews, and raw data received electronically in addition to bench sheets from laboratories. Define acceptance criteria for the use of the data in the project. Discuss any limitations on the use of the data based on uncertainty in the quality of the data and explain the nature of that uncertainty. For instance, if raw data are entered electronically from laboratory bench sheets, each entry should then be confirmed. 56 ------- SECTION 18 DATA MANAGEMENT 18.1 Outline the project data management scheme by tracing the path of the data through receipt from the field or laboratory to the use or storage of the final reported form. Describe the standard record keeping procedures, document control system, and the approach used for data storage and retrieval on electronic media. Explain the control mechanism for detecting and correcting paperwork errors and for preventing loss of data during data reduction, data reporting, and data entry to forms, reports and databases. Provide examples of any forms or checklists to be used. 18.2 Identify and describe all data handling equipment and procedures that will be used to process, compile, and analyze the data. This includes procedures for addressing data generated as part of the project as well as data from other sources. The specifications should include any required computer hardware and software and should address any specific performance requirements for the hardware/software configuration used. Describe the procedures that will be followed to demonstrate acceptability of the hardware/software configuration. 57 ------- SECTION 19 ASSESSMENT AND RESPONSE ACTIONS 19.1 Identify the number, frequency, and type of activities needed to assess and evaluate the project. Assessments include system audits and performance audits which are part of every quality control program. Each QC plan must describe the internal and external performance and system audits required to monitor the capability and performance of the project. 19.2 A systems audit consists of a review of the total data production process which includes onsite reviews of the field and laboratory operational systems and facilities for sampling and processing of samples. 19.3 A performance audit is a type of audit in which the quantitative data generated (e.g., species enumeration and identification) is independently enumerated and identified. This type of audit can test accuracy. 19.4 To the extent possible, these audits should be conducted by individuals who are not directly involved in the measurement process. Audits serve three purposes: 1) to determine if a particular personnel or organizational group has the capability to conduct the monitoring before the project is initiated; 2) to verify that the QAPF and associated SOPs are being implemented; and 3) to detect and define problems so that immediate corrective action can begin. 19.5 The QAPP should specify who will conduct the audit, their relationship within the project organization or their independent affiliation, what the acceptance criteria will be, if or what type of audit will be used, and to whom the audit reports will go. A list should be prepared of the approximate schedule of activities and outline the information expected from the audit. The QAPP should also explicitly define under what conditions the assessor has the ability to order a work suspension. 19.6 The QAPP should explain how and by whom response actions to non- conforming conditions will be addressed, and identify the person(s) responsible for implementing the corrective action. The plan should also describe how corrective actions will be verified, validated, and documented. A corrective action program must have the capability to plan and implement measures to correct identified 58 ------- problems, maintain documentation of the results of the corrective process, and continue the process until each problem is eliminated. The corrective action is the process to remediate defects. 19.7 Corrective actions may be initiated as a result of the following QA activities: 1) performance audits 2) systems audits 3) internal quality control checks. 19.8 When sampling or data analyses are shown to be unsatisfactory as a result of audits or QC sample analysis, a corrective action should be implemented. In addition, corrective actions should be taken during the course of sample and data analysis by field and laboratory crew when the routine QC check criteria are not met. The Project Manager, Laboratory Manager, Quality Assurance Manager, and support technicians may be involved in the corrective action. If data are affected by the situation requiring correction or if the corrective action will impact the project budget or schedule, the action should directly involve the Project Manager. 19.9 Corrective actions are of two basic kinds: 1) Immediate - the need for such an action will most frequently be identified by the field or laboratory technician as a result of calibration checks and QC sample analyses. 2) Long-Term - the need for such actions may be identified by audits. Examples of this type of action include: staff training in technical skills or in implementing the QA/QC program rescheduling of field, laboratory, data handling activities to ensure analysis within allowed holding times reassessment of field and laboratory operation procedures and personnel. 19.10 For either immediate or long-term corrective actions, the following steps should be taken: Specify what type of conditions require corrective action. * Define the specific problem. * Assign responsibility for investigating the problem. * Establish who initiates, approves, implements, evaluates, and reports corrective action. Investigate and determine the cause of the problem. Determine a corrective action to eliminate the problem. 59 ------- Assign and accept responsibility for implementing the corrective action. Establish effectiveness of the corrective action and implement the correction. Verify that the corrective action has eliminated the problem. 19.11 Internal auditing of field, laboratory, and data handling activities may result in the discovery of non-conforming procedures that, left uncorrected, could jeopardize the quality and integrity of project data and results. When such auditing is part of a project and a non-conformance is found, corrective action is initiated by documenting the finding and recommendations of the audit. The corrective action undertaken by the designated responsible party is documented with an implementation schedule and management approval. The implementation is verified by the auditor, which is then made part of the project audit report record. 60 ------- SECTION 20 REPORTS TO MANAGEMENT 20.1 A formal QA report should be issued to inform appropriate management on the performance and progress of the project workplan. The purpose of the report will be to identify the individuals responsible for reporting QC results, and to present the QC data so that management can monitor the data quality effectively. Assume that all readers of the report will potentially use the document for establishing additional biomonitoring or biosurvey programs for validation of models or for validation of a project. Availability of complete QA/QC program descriptions and data quality requirement calculations is essential (Smith et al. 1988). 20.2 The following items should be described in the QA report: Individuals Preparing and Receiving Reports Type of Report Written or oral, frequency Interim or final Contents Status of the project Results of performance evaluation audits Significant QA/QC problems, recommended solutions, and results of corrective actions Changes in the QAPP Summary of QA/QC program, training, and accomplishments Uncertainty estimates Data quality assessment in terms of precision, accuracy, representativeness, completeness, and comparability Reporting of whether the QA objectives were met, and the resulting impact on decision making. 20.3 The majority of points within the above list are either previously detailed in this document or are largely self-explanatory. The following elaboration for reporting uncertainty and data quality requirements is taken primarily from Smith et al. (1988). See also Section 6 of this document, Green (1979), and Freedman et al. (1991). Uncertainty estimates may be either qualitative or quantitative. Estimates may focus on the probabilities for false positives or false negatives in hypothesis-testing designs or they may be in the form of confidence intervals around parameter values. 61 ------- 20.4 PRECISION 20.4.1 Precision-reporting can be presented as an index either as standard deviation, relative standard deviation, or as relative percent difference (Smith et al. 1988; USEPA 1989). These numbers can be presented as a function of the measured value within a range, on a graph illustrating actual measurement values, or as data points with a best-fit curve, including confidence intervals. 20.4.2 Numbers should be presented in tabular form with the data quality assessment values, standard deviations, and, if appropriate, regression equation coefficients. 20.4.3 Additional appropriate information should be included that indicates interlaboratory precision versus intralaboratory precision, procedures for arrival at the estimates and assignment of outlier status, and description of the temporal acceptability of the interlaboratory estimates. 20.5 REPRESENTATIVENESS 20.5.1 Representativeness cannot be quantified (Smith et al. 1988). In lieu of quantification, a description of program/project design and implementation activities, along with photographs and drainage area of sampling site distribution (to reflect degree of ecological stratification), and an assessment of resulting representativeness should be presented. 20.6 COMPLETENESS 20.6.1 Missing data should be identified and practical reasons presented that caused their deletion from the dataset. This information will aid in identification of specific procedural problems and in rectification prior to subsequent sampling events. 20.7 COMPARABILITY 20.7.1 Rationale for the validity of comparing one dataset to another should be given. Selected reasons might include: 9 0 © 62 time/date of sampling comparison of site selection criteria measured parameters, recorded observations field and laboratory methods comparison of QA/QC programs comparison of data quality requirement estimates. ------- 20.8 ACCURACY 20.8.1 Accuracy is the degree of agreement between an observed value and an accepted reference value. Accuracy of data should be checked for transcription errors through the entire sample processing and analyzing phases. Each data entry should be checked to the original field sheet and random quality control checks should be made on subsequent data that have been manipulated. 63 ------- SECTION 21 DATA REVIEW, VALIDATION, AND VERIFICATION REQUIREMENTS 21.1 The purpose of this section is to ensure good data by maintaining quality throughout data reduction, transfer, storage, retrieval, and reporting. The project management scheme should outline the path of the data from the field or laboratory; topics to be addressed include details of data storage, data reduction, data validation, and data reporting. All data handling equipment required hardware and software, and procedures to be used should be identified and described in the plan. 21.2 For each step in the data handling, state the criteria used to review and validate data (accept, reject, or qualify) in an objective and consistent manner. List any calculations that are necessary to prove or disprove the project objectives. 64 ------- SECTION 22 VALIDATION AND VERIFICATION METHODS 22.1 Outline the process used for validating and verifying data including the chain- of-custody for data throughout the life cycle of the project. Describe how issues shall be resolved and what authorities will handle such issues. Describe how results are conveyed to data users. The review can include checks of field and laboratory QC data, instrument calibration, technical system audits, and statistical data treatments. 22.2 RAW DATA 22.2.1 Data such as species names and number of individuals should be legibly recorded by hand whether on standardized field or laboratory bench sheets, or in notebooks. These sheets should be checked by intralaboratory QC personnel. Raw data (non-manipulated) should be stored in hard copy in one or more separate location(s) and in an electronic database medium with ample backup (if possible). For data validation, compare every computer entry to field sheets to ensure correct data entry. 22.3 DATA REDUCTION 22.3.1 Data reduction is the process of transforming raw data by arithmetic or statistical calculations and collation into a more useful form (such as the Index of Biotic Integrity [IBI] or total taxa). Errors are commonly found in the calculations, reductions, and transfer of data to various forms and reports and into data storage systems. Therefore these data should be quality checked to ensure accuracy. 22.3.2 This subsection should highlight at least the following information: names of individuals responsible (Table 4-1) examples of data sheets summary of statistical approach for reducing data summary of data reduction procedures control mechanisms for detecting and correcting errors. 22.4 DATA VALIDATION 22.4.1 Data validation is the process of substantiating specified performance criteria. Each program must establish technically-sound and documented data validation criteria which will serve to accept/reject data in a uniform and consistent manner. Pilot studies may be used to determine metrics with the least variability 65 ------- and to evaluate metrics for their biological relevance; the rationale for their use should be documented. 22.4.2 Information for substantiating data validation should include: names of individuals responsible (Table 4-1) procedures for determining outliers identification of critical control points. 22.5 DATA REPORTING 22.5.1 Data are collected from the summary sheets, bound notebooks, or computerized databases by the data management group and transferred to a draft report table and/or graphical representation. The assembled data and the raw data are then examined for nonsensical, computational, and transcriptional errors. For example, field data sheets should be thoroughly and routinely compared to computer printout data. After reviewing the data, the laboratory leader and laboratory QC officer sign-off on the data report, and the report is forwarded to the Project Manager. 22.5.2 Important information that should be included in this subsection are: key individuals who will handle the data reporting (Table 4-1); flowchart of the data handling process covering all data collection, transfer, storage, recovery, and processing steps, and including QC data for both field and laboratory; and identification of critical control points (i.e., What are the criteria for data points to be considered outliers and when are individual data points rejected from a database?). 66 ------- SECTION 23 RECONCILIATION WITH DATA QUALITY OBJECTIVES 23.1 The purpose of this section is to describe how the results obtained from the project will be reconciled with the DQOs and how any issues will be resolved. Any limitations on the use of the data must be discussed and reported to decisionmakers. Detailed plans for data assessment procedures for precision, accuracy, and completeness must be identified. Routine assessment procedures including statistics, equations, reporting units, and assessment frequency should be summarized {USEPA 1989). Section 6 details formulae for calculating precision (relative percent difference [RPD] and relative standard deviation [RSD]). Accuracy is usually calculated as "percent recovery". Percent recovery normally applies to chemical analytical laboratory procedures; however, in the case of biological laboratories, percent recovery can be applied in the form of sample sorting checks. Usual procedures for calculating accuracy in this sense then are related to the laboratory. Precision should be calculated based on replicated samples taken from adjacent reaches. 23.2 RPDs or RSDs and completeness should be calculated as soon as possible after each sampling event in order to implement corrective actions (Section 19) prior to subsequent data-gathering efforts. Further statistical approaches which could be calculated and reported (USEPA 19801 are: Central tendency and distribution Arithmetic mean Range Standard deviation Pooled standard deviation Geometric mean Data distribution by percentiles Measures of variability (USEPA 1989) Accuracy Bias Precision Coefficient of variability (C.V.) Confidence limits (Platts et al. 1983) Testing for outliers 23.3 Additional statistical guidance can be obtained from Sokal and Rohlf (1969); the statistics associated with the multimetric approach are described by Fore et al. (1994). 67 ------- LITERATURE CITED ASTM. 1991. Standard Guide for Documenting the Standard Operating Procedures Used for the Analysis of Water, ASTM Designation: D5172-91. Annual Book of ASTM Standards 11.0:56-57. Bahls, L. 1992. Periphyton bioassessment methods for Montana streams. Montana Water Quality Bureau, Helena, MT. Barbour M.T. and J.B. Stribling. 1991. Use of habitat assessment in evaluating the biological integrity of stream communities, pp. 25-38. In Biological * > I I I | .j m 41 wmm mm* MM * W 4k mm* « | mm* mm* Jk Criteria: Research and Regulation, 1991. EPA-440/5-91-005. U.S. EPA, Office of Water, Washington, DC. Borror, D.J., C.A. Triplehorn, and N.F. Johnson. 1989. An Introduction to the Study of Insects. 6th edition. Saunders College Publ., Philadelphia, PA. Caton, L.W. 1991. Improved subsampling methods for the EPA "rapid bioassessment" benthic protocols. Bulletin of the North American Benthological Society 8(3):317-319. Costanza, R., S.O. Funtowicz, and J.R. Ravetz. 1992. Assessing and Communicating Data Quality in Policy-Related Research. Environmental Management 16(1);121-131. DHES, 1989. Field procedures manual: collection, analysis and reporting of water quality samples. Department of Health and Environmental Science, Water Quality Bureau, Helena MT. Freedman, D., R. Pisani, R. Purves, and A. Adhikari. 1991. Statistics. W.W. Norton and Company. New York. 617p. Fore, L.S., J.R. Karr, and L.L. Conquest. 1994. Statistical properties of an Index of Biotic Integrity. Canadian J. Fish Aquat. Sci. 51:1077-1087. Green, R.H. 1979. Sampling design and statistical methods for environmental biologists. John Wiley and Sons. New York. 257p. Gibson, G.R. (editor). 1994. Biological Criteria: Technical guidance for streams and small rivers. EPA-822-B-94-001. U.S. Environmental Protection Agency, Office of Science and Technology, Washington, DC. 162. pp. Hughes, R.M. 1995. Defining acceptable biological status by comparing with reference conditions. Pp. 31-47, Chapter 4 In W.S. Davis and T.P. Simon, 68 ------- eds., Biological Assessment and Criteria. Tools for Water Resource Planning and Decision Making. Lewis Press, Boca Raton, FL. Hughes, R.M., S.A. Heiskary, W.J. Matthews, and C.O. Yoder. 1994. Use of ecoregions in biological monitoring, Chapter 8, pp. 125-151. In S.L. Loeb and A. Spacie, eds., Biological Monitoring in Aquatic Systems. Lewis Publishers, Ann Arbor, Ml. ITFM. 1994. Water quality monitoring in the United States; 1993 report of the Intergovernmental Task Force on Monitoring Water Quality, Technical Appendixes. Intergovernmental Task Force on Monitoring Water Quality, January, 1994. Karr, J.R., K.D. Fausch, P.L. Angermeier, P.R. Yant, and I.J. Schlosser. 1986. Assessing biological integrity in running waters: A method and its rationale. Illinois Natural History Survey, Special Publication 5. 28 pp. Kaufman, P.R. 1993. Physical habitat, pp. 59-69. In R.M. Hughes, eds., Stream Indicator and Design Workshop. EPA/600/R-93/138. U.S. EPA, Corvallis, OR. MACS. 1993 (draft). Standard operating procedures and technical basis. Macroinvertebrate collection and habitat assessment for low gradient, non- tidal streams. Prepared by: The Mid-Atlantic Coastal Streams Workgroup. August 24, 1993. (For further information, contact John Maxted, Delaware Department of Natural Resources, 302-739-4590.) Meador, M.R., T.F. Cuffney, and M.E. Gurtz. 1993. Methods for sampling fish communities as part of the national water-quality assessment program. Open-file Report 93-104. US Geological Survey, Raleigh, NC. 40p. Ohio Environmental Protection Agency. 1987. Biological Criteria for the Protection of Aquatic Life: Volumes l-lll. Ohio EPA, Division of Water Quality Monitoring and Assessment, Surface Water Section, Columbus, OH, Ohio Environmental Protection Agency. 1989. Standardized biological field sampling and laboratory methods for assessing fish and macroinvertebrate communities. Subsection 4. Fisheries, Ohio EPA. Columbus. 43p. Plafkin, J.L., M.T. Barbour, K.D. Porter, S.K. Gross, and R.M. Hughes. 1989. Rapid Bioassessment Protocols for Use in Streams and Rivers: Benthic Macroinvertebrates and Fish. EPA/444/4-89-001. U.S. EPA, Office of Water, Washington, DC. 69 ------- Platts, W.S., W.F. Megahan, arid G.W. Minshalf. 1983. Methods for Evaluating Streams, Riparian, and Biotic Conditions. General Tech. Rep. INT-138. U.S. Dep. Agrie., U.S. Forest Serv., Ogden, UT. Smith, F., S. Kulkarni, L.E. Myers, and M.J. Messner. 1988. Evaluating and presenting quality assurance data, Chapter 10, pp. 157-168. Jn L.H. Keith, ed., Principles of Environmental Sampling. ACS Professional Reference Book, American Chemical Society, Washington, DC. Sokal, R.R, and F.J. Rohlf. 1969. Biometry. The Principles and Practice of Statistics in Biological Research. W.H. Freeman and Co., San Francisco, CA. 776pp. U.S. Environmental Protection Agency. 1980. Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans. GAMS-005/80, USEPA, Office of Research and Development, Quality Assurance Management Staff, Washington, DC. U.S. Environmental Protection Agency. 1984. Guidance for Preparation of Combined Work/Quality Assurance Project Plans for Environmental Monitoring. OWRS QA-1. U.S. EPA, Office of Water Regulations and Standards, Washington, DC. U.S. Environmental Protection Agency. 1989. Preparing Perfect Project Plans. A Pocket Guide for the Preparation of Quality Assurance Project Plans. October. EPA/600/9-89/087. USEPA, Office of Research and Development, Risk Reduction Engineering Laboratory, Cincinnati, OH. U.S. Environmental Protection Agency. 1990a. Macroinvertebrate Field and Laboratory Methods for Evaluating the Biological Integrity of Surface Waters. Klemm, D.J., P.A. Lewis, F. Fulk, and J.M. Lazorchak, EPA/600/4-90/030. USEPA, Office of Research and Development, Environmental Monitoring Systems Laboratory, Cincinnati, OH. U.S. Environmental Protection Agency. 1990b. Quality Assurance Glossary and Acronyms. USEPA, Quality Assurance Management Staff, Office of Modeling, Monitoring Systems and Quality Assurance, Office of Research and Development, Washington, DC. July, 1990. U.S. Environmental Protection Agency. 1994 (Draft Interim Final). EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations. EPA QA/R-5. USEPA, Quality Assurance Management Staff, Washington, D.C. July. 70 ------- U.S. Environmental Protection Agency. 1993a. Fish Field and Laboratory Methods for Evaluating the Biological Integrity of Surface Waters. Klemm, D.J., Q.J. Stober, and J.M. Lazorchak, EPA/600/R-92/111. U.S. EPA, Office of Research and Development, Environmental Monitoring Systems Laboratory, Cincinnati, OH. U.S. Environmental Protection Agency. 1993b. Master Glossary; Environmental Monitoring and Assessment Program. EPA/620/R-93/013. U.S. Environmental Protection Agency, Office of Research and Development, Washington, DC. U.S. Environmental Protection Agency. 1994. Quality Management Plan. U.S. EPA, Office of Water, Washington, DC. 71 ------- ------- APPENDIX A ABBREVIATED QAPP FORM A-l ------- Appendix A. TABLE A-1 Form for completion of an abbreviated format QAPP. 1. TITLE PAGE (Project Name) (Responsible Agency) (Date) Project Manager Signature Name/Data Project QA Officer Signature Name/Date USEPA Project Manager Signature Name/Date USEPA QA Officer Signature Name/Date 2. TABLE OF CONTENTS - List sections with page numbers, figures, tables, references and appendices (attach pages). ------- TABLE A-l Continued. 3. DISTRIBUTION LIST - names and telephone numbers of those receiving copies of this QAPP. Attach additional page, if necessary. i. ii. iii. iv. V. vi. vii. viii. ix. x. xi. xii. 4. PROJECT/TASK ORGANIZATION - List key project personnel and their corresponding responsibilities. Please note that an organizational diagram should be presented with this section. Name Project Title Advisory Panel (contact) Project Manager/Principal Investigator A-3 ------- TABLE A-1 Continued. OA Officer Sample Design Coordinator Sample Design QC Officer Field/Sampling Leader Sampling QC Officer Laboratory Manager/Leader Laboratory QC Officer Data Processing Leader Data QC Officer Document Production Coordinator Reporting QC Officer 5. PROBLEM DEFINITION/BACKGROUND; PROBLEM/TASK DESCRIPTION - A. Objective and Scope Statement B. Intended Usage of Data ------- TABLE A-1 Continued. C. General Overview of Project D. Sampling Station Network Design/Rationale E. Project Timetable Anticipated Date of Activity Initiation Completion A-5 ------- TABLE A-1 Continued. 6. MEASUREMENT QUALITY OBJECTIVES Parameter Detection Limit Estimated Accuracy Accuracy Protocol* Estimated Precision Precision Protocol** 'Accuracy Protocol Formula - Percent recovery ** Precision Protocol Formulas - If precision is to be calculated from two replicate samples, use Relative Percent Difference (RPD) calculated as RPC W-qj«ioo (C^+C2)+2 where C, = the larger of the two values and C2 = the smaller of the two values. And, if it is to be calculated from three or more replicate samples, use Relative Standard Deviation {RSD) calculated as where s = standard deviation and X = mean of replicate samples. The standard deviation or the A-6 ------- TABLE A-1 Continued. RSD=4x 100 X standard error of a sample mean (s) is calculated as f (*i-*)2 M" ""I where Xj = measured value of the replicate, 8 = mean of replicate sample measurements, n number of replicates, values. Precision can also be expressed in terms of the range of measurement B. Data Representativeness C. Data Comparability A-7 ------- TABLE A-1 Continued. D. Data Completeness Parameter No. Valid Samples Anticipated No. Valid Samples Collected and Analyzed Percent Complete 7. PROJECT NARRATIVE - Paragraph relating project to the Data Quality Objectives and problem definition. 8. SPECIAL TRAINING REQUIREMENTS AND CERTIFICATION - Position Title Requirements Date of Training/Certification A-8 ------- TABLE A-1 Continued. 9. DOCUMENTATION AND RECORDS 10. SAMPLING PROCESS DESIGN/SAMPLING METHODS REQUIREMENTS Type of Sample/ Parameter Sampling Gear/ Method ISOP No., if available) Number of Samples Sampling Frequency (Number per year) Method of Analysis Biological Physical Chemical B. Rationale for Selection of Sampling Sites A-9 ------- TABLE A-1 Continued, 11. SAMPLE HANDLING AND CUSTODY PROCEDURES 12. ANALYTICAL METHODS REQUIREMENTS A. Sample processing procedures B. Location of voucher collection 13. QUALITY CONTROL REQUIREMENTS A. Field QC checks B. Laboratory QC checks A-10 ------- TABLE A-1 Continued. C. Data Analysis QC checks 14. INSTRUMENT/EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE SCHEDULE Item Serial No. Date of Last Examination 15. INSTRUMENT CALIBRATION AND FREQUENCY 16. INSPECTION/ACCEPTANCE REQUIREMENTS A-11 ------- TABLE A-1 Continued. 17. ACQUISITION OF NON-DIRECT MEASUREMENT DATA 18. DATA MANAGEMENT PROGRAM/SYSTEM 19. ASSESSMENT AND RESPONSE ACTIONS 20. REPORTING PLANS 21. DATA REVIEW AND VALIDATION REQUIREMENTS A-12 ------- TABLE A-1 Continued. 22. VALIDATION AND VERIFICATION 23. RECONCILIATION WITH DQOs ------- ------- APPENDIX B QAPP GLOSSARY OF TERMS S> B-1 ------- QAPP Glossary of Terms Acceptance criteria - criteria specifying the limit above which data quality is considered satisfactory and below which it is not. [Modified from USEPA (1990b) "Acceptable quality level"]. Accuracy - the degree of agreement between an observed value and an accepted reference value. Accuracy includes a combination of random error (precision) and systematic error (bias) components which are due to sampling and analytical operations; a data quality indicator. EPA recommends that this term not be used and that precision and bias be used to convey the information usually associated with accuracy [USEPA (1993a)]. Assemblage - an association of interacting populations of organisms in a given waterbody, for example, fish assemblage or a benthic macroinvertebrate assemblage [Gibson (1994)], Bias - the systematic or persistent distortion of a measurement process which deprives the result of representativeness (i.e., the expected sample measurement is different than the sample's true value.) A data quality indicator [USEPA (1993a)]. Biological Assessment I Bioassessment - an evaluation of the condition of a waterbody using biological surveys and other direct measurements of the resident biota in surface waters [Gibson (1994), USEPA (1991)]. Biological criteria I Blocriteria - numerical values or narrative expressions that describe the reference biological condition of aquatic communities inhabiting waters of a given designated aquatic life use. Biocriteria are benchmarks for water resources evaluation and management decision making [Gibson (1994)]. Biological integrity - the condition of an aquatic community inhabiting unimpaired waterbodies of a specified habitat as measured by an evaluation of multiple attributes of the aquatic biota. Three critical components of biological integrity are that the biota is (1) the product of the evolutionary process for that locality, or site, (2) inclusive of a broad range of biological and ecological characteristics such as taxonomic richness and compositions, trophic structure, and (3) is found in the biogeographic region of study [Gibson (1994)]. Biomonitoring - multiple, routine biological assessments over time using consistent sampling and analysis methods for detection of changes in biological condition. Calibration - to determine, by measurement or comparison with a standard, the correct value of each scale reading on a meter or other device, or the correct value B-2 ------- for each setting of a control knob. The levels of the calibration standards should bracket the range of planned measurements [USEPA (1990b)]. Community - any group of organisms belonging to a number of different species that co-occur in the same habitat or area; an association of interacting assemblages in a given waterbody. Comparability - the degree to which different methods, data sets and/or decisions agree or can be represented as similar; a data quality indicator [USEPA (1990, 1993b)]. Completeness - the amount of valid data obtained compared to the planned amount, and usually expressed as a percentage; a data quality indicator [USEPA {1990b, 1993a)]. Confidence level - the probability, usually expressed as a percentage, that a confidence interval will include a specific population parameter; confidence levels usually range from 90 to 99 percent [USEPA (1990b)]. Confidence interval - an interval that has the stated probability (e.g., 95 percent) of containing the true value of a fixed (but unknown) parameter [Gibson (1994)]. Corrective action - corrective actions are measures to correct identified problems, maintain documentation of the results of the corrective process, and continue the process until each problem is eliminated. The corrective action is the process to remediate defects. Damaged and unusable samples - are samples that have been damaged and part or all of the sample was destroyed or not recoverable. Damaged and usable samples - samples that have been damaged but the entire sample was salvageable (i.e., all organisms were saved). Data quality objectives (DQOs) - qualitative and quantitative statements developed by data users to specify the quality of data needed to support specific decisions; statements about the level of uncertainty that a decisionmaker is willing to accept in data used to support a particular decision. Complete DQOs describe the decision to be made, what data are required, why they are needed, the calculations in which they will be used; and time and resource constraints. DQOs are used to design data collection plans [Gibson (1994)]. Data reduction - the process of transforming raw data by arithmetic or statistical calculations, standard curves, concentration factors, etc., and collation into a more useful form [USEPA (1990b)]. B-3 ------- Data validation - see validation. Data verification - see verifiable. Ecological integrity - the condition of an unimpaired ecosystem as measured by combined chemical, physical (including habitat), and biological attributes [Gibson (1994)]. Ecoregion - geographic regions of ecological similarity defined by similarity of climate, landform, soil, potential natural vegetation, hydrology, or other ecologically relevant variables [Gibson (1994)]. Endpoints - a measurable ecological characteristic [USEPA (1993a)]. Environmental monitoring - the periodic collection of data to be used to determine the condition of ecological resources [USEPA (1993a)] In situ - used to describe measurements taken in the natural environment. Index period - the sampling period with selection being based on temporal behavior of the indicator and the practical considerations for sampling [ITFM (1994)], Indicator - characteristics of the environment, both abiotic and bio tic, that can provide quantitative information on ecological resources [USEPA (1993a)]. Interlaboratory - activities that occur among different laboratories [USEPA (1990b)]. Intralaboratory - activities that occur within a laboratory [USEPA (1990b)). Level of effort - the amount of effort (e.g., person-hours, sampling effort per time, or sampling vigor) needed to complete a task or project. Measurement parameters - any quantity such as a mean or standard deviation characterizing a population. Commonly misused for "variable", "characteristic" or "property" [USEPA (1990b)]. Measurement quality objectives - the QA objectives for precision, representativeness, comparability and completeness for each measurement [this document]. Metric - a calculated term or enumeration which represents some aspect of biological assemblage structure, function or other measurable aspect of a B-4 ------- characteristic of the biota that changes in some predictable way with increased human influence [Gibson (1994)]. Monitoring design features - includes listing ail measurements or variables to be taken; a statement of how measurements will be evaluated; the rationale used to select the statistic that will be used to analyze data; explicit delineation of ecosystems to which decisions will be applied, and a summary table listing the types and numbers of samples and the sampling gear. Multimetric approach - is an assessment approach that uses a combination of multiple metrics to provide synthetic assessments of the status of water resources [Gibson (1994)]. Percent recovery - Accuracy is usually calculated as "percent recovery" and is applied in the form of sample sorting checks [this document]. Performance audit - a type of audit in which the quantitative data generated in a measurement system are obtained independently and compared with routinely obtained data to evaluate the proficiency of an analyst or laboratory [USEPA (1990b)]. Pilot studies - studies implemented based on questions that require field work to evaluate indicators, sampling strategy, methods and logistics [USEPA (1993a)]. Potentially Responsible Party - individual or group of individuals that may be liable for degradation of a natural resource. Precision - the degree of variation among individual measurements of the same property, usually obtained under similar conditions; a data quality indicator. Precision is usually expressed as standard deviation, variance or range, in either absolute or relative terms [USEPA (1990b)]. Preventive maintenance - an orderly program of activities designed to ensure against equipment failure (USEPA (1990b)]. Primary sample processing - the first phase of sample processing for those samples that require more than field processing, identification and counting, and, for example, laboratory subsampling of macroinvertebrate samples. Probabilistic site - sampling sites are selected at random to ensure representativeness. Random site selection and sampling can provide a statistically- valid estimate of the condition of a waterbody class or other habitat class (e.g., lakes, large rivers, streams). B-5 ------- Qualitative - non-quantitative or subjective. Quality Assurance (OA) - an integrated system of activities involving quality planning, quality control, quality assessment, quality reporting and quality improvement to ensure that a product or service meets defined standards of quality with a stated level of confidence [USEPA (1990b)]. Quality objectives - the upper and lower limiting values of the data quality indicators as defined by the data user's acceptable error bounds [USEPA (1990b)]. Quality Control (QC) - the overall system of technical activities whose purpose is to measure and control the quality of a product or service so that it meets the needs of users. The aim is to provide quality data or results that are satisfactory, adequate, dependable, and economical [USEPA (1990b)]. Quality Assurance Project Plan (QAPP) - a formal document describing the detailed quality control procedures by which the data quality requirements defined for the data and decisions in a specific project are to be achieved [USEPA (1990b)], Quantitative - non-subjective. Rank order comparisons - comparing the position of a site in its assessment relative to other sites. Rapid bioassessment protocols - a framework for assessing biological condition of streams and wadable rivers using scientifically-valid and cost-effective procedures [Plafkin et al. (1989)]. Raw data - data that have not been manipulated; the actual measurements taken. Reference site - a specific locality on a waterbody which is minimally impaired and is representative of the expected ecological integrity of other localities on the same waterbody or nearby waterbodies [Gibson (1994)]. Reference condition - the set of selected measurements or conditions of minimally impaired waterbodies characteristic of a waterbody type in a region [Gibson (1994)]. Reference collection - a set of biological specimens, each representing some taxonomic level and not necessarily limited to specific projects or activities Representativeness - the degree to which data accurately and precisely represent the frequency distribution of a specific variable in the population; a data quality indicator [USEPA (1990b)]. B-6 ------- Risk assessment - Qualitative and quantitative evaluation of the risk posed to human health and/or the environment by the actual or potential presence and/or use of specific pollutants [USEPA (1993a)]. Sample evidence file - a file containing anything pertaining to the sample including copies or original laboratory bench sheets, field notes, chain-of-custody forms, logbooks, sample location and project information, and final report. Secondary sample processing - the second phase of sample processing for those samples that require more than field processing, identification and counting, for example, taxonomic identification of macroinvertebrate samples. Selection criteria - a set of statements describing suitable indicators; rationale for selecting indicators (ITFM (1994)]. Sensitivity - capability of method or instrument to discriminate between measurement responses of a variable of interest [USEPA (1990b)]. Subsampling - a subset of a sample; subsample may be taken from any laboratory or field sample [USEPA (1990b)]. System audit - consists of a review of the total data production process which includes onsite reviews of the field and laboratory operational systems and facilities for sampling and processing of samples [this document]. Tolerance values - numeric values given for biota to reflect their relative tolerance to chemical pollution or other environmental degradation. Values may be pollution specific and may be given at the family, genus and/or species level. Type II error - (beta error) an incorrect decision resulting from acceptance of a false hypothesis (a false negative decision) [USEPA (1990b)]. Type I error - (alpha error) an incorrect decision resulting from the rejection of a true hypothesis (a false positive decision) [USEPA (1990b)]. Uncertainty of data - a measure of the total variability associated with sampling and measuring, taking into account two major error components: systematic error (bias) and random error [USEPA (1990b)]. Validation - the process of substantiating specified performance criteria [USEPA (1993b)]. Verifiable - the ability to be proven or substantiated [USEPA (1993b)]. B-7 ------- Voucher collection - a curated collection consisting of the actual specimens collected in a survey that is maintained following identification and enumeration. B-8 ------- |